WO2023127870A1 - Care support device, care support program, and care support method - Google Patents

Care support device, care support program, and care support method Download PDF

Info

Publication number
WO2023127870A1
WO2023127870A1 PCT/JP2022/048136 JP2022048136W WO2023127870A1 WO 2023127870 A1 WO2023127870 A1 WO 2023127870A1 JP 2022048136 W JP2022048136 W JP 2022048136W WO 2023127870 A1 WO2023127870 A1 WO 2023127870A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
evaluation
user
unit
tool
Prior art date
Application number
PCT/JP2022/048136
Other languages
French (fr)
Japanese (ja)
Inventor
侑也 ▲高▼久
Original Assignee
株式会社Sportip
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Sportip filed Critical 株式会社Sportip
Publication of WO2023127870A1 publication Critical patent/WO2023127870A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/22Social work or social welfare, e.g. community support activities or counselling services
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising

Definitions

  • the present disclosure relates to a care support device, a care support program, and a care support method.
  • the above-mentioned technology aims to present a suitable rehabilitation menu based on information on the equipment and personnel of nursing care facilities, as well as the care level and dementia level of the user receiving care.
  • the conditions of users are completely different for each individual, and it is difficult to present an appropriate rehabilitation menu to each user based on the above information.
  • the present disclosure has been made in view of the above problems, and its purpose is to provide a technology that can easily and accurately analyze the user's body movement and support nursing care work. .
  • a nursing care support device that supports nursing care work, and includes an evaluation unit that evaluates the physical exercise based on first image information including the user's physical exercise, and an evaluation of the physical exercise. , an improvement measure information transmission unit that selects a suitable training menu and presents it to a supporter, and a document information generation unit that generates document information based on at least the state of implementation of the training menu by the user.
  • a care support device characterized by:
  • FIG. 2 is a diagram showing an example hardware configuration of a user terminal 10;
  • FIG. 3 is a diagram showing an example of the software configuration of the user terminal 10;
  • FIG. 4 is a diagram showing a configuration example of physical information stored in a physical information storage unit 130.
  • FIG. 4 is a diagram showing a configuration example of an evaluation request transmitted by an evaluation request transmission unit 112 to the server device 20;
  • FIG. 4 is a diagram showing a configuration example of evaluation information received by the evaluation information receiving unit 113 from the server device 20.
  • FIG. 11 is a diagram showing a configuration example of an improvement measure request that an improvement measure request transmission unit 116 transmits to the server device 20; It is a figure which shows the structural example of improvement measure information.
  • 2 is a diagram illustrating an example hardware configuration of a server device 20;
  • FIG. 3 is a diagram showing an example of the software configuration of the server device 20;
  • FIG. 3 is a diagram showing a configuration example of image information stored in an image data storage unit 231.
  • FIG. 4 is a diagram showing a configuration example of reference information stored in a reference information storage unit 232;
  • FIG. 3 is a diagram showing a configuration example of evaluation condition information stored in an evaluation condition information storage unit 233;
  • FIG. 6 is a diagram showing a configuration example of improvement plan information stored in an improvement plan information storage unit 234;
  • FIG. 10 is a diagram showing an example of a video evaluation screen in the weightlifting mode.
  • FIG. 11 is another diagram showing an example of a video evaluation screen in the weightlifting mode.
  • FIG. 11 is another diagram showing an example of a video evaluation screen in the weightlifting mode.
  • FIG. 11 is another diagram showing an example of a video evaluation screen in the weightlifting mode.
  • FIG. 11 is another diagram showing an example of a video evaluation screen in the weightlifting mode.
  • a care support device according to an embodiment of the present invention has the following configuration.
  • a care support device that supports care work, an evaluation unit that evaluates the physical exercise based on the first image information including the user's physical exercise; an improvement measure information transmission unit that selects a suitable training menu from the evaluation of the physical exercise and presents it to a supporter; a document information generation unit that generates document information based on at least the state of implementation of the training menu by the user;
  • a care support device characterized by comprising: [Item 2] an implementation status information acquisition unit that acquires implementation status information of the training menu based on at least the second image information of the user; further comprising The document information generating unit generates the document information based on at least the implementation status information;
  • the care support device according to item 1, characterized by: [Item 3] the improvement measure information transmitting unit reselecting the training menu based on the first image information and the second image information; 3.
  • the care support device characterized by: [Item 4] A care support program that supports care work, to the processor, an evaluation step of evaluating the physical exercise based on the first image information including the user's physical exercise; a improvement measure information transmission step of selecting a suitable training menu from the evaluation of the physical exercise and presenting it to a supporter; a document information generation step of generating document information based on at least the state of implementation of the training menu by the user; A care support program that allows you to carry out [Item 5] A nursing care support method for supporting nursing care work, the processor an evaluation step of evaluating the physical exercise based on the first image information including the user's physical exercise; a improvement measure information transmission step of selecting a suitable training menu from the evaluation of the physical exercise and presenting it to a supporter; a document information generation step of generating document information based on at least the state of implementation of the training menu by the user; A nursing care support method, characterized by performing
  • a care support device evaluates the physical exercise of a care service user (hereinafter referred to as a user) and presents a suitable training menu such as training and rehabilitation.
  • the care support device of the present embodiment is, for example, an image of a user exercising (either a still image or a moving image, but in the present embodiment, a moving image). ), it should be noted that physical exercise also includes those performed using tools. Furthermore, physical exercise includes exercise performed while receiving support from a supporter.
  • the care support device of the present embodiment identifies a body part and evaluates the movement of the body part based on the relative positional relationship of the body part. Note that the care support device of the present embodiment identifies tools and parts of tools from an image of a user exercising using tools, and determines the absolute position of the part and a plurality of different parts. You may evaluate a motion of a body based on the relative positional relationship of , the relative positional relationship of the said part, and the body part.
  • FIG. 1 is a diagram showing an example of the overall configuration of a care support device according to this embodiment.
  • the care support device of this embodiment includes a user terminal 10 and a server device 20 .
  • an imaging terminal 30 may be included.
  • the user terminal 10, the server device 20, and the imaging terminal 30 are connected to each other via a communication network 40 so as to be able to communicate with each other.
  • the communication network 40 is, for example, the Internet or a LAN (Local Area Network), and is constructed by a public telephone line network, a dedicated telephone line network, a mobile telephone line network, Ethernet (registered trademark), a wireless communication path, and the like.
  • the user terminal 10 is a computer operated by a user who performs physical exercise or a supporter thereof. Such supporters include not only the user's family members, but also trainers, physical therapists, caregivers, and others who play a role in guiding, instructing, explaining, and supporting users who exercise. It's okay to be
  • the user terminal 10 is, for example, a smart phone, a tablet computer, a personal computer, or the like.
  • the user terminal 10 is equipped with an imaging device such as a camera, which can capture an image of the user's body during exercise. In this embodiment, it is assumed that a moving image of the user's body during exercise is transmitted from the user terminal 10 to the server device 20 .
  • the user terminal 10 may also serve as the imaging terminal 30 . Although only one user terminal 10 is shown in FIG. 1, it goes without saying that a plurality of user terminals may be provided.
  • the user terminal 10 may be a sensor worn by the user (a wearable sensor such as a shape of clothes or a shoe last, or a sensor attached to clothes or a part of the body). Not only exercise measurement, but also activity amount, conversation amount, sleep time, pulse, UV amount, pulse interval (PPI), skin temperature, heartbeat, etc. are sensed, and those data are sent to the server device 20 via the communication network 40. may be
  • the server device 20 is a computer that evaluates physical exercise.
  • the server device 20 is, for example, a workstation, a personal computer, a virtual computer that is logically implemented by cloud computing, or the like.
  • the server device 20 receives the moving image captured by the user terminal 10 or the imaging terminal 30, analyzes the received moving image, and evaluates the body exercise.
  • the server device 20 also makes proposals regarding improvement measures for physical exercise. Details of evaluation of physical exercise and proposal of improvement measures will be described later.
  • the imaging terminal 30 is, for example, a smart phone, a tablet computer, a personal computer, or the like.
  • the image capturing terminal 30 includes an image capturing device such as a camera, which can capture an image of the user's body during physical exercise. In the present embodiment, it is assumed that a moving image obtained by imaging the user's body during exercise is transmitted from the imaging terminal 30 to the server device 20 .
  • the data of the image of the user stored in the imaging terminal 30 may be directly input to the server device 20 by the user, the supporter of the user, or a business operator using the care support device, Input may be via communication network 40 .
  • the supporter terminal 50 is, for example, a smart phone, a tablet computer, a personal computer, or the like.
  • an image obtained by capturing the body movement of the user is analyzed. It's okay.
  • an image captured by the user terminal 10 is processed by each processing unit and stored by each storage unit, but the image may be captured by the imaging terminal 30. .
  • the supporter terminal 50 may have functions equivalent to those of the user terminal 10, and instead of the user, the user's body movement is captured, and improvement measures are taken. You can make a request.
  • the supporter may use the functions of the supporter terminal 50, which are equivalent to those of the user terminal 10, to present the user with evaluation results and improvement measures. Further, all the processing performed by the server device 20 on the user terminal 10 may be performed on the supporter terminal 50, and all the processing performed on the server device 20 by the user terminal 10 may be performed on the supporter terminal. 50.
  • FIG. 2 is a diagram showing a hardware configuration example of the user terminal 10.
  • the user terminal 10 includes a CPU 101 , a memory 102 , a storage device 103 , a communication interface 104 , a touch panel display 105 and a camera 106 .
  • the storage device 103 is, for example, a hard disk drive, solid state drive, flash memory, etc., which stores various data and programs.
  • the communication interface 104 is an interface for connecting to the communication network 40, and includes, for example, an adapter for connecting to Ethernet (registered trademark), a modem for connecting to a public telephone network, and a wireless communication device for wireless communication. , USB (Universal Serial Bus) connector and RS232C connector for serial communication.
  • the touch panel display 105 is a device that inputs and outputs data.
  • the user terminal 10 may further include input devices such as a keyboard, mouse, buttons, and microphone, and output devices such as speakers and printers.
  • FIG. 3 is a diagram showing a software configuration example of the user terminal 10.
  • the user terminal 10 includes an imaging unit 111, an evaluation request transmission unit 112, an evaluation information reception unit 113, an evaluation display unit 114, a checkpoint display unit 115, an improvement request transmission unit 116, an improvement measure information reception unit 117, and an improvement measure information. It includes functional units of display unit 118 , physical information storage unit 130 , image storage unit 131 , evaluation information storage unit 132 , and improvement measure storage unit 133 .
  • each of the above functional units is implemented by the CPU 101 provided in the user terminal 10 reading a program stored in the storage device 103 into the memory 102 and executing it, and each of the above storage units is provided in the user terminal 10. It is implemented as part of the storage area provided by memory 102 and storage device 103 .
  • the imaging unit 111 captures images, including moving images, while the user is exercising. By controlling the camera 106, the imaging unit 111 can obtain a moving image of the user's body movement.
  • the user or the user's support person installs the user terminal 10 on a flat place or a wall, directs the optical axis of the camera 106 to the place where the user is exercising, and instructs the start of video recording.
  • the imaging unit 111 may operate the camera 106 in response to this to obtain a moving image.
  • the imaging unit 111 stores the acquired moving image in the image storage unit 131 .
  • the image storage unit 131 stores images captured by the imaging unit 111 .
  • the image is a moving image in this embodiment, it is not limited to this.
  • the image storage unit 131 can store moving images, for example, as files.
  • the imaging unit 111 captures an image of the user's physical exercise, which may be at the time of initial evaluation, or at the time the user implements a training menu presented by the function of the server device 1, which will be described later. You may At either time, the image captured by the imaging unit 111 is stored in the image storage unit 131 .
  • the physical information storage unit 130 stores information (hereinafter referred to as physical information) regarding factors affecting the user's body, physical ability, and training effect.
  • FIG. 4 is a diagram showing a configuration example of physical information stored in the physical information storage unit 130.
  • the physical information includes height, weight, gender, dominant hand, arm length, leg length, hand size, finger length, grip strength, muscle strength, flexibility, shoulder strength.
  • the evaluation request transmission unit 112 transmits to the server device 20 a request to evaluate physical exercise (hereinafter referred to as an evaluation request) based on the image captured by the imaging unit 111 .
  • FIG. 5 is a diagram showing a configuration example of an evaluation request that the evaluation request transmission unit 112 transmits to the server device 20.
  • the evaluation request includes user ID, mode, physical information and image data.
  • a user ID is information that identifies a user.
  • a mode is information indicating an exercise performed by a user. Modes can be, for example, "strength training", “joint range of motion training", “gait rehabilitation”, “improvement of frailty (improvement of specific symptoms and medical conditions)", and the like. It is assumed that the mode is selected from predetermined options.
  • the physical information is physical information stored in the physical information storage unit 130 .
  • the image data is data of a moving image acquired by the imaging unit 111 .
  • the evaluation information receiving unit 113 receives information on evaluation of physical exercise using a tool (hereinafter referred to as evaluation information) that is responded from the server device 20 in response to an evaluation request.
  • evaluation information information on evaluation of physical exercise using a tool (hereinafter referred to as evaluation information) that is responded from the server device 20 in response to an evaluation request.
  • the evaluation information receiving section 113 registers the received evaluation information in the evaluation information storage section 132 .
  • FIG. 6 is a diagram showing a configuration example of evaluation information received by the evaluation information receiving unit 113 from the server device 20.
  • the evaluation information includes mode, user ID, tool position information, body part position information, posture information, motion information and checkpoint information.
  • the user ID and mode are the user ID and mode included in the evaluation request.
  • the photographed image indicates that the body of the user who has performed the exercise indicated by the mode is photographed.
  • Tool position information includes each part of the tool (for example, the entire bat used for baseball, both ends, the grip, the place where the ball meets, the center of gravity, and other arbitrary parts.
  • the tool position information includes each part of the tool and the position of the part in association with the time point on the time axis of the moving image. Based on the tool position information, it is possible to display the movement of the tool and the relationship with parts of the body. That is, for example, at the position indicated by the tool position information, a figure (such as a circle) indicating the part can be superimposed on the image and displayed.
  • tool position information may not be included for a portion that connects two portions (for example, the midpoint between the portions where the barbell is gripped with the right and left hands). In this case, by connecting a pair of marks (such as circles) indicating two predetermined portions with a line, a portion connecting these two portions can be represented.
  • the tool part position information may be included for each frame constituting the moving image, may be included for each key frame (including frames related to checkpoints to be described later), or may be included for any number of It may be included for each frame, or may be included for random time points. If position information is not included in each frame, the figure can be displayed based on position information at the most recent past time.
  • the body position information indicates the position in the image of each part of the body (eg, head, shoulders, elbows, waist, knees, ankles, etc.).
  • the body position information includes a part and the position of the part in association with a point in time on the time axis of the moving image.
  • the state of the skeleton of the body (bone) can be displayed. That is, for example, it is possible to superimpose and display a figure (for example, a circle) indicating the part on the image at the position indicated by the body position information. It should be noted that multiple site positions may be included for one time point.
  • position information does not need to be included for a part connecting two parts (for example, a forearm connecting a wrist and an elbow, a thigh connecting a waist and a knee, etc.).
  • a part connecting these two parts can be expressed.
  • Position information may be included for each frame that constitutes a moving image, may be included for each keyframe (including frames related to checkpoints to be described later), or may be included for any number of frames. , or at random time points. If position information is not included in every frame, bones can be displayed based on position information for the most recent past time.
  • the tool orientation information is information related to the orientation of the tool used by the user and the direction in which the part of the tool faces.
  • the tool orientation information includes a portion of the tool to be evaluated, a tool movement value, an evaluation rank, an evaluation comment, etc., in association with a point in time on the time axis of the moving image.
  • the tool orientation value is a value representing the orientation of the tool.
  • a tool orientation value can be, for example, the distance from the ground to a part of the tool, the distance between two parts of the tool, the angle of the part (the first part of the tool and the part where the user is holding the tool, for example).
  • An evaluation rank is a value representing an evaluation value by a rank.
  • the evaluation rank is expressed by, for example, 1 to 5 out of 5 or ABC.
  • the evaluation comment is a comment related to the evaluation of posture. For example, if the mode is "upright row” and the right and left ends of the barbell are at different distances from the ground, an evaluation comment such as "different forces are applied to the left and right" may be included.
  • Tool movement information is information related to the movement of the tool used by the user.
  • the tool movement information includes the portion of the tool to be evaluated, the list of tool orientation values, the evaluation rank, and the evaluation comment in association with the period on the time axis of the moving image.
  • a list of tool orientation values is a time series of tool orientation values within a time period.
  • the evaluation comment is a comment related to the evaluation regarding the movement of the tool. For example, if the mode is "upright row” and there is not enough up and down movement of the barbell, a rating comment such as "the barbell is not lifted enough" may be included.
  • the posture information is information related to the posture of the user's body.
  • the posture information includes a part to be evaluated, a posture value, an evaluation rank, and an evaluation comment in association with a time point on the time axis of the moving image.
  • a posture value is a value representing a posture.
  • the attitude value can be, for example, the distance from the ground to the part, the distance between the two parts, the angle of the indirect part (a straight line from the first end part to the indirect part, and from the second end part to the indirect part angle made by a straight line to ) and so on.
  • An evaluation rank is a value representing an evaluation value by a rank.
  • the evaluation rank is expressed by, for example, 1 to 5 out of 5 or ABC.
  • the evaluation comment is a comment related to the evaluation of posture.
  • Body movement information is information related to the movement of the user's body.
  • the body movement information includes a part to be evaluated, a list of posture values, an evaluation rank, and an evaluation comment in association with a period on the time axis of the moving image.
  • the list of attitude values is the time-series attitude values within the period.
  • An evaluation comment is a comment related to an evaluation regarding motion. For example, if the mode is "lifting" and the knee extension is not smooth, an evaluation comment such as "knee movement is not smooth" may be included.
  • the relationship information indicates information on one or more positional relationships between the tool and the body.
  • the relationship information includes information on the relationship between information such as the position, orientation, and movement of the part of the tool and information such as the body part, posture, and movement, in association with the point in time on the time axis of the moving image. For example, based on the tool position information and the body position information, it is possible to display the positional relationship between the two. That is, for example, it is possible to superimpose a figure (such as a circle) indicating the part on the image at the position indicated by the tool position information, and display a figure (such as a circle) indicating the part at the position indicated by the body position information. can. It should be noted that one point in time may include a plurality of parts and site positions.
  • Positional information does not need to be included for a point connecting a part and a part (for example, the tip of the bat and the center point of the part where the user grips the bat).
  • a pair of figures such as circles
  • the relational information may be included for each frame constituting the moving image, for each keyframe, for each checkpoint (details will be described later), or for any number of frames. may be included or may be included for random time points. If position information is not included in each frame, it is possible to display a figure indicating a part or region and a line connecting them based on the position information at the most recent past time.
  • Checkpoint information is information that indicates the points (hereinafter referred to as checkpoints) at which the direction of the tool and the posture of the body should be checked in the movement of the tool used by the user or in the series of body movements of the user. be.
  • checkpoints include when the barbell reaches its highest position, when it reaches its lowest position, and when it is lifted.
  • the mode is "pitching", it is when the foot is lifted, when the lifted foot is put down and the weight is shifted, when the ball is released, and so on.
  • the checkpoint information stores information indicating a checkpoint (hereinafter referred to as a checkpoint ID) in association with a point in time on the time axis of the moving image. That is, it is possible to specify the frame (still image) in the moving image in which the checkpoint indicated by the checkpoint ID is displayed.
  • the evaluation display unit 114 displays evaluation information. For example, the evaluation display unit 114 superimposes the tool position, tool orientation, tool movement information, body position, posture, body movement information, etc. included in the evaluation information on the animation, and displays the parts of the tool and the body. By displaying figures representing body parts (for example, circles representing the ends of tools, the center of gravity of the body, lines connecting them, etc.), movements of tool parts and body parts can be superimposed on videos. .
  • the evaluation display unit 114 can graphically display chronological changes in the position of the part of the tool, the direction of the tool, the movement of the tool, the position of the part of the body, the posture, the movement of the body, and the like. can.
  • the evaluation display unit 114 can display the evaluation rank and the evaluation comment together with the display of the moving image based on the tool orientation information and the tool movement information included in the evaluation information. For example, the evaluation display unit 114 displays the tool orientation information when the playback time of the moving image reaches the vicinity of the time point included in the tool orientation information (for example, it can be set to any length such as around 5 seconds). You can view the rating rank and rating comments contained in the . Moreover, the evaluation display unit 114 can display the evaluation rank and the evaluation comment included in the tool movement information when the reproduction time of the moving image comes within the period included in the tool movement information. The evaluation display unit 114 can also display the posture value included in the posture information. In addition, the evaluation display unit 114 can graphically display changes in tool orientation values over time based on the list of tool orientation values included in the tool movement information.
  • the evaluation display unit 114 can display the evaluation rank and the evaluation comment together with the display of the moving image based on the posture information and motion information included in the evaluation information. For example, the evaluation display unit 114 displays the playback time included in the posture information when the playback time of the moving image reaches the vicinity of the point in time included in the posture information (for example, it can be set to an arbitrary length such as around 5 seconds). You can view the evaluation rank and evaluation comments that have been submitted. Moreover, the evaluation display unit 114 can display the evaluation rank and the evaluation comment included in the motion information when the playback time of the moving image comes within the period included in the motion information. The evaluation display unit 114 can also display the posture value included in the posture information.
  • the evaluation display unit 114 can display chronological changes in posture values in a graph based on the list of posture values included in the motion information.
  • the evaluation rank and the evaluation comment may be displayed together with the evaluation rank and the evaluation comment displayed in accordance with the display of the moving image based on the tool orientation information and the tool movement information described in the previous paragraph. .
  • the checkpoint display unit 115 can extract and display checkpoint images from the video.
  • the checkpoint display unit 115 can read a frame corresponding to the time point included in the checkpoint information from the moving image data stored in the image storage unit 131 and display it as a still image. Also, the checkpoint display unit 115 may, for example, extract and display only body parts from the read frames.
  • the remedy request transmission unit 116 transmits to the server device 20 a request for acquiring an remedy regarding tool handling or physical exercise (hereinafter referred to as a remedy request).
  • FIG. 7 is a diagram showing a configuration example of an improvement request.
  • the improvement request includes a user ID, mode, purpose, and the like.
  • the purpose is the purpose for which the user makes improvements.
  • the purpose can be, for example, "increase joint range of motion", “increase muscle strength”, “stabilize walking”, “improve symptoms/medical condition", and the like.
  • the purpose may also be selected from predetermined options, and the symptoms and medical conditions may also be selected from predetermined options.
  • the improvement plan information receiving unit 117 receives information about the improvement plan (hereinafter referred to as improvement plan information) transmitted from the server device 20 in response to the improvement plan request.
  • the improvement plan information receiving unit 117 stores the received improvement plan information in the improvement plan storage unit 133 .
  • FIG. 8 shows a configuration example of improvement measure information.
  • the improvement plan information includes purpose, advice and standard information, training menu (training menu, appropriate intensity of training, appropriate number of times of training, recommended posture such as standing or sitting, individual body according to the information) and the number of times performed.
  • the advice is assumed to be a character string representing the improvement measure, but may be content that presents the improvement measure using an image, video, or the like.
  • the reference information is suitable orientation and movement of the tool (position, orientation, movement, speed, etc. of each part) and body posture (position, angle, etc. of each part).
  • the improvement measure information receiving unit 117 may receive the improvement measure information transmitted by the improvement measure information transmitting unit 216 based on the evaluation result and the reference value even if there is no improvement measure request.
  • the improvement plan information display unit 118 displays the improvement plan.
  • the remedy information display unit 118 displays advice included in the remedy information. Further, when the improvement measure information includes a suitable position and angle of a part, the improvement measure information display unit 118 extracts an image of a frame in which the part is at a suitable angle from the moving image information. It may be displayed on the user terminal 10 .
  • FIG. 9 is a diagram showing a hardware configuration example of the server device 20.
  • the server device 20 includes a CPU 201 , a memory 202 , a storage device 203 , a communication interface 204 , an input device 205 and an output device 206 .
  • the storage device 203 is, for example, a hard disk drive, solid state drive, flash memory, etc., which stores various data and programs.
  • the communication interface 204 is an interface for connecting to the communication network 40, and includes, for example, an adapter for connecting to Ethernet (registered trademark), a modem for connecting to a public telephone network, and a wireless communication device for performing wireless communication. , USB (Universal Serial Bus) connector and RS232C connector for serial communication.
  • the input device 205 is, for example, a keyboard, mouse, touch panel, button, microphone, etc. for inputting data.
  • the output device 206 is, for example, a display, printer, speaker, or the like that outputs data.
  • FIG. 10 is a diagram showing a software configuration example of the server device 20.
  • the server device 20 includes an evaluation request reception unit 211, an image analysis unit 212, an evaluation unit 213, an evaluation information transmission unit 214, an improvement measure request reception unit 215, an improvement measure information transmission unit 216, an implementation status information Acquisition unit 217, document information generation unit 218, image data storage unit 231, reference information storage unit 232, evaluation condition information storage unit 233, improvement measure information storage unit 234, implementation status information storage unit 235 , a template storage unit 236, and a generated document storage unit 237.
  • FIG. 10 is a diagram showing a software configuration example of the server device 20.
  • the server device 20 includes an evaluation request reception unit 211, an image analysis unit 212, an evaluation unit 213, an evaluation information transmission unit 214, an improvement measure request reception unit 215, an improvement measure information transmission unit 216, an implementation status information Acquisition unit 217, document information generation unit 218, image data storage unit 231, reference information storage unit 232, evaluation condition information storage unit 233, improvement measure information storage unit
  • each of the functional units described above is implemented by the CPU 201 provided in the server device 20 reading a program stored in the storage device 203 into the memory 202 and executing the program. and part of the storage area provided by the storage device 203 .
  • the evaluation request receiving unit 211 receives an evaluation request transmitted from the user terminal 10.
  • the evaluation request reception unit 211 registers information including image data included in the received evaluation request (hereinafter referred to as image information) in the image data storage unit 231 .
  • FIG. 11 is a diagram showing a configuration example of image information stored in the image data storage unit 231. As shown in FIG. As shown in the figure, the image information includes image data in association with a user ID indicating the user whose image was captured. The image data was included in the evaluation request.
  • the reference information storage unit 232 stores reference values related to body movements derived from tool positions, tool movements (tool orientations and movements, etc.), postures (positions and angles of parts, etc.), and relationships between tools and the body. Information included (hereinafter referred to as reference information) is stored.
  • FIG. 12 is a diagram showing a configuration example of reference information stored in the reference information storage unit 232. As shown in FIG.
  • the reference information includes information such as the absolute position of the tool and how the tool moves (moving speed, moving distance, etc.) , direction of movement, etc.), reference information on the absolute position of the part or relative position to another part or other reference object (hereinafter referred to as position reference information), and three parts including joint parts, two parts It includes, but is not limited to, reference information on the angle formed by the straight line connecting each and the joint part (hereinafter referred to as angle reference information), and information on the relationship between the part of the tool and the part of the body.
  • the tool position reference information includes the part of the tool and the reference position of that part in association with the mode and checkpoint ID.
  • the vertical position may be, for example, the height above the ground or the distance from either foot.
  • the distance from the line connecting the body parts or the body parts such as the distance between the line connecting both shoulders and the shaft, may be used.
  • the horizontal position of the part may be the distance from a predetermined reference object (for example, a mound plate or a mark on the floor) or the distance from the reference part such as the shoulder, chest, or leg. It is assumed that the position reference information is registered in advance.
  • the tool movement reference information includes reference values for information such as the movement speed and movement distance of parts of the tool, the direction of movement at a certain point in time, and the trajectory of movement during a certain period of time, in association with the mode and checkpoint ID. and are included.
  • the position reference information includes the part and the reference position of the part in association with the mode and checkpoint ID. There may be multiple sites. Regarding position, the vertical position may be, for example, the height from the ground or the distance from either foot. The horizontal position may be the distance from a predetermined reference object (for example, a mound plate or a mark on the floor) or the distance from a reference part such as the shoulder, chest, or leg. It is assumed that the position reference information is registered in advance.
  • a predetermined reference object for example, a mound plate or a mark on the floor
  • a reference part such as the shoulder, chest, or leg. It is assumed that the position reference information is registered in advance.
  • the angle reference information includes two parts (part 1 and part 2), one joint part, a straight line connecting the part 1 and the joint part, a part 2 and the joint part, and corresponding to the mode and the checkpoint ID. and the reference value of the angle between the straight line connecting the part.
  • the relational reference information includes information on the reference represented by the relation between the part of the tool and the part of the body in association with the mode and checkpoint ID.
  • the relational reference information includes information obtained from movement speed, movement distance, angle, etc. in one or more parts and regions in association with the mode and checkpoint ID. For example, when the mode is batting, the reference information includes the movement speed of the tip of the bat at the time of contact with the ball, the angle formed by the bat and the dominant arm holding the bat, and the like.
  • the evaluation condition information storage unit 233 stores information for evaluation (hereinafter referred to as evaluation condition information).
  • FIG. 13 is a diagram showing a configuration example of evaluation condition information stored in the evaluation condition information storage unit 233.
  • the evaluation condition information includes categories, conditions, evaluation ranks, and comments.
  • Category is the category of evaluation. Categories may include, for example, "muscle strength", “range of motion”, and "endurance”.
  • the conditions include the position, orientation, or movement of each part of the tool in the image (change in position in time series), or the position or movement of each part of the body (change in position in time series).
  • condition information For example, when analyzing a weightlifting movement, checkpoints at the moment of lifting the barbell, conditions such as elbow angle and arm extension speed, and shaft movement and up and down speed during the period of lifting and lowering the barbell. can be set in the evaluation condition information. Also, when analyzing a pitching form, it is possible to set conditions such as the angle of the elbow and the line speed of the arm in the evaluation condition information for check points for releasing the ball.
  • the evaluation rank is an evaluation value when the above conditions are satisfied.
  • the comment is a description of the body's posture and movement when the above conditions are met.
  • the image analysis unit 212 (part/part identification unit) analyzes the image data.
  • the image analysis unit 212 analyzes the image data, extracts the feature amount of each part of the tool and each part of the body, and specifies the position of each part and each part in the image. Also, the image analysis unit 212 analyzes the image data, extracts the feature amount of each part of the tool, and identifies the direction in which each part faces. It should be noted that the image analysis method by the image analysis unit 212 is assumed to employ a general method, and detailed description thereof will be omitted here.
  • the image analysis unit 212 may analyze image data for each frame or key frame, may analyze image data for each checkpoint, or may analyze image data at random timing. good too.
  • the image analysis unit 212 also compares the position of each part extracted from the image data with the position reference information stored in the reference information storage unit 232 for each checkpoint ID, and selects the closest time as the checkpoint time. Identify as
  • the user's physical exercise in the present embodiment also includes what the user does with the assistance of a supporter.
  • the image analysis unit 212 may capture the user's body part or the movement of the tool associated with the user's body movement from the image capturing such body movement.
  • the image analysis unit 212 may detect a person by analyzing image data and extracting a feature amount, and after dividing an area for each person, specify each part and each part of the user.
  • the image analysis unit 212 may perform posture estimation after estimating the region of each person in units of pixels using instance segmentation.
  • the image analysis unit 212 distinguishes between the user and the supporter and analyzes the posture of the user.
  • the user may be determined to be the person who appears in the largest image on the moving image, or the person appearing closer to the center of the moving image may be determined to be the user.
  • a person may be determined as a user.
  • the image analysis unit 212 may determine the user based on the markers attached to the user's clothing, body surface, hair, etc. Conversely, the image analysis unit 212 may attach the marker to the supporter to determine the user.
  • the image analysis unit 212 may determine that a person holding or wearing a device used for training or the like is a user. Also, the image analysis unit 212 may identify the user using a general face authentication technique. In addition, the image analysis unit 212 may determine the user by recognizing the features of the user (for example, an elderly person, a care recipient, etc.). In this case, for example, the image analysis unit 212 generates a determination model that has learned the walking and movement characteristics of the elderly and people with problems in specific parts, and uses the model to determine the characteristics of the elderly and care recipients.
  • the image analysis unit 212 analyzes the postures of all the people appearing in the moving image, and then presents the user terminal or the supporter terminal with a function that allows the user to select the user. A selection may be accepted to identify the user. Note that the user determination method described above is not limited to the case where the user and the supporter are included in the image. may be used to determine the user.
  • the image analysis unit 212 identifies the parts of the supporter without mistaking them for the parts of the user even when the user and the supporter are included in the image and the two overlap on the image.
  • the image analysis unit 212 may, for example, output a plurality of joint point candidates for each part and group which candidates belong to the same person as other part candidates in post-processing. Further, the image analysis unit 212 may acquire depth using a depth sensor, and group joint points with those of the same person based on the depth information.
  • the image analysis unit 212 analyzes images acquired from a plurality of imaging terminals, analyzes whether or not there is a contradiction in the same part in the parts determined from each image, and integrates them to suppress false detection. good too.
  • the image analysis unit 212 may receive the user's or supporter's manual designation of the detected joint points from the user's terminal or the supporter's terminal, and group them with those of the same person. In addition, the image analysis unit 212 uses physical characteristics such as the joint length between the specified joint points, the range of motion of the joint (restrictions on the joint angle), and the degree of bending of the waist to identify the detected parts as those of the same person. may be grouped. Also, the image analysis unit 212 tracks each joint using the time-series information (the supporter is kept away at the start of imaging). In addition, using an imaging terminal, the user or supporter wears a marker such as a glove to target limbs that are likely to be detected incorrectly, and the movement is imaged.
  • a marker such as a glove to target limbs that are likely to be detected incorrectly, and the movement is imaged.
  • the image analysis unit 212 may be used for grouping joint points or parts when a person is captured.
  • the evaluation unit 213 evaluates the movement of the tool used by the user based on the image data.
  • the evaluation unit 213 searches the evaluation condition information storage unit 233 for evaluation condition information including conditions satisfied by the position and motion of each part of the tool specified from the image data, If there is evaluation condition information, the evaluation rank and comments included in it are acquired. Note that the evaluation unit 213 may evaluate the movement of the tool and count the number of body movements.
  • the evaluation unit 213 evaluates the movement of the user's body based on the image data.
  • the evaluation unit 213 searches the evaluation condition information storage unit 233 for evaluation condition information including conditions satisfied by the position and movement of each part specified from the image data, and searches the evaluation condition information that satisfies the conditions. Get the rating rank and comments contained in the information, if any. Note that the evaluation unit 213 may evaluate the body movement and count the number of body movements.
  • the evaluation unit 213 evaluates the tool used by the user and the movement of the user's body based on the image data.
  • the evaluation unit 213 generates evaluation condition information including the position of each part of the tool and each part of the body specified from the image data, and the conditions satisfied by the movement or relationship between the parts and the parts.
  • a search is made from the storage unit 233, and if there is evaluation condition information that satisfies the condition, the evaluation rank and comment included therein are acquired.
  • the evaluation unit 213 may count the number of physical exercises by evaluating the movement of the tool and the body.
  • the evaluation information transmission unit 214 transmits the evaluation information to the user terminal 10.
  • the evaluation information transmission unit 214 generates tool position information including the points in time on the time axis of the moving image specified by the image analysis unit 212 and the positions of each part of the tool.
  • the evaluation rank and comment acquired by the evaluation unit 213 if the position of the part of the tool satisfies the condition, posture information including the time, part and tool orientation values, the evaluation rank and the comment is generated, and the motion of the part is calculated. If (position change in chronological order) satisfies the conditions, then generate tool motion information including a list of time points, parts and tool orientation values, as well as evaluation ranks and comments.
  • the evaluation information transmission unit 214 generates checkpoint information including the time point corresponding to each checkpoint analyzed by the image analysis unit 212 and the checkpoint ID indicating the checkpoint.
  • the evaluation information transmission unit 214 creates evaluation information including the generated tool position information, tool orientation information, tool movement information, and checkpoint information, and transmits the evaluation information to the user terminal 10 .
  • the evaluation unit 213 and the evaluation information transmission unit 214 can correspond to the comment output unit of the present invention.
  • the evaluation information transmission unit 214 transmits the evaluation information to the user terminal 10.
  • the evaluation information transmission unit 214 generates position information including the time point on the time axis of the moving image specified by the image analysis unit 212 and the position of each part.
  • position information including the time point on the time axis of the moving image specified by the image analysis unit 212 and the position of each part.
  • posture information including the time point, the part and posture values, the evaluation rank and the comment is generated, and the movement of the part (time series) is generated. position change) satisfies the conditions, generate motion information including a list of time points, part and posture values, evaluation ranks and comments.
  • the evaluation information transmission unit 214 generates checkpoint information including the time point corresponding to each checkpoint analyzed by the image analysis unit 212 and the checkpoint ID indicating the checkpoint.
  • the evaluation information transmission unit 214 creates evaluation information including the generated position information, posture information, motion information, and checkpoint information, and transmits the evaluation information to the user terminal 10 .
  • the evaluation unit 213 and the evaluation information transmission unit 214 can correspond to the comment output unit of the present invention.
  • the improvement plan information storage unit 234 stores information related to improvement plans (hereinafter referred to as improvement plan information).
  • FIG. 14 is a diagram showing a configuration example of improvement plan information stored in the improvement plan information storage unit 234.
  • the improvement plan information includes advice in association with purposes, categories and conditions.
  • the conditions may be conditions for the tool itself (weight of the barbell, etc.), how to use the tool, conditions for the body (flexibility, etc.), or conditions for the position, orientation, and movement of parts of the tool. It may be a condition for the position or movement of the part of the .
  • the improvement request reception unit 215 receives the improvement request sent from the user terminal 10 .
  • the remedy request receiving unit 215 may receive the remedy request from the supporter terminal 50 .
  • the improvement measure information transmission unit 216 transmits the user's physical information included in the evaluation request and each item specified by the image analysis unit 212 among the improvement measure information corresponding to the mode and purpose included in the improvement request. A search is made for items that satisfy conditions such as the position, orientation, movement, etc. of a part or each part.
  • the improvement measure information transmission unit 216 acquires the advice of the searched improvement measure information, creates the improvement measure information in which the purpose and the advice are set, and responds to the user terminal 10 with the created improvement measure information.
  • the improvement measure information transmitting unit 216 also transmits the position, orientation, speed, angle, and the like of each portion and each part included in the reference information in the improvement measure information. Note that the improvement measure information transmission unit 216 may search for an improvement measure based on the evaluation information and the reference value without the improvement measure request, and the improvement measure information transmission unit 216 transmits the improvement measure to the user terminal 10. You may send.
  • the improvement measure information transmission unit 216 may transmit to the user terminal 10 the improvement measure information corresponding to the mode (symptoms, disease name, condition, etc.) and/or purpose (desires, goals, needs, etc.).
  • the improvement measure information transmission unit 216 transmits the user's physical information contained in the evaluation request among the improvement measure information corresponding to the mode (symptoms, disease name, condition, etc.) and/or the purpose (hope, goal, needs, etc.) In particular, information such as ADL, joint range of motion, degree of support required, degree of care required, etc., or information that satisfies conditions such as the position, orientation, movement, etc. of each part and each part specified by the image analysis unit 212 is searched. good too.
  • the improvement measure information transmission unit 216 transmits the user's physical information included in the evaluation request, particularly ADL, joint range of motion, support required, among the improvement measure information corresponding to the user's motor function, life function, and cognitive function. It is possible to search for information that satisfies conditions such as information such as degree and degree of need for nursing care, and the position, orientation, movement, etc. of each portion and each part specified by the image analysis unit 212 .
  • the improvement measure information transmission unit 216 transmits mode (symptoms, disease name, condition, etc.) or/and purpose (hope, goal, needs, etc.) and improvement measure information corresponding to the user's motor function, life function, cognitive function , the user's physical information included in the evaluation request, especially information such as ADL, joint range of motion, degree of support required, degree of care required, and the position and orientation of each part and each part specified by the image analysis unit 212, A search may be made for items that satisfy conditions such as movement.
  • the improvement measure information transmission unit 216 corresponds to modes (symptoms, disease names, conditions, etc.) and/or purposes (desires, goals, needs, etc.) and low-evaluation items among the user's motor functions, life functions, and cognitive functions.
  • the user's physical information included in the evaluation request especially information such as ADL, joint range of motion, level of support required, level of care required, and each part and each part specified by the image analysis unit 212 A search may be made for a part that satisfies conditions such as the position, orientation, and movement of the part.
  • the implementation status information acquisition unit 217 acquires, from the user terminal 10, the implementation status of the improvement measures (a training menu such as a rehabilitation menu and a training menu) transmitted to the user terminal 10, and stores it in the implementation status information storage unit 235. do.
  • the implementation status information acquisition unit 217 presents a form or the like for inputting the implementation date, the number of implementations, etc. for each training menu on the user terminal 10, and acquires the input information from the user and the supporter as implementation status information. .
  • the implementation status information acquisition unit 217 acquires an image captured by the user terminal 10, and the image analysis unit 212 analyzes the image to determine whether or not the specified training menu has been implemented. You may acquire the information of the frequency
  • the image analysis unit 212 stores information on how each part of the body or the part of the tool moves for each training menu, each part obtained by analyzing the image of the user, and the movement of each part. By comparing the information, it is possible to analyze whether or not the specified training menu has been implemented, and to count the number of times the movement of each part and each part has been repeated. Further, the evaluation unit 213 may evaluate the image to evaluate whether or not the training was effective, and the evaluation information transmission unit 214 may display the evaluation result on the user terminal 10 .
  • the document information generation unit 218 generates documents to be submitted to departments related to health and welfare established in each local government, etc., in order to receive the application of nursing care insurance and the payment of subsidies in the nursing care business.
  • Such documents include documents for applying for and notifying long-term care insurance, including documents related to designated applications, documents related to remuneration claims, and documents related to instruction audits. , ADL maintenance addition, living function improvement cooperation addition, scientific nursing care promotion addition, etc., but not limited to these.
  • the documents include handover, transfer, chart (user's condition, monitoring, etc.), training implementation record, diary, care plan, assessment results, various plans, care provision chart, nursing care benefit statement (leading to remuneration claim) Records) and other documents created and stored by the organization to which the supporter belongs may also be included.
  • the document information generation unit 218 acquires information on the template of the document to be generated, which is stored in the template storage unit 236, and stores the information on the user, the content of care provided, and the training implementation stored in the implementation status information storage unit 235.
  • the corresponding information of the situation (implementation content, number of times, etc.) may be poured into the template and output to the supporter terminal 50 as a word file, PDF file, or the like.
  • the document information generation unit 218 cooperates with the management system, etc., and stores relevant information such as user information, implementation details, implementation frequency, etc., stored in the implementation status information storage unit 235, into the database of the management system. may be stored.
  • FIG. 15 is a diagram showing an example of the flow of processing executed by the care support device of this embodiment.
  • the imaging unit 111 of the user terminal 10 receives mode input, images the user's body during exercise, and acquires video data (S321).
  • the evaluation request transmission unit 112 transmits to the server device 20 an evaluation request including the user ID indicating the user, the received mode, the physical information and the video data (S322).
  • the image analysis unit 212 analyzes the moving image data to extract feature amounts (S323), and specifies the position of each part and each part (S324).
  • the image analysis unit 212 may specify the position on the image, or use the physical information to determine the actual position (height from the ground, distance from a reference point such as the center of gravity of the body, etc.). You may make it specify.
  • the evaluation unit 213 acquires an evaluation rank and a comment from the evaluation condition information that satisfies the condition of each part, the position of each part, the movement of the part (change in position over time) (S325).
  • the evaluation information transmission unit 214 creates evaluation information and transmits it to the user terminal 10 (S326).
  • the evaluation display unit 114 of the user terminal 10 displays the position, orientation, movement, etc. of the tool on the video data based on the received evaluation information (S327).
  • the evaluation display unit 114 of the user terminal 10 may display the position (bone) of each part indicating the posture of the body, as well as the evaluation rank and comments (S327).
  • the evaluation display unit 114 may graphically display the position, orientation, movement, etc. of the part, and time-series changes in the position and movement of the part.
  • the checkpoint display unit 115 may extract and display images of checkpoints from the moving image.
  • the remedy request transmission unit 116 transmits the remedy request to the server device 20 according to the instruction from the user (S328).
  • the improvement measure information transmission unit 216 searches for the improvement measure information that satisfies the conditions, and Advice included in the measure information is acquired (S329), and improvement measure information including the acquired advice is created and transmitted to the user terminal 10 (S330).
  • the improvement measure information display unit 118 displays the advice included in the received improvement measure information, and displays the suitable tool usage as video data. It can be superimposed and displayed (S331).
  • the improvement measure information display unit 118 displays the advice included in the received improvement measure information, and also displays the preferred body posture in the form of bones. (S331).
  • the imaging unit 111 of the user terminal 10 captures an image of the user's body while exercising, and acquires other moving image data.
  • the user terminal 10 may transmit an evaluation request including the user ID indicating the user, the received mode, physical information and video data to the server device 20 by the evaluation request transmission unit 112, and may capture the body movement.
  • only the moving image data obtained from the video data may be transmitted to the server device 20 (S332).
  • the implementation status information acquisition unit 217 acquires implementation status information based on the analysis of the moving image data by the image analysis unit 212 and the evaluation by the evaluation unit 213 (S333).
  • the document information generation unit 218 generates document information based on at least the implementation status information and the document template stored in the server device 20 (S334).
  • the care support device of the present embodiment it is possible to easily evaluate physical exercise. Especially for physical exercise related to sports, it is possible to evaluate the positional relationship and movement of each part of the tool and each part of the body. In addition, since the care support device of the present embodiment also provides comments and advice, the user can easily grasp the current situation and improvement measures.
  • FIG. 16 is a diagram showing an example of a screen displaying an evaluation of physical exercise using a tool.
  • FIG. 16 illustrates a case where a moving image is captured in the weightlifting mode.
  • the screen 41 displays a mark 411 indicating the position of the barbell shaft.
  • the movement of the barbell shaft is indicated by line 412 .
  • FIG. 17 is another diagram showing an example of a screen displaying an evaluation of physical exercise using a tool.
  • FIG. 17 illustrates a case where a moving image is captured in the weightlifting mode.
  • the result of evaluation performed by the evaluation unit 213 in the weight lifting mode the inclination of the barbell shaft, the movement distance of the shaft, the movement speed (line 421), and the like are displayed.
  • the reference value (line 422) is displayed.
  • the actual measurement result (line 423) may be displayed, and further, the difference from the reference value may be displayed numerically or graphically. The user can consider the movement and posture that should be corrected by referring to this.
  • FIG. 18 is another diagram showing an example of a screen displaying an evaluation of physical exercise using a tool.
  • FIG. 18 illustrates a case where moving images are captured in the weightlifting mode.
  • the reference value such as the angle of the body joint at the lowest point of the barbell and the evaluation result (line 432) are displayed. It is a figure.
  • the difference from the reference value may be displayed numerically, or may be displayed in a graph or the like.
  • the evaluation result based on the relationship between the body and the tool may be displayed. The user can consider the movement and posture that should be corrected by referring to this.
  • FIG. 19 is another diagram showing an example of a screen displaying an evaluation of physical exercise using a tool.
  • FIG. 19 illustrates a case where a moving image is captured in the weightlifting mode.
  • a line 441 displays a bone that is displayed by connecting predetermined positions of parts of the tool and body parts identified from the image. Note that the bones may be displayed superimposed on the captured image.
  • a line 442 represents the acceleration of each part of the body.
  • a count result such as how many times the weight has been lifted may be displayed.
  • the evaluation result and the next training result in accordance with the purpose or the like may be indicated.
  • FIG. 20 is another diagram showing an example of a screen displaying an evaluation of physical exercise using a tool.
  • FIG. 20 illustrates a case where a moving image is captured in the weightlifting mode.
  • an evaluation (line 441) obtained by comparison with various reference values is displayed.
  • the server device 20 analyzes the image, but the present invention is not limited to this, and the user terminal 10 analyzes the image and specifies the positional relationship between each part and each part. may
  • the positions of parts and sites are positions on a two-dimensional image, but they are not limited to this, and may be three-dimensional positions.
  • the three-dimensional position of a part or site is specified based on the image from the camera 106 and the depth map from the depth camera. can do.
  • the three-dimensional position may be specified by estimating the three-dimensional image from the two-dimensional image. It is also possible to provide a depth camera in place of the camera 106 and specify the three-dimensional position only from the depth map from the depth camera. In this case, the depth map is transmitted from the user terminal 10 together with the image data or instead of the image data to the server device 20, and the image analysis unit 212 of the server device 20 analyzes the three-dimensional position. can be done.
  • an image of the user's body during exercise using a tool is transmitted from the user terminal 10 to the server device 20.
  • a feature amount may be extracted from the image, and the feature amount may be transmitted to the server device 20.
  • the user terminal 10 may estimate the part of the tool or the part of the body based on the feature amount, and determine the absolute value of the part or the part. position (on the XY coordinates of the image), or the actual distance from the reference position (for example, the ground, the tip of the foot, the head, the center of gravity of the body, etc.), or any other coordinate system ) Or acquire the relative positional relationship between multiple parts, between multiple parts, and between multiple parts, and send these absolute positions and relative positional relationships to the server device 20 You may make it transmit.
  • content prepared on the server device 20 side is provided as the improvement measure information, but this is not restrictive.
  • Marks and bones representing movements and postures may be superimposed and displayed on moving images or still images extracted from moving images. This makes it possible to easily grasp what kind of movement and posture should be taken.
  • evaluation is performed on the part and orientation of the tool, the position or movement of the part of the body, etc. (position over time), but the position of the tool worn by the user is not limited to this. may be specified and evaluated.
  • the server device 20 stores the tool and the reference value of the size (length, etc.) of the tool in association with the user's physical information (height, weight, etc.). Identify the shape of the tool by extracting the feature value of the tool in use, estimate the size of the tool based on the size of the user (such as height, etc.) included in the shape and physical information, and estimate the tool If the difference between the size and the reference value is greater than or equal to a predetermined threshold value, a tool having the size of the reference value can be recommended.
  • content such as advice is provided as an improvement measure, but for example, the user may be suspended from exercising.
  • the server device 20 stores a reference value at which physical exercise should be interrupted in association with the user's physical information (purpose, height, weight, etc.).
  • the number and speed of exercise for example, the speed of lifting the barbell is extremely slow, or the number of times of exercise is too large at one time, etc.
  • the physical exercise is discontinued.
  • a comment may be issued to the user terminal 10 asking the user to stop, the user may be notified by changing the display such as turning off the screen, or a sound such as an alert sound may be emitted. Alternatively, the user may be notified by vibration.
  • content such as advice is provided for improvement measures, but for example, determination of illness and physical exercise for improvement may be presented.
  • the server device 20 extracts candidates for the disease that the user is assumed to have developed from the symptoms entered by the user in the physical information and the evaluation information, and presents a screening test for narrowing down the disease. do. After the user has taken a screening test and has narrowed down the name of the disease, the service recommends that the user receive a medical examination, and recommends physical exercise for improvement, tools for physical exercise, and items such as meals.
  • the server device 20 can estimate the speed, acceleration, movement distance, trajectory, etc. of the tool.
  • the server device 20 can estimate the number of patterns as the number of actions using the tool by extracting patterns of changes in the position of the tool in time series.
  • the server device 20 may store assignments instead of evaluation comments in association with one or a series of postures or movements, and output the assignments.
  • the exercise is evaluated, but the present invention is not limited to this.
  • Contents for improving physical exercise may be presented according to the purpose, such as performance, or preparatory stages such as stretching, strength training, and posture.
  • the server device 20 associates one or a series of movement of a part of the tool, orientation of the part of the tool, body posture, or movement of the body part, and instead of the evaluation comment, the server device 20 provides the content of the training or the like. It is sufficient to store the content and output the content.
  • the exercise is evaluated, but it is not limited to this, and it is also possible to automatically detect the action performed by the user.
  • the server device 20 stores, as reference information, the positions and postures (positions of each part of the body) of each part of a tool that performs a predetermined action such as a shoot or a pass, and the tool analyzed from the image. By comparing the position of the part or part of the body with the reference information, the action performed by the user in the image can be identified.
  • an image captured in the past is analyzed to evaluate the motion, but the present invention is not limited to this, and analysis processing is performed in real time, and when a predetermined motion is detected, the next step is taken. It is also possible to recommend postures and postures, parts and directions to be moved, the number of times, and the like. In this case, correct postures and postures are stored in association with postures or movements instead of evaluation comments, correct postures and postures are calculated in real time, and differences between the postures at that time and the correct postures and postures are calculated. Just output the actions needed to fill in the difference.
  • the improvement measure information is obtained by extracting the improvement measure stored in the improvement measure information storage unit 234 based on the mode, purpose, physical information, or the evaluation result performed by the evaluation unit 213. , is presented to the user terminal 10.
  • the evaluation unit 213 evaluates images of training performed a plurality of times at different times, determines whether the evaluation result is improved, and determines whether the evaluation result is improved.
  • the improvement measure information transmission unit 216 may present to the user terminal 10 a training menu different from the previous time or an increase/decrease in the number of times.
  • the improvement measure information storage unit 234 stores the result of the evaluation of the user's physical exercise by the evaluation unit 213 based on the first image including the physical exercise performed by the user, and By comparing the results of evaluating the user's body movement based on the second image containing the body movement performed at different times after the above, the evaluation based on the second image is compared. , it is determined that the image has improved when it is closer to the reference value than the evaluation performed based on the first image. The improvement measure information storage unit 234 determines that there is no improvement when the evaluation based on the second image is farther from the reference value than the evaluation based on the first image. do.
  • the improvement measure information storage unit 234 determines that there is no change. judge not.
  • the improvement measure information storage unit 234 extracts improvement measures based on the mode, purpose, physical information, the evaluation results obtained by the evaluation unit 213, and the determination results such as improvement, no improvement, and no change. It may be redone and presented on the user terminal 10 .
  • the user terminal 10 or the server device 20 executes the predetermined function and stores the information.
  • the functional unit and the storage unit may be provided separately.

Landscapes

  • Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Primary Health Care (AREA)
  • Engineering & Computer Science (AREA)
  • Tourism & Hospitality (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Child & Adolescent Psychology (AREA)
  • Theoretical Computer Science (AREA)
  • Marketing (AREA)
  • Biophysics (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Epidemiology (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Medical Treatment And Welfare Office Work (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

[Solution] A care support device for supporting caregiving, the care support device being characterized by having an evaluation unit for evaluating a physical exercise on the basis of first image information that includes a physical exercise by a user, an improvement measure information transmission unit for selecting a suitable training menu from an evaluation of the physical exercise and presenting the training menu to a support person, and a document information generating unit for generating document information on the basis of at least the status of implementation of the training menu by the user.

Description

介護支援装置、介護支援プログラム、介護支援方法Care support device, care support program, care support method
 本開示は、介護支援装置、介護支援プログラム、介護支援方法に関する。 The present disclosure relates to a care support device, a care support program, and a care support method.
 介護サービスの支援をする技術が知られている。 Technology that supports nursing care services is known.
特願2003-185906Patent application 2003-185906
 上述した技術は、介護施設の設備や人員の情報、また、介護を受ける利用者の介護レベルや痴呆レベルなどから、適したリハビリメニュを提示することを目的としたものである。しかしながら、実際に利用者の状態は個人ごとに全くと言っていいほど異なるものであり、上述した情報からはそれぞれの利用者に適切なリハビリメニュを提示することは困難である。 The above-mentioned technology aims to present a suitable rehabilitation menu based on information on the equipment and personnel of nursing care facilities, as well as the care level and dementia level of the user receiving care. However, in reality, the conditions of users are completely different for each individual, and it is difficult to present an appropriate rehabilitation menu to each user based on the above information.
 そこで、本開示は上記問題点に鑑みてなされたものであり、その目的は、利用者の身体の動作を容易かつ正確に解析し、介護業務の支援することのできる技術を提供することである。 Therefore, the present disclosure has been made in view of the above problems, and its purpose is to provide a technology that can easily and accurately analyze the user's body movement and support nursing care work. .
 本開示によれば、介護業務を支援する介護支援装置であって、利用者の身体運動を含む第1画像情報をもとに、前記身体運動を評価する評価部と、前記身体運動の評価から、好適な訓練メニュを選定して支援者に提示する改善策情報送信部と、少なくとも前記利用者による前記訓練メニュの実施状況をもとに、書類情報を生成する書類情報生成部と、を有することを特徴とする、介護支援装置、が提供される。 According to the present disclosure, there is provided a nursing care support device that supports nursing care work, and includes an evaluation unit that evaluates the physical exercise based on first image information including the user's physical exercise, and an evaluation of the physical exercise. , an improvement measure information transmission unit that selects a suitable training menu and presents it to a supporter, and a document information generation unit that generates document information based on at least the state of implementation of the training menu by the user. A care support device characterized by:
 本開示によれば、介護業務の支援をすることができる。 According to this disclosure, it is possible to support nursing care work.
 その他本願が開示する課題やその解決方法については、発明の実施形態の欄および図面により明らかにされる。 Other problems disclosed by the present application and their solutions will be clarified in the section of the embodiment of the invention and the drawings.
本実施形態に係る介護支援システムの全体構成例を示す図である。BRIEF DESCRIPTION OF THE DRAWINGS It is a figure which shows the whole structural example of the care support system which concerns on this embodiment. 利用者端末10のハードウェア構成例を示す図である。2 is a diagram showing an example hardware configuration of a user terminal 10; FIG. 利用者端末10のソフトウェア構成例を示す図である。3 is a diagram showing an example of the software configuration of the user terminal 10; FIG. 身体情報記憶部130が記憶する身体情報の構成例を示す図である。4 is a diagram showing a configuration example of physical information stored in a physical information storage unit 130. FIG. 評価リクエスト送信部112がサーバ装置20に送信する評価リクエストの構成例を示す図である。4 is a diagram showing a configuration example of an evaluation request transmitted by an evaluation request transmission unit 112 to the server device 20; FIG. 評価情報受信部113がサーバ装置20から受信する評価情報の構成例を示す図である。4 is a diagram showing a configuration example of evaluation information received by the evaluation information receiving unit 113 from the server device 20. FIG. 改善策リクエスト送信部116がサーバ装置20に送信する改善策リクエストの構成例を示す図である。FIG. 11 is a diagram showing a configuration example of an improvement measure request that an improvement measure request transmission unit 116 transmits to the server device 20; 改善策情報の構成例を示す図である。It is a figure which shows the structural example of improvement measure information. サーバ装置20のハードウェア構成例を示す図である。2 is a diagram illustrating an example hardware configuration of a server device 20; FIG. サーバ装置20のソフトウェア構成例を示す図である。3 is a diagram showing an example of the software configuration of the server device 20; FIG. 画像データ記憶部231に記憶される画像情報の構成例を示す図である。3 is a diagram showing a configuration example of image information stored in an image data storage unit 231. FIG. 基準情報記憶部232が記憶する基準情報の構成例を示す図である。4 is a diagram showing a configuration example of reference information stored in a reference information storage unit 232; FIG. 評価条件情報記憶部233に記憶されている評価条件情報の構成例を示す図である。3 is a diagram showing a configuration example of evaluation condition information stored in an evaluation condition information storage unit 233; FIG. 改善策情報記憶部234に記憶される改善策情報の構成例を示す図である。6 is a diagram showing a configuration example of improvement plan information stored in an improvement plan information storage unit 234; FIG. 本実施形態の介護支援装置において実行される処理の流れを示す図である。It is a figure which shows the flow of the process performed in the care assistance apparatus of this embodiment. ウェイトリフティングのモードにおける動画の評価画面の一例を示す図である。FIG. 10 is a diagram showing an example of a video evaluation screen in the weightlifting mode. ウェイトリフティングのモードにおける動画の評価画面の一例を示す他の図である。FIG. 11 is another diagram showing an example of a video evaluation screen in the weightlifting mode. ウェイトリフティングのモードにおける動画の評価画面の一例を示す他の図である。FIG. 11 is another diagram showing an example of a video evaluation screen in the weightlifting mode. ウェイトリフティングのモードにおける動画の評価画面の一例を示す他の図である。FIG. 11 is another diagram showing an example of a video evaluation screen in the weightlifting mode. ウェイトリフティングのモードにおける動画の評価画面の一例を示す他の図である。FIG. 11 is another diagram showing an example of a video evaluation screen in the weightlifting mode.
 本発明の実施形態の内容を列記して説明する。本発明の実施の形態による介護支援装置は、以下のような構成を備える。 The contents of the embodiments of the present invention will be listed and explained. A care support device according to an embodiment of the present invention has the following configuration.
[項目1]
 介護業務を支援する介護支援装置であって、
 利用者の身体運動を含む第1画像情報をもとに、前記身体運動を評価する評価部と、
 前記身体運動の評価から、好適な訓練メニュを選定して支援者に提示する改善策情報送信部と、
 少なくとも前記利用者による前記訓練メニュの実施状況をもとに、書類情報を生成する書類情報生成部と、
を有することを特徴とする、介護支援装置。
[項目2]
 少なくとも、前記利用者の第2画像情報をもとに前記訓練メニュの実施状況情報を取得する実施状況情報取得部と、
を更に備え、
 前記書類情報生成部は、少なくとも前記実施状況情報をもとに、前記書類情報を生成すること、
を特徴とする、項目1に記載の介護支援装置。
[項目3]
 前記改善策情報送信部は、第1画像情報と第2画像情報をもとに、前記訓練メニュを選定し直すこと、
を特徴とする、項目1または2のいずれかに記載の介護支援装置。
[項目4]
 介護業務を支援する介護支援プログラムであって、
プロセッサに、
 利用者の身体運動を含む第1画像情報をもとに、前記身体運動を評価する評価ステップと、
 前記身体運動の評価から、好適な訓練メニュを選定して支援者に提示する改善策情報送信ステップと、
 少なくとも前記利用者による前記訓練メニュの実施状況をもとに、書類情報を生成する書類情報生成ステップと、
を実行させる、介護支援プログラム。
[項目5]
 介護業務を支援する介護支援方法であって、
プロセッサが、
 利用者の身体運動を含む第1画像情報をもとに、前記身体運動を評価する評価ステップと、
 前記身体運動の評価から、好適な訓練メニュを選定して支援者に提示する改善策情報送信ステップと、
 少なくとも前記利用者による前記訓練メニュの実施状況をもとに、書類情報を生成する書類情報生成ステップと、
を行うことを特徴とする、介護支援方法。
[Item 1]
A care support device that supports care work,
an evaluation unit that evaluates the physical exercise based on the first image information including the user's physical exercise;
an improvement measure information transmission unit that selects a suitable training menu from the evaluation of the physical exercise and presents it to a supporter;
a document information generation unit that generates document information based on at least the state of implementation of the training menu by the user;
A care support device, characterized by comprising:
[Item 2]
an implementation status information acquisition unit that acquires implementation status information of the training menu based on at least the second image information of the user;
further comprising
The document information generating unit generates the document information based on at least the implementation status information;
The care support device according to item 1, characterized by:
[Item 3]
the improvement measure information transmitting unit reselecting the training menu based on the first image information and the second image information;
3. The care support device according to any one of items 1 and 2, characterized by:
[Item 4]
A care support program that supports care work,
to the processor,
an evaluation step of evaluating the physical exercise based on the first image information including the user's physical exercise;
a improvement measure information transmission step of selecting a suitable training menu from the evaluation of the physical exercise and presenting it to a supporter;
a document information generation step of generating document information based on at least the state of implementation of the training menu by the user;
A care support program that allows you to carry out
[Item 5]
A nursing care support method for supporting nursing care work,
the processor
an evaluation step of evaluating the physical exercise based on the first image information including the user's physical exercise;
a improvement measure information transmission step of selecting a suitable training menu from the evaluation of the physical exercise and presenting it to a supporter;
a document information generation step of generating document information based on at least the state of implementation of the training menu by the user;
A nursing care support method, characterized by performing
 本発明の一実施形態に係る介護支援装置は、介護サービスの利用者(以下、利用者と記載する)の身体運動を評価し、適したトレーニング、リハビリ等の訓練メニュを提示する。 A care support device according to an embodiment of the present invention evaluates the physical exercise of a care service user (hereinafter referred to as a user) and presents a suitable training menu such as training and rehabilitation.
 本実施形態の介護支援装置は、たとえば、利用者が身体運動をしている様子を撮像した画像(静止画像であっても動画像であってもよいが、本実施形態では動画像であるものとする。)から、なお、身体運動には、道具を用いて行うものも含まれる。更に、身体運動には、支援者の支援を受けながら行うものも含まれる。また、本実施形態の介護支援装置は、身体の部位を特定し、前記身体の部位の相対的な位置関係などに基づいて部位の動きを評価する。なお、本実施形態の介護支援装置は、利用者が道具を用いた身体運動をしている様子の画像から、道具および道具の部分を特定し、当該部分の絶対的な位置および複数の異なる部分の相対的な位置関係、当該部分と身体の部位の相対的な位置関係などに基づいて、身体の動きを評価してもよい。 The care support device of the present embodiment is, for example, an image of a user exercising (either a still image or a moving image, but in the present embodiment, a moving image). ), it should be noted that physical exercise also includes those performed using tools. Furthermore, physical exercise includes exercise performed while receiving support from a supporter. In addition, the care support device of the present embodiment identifies a body part and evaluates the movement of the body part based on the relative positional relationship of the body part. Note that the care support device of the present embodiment identifies tools and parts of tools from an image of a user exercising using tools, and determines the absolute position of the part and a plurality of different parts. You may evaluate a motion of a body based on the relative positional relationship of , the relative positional relationship of the said part, and the body part.
 図1は本実施形態に係る介護支援装置の全体構成例を示す図である。同図に示すように、本実施形態の介護支援装置は、利用者端末10およびサーバ装置20を含んで構成される。さらに、撮像端末30を含んでもよい。利用者端末10とサーバ装置20と撮像端末30は、通信ネットワーク40を介して互いに通信可能に接続されている。通信ネットワーク40は、たとえば、インターネットやLAN(Local Area Network)であり、公衆電話回線網、専用電話回線網、携帯電話回線網、イーサネット(登録商標)、無線通信路などにより構築される。 FIG. 1 is a diagram showing an example of the overall configuration of a care support device according to this embodiment. As shown in the figure, the care support device of this embodiment includes a user terminal 10 and a server device 20 . Furthermore, an imaging terminal 30 may be included. The user terminal 10, the server device 20, and the imaging terminal 30 are connected to each other via a communication network 40 so as to be able to communicate with each other. The communication network 40 is, for example, the Internet or a LAN (Local Area Network), and is constructed by a public telephone line network, a dedicated telephone line network, a mobile telephone line network, Ethernet (registered trademark), a wireless communication path, and the like.
==利用者端末10==
 利用者端末10は、身体運動を行う利用者またはその支援者が操作コンピュータである。当該支援者には、利用者の家族だけでなく、トレーナー、理学療法士、介護士等の、身体運動を行う利用者に対して指導、教示、説明、支援等を行う役割を担う人が含まれていてよい。利用者端末10は、たとえば、スマートフォンやタブレットコンピュータ、パーソナルコンピュータなどである。利用者端末10はカメラ等の撮像装置を備えており、これにより運動中における利用者の身体を撮像することができる。本実施形態では、運動中の利用者の身体を撮像した動画像は利用者端末10からサーバ装置20に送信されるものとする。なお、利用者端末10は撮像端末30を兼ねていてもよい。また、利用者端末10は図1において1台しか記載していないが、複数であってもよいことは言うまでもない。なお、利用者端末10は利用者が身につけるセンサ(服形状や靴型などのウェアラブルセンサや、センサを服や体の一部に取り付ける形式でもよい)であってもよく、加速度センサによる身体運動の測定だけでなく、活動量、会話量、睡眠時間、脈拍、UV量、脈波間隔(PPI)、皮膚温度、心拍などをセンシングし、通信ネットワーク40経由でそれらデータはサーバ装置20に送信されてもよい。
== User terminal 10 ==
The user terminal 10 is a computer operated by a user who performs physical exercise or a supporter thereof. Such supporters include not only the user's family members, but also trainers, physical therapists, caregivers, and others who play a role in guiding, instructing, explaining, and supporting users who exercise. It's okay to be The user terminal 10 is, for example, a smart phone, a tablet computer, a personal computer, or the like. The user terminal 10 is equipped with an imaging device such as a camera, which can capture an image of the user's body during exercise. In this embodiment, it is assumed that a moving image of the user's body during exercise is transmitted from the user terminal 10 to the server device 20 . Note that the user terminal 10 may also serve as the imaging terminal 30 . Although only one user terminal 10 is shown in FIG. 1, it goes without saying that a plurality of user terminals may be provided. Note that the user terminal 10 may be a sensor worn by the user (a wearable sensor such as a shape of clothes or a shoe last, or a sensor attached to clothes or a part of the body). Not only exercise measurement, but also activity amount, conversation amount, sleep time, pulse, UV amount, pulse interval (PPI), skin temperature, heartbeat, etc. are sensed, and those data are sent to the server device 20 via the communication network 40. may be
==サーバ装置20==
 サーバ装置20は、身体運動を評価するコンピュータである。サーバ装置20は、たとえば、ワークステーションやパーソナルコンピュータ、クラウドコンピューティングにより論理的に実現される仮想コンピュータなどである。サーバ装置20は、利用者端末10または撮像端末30が撮影した動画像を受信し、受信した動画像を解析して身体運動の評価を行う。また、サーバ装置20は、身体運動の改善策に係る提案も行う。身体運動の評価および改善策の提案の詳細については後述する。
== server device 20 ==
The server device 20 is a computer that evaluates physical exercise. The server device 20 is, for example, a workstation, a personal computer, a virtual computer that is logically implemented by cloud computing, or the like. The server device 20 receives the moving image captured by the user terminal 10 or the imaging terminal 30, analyzes the received moving image, and evaluates the body exercise. In addition, the server device 20 also makes proposals regarding improvement measures for physical exercise. Details of evaluation of physical exercise and proposal of improvement measures will be described later.
==撮像端末30==
 撮像端末30は、利用者が身体運動を行う場所に設置、または取り付けられた、利用者を撮像するコンピュータ、または通信機能を持つカメラなどである。撮像端末30は、たとえば、スマートフォンやタブレットコンピュータ、パーソナルコンピュータなどである。撮像端末30はカメラ等の撮像装置を備えており、これにより身体運動中における利用者の身体を撮像することができる。本実施形態では、運動中の利用者の身体を撮像した動画像は撮像端末30からサーバ装置20に送信されるものとする。なお、撮像端末30に記憶された、利用者を撮像したデータを、利用者またはその支援者、介護支援装置を用いて事業を行う事業者が、サーバ装置20に直接インプットしてもよいし、通信ネットワーク40を介してインプットしてもよい。
== Imaging terminal 30 ==
The imaging terminal 30 is a computer that captures an image of the user, or a camera that has a communication function, and is installed or attached to a place where the user exercises. The imaging terminal 30 is, for example, a smart phone, a tablet computer, a personal computer, or the like. The image capturing terminal 30 includes an image capturing device such as a camera, which can capture an image of the user's body during physical exercise. In the present embodiment, it is assumed that a moving image obtained by imaging the user's body during exercise is transmitted from the imaging terminal 30 to the server device 20 . In addition, the data of the image of the user stored in the imaging terminal 30 may be directly input to the server device 20 by the user, the supporter of the user, or a business operator using the care support device, Input may be via communication network 40 .
==支援者端末50==
 支援者端末50は、トレーナー、理学療法士、介護士等の、身体運動を行う利用者に対して指導、教示、説明、支援等を行う役割を担う人(支援者)が操作するコンピュータである。支援者端末50は、たとえば、スマートフォンやタブレットコンピュータ、パーソナルコンピュータなどである。
== Supporter terminal 50 ==
The supporter terminal 50 is a computer operated by a person (supporter), such as a trainer, physical therapist, caregiver, etc., who plays a role of providing guidance, instruction, explanation, support, etc. to users who perform physical exercise. . The supporter terminal 50 is, for example, a smart phone, a tablet computer, a personal computer, or the like.
 本実施形態に係る介護支援装置においては、利用者の身体運動を撮像した画像を解析するが、当該画像は利用者端末10で撮像したものでもよく、撮像端末30を別途準備して撮像したものでもよい。以下の説明においては利用者端末10で撮像した画像を各処理部が処理したり、各記憶部が記憶したりする説明を行うが、当該画像は撮像端末30で撮像したものであってもよい。 In the care support device according to the present embodiment, an image obtained by capturing the body movement of the user is analyzed. It's okay. In the following description, an image captured by the user terminal 10 is processed by each processing unit and stored by each storage unit, but the image may be captured by the imaging terminal 30. .
 本実施形態に係る介護支援装置においては、支援者端末50は、利用者端末10と同等の機能を有していてよく、利用者に変わって利用者の身体運動を撮像したり、改善策のリクエストをしたりするなどしてよい。また、支援者は、支援者端末50が有する、利用者端末10と同等の機能をもちいて、利用者に対して評価結果や改善策を提示してもよい。更に、サーバ装置20が利用者端末10に対して行う処理は、全て支援者端末50に対して行なってもよいし、利用者端末10がサーバ装置20に対して行う処理は、全て支援者端末50で行うことができてもよい。 In the care support device according to the present embodiment, the supporter terminal 50 may have functions equivalent to those of the user terminal 10, and instead of the user, the user's body movement is captured, and improvement measures are taken. You can make a request. In addition, the supporter may use the functions of the supporter terminal 50, which are equivalent to those of the user terminal 10, to present the user with evaluation results and improvement measures. Further, all the processing performed by the server device 20 on the user terminal 10 may be performed on the supporter terminal 50, and all the processing performed on the server device 20 by the user terminal 10 may be performed on the supporter terminal. 50.
 図2は、利用者端末10のハードウェア構成例を示す図である。利用者端末10は、CPU101、メモリ102、記憶装置103、通信インタフェース104、タッチパネルディスプレイ105、カメラ106を備える。記憶装置103は、各種のデータやプログラムを記憶する、例えばハードディスクドライブやソリッドステートドライブ、フラッシュメモリなどである。通信インタフェース104は、通信ネットワーク40に接続するためのインタフェースであり、例えばイーサネット(登録商標)に接続するためのアダプタ、公衆電話回線網に接続するためのモデム、無線通信を行うための無線通信機、シリアル通信のためのUSB(Universal Serial Bus)コネクタやRS232Cコネクタなどである。タッチパネルディスプレイ105は、データの入出力を行うデバイスである。利用者端末10はまた、キーボードやマウス、ボタン、マイクロフォンなどの入力装置、スピーカやプリンタなどの出力装置をさらに備えるようにしてもよい。 FIG. 2 is a diagram showing a hardware configuration example of the user terminal 10. As shown in FIG. The user terminal 10 includes a CPU 101 , a memory 102 , a storage device 103 , a communication interface 104 , a touch panel display 105 and a camera 106 . The storage device 103 is, for example, a hard disk drive, solid state drive, flash memory, etc., which stores various data and programs. The communication interface 104 is an interface for connecting to the communication network 40, and includes, for example, an adapter for connecting to Ethernet (registered trademark), a modem for connecting to a public telephone network, and a wireless communication device for wireless communication. , USB (Universal Serial Bus) connector and RS232C connector for serial communication. The touch panel display 105 is a device that inputs and outputs data. The user terminal 10 may further include input devices such as a keyboard, mouse, buttons, and microphone, and output devices such as speakers and printers.
 図3は、利用者端末10のソフトウェア構成例を示す図である。利用者端末10は、撮像部111、評価リクエスト送信部112、評価情報受信部113、評価表示部114、チェックポイント表示部115、改善策リクエスト送信部116、改善策情報受信部117、改善策情報表示部118の各機能部と、身体情報記憶部130、画像記憶部131、評価情報記憶部132および改善策記憶部133の各記憶部とを備える。 FIG. 3 is a diagram showing a software configuration example of the user terminal 10. As shown in FIG. The user terminal 10 includes an imaging unit 111, an evaluation request transmission unit 112, an evaluation information reception unit 113, an evaluation display unit 114, a checkpoint display unit 115, an improvement request transmission unit 116, an improvement measure information reception unit 117, and an improvement measure information. It includes functional units of display unit 118 , physical information storage unit 130 , image storage unit 131 , evaluation information storage unit 132 , and improvement measure storage unit 133 .
 なお、上記各機能部は、利用者端末10が備えるCPU101が記憶装置103に記憶されているプログラムをメモリ102に読み出して実行することにより実現され、上記各記憶部は、利用者端末10が備えるメモリ102および記憶装置103が提供する記憶領域の一部として実現される。 Note that each of the above functional units is implemented by the CPU 101 provided in the user terminal 10 reading a program stored in the storage device 103 into the memory 102 and executing it, and each of the above storage units is provided in the user terminal 10. It is implemented as part of the storage area provided by memory 102 and storage device 103 .
 撮像部111は、利用者が身体運動中の画像を、動画を含めて撮像する。撮像部111は、カメラ106を制御することにより、利用者の身体運動を撮像した動画像を取得することができる。なお、利用者または利用者の支援者は、利用者端末10を平坦な場所や壁などに設置し、カメラ106の光軸を利用者が運動を行う場所に向け、ビデオ撮影の開始を指示し、これに応じて撮像部111はカメラ106を動作させて、動画像を取得すればよい。撮像部111は、取得した動画像を画像記憶部131に記憶する。 The imaging unit 111 captures images, including moving images, while the user is exercising. By controlling the camera 106, the imaging unit 111 can obtain a moving image of the user's body movement. The user or the user's support person installs the user terminal 10 on a flat place or a wall, directs the optical axis of the camera 106 to the place where the user is exercising, and instructs the start of video recording. , the imaging unit 111 may operate the camera 106 in response to this to obtain a moving image. The imaging unit 111 stores the acquired moving image in the image storage unit 131 .
 画像記憶部131は、撮像部111が撮像した画像を記憶する。本実施形態では、画像は動画像であるが、それに限定されない。画像記憶部131は、たとえばファイルとして動画像を記憶することができる。 The image storage unit 131 stores images captured by the imaging unit 111 . Although the image is a moving image in this embodiment, it is not limited to this. The image storage unit 131 can store moving images, for example, as files.
 撮像部111は、利用者の身体運動の画像を撮像するが、それは当初の評価を行う時点でもよいし、後述するサーバ装置1の機能によって提示された訓練メニュを利用者が実施する時点に撮像してもよい。どちらの時点においても、撮像部111が撮像した画像は、画像記憶部131に記憶される。 The imaging unit 111 captures an image of the user's physical exercise, which may be at the time of initial evaluation, or at the time the user implements a training menu presented by the function of the server device 1, which will be described later. You may At either time, the image captured by the imaging unit 111 is stored in the image storage unit 131 .
 身体情報記憶部130は、利用者の身体や身体能力、トレーニング効果に影響を及ぼす要素等に関する情報(以下、身体情報という。)を記憶する。図4は、身体情報記憶部130が記憶する身体情報の構成例を示す図である。同図に示すように、身体情報には、身長、体重、性別、利き手、腕の長さ、足の長さ、手の大きさ、指の長さ、握力、筋力、柔軟性、肩の強さ、ゲノム、エピゲノム、遺伝子多型、腸内細菌叢、食事、関節可動域、既往歴、発症している病名、現在の病状、心身の状態、要支援度、要介護度、痴呆レベル、ADL、IDAL、運動機能、生活機能、認知機能などが含まれるが、これらに限定されない。 The physical information storage unit 130 stores information (hereinafter referred to as physical information) regarding factors affecting the user's body, physical ability, and training effect. FIG. 4 is a diagram showing a configuration example of physical information stored in the physical information storage unit 130. As shown in FIG. As shown in the figure, the physical information includes height, weight, gender, dominant hand, arm length, leg length, hand size, finger length, grip strength, muscle strength, flexibility, shoulder strength. Genome, epigenome, genetic polymorphism, intestinal flora, diet, joint range of motion, medical history, disease name, current medical condition, mental and physical condition, level of support required, level of nursing care required, level of dementia, ADL , IDAL, motor function, life function, cognitive function, etc.
 評価リクエスト送信部112は、撮像部111が撮像した画像に基づいて、身体運動の評価を行うことのリクエスト(以下、評価リクエストという。)をサーバ装置20に送信する。 The evaluation request transmission unit 112 transmits to the server device 20 a request to evaluate physical exercise (hereinafter referred to as an evaluation request) based on the image captured by the imaging unit 111 .
 図5は、評価リクエスト送信部112がサーバ装置20に送信する評価リクエストの構成例を示す図である。同図に示すように、評価リクエストには、利用者ID、モード、身体情報および画像データが含まれる。利用者IDは利用者を特定する情報である。モードは、利用者が行う運動を示す情報である。モードは、たとえば、「筋力トレーニング」や「関節可動域訓練」、「歩行リハビリ」、「フレイルの改善(特定の症状、病状の改善)」などとすることができる。なお、モードは、所定の選択肢から選択されるものとする。身体情報は、身体情報記憶部130に記憶されている身体情報である。画像データは撮像部111が取得した動画像のデータである。 FIG. 5 is a diagram showing a configuration example of an evaluation request that the evaluation request transmission unit 112 transmits to the server device 20. As shown in FIG. As shown in the figure, the evaluation request includes user ID, mode, physical information and image data. A user ID is information that identifies a user. A mode is information indicating an exercise performed by a user. Modes can be, for example, "strength training", "joint range of motion training", "gait rehabilitation", "improvement of frailty (improvement of specific symptoms and medical conditions)", and the like. It is assumed that the mode is selected from predetermined options. The physical information is physical information stored in the physical information storage unit 130 . The image data is data of a moving image acquired by the imaging unit 111 .
 評価情報受信部113は、評価リクエストに応じてサーバ装置20から応答される、道具を用いた身体運動の評価に関する情報(以下、評価情報という。)を受信する。評価情報受信部113は、受信した評価情報を評価情報記憶部132に登録する。 The evaluation information receiving unit 113 receives information on evaluation of physical exercise using a tool (hereinafter referred to as evaluation information) that is responded from the server device 20 in response to an evaluation request. The evaluation information receiving section 113 registers the received evaluation information in the evaluation information storage section 132 .
 図6は、評価情報受信部113がサーバ装置20から受信する評価情報の構成例を示す図である。同図に示すように、評価情報には、モード、利用者ID、道具位置情報、身体部位位置情報、姿勢情報、動き情報およびチェックポイント情報が含まれる。 FIG. 6 is a diagram showing a configuration example of evaluation information received by the evaluation information receiving unit 113 from the server device 20. As shown in FIG. As shown in the figure, the evaluation information includes mode, user ID, tool position information, body part position information, posture information, motion information and checkpoint information.
 利用者IDおよびモードは評価リクエストに含まれていた利用者IDとモードである。撮像された画像には、モードが示す運動を利用者が行った身体を撮影したものであることを示す。 The user ID and mode are the user ID and mode included in the evaluation request. The photographed image indicates that the body of the user who has performed the exercise indicated by the mode is photographed.
 道具位置情報は、道具の各部分(たとえば、野球に用いるバット全体や、両端、グリップ、ボールがミートした場所、重心、その他任意の部分など。また、バーベル全体やシャフト、プレート(重りの部分)、グリップした部分、シャフトの中心、重心、その他任意の部分など。道具全体も含む。)の画像中の位置を示す。道具位置情報には、動画の時間軸上の時点に対応付けて、道具の各部分と、当該部分の位置とが含まれる。道具位置情報に基づいて、道具の動きや、身体の部位との関係性を表示することができる。すなわち、たとえば、道具位置情報が示す位置に、部分を示す図形(たとえば円など)を画像に重畳させて表示することができる。なお、1つの時点について複数の部分の位置が含まれ得る。さらに、2つの部分の間を結ぶ部分(たとえば、バーベルを右手と左手で握った部分の中間点など)については、道具位置情報が含まれていなくてよい。この場合、所定の2つの部分を示すマーク(円など)のペアの間を線で結ぶことにより、これらの2つの部分の間を結ぶ部分を表現することができる。道具部分位置情報は、動画を構成する各フレームについて含まれていてもよいし、キーフレーム(のちに説明するチェックポイントに関するフレームを含む)ごとに含まれていてもよいし、任意の数ごとのフレームごとに含まれていてもよいし、ランダムな時点について含まれていてもよい。毎フレームに位置情報が含まれていない場合、最も近い過去の時点の位置情報に基づいて前記図形を表示するようにすることができる。 Tool position information includes each part of the tool (for example, the entire bat used for baseball, both ends, the grip, the place where the ball meets, the center of gravity, and other arbitrary parts. In addition, the entire barbell, shaft, plate (weight part) , the gripped part, the center of the shaft, the center of gravity, or any other part (including the entire tool) in the image. The tool position information includes each part of the tool and the position of the part in association with the time point on the time axis of the moving image. Based on the tool position information, it is possible to display the movement of the tool and the relationship with parts of the body. That is, for example, at the position indicated by the tool position information, a figure (such as a circle) indicating the part can be superimposed on the image and displayed. Note that multiple segment positions may be included for one time point. In addition, tool position information may not be included for a portion that connects two portions (for example, the midpoint between the portions where the barbell is gripped with the right and left hands). In this case, by connecting a pair of marks (such as circles) indicating two predetermined portions with a line, a portion connecting these two portions can be represented. The tool part position information may be included for each frame constituting the moving image, may be included for each key frame (including frames related to checkpoints to be described later), or may be included for any number of It may be included for each frame, or may be included for random time points. If position information is not included in each frame, the figure can be displayed based on position information at the most recent past time.
 身体位置情報は、身体の各部位(たとえば、頭、肩、肘、腰、膝、足首など)の画像中の位置を示す。身体位置情報には、動画の時間軸上の時点に対応付けて、部位と、当該部位の位置とが含まれる。身体位置情報に基づいて、身体の骨格の状態(ボーン)を表示することができる。すなわち、たとえば、身体位置情報が示す位置に、部位を示す図形(たとえば円など)を画像に重畳させて表示することができる。なお、1つの時点について複数の部位の位置が含まれ得る。なお、2つの部位の間を結ぶ部位(たとえば、手首と肘を結ぶ前腕や腰と膝を結ぶ大腿など)については、位置情報が含まれていなくてよい。この場合、所定の2つの部位を示すマーク(円など)のペアの間を線で結ぶことにより、これらの2つの部位の間を結ぶ部位を表現することができる。位置情報は、動画を構成する各フレームについて含まれていてもよいし、キーフレーム(のちに説明するチェックポイントに関するフレームを含む)ごとに含まれていてもよいし、任意の数ごとのフレームごとに含まれていてもよいし、ランダムな時点について含まれていてもよい。毎フレームに位置情報が含まれていない場合、最も近い過去の時点の位置情報に基づいてボーンを表示するようにすることができる。 The body position information indicates the position in the image of each part of the body (eg, head, shoulders, elbows, waist, knees, ankles, etc.). The body position information includes a part and the position of the part in association with a point in time on the time axis of the moving image. Based on the body position information, the state of the skeleton of the body (bone) can be displayed. That is, for example, it is possible to superimpose and display a figure (for example, a circle) indicating the part on the image at the position indicated by the body position information. It should be noted that multiple site positions may be included for one time point. It should be noted that position information does not need to be included for a part connecting two parts (for example, a forearm connecting a wrist and an elbow, a thigh connecting a waist and a knee, etc.). In this case, by connecting a pair of marks (such as circles) indicating two predetermined parts with a line, a part connecting these two parts can be expressed. Position information may be included for each frame that constitutes a moving image, may be included for each keyframe (including frames related to checkpoints to be described later), or may be included for any number of frames. , or at random time points. If position information is not included in every frame, bones can be displayed based on position information for the most recent past time.
 道具向き情報は、利用者が用いている道具の向きや道具の部分が向いている方向などに係る情報である。道具向き情報には、動画の時間軸上の時点に対応付けて、評価対象となる道具の部分と、道具動き値と、評価ランクと、評価コメント等が含まれる。道具向き値とは、道具の向きを表す値である。道具向き値は、たとえば、地面から道具のとある部分までの距離、道具の2つの部分間の距離、部分の角度(道具の第1の部分と、例えば利用者が当該道具を握っている部分への直線と、道具の第2の部分と、例えば利用者が当該道具を握っている部分への直線とが作る角度)、道具のとある部位の動きなどである。評価ランクは、評価値をランクにより表した値である。評価ランクは、たとえば、5段階の1ないし5や、ABCなどで表現される。評価コメントは、姿勢に関する評価に係るコメントである。たとえば、モードが「アップライトロウ」で、地面からバーベルの右端と左端の距離が異なる場合に、「左右に異なる力が掛かっています」といった評価コメントが含まれうる。 The tool orientation information is information related to the orientation of the tool used by the user and the direction in which the part of the tool faces. The tool orientation information includes a portion of the tool to be evaluated, a tool movement value, an evaluation rank, an evaluation comment, etc., in association with a point in time on the time axis of the moving image. The tool orientation value is a value representing the orientation of the tool. A tool orientation value can be, for example, the distance from the ground to a part of the tool, the distance between two parts of the tool, the angle of the part (the first part of the tool and the part where the user is holding the tool, for example). the angle formed by a straight line to a second part of the tool and, for example, a straight line to the part where the user is gripping the tool); An evaluation rank is a value representing an evaluation value by a rank. The evaluation rank is expressed by, for example, 1 to 5 out of 5 or ABC. The evaluation comment is a comment related to the evaluation of posture. For example, if the mode is "upright row" and the right and left ends of the barbell are at different distances from the ground, an evaluation comment such as "different forces are applied to the left and right" may be included.
 道具動き情報は、利用者が用いている道具の動きに係る情報である。道具動き情報には、動画の時間軸上の期間に対応付けて、評価対象となる道具の部分と、前記道具向き値のリストと、評価ランクと、評価コメントとが含まれる。道具向き値のリストは、期間内における時系列の道具向き値である。評価コメントは、道具の動きに関する評価に係るコメントである。たとえば、モードが「アップライトロウ」で、バーベルの上下運動が十分でない場合に、「バーベルが十分に持ち上げられていません」といった評価コメントが含まれうる。 Tool movement information is information related to the movement of the tool used by the user. The tool movement information includes the portion of the tool to be evaluated, the list of tool orientation values, the evaluation rank, and the evaluation comment in association with the period on the time axis of the moving image. A list of tool orientation values is a time series of tool orientation values within a time period. The evaluation comment is a comment related to the evaluation regarding the movement of the tool. For example, if the mode is "upright row" and there is not enough up and down movement of the barbell, a rating comment such as "the barbell is not lifted enough" may be included.
 姿勢情報は、利用者の身体の姿勢に係る情報である。姿勢情報には、動画の時間軸上の時点に対応付けて、評価対象となる部位と、姿勢値と、評価ランクと、評価コメントとが含まれる。姿勢値とは、姿勢を表す値である。姿勢値は、たとえば、地面から部位までの距離、2つの部位間の距離、間接部位の角度(第1の端部の部位から間接部位への直線と、第2の端部の部位から間接部位への直線とが作る角度)などである。評価ランクは、評価値をランクにより表した値である。評価ランクは、たとえば、5段階の1ないし5や、ABCなどで表現される。評価コメントは、姿勢に関する評価に係るコメントである。たとえば、モードが「リフティング」で、屈曲が十分でない場合に、「腰が下がっていない」といった評価コメントが含まれうる。(関節可動域等の柔軟性、数値(角度や距離で出す)→(確認)アドバイスにも影響) The posture information is information related to the posture of the user's body. The posture information includes a part to be evaluated, a posture value, an evaluation rank, and an evaluation comment in association with a time point on the time axis of the moving image. A posture value is a value representing a posture. The attitude value can be, for example, the distance from the ground to the part, the distance between the two parts, the angle of the indirect part (a straight line from the first end part to the indirect part, and from the second end part to the indirect part angle made by a straight line to ) and so on. An evaluation rank is a value representing an evaluation value by a rank. The evaluation rank is expressed by, for example, 1 to 5 out of 5 or ABC. The evaluation comment is a comment related to the evaluation of posture. For example, if the mode is "lifting" and the flexion is not sufficient, an evaluation comment such as "the waist is not lowered" may be included. (Flexibility such as range of motion of joints, numerical values (provided by angle and distance) → (confirmation) also affects advice)
 身体動き情報は、利用者の身体の動きに係る情報である。身体動き情報には、動画の時間軸上の期間に対応付けて、評価対象となる部位と、姿勢値のリストと、評価ランクと、評価コメントとが含まれる。姿勢値のリストは、期間内における時系列の姿勢値である。評価コメントは、動きに関する評価に係るコメントである。たとえば、モードが「リフティング」で、膝の伸展がスムーズでない場合に、「膝の動きがスムーズではありません」といった評価コメントが含まれうる。  Body movement information is information related to the movement of the user's body. The body movement information includes a part to be evaluated, a list of posture values, an evaluation rank, and an evaluation comment in association with a period on the time axis of the moving image. The list of attitude values is the time-series attitude values within the period. An evaluation comment is a comment related to an evaluation regarding motion. For example, if the mode is "lifting" and the knee extension is not smooth, an evaluation comment such as "knee movement is not smooth" may be included.
 関係情報は、前記道具と前記身体のそれぞれ一つ以上の位置関係の情報を示す。関係情報には、動画の時間軸上の時点に対応付けて、道具の部分の位置、向き、動き等の情報と、身体の部位、姿勢、動き等の情報との関係の情報が含まれる。例えば、道具位置情報と身体位置情報に基づいて、両者がどのような位置関係にあるのかを表示することができる。すなわち、たとえば、道具位置情報が示す位置に、部分を示す図形(たとえば円など)を画像に重畳させ、さらに身体位置情報が示す位置に、部位を示す図形(たとえば円など)で表示することができる。なお、1つの時点について複数の部分、部位の位置が含まれ得る。なお、部分と部位の間を結ぶ箇所(たとえば、バットの先端部分と利用者がバットを握っている部位の中心点など)については、位置情報が含まれていなくてよい。この場合、所定の2つの部分と部位を示す図形(円など)のペアの間を線で結ぶことにより、これらの2つの部分と部位の間を結ぶ箇所を表現することができる。関係情報は、動画を構成する各フレームについて含まれていてもよいし、キーフレームごと、チェックポイント(詳細は後述する)ごとに含まれていてもよいし、任意の数ごとのフレームごとに含まれていてもよいし、ランダムな時点について含まれていてもよい。毎フレームに位置情報が含まれていない場合、最も近い過去の時点の位置情報に基づいて、部分や部位を示す図形、それらを結ぶ線を表示するようにすることができる。 The relationship information indicates information on one or more positional relationships between the tool and the body. The relationship information includes information on the relationship between information such as the position, orientation, and movement of the part of the tool and information such as the body part, posture, and movement, in association with the point in time on the time axis of the moving image. For example, based on the tool position information and the body position information, it is possible to display the positional relationship between the two. That is, for example, it is possible to superimpose a figure (such as a circle) indicating the part on the image at the position indicated by the tool position information, and display a figure (such as a circle) indicating the part at the position indicated by the body position information. can. It should be noted that one point in time may include a plurality of parts and site positions. Positional information does not need to be included for a point connecting a part and a part (for example, the tip of the bat and the center point of the part where the user grips the bat). In this case, by connecting a pair of figures (such as circles) indicating two predetermined parts and parts with a line, it is possible to express the point connecting these two parts and the part. The relational information may be included for each frame constituting the moving image, for each keyframe, for each checkpoint (details will be described later), or for any number of frames. may be included or may be included for random time points. If position information is not included in each frame, it is possible to display a figure indicating a part or region and a line connecting them based on the position information at the most recent past time.
 チェックポイント情報は、利用者が用いる道具の動きまたは利用者の身体の一連の動作の中で、道具の向きや身体の姿勢などをチェックするべきポイント(以下、チェックポイントという。)を示す情報である。チェックポイントとしては、たとえば、モードが「ウェイトリフティング」である場合、バーベルが一番高い位置に到達したところや、一番低い位置に到達したところ、持ち上がる瞬間などである。モードが「ピッチング」である場合、足を上げたところや、上げた足を下ろして体重移動したところ、ボールをリリースしたところなどである。チェックポイント情報には、動画の時間軸上の時点に対応付けて、チェックポイントを示す情報(以下、チェックポイントIDという。)を記憶している。すなわち、チェックポイントIDが示すチェックポイントが表示されている動画中のフレーム(静止画像)を特定することができる。 Checkpoint information is information that indicates the points (hereinafter referred to as checkpoints) at which the direction of the tool and the posture of the body should be checked in the movement of the tool used by the user or in the series of body movements of the user. be. For example, when the mode is "weightlifting", checkpoints include when the barbell reaches its highest position, when it reaches its lowest position, and when it is lifted. When the mode is "pitching", it is when the foot is lifted, when the lifted foot is put down and the weight is shifted, when the ball is released, and so on. The checkpoint information stores information indicating a checkpoint (hereinafter referred to as a checkpoint ID) in association with a point in time on the time axis of the moving image. That is, it is possible to specify the frame (still image) in the moving image in which the checkpoint indicated by the checkpoint ID is displayed.
 評価表示部114は、評価情報を表示する。たとえば、評価表示部114は、評価情報に含まれている道具位置、道具向き、道具動き情報や身体位置、姿勢、身体動き情報などに基づいて、動画に重畳させて、道具の部分や身体の部位を表す図形(たとえば、道具の端部や身体の重心を表す円、それらを結ぶ線など)を表示することで、道具の部分や身体の部位の動きを動画に重ねて表示することができる。また、評価表示部114は、たとえば、道具の部分の位置、道具の向き、道具の動きなどや、身体の部位の位置、姿勢、身体の動きなどの時系列的な変化をグラフ表示することができる。 The evaluation display unit 114 displays evaluation information. For example, the evaluation display unit 114 superimposes the tool position, tool orientation, tool movement information, body position, posture, body movement information, etc. included in the evaluation information on the animation, and displays the parts of the tool and the body. By displaying figures representing body parts (for example, circles representing the ends of tools, the center of gravity of the body, lines connecting them, etc.), movements of tool parts and body parts can be superimposed on videos. . In addition, the evaluation display unit 114 can graphically display chronological changes in the position of the part of the tool, the direction of the tool, the movement of the tool, the position of the part of the body, the posture, the movement of the body, and the like. can.
 また、評価表示部114は、評価情報に含まれている道具向き情報および道具動き情報に基づいて、動画の表示に併せて、評価ランクおよび評価コメントを表示することができる。たとえば、評価表示部114は、動画の再生時間が、道具向き情報に含まれている時点の前後近傍(たとえば、5秒前後など任意の長さとすることができる。)にきたところで、道具向き情報に含まれている評価ランクおよび評価コメントを表示することができる。また、評価表示部114は、動画の再生時間が、道具動き情報に含まれている期間内にきたところで、道具動き情報に含まれている評価ランクおよび評価コメントを表示することができる。また、評価表示部114は、姿勢情報に含まれている姿勢値を表示することができる。また、評価表示部114は、道具動き情報に含まれている道具向き値のリストに基づいて、道具向き値の時系列的な変化をグラフ表示することができる。 In addition, the evaluation display unit 114 can display the evaluation rank and the evaluation comment together with the display of the moving image based on the tool orientation information and the tool movement information included in the evaluation information. For example, the evaluation display unit 114 displays the tool orientation information when the playback time of the moving image reaches the vicinity of the time point included in the tool orientation information (for example, it can be set to any length such as around 5 seconds). You can view the rating rank and rating comments contained in the . Moreover, the evaluation display unit 114 can display the evaluation rank and the evaluation comment included in the tool movement information when the reproduction time of the moving image comes within the period included in the tool movement information. The evaluation display unit 114 can also display the posture value included in the posture information. In addition, the evaluation display unit 114 can graphically display changes in tool orientation values over time based on the list of tool orientation values included in the tool movement information.
 また、評価表示部114は、評価情報に含まれている姿勢情報および動き情報に基づいて、動画の表示に併せて、評価ランクおよび評価コメントを表示することができる。たとえば、評価表示部114は、動画の再生時間が、姿勢情報に含まれている時点の前後近傍(たとえば、5秒前後など任意の長さとすることができる。)にきたところで、姿勢情報に含まれている評価ランクおよび評価コメントを表示することができる。また、評価表示部114は、動画の再生時間が、動き情報に含まれている期間内にきたところで、動き情報に含まれている評価ランクおよび評価コメントを表示することができる。また、評価表示部114は、姿勢情報に含まれている姿勢値を表示することができる。また、評価表示部114は、動き情報に含まれている姿勢値のリストに基づいて、姿勢値の時系列的な変化をグラフ表示することができる。なお、当該評価ランクと当該評価コメントは、前段落に記載した、道具向き情報および道具動き情報に基づいて、動画の表示に合わせて表示された評価ランクおよび評価コメントと併せて表示してもよい。 In addition, the evaluation display unit 114 can display the evaluation rank and the evaluation comment together with the display of the moving image based on the posture information and motion information included in the evaluation information. For example, the evaluation display unit 114 displays the playback time included in the posture information when the playback time of the moving image reaches the vicinity of the point in time included in the posture information (for example, it can be set to an arbitrary length such as around 5 seconds). You can view the evaluation rank and evaluation comments that have been submitted. Moreover, the evaluation display unit 114 can display the evaluation rank and the evaluation comment included in the motion information when the playback time of the moving image comes within the period included in the motion information. The evaluation display unit 114 can also display the posture value included in the posture information. In addition, the evaluation display unit 114 can display chronological changes in posture values in a graph based on the list of posture values included in the motion information. Note that the evaluation rank and the evaluation comment may be displayed together with the evaluation rank and the evaluation comment displayed in accordance with the display of the moving image based on the tool orientation information and the tool movement information described in the previous paragraph. .
 チェックポイント表示部115は、動画からチェックポイントの画像を抽出して表示することができる。チェックポイント表示部115は、画像記憶部131に記憶されている動画の画像データから、チェックポイント情報に含まれている時点に対応するフレームを読み出して静止画像として表示することができる。また、チェックポイント表示部115は、たとえば、読み出したフレームから身体部分のみを抽出して表示するようにしてもよい。 The checkpoint display unit 115 can extract and display checkpoint images from the video. The checkpoint display unit 115 can read a frame corresponding to the time point included in the checkpoint information from the moving image data stored in the image storage unit 131 and display it as a still image. Also, the checkpoint display unit 115 may, for example, extract and display only body parts from the read frames.
 改善策リクエスト送信部116は、道具の扱い方または身体運動に関する改善策を取得するためのリクエスト(以下、改善策リクエストという。)をサーバ装置20に送信する。図7は、改善策リクエストの構成例を示す図である。同図に示すように、改善策リクエストには、利用者ID、モードおよび目的等が含まれている。目的は、利用者が改善を行う目的である。目的としては、たとえば、「関節可動域を広げる」、「筋力をアップする」、「歩行を安定させる」、「症状・病状を改善する」、などとすることができる。目的についても、所定の選択肢から選択されるものとし、症状や病状に関しても所定の選択しから選択されてよい。 The remedy request transmission unit 116 transmits to the server device 20 a request for acquiring an remedy regarding tool handling or physical exercise (hereinafter referred to as a remedy request). FIG. 7 is a diagram showing a configuration example of an improvement request. As shown in the figure, the improvement request includes a user ID, mode, purpose, and the like. The purpose is the purpose for which the user makes improvements. The purpose can be, for example, "increase joint range of motion", "increase muscle strength", "stabilize walking", "improve symptoms/medical condition", and the like. The purpose may also be selected from predetermined options, and the symptoms and medical conditions may also be selected from predetermined options.
 改善策情報受信部117は、改善策リクエストに応じてサーバ装置20から送信される改善策に関する情報(以下、改善策情報という。)を受信する。改善策情報受信部117は、受信した改善策情報を改善策記憶部133に記憶する。改善策情報の構成例を図8に示す。同図に示すように改善策情報には目的とアドバイスと基準情報、訓練メニュ(トレーニングメニュ、訓練の適切な強度、訓練の適切な回数、立位・座位などの推薦する実施体勢、個人の身体情報に合わせて)と実施回数などが含まれる。本実施形態では、アドバイスは、改善策を表した文字列であることを想定するが、画像や動画などにより改善策を提示するコンテンツであってもよい。基準情報は、好適な道具の向きや動き(各部分の位置や向き、動き、速度等)、身体の姿勢(各部位の位置や角度等)である。なお、改善策情報受信部117は、改善策リクエストが無くても、評価結果と基準値を基に改善策情報送信部216が送信した改善策情報を受信してもよい。 The improvement plan information receiving unit 117 receives information about the improvement plan (hereinafter referred to as improvement plan information) transmitted from the server device 20 in response to the improvement plan request. The improvement plan information receiving unit 117 stores the received improvement plan information in the improvement plan storage unit 133 . FIG. 8 shows a configuration example of improvement measure information. As shown in the figure, the improvement plan information includes purpose, advice and standard information, training menu (training menu, appropriate intensity of training, appropriate number of times of training, recommended posture such as standing or sitting, individual body according to the information) and the number of times performed. In the present embodiment, the advice is assumed to be a character string representing the improvement measure, but may be content that presents the improvement measure using an image, video, or the like. The reference information is suitable orientation and movement of the tool (position, orientation, movement, speed, etc. of each part) and body posture (position, angle, etc. of each part). Note that the improvement measure information receiving unit 117 may receive the improvement measure information transmitted by the improvement measure information transmitting unit 216 based on the evaluation result and the reference value even if there is no improvement measure request.
 改善策情報表示部118は、改善策を表示する。改善策情報表示部118は、改善策情報に含まれているアドバイスを表示する。また、改善策情報に部位の好適な位置や角度が含まれている場合には、改善策情報表示部118は、当該部位が好適な角度となっているフレームの画像を前記動画情報から抜き出して利用者端末10に表示してもよい。 The improvement plan information display unit 118 displays the improvement plan. The remedy information display unit 118 displays advice included in the remedy information. Further, when the improvement measure information includes a suitable position and angle of a part, the improvement measure information display unit 118 extracts an image of a frame in which the part is at a suitable angle from the moving image information. It may be displayed on the user terminal 10 .
 図9は、サーバ装置20のハードウェア構成例を示す図である。サーバ装置20は、CPU201、メモリ202、記憶装置203、通信インタフェース204、入力装置205、出力装置206を備える。記憶装置203は、各種のデータやプログラムを記憶する、例えばハードディスクドライブやソリッドステートドライブ、フラッシュメモリなどである。通信インタフェース204は、通信ネットワーク40に接続するためのインタフェースであり、例えばイーサネット(登録商標)に接続するためのアダプタ、公衆電話回線網に接続するためのモデム、無線通信を行うための無線通信機、シリアル通信のためのUSB(Universal Serial Bus)コネクタやRS232Cコネクタなどである。入力装置205は、データを入力する、例えばキーボードやマウス、タッチパネル、ボタン、マイクロフォンなどである。出力装置206は、データを出力する、例えばディスプレイやプリンタ、スピーカなどである。 FIG. 9 is a diagram showing a hardware configuration example of the server device 20. As shown in FIG. The server device 20 includes a CPU 201 , a memory 202 , a storage device 203 , a communication interface 204 , an input device 205 and an output device 206 . The storage device 203 is, for example, a hard disk drive, solid state drive, flash memory, etc., which stores various data and programs. The communication interface 204 is an interface for connecting to the communication network 40, and includes, for example, an adapter for connecting to Ethernet (registered trademark), a modem for connecting to a public telephone network, and a wireless communication device for performing wireless communication. , USB (Universal Serial Bus) connector and RS232C connector for serial communication. The input device 205 is, for example, a keyboard, mouse, touch panel, button, microphone, etc. for inputting data. The output device 206 is, for example, a display, printer, speaker, or the like that outputs data.
 図10は、サーバ装置20のソフトウェア構成例を示す図である。同図に示すように、サーバ装置20は、評価リクエスト受信部211、画像解析部212、評価部213、評価情報送信部214、改善策リクエスト受信部215、改善策情報送信部216、実施状況情報取得部217と、書類情報生成部218と、の各処理部と、画像データ記憶部231、基準情報記憶部232、評価条件情報記憶部233、改善策情報記憶部234、実施状況情報記憶部235、雛形記憶部236、生成書類記憶部237の各記憶部とを備える。 FIG. 10 is a diagram showing a software configuration example of the server device 20. As shown in FIG. As shown in the figure, the server device 20 includes an evaluation request reception unit 211, an image analysis unit 212, an evaluation unit 213, an evaluation information transmission unit 214, an improvement measure request reception unit 215, an improvement measure information transmission unit 216, an implementation status information Acquisition unit 217, document information generation unit 218, image data storage unit 231, reference information storage unit 232, evaluation condition information storage unit 233, improvement measure information storage unit 234, implementation status information storage unit 235 , a template storage unit 236, and a generated document storage unit 237. FIG.
 なお、上記各機能部は、サーバ装置20が備えるCPU201が記憶装置203に記憶されているプログラムをメモリ202に読み出して実行することにより実現され、上記各記憶部は、サーバ装置20が備えるメモリ202および記憶装置203が提供する記憶領域の一部として実現される。 Note that each of the functional units described above is implemented by the CPU 201 provided in the server device 20 reading a program stored in the storage device 203 into the memory 202 and executing the program. and part of the storage area provided by the storage device 203 .
 評価リクエスト受信部211は、利用者端末10から送信される評価リクエストを受信する。評価リクエスト受信部211は、受信した評価リクエストに含まれている画像データを含む情報(以下、画像情報という。)を画像データ記憶部231に登録する。図11は、画像データ記憶部231に記憶される画像情報の構成例を示す図である。同図に示すように、画像情報には、撮影された利用者を示す利用者IDに対応付けて、画像データが含まれる。画像データは、評価リクエストに含まれていたものである。 The evaluation request receiving unit 211 receives an evaluation request transmitted from the user terminal 10. The evaluation request reception unit 211 registers information including image data included in the received evaluation request (hereinafter referred to as image information) in the image data storage unit 231 . FIG. 11 is a diagram showing a configuration example of image information stored in the image data storage unit 231. As shown in FIG. As shown in the figure, the image information includes image data in association with a user ID indicating the user whose image was captured. The image data was included in the evaluation request.
 基準情報記憶部232は、身体運動に係る、道具位置、道具動き(道具の向きや動き等)、姿勢(部位の位置や角度等)等、また道具と身体の関係から導き出す関係に関する基準値を含む情報(以下、基準情報という。)を記憶する。図12は、基準情報記憶部232が記憶する基準情報の構成例を示す図である。同図に示すように、基準情報には、道具を用いた身体運動を行った際に、道具の部分の絶対位置や、道具の部分がどのように動いたのかという情報(移動速度や移動距離、移動の方向など)、部位の絶対位置または他の部位もしくは他の基準物に対する相対位置に関する基準情報(以下、位置基準情報という。)と、関節部位を含む3つの部位について、2つの部位のそれぞれと関節部位とを結ぶ直線により形成される角度の基準情報(以下、角度基準情報という。)、さらに、道具の部分と身体の部位の関係に関する情報とが含まれるが、これに限定されない。 The reference information storage unit 232 stores reference values related to body movements derived from tool positions, tool movements (tool orientations and movements, etc.), postures (positions and angles of parts, etc.), and relationships between tools and the body. Information included (hereinafter referred to as reference information) is stored. FIG. 12 is a diagram showing a configuration example of reference information stored in the reference information storage unit 232. As shown in FIG. As shown in the figure, the reference information includes information such as the absolute position of the tool and how the tool moves (moving speed, moving distance, etc.) , direction of movement, etc.), reference information on the absolute position of the part or relative position to another part or other reference object (hereinafter referred to as position reference information), and three parts including joint parts, two parts It includes, but is not limited to, reference information on the angle formed by the straight line connecting each and the joint part (hereinafter referred to as angle reference information), and information on the relationship between the part of the tool and the part of the body.
 道具位置基準情報には、モードとチェックポイントIDとに対応付けて、道具の部分と、当該部分の基準となる位置が含まれる。部分は複数あってもよい。部分の位置について、鉛直方向の位置は、たとえば、地面からの高さとしてもよいし、いずれかの足先からの距離とすることができる。また、たとえばモードが「ウェイトリフティング」の場合は、両肩を結ぶ線とシャフトの間の距離など、身体の部位や部位と部位を繋ぐ線からの距離としてもよい。部位の水平方向の位置は、所定の基準物(たとえば、マウンドプレートや床上のマークなど)からの距離としてもよいし、肩や胸、足などの基準部位からの距離としてもよい。位置基準情報は、予め登録されているものとする。 The tool position reference information includes the part of the tool and the reference position of that part in association with the mode and checkpoint ID. There may be multiple parts. For the position of the part, the vertical position may be, for example, the height above the ground or the distance from either foot. Alternatively, for example, when the mode is "weightlifting", the distance from the line connecting the body parts or the body parts, such as the distance between the line connecting both shoulders and the shaft, may be used. The horizontal position of the part may be the distance from a predetermined reference object (for example, a mound plate or a mark on the floor) or the distance from the reference part such as the shoulder, chest, or leg. It is assumed that the position reference information is registered in advance.
 道具動き基準情報には、モードとチェックポイントIDとに対応付けて、道具の部位の移動速度や移動距離、ある時点での移動の方向やある期間での移動の軌跡などの情報の基準値等とが含まれる。 The tool movement reference information includes reference values for information such as the movement speed and movement distance of parts of the tool, the direction of movement at a certain point in time, and the trajectory of movement during a certain period of time, in association with the mode and checkpoint ID. and are included.
 位置基準情報には、モードとチェックポイントIDとに対応付けて、部位と、当該部位の基準となる位置が含まれる。部位は複数あってもよい。位置について、鉛直方向の位置は、たとえば、地面からの高さとしてもよいし、いずれかの足先からの距離とすることができる。水平方向の位置は、所定の基準物(たとえば、マウンドプレートや床上のマークなど)からの距離としてもよいし、肩や胸、足などの基準部位からの距離としてもよい。位置基準情報は、予め登録されているものとする。 The position reference information includes the part and the reference position of the part in association with the mode and checkpoint ID. There may be multiple sites. Regarding position, the vertical position may be, for example, the height from the ground or the distance from either foot. The horizontal position may be the distance from a predetermined reference object (for example, a mound plate or a mark on the floor) or the distance from a reference part such as the shoulder, chest, or leg. It is assumed that the position reference information is registered in advance.
 角度基準情報には、モードとチェックポイントIDとに対応付けて、2つの部位(部位1および部位2)と、1つの関節部位と、部位1と関節部位とを結ぶ直線と、部位2と関節部位とを結ぶ直線との間の角度の基準値とが含まれる。 The angle reference information includes two parts (part 1 and part 2), one joint part, a straight line connecting the part 1 and the joint part, a part 2 and the joint part, and corresponding to the mode and the checkpoint ID. and the reference value of the angle between the straight line connecting the part.
 関係基準情報には、モードとチェックポイントIDとに対応付けて、道具の部分と身体の部位との関係で表される基準に関する情報が含まれる。関係基準情報には、モードとチェックポイントIDとに対応付けて、一つ以上の部分と部位において、移動速度、移動距離、角度等から得られる情報が含まれる。関係基準情報には、たとえば、モードがバッティングの場合、ボールをミートした時点でのバットの先端の移動速度と、バットとバットを持つ利き腕からなる角度など、が基準情報として含まれる。 The relational reference information includes information on the reference represented by the relation between the part of the tool and the part of the body in association with the mode and checkpoint ID. The relational reference information includes information obtained from movement speed, movement distance, angle, etc. in one or more parts and regions in association with the mode and checkpoint ID. For example, when the mode is batting, the reference information includes the movement speed of the tip of the bat at the time of contact with the ball, the angle formed by the bat and the dominant arm holding the bat, and the like.
 評価条件情報記憶部233は、評価を行うための情報(以下、評価条件情報という。)を記憶する。図13は、評価条件情報記憶部233に記憶されている評価条件情報の構成例を示す図である。評価条件情報には、カテゴリ、条件、評価ランク、コメントが含まれている。カテゴリは、評価のカテゴリである。カテゴリとしては、たとえば、「筋力」、「可動域」、「持久力」などとすることができる。条件は、画像における道具の各部分の位置、向きまたは動き(時系列における位置の変化)など、また、身体の各部位の位置または動き(時系列における位置の変化)に対する条件である。たとえば、ウェイトリフティングの動きを分析する場合、バーベルを持ち上げる瞬間のチェックポイントについて、肘の角度や腕を伸ばす速度などに対する条件、また、バーベルを持ち上げて下ろす期間中のシャフトの動きや上下する速度などに対する条件を評価条件情報に設定することができる。また、ピッチングフォームを分析する場合、ボールをリリースするチェックポイントについて、肘の角度や腕の回線速度などに対する条件を評価条件情報に設定することができる。評価ランクは、上記条件が満たされた場合の評価値である。コメントは、上記条件が満たされた場合における、身体の姿勢や動きについての説明である。 The evaluation condition information storage unit 233 stores information for evaluation (hereinafter referred to as evaluation condition information). FIG. 13 is a diagram showing a configuration example of evaluation condition information stored in the evaluation condition information storage unit 233. As shown in FIG. The evaluation condition information includes categories, conditions, evaluation ranks, and comments. Category is the category of evaluation. Categories may include, for example, "muscle strength", "range of motion", and "endurance". The conditions include the position, orientation, or movement of each part of the tool in the image (change in position in time series), or the position or movement of each part of the body (change in position in time series). For example, when analyzing a weightlifting movement, checkpoints at the moment of lifting the barbell, conditions such as elbow angle and arm extension speed, and shaft movement and up and down speed during the period of lifting and lowering the barbell. can be set in the evaluation condition information. Also, when analyzing a pitching form, it is possible to set conditions such as the angle of the elbow and the line speed of the arm in the evaluation condition information for check points for releasing the ball. The evaluation rank is an evaluation value when the above conditions are satisfied. The comment is a description of the body's posture and movement when the above conditions are met.
 画像解析部212(部分・部位特定部)は、画像データを解析する。画像解析部212は、画像データを解析して道具の各部分、身体の各部位の特徴量を抽出し、各部分、各部位の、画像における位置を特定する。また、画像解析部212は、画像データを解析して道具の各部分の特徴量を抽出し、各部分が向いている方向を特定する。なお、画像解析部212による画像解析の手法については一般的なものを採用するものとして、ここでは詳細な説明を省略する。画像解析部212は、フレームごとまたはキーフレームごとに画像データを解析するようにしてもよいし、チェックポイントごとに画像データを解析するようにしてもよいし、ランダムなタイミングで解析するようにしてもよい。 The image analysis unit 212 (part/part identification unit) analyzes the image data. The image analysis unit 212 analyzes the image data, extracts the feature amount of each part of the tool and each part of the body, and specifies the position of each part and each part in the image. Also, the image analysis unit 212 analyzes the image data, extracts the feature amount of each part of the tool, and identifies the direction in which each part faces. It should be noted that the image analysis method by the image analysis unit 212 is assumed to employ a general method, and detailed description thereof will be omitted here. The image analysis unit 212 may analyze image data for each frame or key frame, may analyze image data for each checkpoint, or may analyze image data at random timing. good too.
 画像解析部212はまた、チェックポイントIDごとに、画像データから抽出した各部位の位置と、基準情報記憶部232に記憶されている位置基準情報とを比較し、最も近い時点をチェックポイントの時点として特定する。 The image analysis unit 212 also compares the position of each part extracted from the image data with the position reference information stored in the reference information storage unit 232 for each checkpoint ID, and selects the closest time as the checkpoint time. Identify as
 ここで、本実施形態におけるユーザの身体運動には、支援者の介助の上でユーザが行うものも含まれる。画像解析部212は、このような身体運動を捉えた画像から、ユーザの身体部位、またはユーザが行う身体運動に伴う道具の動きを捉えれば良い。画像解析部212は、画像データを解析して特徴量を抽出するなどして人物検出をし、人ごとに領域を分割しした後に、ユーザの各部位、各部分を特定すればよい。また、画像解析部212は、インスタンスセグメンテーションを用いて、人物ごとの領域をピクセル単位で推定した後に、姿勢推定を行ってもよい。 Here, the user's physical exercise in the present embodiment also includes what the user does with the assistance of a supporter. The image analysis unit 212 may capture the user's body part or the movement of the tool associated with the user's body movement from the image capturing such body movement. The image analysis unit 212 may detect a person by analyzing image data and extracting a feature amount, and after dividing an area for each person, specify each part and each part of the user. Alternatively, the image analysis unit 212 may perform posture estimation after estimating the region of each person in units of pixels using instance segmentation.
 なお、利用者と支援者が画像中に含まれる場合には、画像解析部212は、利用者と支援者を判別して、利用者の姿勢を解析する。画像解析部212による、利用者と支援者の判別方法としては、例えば、動画上で最も大きく写っている人を利用者と判定しても良いし、より動画の中心に近い部分に写っている人を利用者と判定してもよい。また、画像解析部212は、利用者の衣服や体表、毛髪等に取り付けたマーカをもとに、利用者を判定してもよいし、逆に支援者にマーカを取り付けて、マーカをもとに支援者を特定し、支援者ではない人を利用者と判定してもよい。また、画像解析部212は、トレーニング等に用いる装置を把持、または身につけている人を利用者と判定してもよい。また、画像解析部212は、一般的な顔認証技術を用いて、利用者を特定してもよい。また、画像解析部212は、利用者(例えば高齢者や被介護者など)の特徴を認識して、利用者を判定してもよい。この場合、例えば、画像解析部212は、高齢者や特定の部位に問題のある人の歩行や動作の特徴を学習した判定モデルを生成し、当該モデルを用いて高齢者や被介護者の特徴を有する人を利用者と判定すればよい。また、画像解析部212は、動画に写る全ての人の姿勢を解析した上で、利用者端末または支援者端末に、利用者を選択させる機能を提示し、利用者または支援者から利用者の選択を受け付け、利用者を特定してもよい。なお、上述した利用者の判定方法は、利用者と支援者が画像中に含まれる場合に限らず、動画に、利用者と、利用者以外の人が写り込んだ場合に、画像解析部212がユーザを判定する用途に用いてもよい。 It should be noted that when the user and the supporter are included in the image, the image analysis unit 212 distinguishes between the user and the supporter and analyzes the posture of the user. As a method for distinguishing between the user and the supporter by the image analysis unit 212, for example, the user may be determined to be the person who appears in the largest image on the moving image, or the person appearing closer to the center of the moving image may be determined to be the user. A person may be determined as a user. The image analysis unit 212 may determine the user based on the markers attached to the user's clothing, body surface, hair, etc. Conversely, the image analysis unit 212 may attach the marker to the supporter to determine the user. It is also possible to specify a supporter at the same time and determine a person who is not a supporter as a user. Also, the image analysis unit 212 may determine that a person holding or wearing a device used for training or the like is a user. Also, the image analysis unit 212 may identify the user using a general face authentication technique. In addition, the image analysis unit 212 may determine the user by recognizing the features of the user (for example, an elderly person, a care recipient, etc.). In this case, for example, the image analysis unit 212 generates a determination model that has learned the walking and movement characteristics of the elderly and people with problems in specific parts, and uses the model to determine the characteristics of the elderly and care recipients. It is sufficient to determine that a person who has In addition, the image analysis unit 212 analyzes the postures of all the people appearing in the moving image, and then presents the user terminal or the supporter terminal with a function that allows the user to select the user. A selection may be accepted to identify the user. Note that the user determination method described above is not limited to the case where the user and the supporter are included in the image. may be used to determine the user.
 画像解析部212は、利用者と支援者が画像中に含まれ、また、両者が画像上で重なった場合にも、支援者の部位を、利用者の部位と間違わずに特定する。画像解析部212は、例えば、関節点の候補を部位ごとに複数出力し、後処理でどの候補が他の部位の候補と同じ人物のものかをグルーピングすればよい。また、画像解析部212は、デプスセンサを用いて深度を取得し、深度情報をもとに、関節点を同一人物のものとグルーピングしてもよい。また、画像解析部212は、複数の撮像端末から取得した画像を解析し、それぞれの画像から判定した部位において、同一の部位に矛盾がないかを解析し、統合して誤検出を抑制してもよい。また、画像解析部212は、利用者端末または支援者端末から、検出された関節点にたいして利用者または支援者の手動による指定を受け付け、同一人物のものとグルーピングしてもよい。また、画像解析部212は、特定した関節点間の関節長や、関節可動域(関節角の制約)、腰の曲がり具合などの身体的特徴を用いて、検出した部位を同一人物のものとグルーピングしてもよい。また、画像解析部212は、時系列情報を用いて、各関節をトラッキングする(撮影開始時に支援者は離れておく)。また、撮像端末を用いて、誤検出しやすい手足などを対象に利用者または支援者に手袋などのマーカとなるものを装着させたうえで動作を撮像し、当該マーカによって、検出した部位を同一人物のものとグルーピングしてもよい。なお、上述した、検出した関節点、部位などを同一人物のものとグルーピングする方法は、利用者と支援者が画像中に含まれる場合に限らず、動画に、利用者と、利用者以外の人が写り込んだ場合に、画像解析部212が関節点や部位をグルーピングする用途に用いてもよい。 The image analysis unit 212 identifies the parts of the supporter without mistaking them for the parts of the user even when the user and the supporter are included in the image and the two overlap on the image. The image analysis unit 212 may, for example, output a plurality of joint point candidates for each part and group which candidates belong to the same person as other part candidates in post-processing. Further, the image analysis unit 212 may acquire depth using a depth sensor, and group joint points with those of the same person based on the depth information. In addition, the image analysis unit 212 analyzes images acquired from a plurality of imaging terminals, analyzes whether or not there is a contradiction in the same part in the parts determined from each image, and integrates them to suppress false detection. good too. Further, the image analysis unit 212 may receive the user's or supporter's manual designation of the detected joint points from the user's terminal or the supporter's terminal, and group them with those of the same person. In addition, the image analysis unit 212 uses physical characteristics such as the joint length between the specified joint points, the range of motion of the joint (restrictions on the joint angle), and the degree of bending of the waist to identify the detected parts as those of the same person. may be grouped. Also, the image analysis unit 212 tracks each joint using the time-series information (the supporter is kept away at the start of imaging). In addition, using an imaging terminal, the user or supporter wears a marker such as a glove to target limbs that are likely to be detected incorrectly, and the movement is imaged. It may be grouped with that of a person. Note that the above-described method of grouping the detected joint points, body parts, etc. with those of the same person is not limited to the case where the user and the supporter are included in the image. The image analysis unit 212 may be used for grouping joint points or parts when a person is captured.
 評価部213は、画像データに基づいて利用者の用いた道具の動きを評価する。本実施形態では、評価部213は、画像データから特定された道具の各部分の位置および部分の動きが満たす条件を含む評価条件情報を評価条件情報記憶部233から検索し、条件が満たされた評価条件情報があればそれに含まれる評価ランクおよびコメントを取得する。なお、評価部213は、当該道具の動きを評価して、身体運動の回数をカウントしてもよい。 The evaluation unit 213 evaluates the movement of the tool used by the user based on the image data. In this embodiment, the evaluation unit 213 searches the evaluation condition information storage unit 233 for evaluation condition information including conditions satisfied by the position and motion of each part of the tool specified from the image data, If there is evaluation condition information, the evaluation rank and comments included in it are acquired. Note that the evaluation unit 213 may evaluate the movement of the tool and count the number of body movements.
 評価部213は、画像データに基づいて利用者の身体の動きを評価する。本実施形態では、評価部213は、画像データから特定された各部位の位置および部位の動きが満たす条件を含む評価条件情報を評価条件情報記憶部233から検索し、条件が満たされた評価条件情報があればそれに含まれる評価ランクおよびコメントを取得する。なお、評価部213は、当該身体の動きを評価して、身体運動の回数をカウントしてもよい。 The evaluation unit 213 evaluates the movement of the user's body based on the image data. In this embodiment, the evaluation unit 213 searches the evaluation condition information storage unit 233 for evaluation condition information including conditions satisfied by the position and movement of each part specified from the image data, and searches the evaluation condition information that satisfies the conditions. Get the rating rank and comments contained in the information, if any. Note that the evaluation unit 213 may evaluate the body movement and count the number of body movements.
 評価部213は、画像データに基づいて利用者の用いた道具と身体の動きを評価する。本実施形態では、評価部213は、画像データから特定された道具の各部分と身体の各部位の位置および、前記部分と前記部位の動きまたは関係が満たす条件を含む評価条件情報を評価条件情報記憶部233から検索し、条件が満たされた評価条件情報があればそれに含まれる評価ランクおよびコメントを取得する。なお、評価部213は、当該道具と当該身体の動きを評価して、身体運動の回数をカウントしてもよい。 The evaluation unit 213 evaluates the tool used by the user and the movement of the user's body based on the image data. In this embodiment, the evaluation unit 213 generates evaluation condition information including the position of each part of the tool and each part of the body specified from the image data, and the conditions satisfied by the movement or relationship between the parts and the parts. A search is made from the storage unit 233, and if there is evaluation condition information that satisfies the condition, the evaluation rank and comment included therein are acquired. Note that the evaluation unit 213 may count the number of physical exercises by evaluating the movement of the tool and the body.
 評価情報送信部214は、評価情報を利用者端末10に送信する。評価情報送信部214は、画像解析部212が特定した動画の時間軸における時点と道具の各部分の位置とを含む道具位置情報を生成する。評価部213が取得した評価ランクおよびコメントについて、道具の部分の位置が条件を満たす場合には、時点、部分および道具向き値と、評価ランクおよびコメントとを含む姿勢情報を生成し、部分の動き(時系列における位置の変化)が条件を満たす場合には、時点、部分および道具向き値のリストと、評価ランクおよびコメントとを含む道具動き情報を生成する。また、評価情報送信部214は、画像解析部212が解析した、各チェックポイントに対応する時点と、当該チェックポイントを示すチェックポイントIDとを含むチェックポイント情報を生成する。評価情報送信部214は、生成した道具位置情報、道具向き情報、道具動き情報およびチェックポイント情報を含む評価情報を作成して利用者端末10に送信する。なお、評価部213および評価情報送信部214は、本発明のコメント出力部に該当しうる。 The evaluation information transmission unit 214 transmits the evaluation information to the user terminal 10. The evaluation information transmission unit 214 generates tool position information including the points in time on the time axis of the moving image specified by the image analysis unit 212 and the positions of each part of the tool. Regarding the evaluation rank and comment acquired by the evaluation unit 213, if the position of the part of the tool satisfies the condition, posture information including the time, part and tool orientation values, the evaluation rank and the comment is generated, and the motion of the part is calculated. If (position change in chronological order) satisfies the conditions, then generate tool motion information including a list of time points, parts and tool orientation values, as well as evaluation ranks and comments. In addition, the evaluation information transmission unit 214 generates checkpoint information including the time point corresponding to each checkpoint analyzed by the image analysis unit 212 and the checkpoint ID indicating the checkpoint. The evaluation information transmission unit 214 creates evaluation information including the generated tool position information, tool orientation information, tool movement information, and checkpoint information, and transmits the evaluation information to the user terminal 10 . Note that the evaluation unit 213 and the evaluation information transmission unit 214 can correspond to the comment output unit of the present invention.
 評価情報送信部214は、評価情報を利用者端末10に送信する。評価情報送信部214は、画像解析部212が特定した動画の時間軸における時点と各部位の位置とを含む位置情報を生成する。評価部213が取得した評価ランクおよびコメントについて、部位の位置が条件を満たす場合には、時点、部位および姿勢値と、評価ランクおよびコメントとを含む姿勢情報を生成し、部位の動き(時系列における位置の変化)が条件を満たす場合には、時点、部位および姿勢値のリストと、評価ランクおよびコメントとを含む動き情報を生成する。また、評価情報送信部214は、画像解析部212が解析した、各チェックポイントに対応する時点と、当該チェックポイントを示すチェックポイントIDとを含むチェックポイント情報を生成する。評価情報送信部214は、生成した位置情報、姿勢情報、動き情報およびチェックポイント情報を含む評価情報を作成して利用者端末10に送信する。なお、評価部213および評価情報送信部214は、本発明のコメント出力部に該当しうる。 The evaluation information transmission unit 214 transmits the evaluation information to the user terminal 10. The evaluation information transmission unit 214 generates position information including the time point on the time axis of the moving image specified by the image analysis unit 212 and the position of each part. Regarding the evaluation rank and comment acquired by the evaluation unit 213, if the position of the part satisfies the condition, posture information including the time point, the part and posture values, the evaluation rank and the comment is generated, and the movement of the part (time series) is generated. position change) satisfies the conditions, generate motion information including a list of time points, part and posture values, evaluation ranks and comments. In addition, the evaluation information transmission unit 214 generates checkpoint information including the time point corresponding to each checkpoint analyzed by the image analysis unit 212 and the checkpoint ID indicating the checkpoint. The evaluation information transmission unit 214 creates evaluation information including the generated position information, posture information, motion information, and checkpoint information, and transmits the evaluation information to the user terminal 10 . Note that the evaluation unit 213 and the evaluation information transmission unit 214 can correspond to the comment output unit of the present invention.
 改善策情報記憶部234は、改善策に係る情報(以下、改善策情報という。)を記憶する。図14は、改善策情報記憶部234に記憶される改善策情報の構成例を示す図である。同図に示すように、改善策情報には、目的、カテゴリおよび条件に対応付けてアドバイスが含まれる。条件は、道具自体への条件(バーベルの重量等)、道具の使い方、身体条件(柔軟性など)に対する条件であってもよいし、道具の部位の位置や向き、動きに対する条件でもよく、身体の部位の位置や動きに対する条件であってもよい。 The improvement plan information storage unit 234 stores information related to improvement plans (hereinafter referred to as improvement plan information). FIG. 14 is a diagram showing a configuration example of improvement plan information stored in the improvement plan information storage unit 234. As shown in FIG. As shown in the figure, the improvement plan information includes advice in association with purposes, categories and conditions. The conditions may be conditions for the tool itself (weight of the barbell, etc.), how to use the tool, conditions for the body (flexibility, etc.), or conditions for the position, orientation, and movement of parts of the tool. It may be a condition for the position or movement of the part of the .
 改善策リクエスト受信部215は、利用者端末10から送信される改善策リクエストを受信する。なお、改善策リクエスト受信部215は、支援者端末50から、改善策リクエストを受信してもよい。 The improvement request reception unit 215 receives the improvement request sent from the user terminal 10 . Note that the remedy request receiving unit 215 may receive the remedy request from the supporter terminal 50 .
 改善策情報送信部216は、改善策リクエストに含まれているモードおよび目的に対応する改善策情報のうち、評価リクエストに含まれていた利用者の身体情報や、画像解析部212が特定した各部分や各部位の位置や向き、動き等が条件を満たされるものを検索する。改善策情報送信部216は、検索した改善策情報のアドバイスを取得し、目的およびアドバイスを設定した改善策情報を作成し、作成した改善策情報を利用者端末10に応答する。改善策情報送信部216は、また、基準情報に含まれている各部分や各部位の位置や向き、速度、角度等を改善策情報に含めて送信する。なお、改善策情報送信部216は、改善策リクエスト無くても、評価情報と基準値を基に改善策を検索してもよく、当該改善策を改善策情報送信部216が利用者端末10に送信してもよい。 The improvement measure information transmission unit 216 transmits the user's physical information included in the evaluation request and each item specified by the image analysis unit 212 among the improvement measure information corresponding to the mode and purpose included in the improvement request. A search is made for items that satisfy conditions such as the position, orientation, movement, etc. of a part or each part. The improvement measure information transmission unit 216 acquires the advice of the searched improvement measure information, creates the improvement measure information in which the purpose and the advice are set, and responds to the user terminal 10 with the created improvement measure information. The improvement measure information transmitting unit 216 also transmits the position, orientation, speed, angle, and the like of each portion and each part included in the reference information in the improvement measure information. Note that the improvement measure information transmission unit 216 may search for an improvement measure based on the evaluation information and the reference value without the improvement measure request, and the improvement measure information transmission unit 216 transmits the improvement measure to the user terminal 10. You may send.
 改善策情報送信部216は、モード(症状、病名、状態等)または/および目的(希望、目標、ニーズ等)に対応する改善策情報を、利用者端末10に送信すればよい。 The improvement measure information transmission unit 216 may transmit to the user terminal 10 the improvement measure information corresponding to the mode (symptoms, disease name, condition, etc.) and/or purpose (desires, goals, needs, etc.).
 改善策情報送信部216は、モード(症状、病名、状態等)または/および目的(希望、目標、ニーズ等)に対応する改善策情報のうち、評価リクエストに含まれていた利用者の身体情報、特にADL、関節可動域、要支援度、要介護度などの情報、または、画像解析部212が特定した各部分や各部位の位置や向き、動き等が条件を満たされるものを検索してもよい。 The improvement measure information transmission unit 216 transmits the user's physical information contained in the evaluation request among the improvement measure information corresponding to the mode (symptoms, disease name, condition, etc.) and/or the purpose (hope, goal, needs, etc.) In particular, information such as ADL, joint range of motion, degree of support required, degree of care required, etc., or information that satisfies conditions such as the position, orientation, movement, etc. of each part and each part specified by the image analysis unit 212 is searched. good too.
 改善策情報送信部216は、利用者の運動機能、生活機能、認知機能に対応する改善策情報のうち、評価リクエストに含まれていた利用者の身体情報、特にADL、関節可動域、要支援度、要介護度などの情報と、画像解析部212が特定した各部分や各部位の位置や向き、動き等が条件を満たされるものを検索してもよい。 The improvement measure information transmission unit 216 transmits the user's physical information included in the evaluation request, particularly ADL, joint range of motion, support required, among the improvement measure information corresponding to the user's motor function, life function, and cognitive function. It is possible to search for information that satisfies conditions such as information such as degree and degree of need for nursing care, and the position, orientation, movement, etc. of each portion and each part specified by the image analysis unit 212 .
 改善策情報送信部216は、モード(症状、病名、状態等)または/および目的(希望、目標、ニーズ等)と、利用者の運動機能、生活機能、認知機能に対応する改善策情報のうち、評価リクエストに含まれていた利用者の身体情報、特にADL、関節可動域、要支援度、要介護度などの情報と、画像解析部212が特定した各部分や各部位の位置や向き、動き等が条件を満たされるものを検索してもよい。 The improvement measure information transmission unit 216 transmits mode (symptoms, disease name, condition, etc.) or/and purpose (hope, goal, needs, etc.) and improvement measure information corresponding to the user's motor function, life function, cognitive function , the user's physical information included in the evaluation request, especially information such as ADL, joint range of motion, degree of support required, degree of care required, and the position and orientation of each part and each part specified by the image analysis unit 212, A search may be made for items that satisfy conditions such as movement.
 改善策情報送信部216は、モード(症状、病名、状態等)または/および目的(希望、目標、ニーズ等)と、利用者の運動機能、生活機能、認知機能の中でも評価の低い項目に対応する改善策情報のうち、評価リクエストに含まれていた利用者の身体情報、特にADL、関節可動域、要支援度、要介護度などの情報と、画像解析部212が特定した各部分や各部位の位置や向き、動き等が条件を満たされるものを検索し、てもよい。 The improvement measure information transmission unit 216 corresponds to modes (symptoms, disease names, conditions, etc.) and/or purposes (desires, goals, needs, etc.) and low-evaluation items among the user's motor functions, life functions, and cognitive functions. Among the improvement measure information, the user's physical information included in the evaluation request, especially information such as ADL, joint range of motion, level of support required, level of care required, and each part and each part specified by the image analysis unit 212 A search may be made for a part that satisfies conditions such as the position, orientation, and movement of the part.
 実施状況情報取得部217は、利用者端末10に送信された改善策(リハビリメニュ、トレーニングメニュ等の訓練メニュ)の実施状況を、利用者端末10から取得し、実施状況情報記憶部235に記憶する。実施状況情報取得部217は、実施日や実施回数などを訓練メニュごとに入力するフォーム等を利用者端末10に提示し、利用者やその支援者からの入力情報を、実施状況情報として取得する。また、実施状況情報取得部217は、利用者端末10で撮像した画像を取得し、当該画像を画像解析部212が解析することで、指定された訓練メニュが実施されたか、また、指定された訓練メニュが実施された回数の情報を取得してもよい。この場合、画像解析部212は、訓練メニュごとに身体の各部位または道具の部分がどのように動くかを記憶した情報と、利用者の画像を解析して取得した各部位、各部分の動きの情報を比較することで、指定された訓練メニュが実施されたかを解析すればよいし、各部位、各部分の動きが何回繰り返されたかによって、回数をカウントすればよい。また、当該画像を評価部213が評価することにより、効果的な訓練が実施されたかを評価してもよく、当該評価結果を評価情報送信部214が利用者端末10に表示させてもよい。 The implementation status information acquisition unit 217 acquires, from the user terminal 10, the implementation status of the improvement measures (a training menu such as a rehabilitation menu and a training menu) transmitted to the user terminal 10, and stores it in the implementation status information storage unit 235. do. The implementation status information acquisition unit 217 presents a form or the like for inputting the implementation date, the number of implementations, etc. for each training menu on the user terminal 10, and acquires the input information from the user and the supporter as implementation status information. . In addition, the implementation status information acquisition unit 217 acquires an image captured by the user terminal 10, and the image analysis unit 212 analyzes the image to determine whether or not the specified training menu has been implemented. You may acquire the information of the frequency|count that the training menu was implemented. In this case, the image analysis unit 212 stores information on how each part of the body or the part of the tool moves for each training menu, each part obtained by analyzing the image of the user, and the movement of each part. By comparing the information, it is possible to analyze whether or not the specified training menu has been implemented, and to count the number of times the movement of each part and each part has been repeated. Further, the evaluation unit 213 may evaluate the image to evaluate whether or not the training was effective, and the evaluation information transmission unit 214 may display the evaluation result on the user terminal 10 .
 書類情報生成部218は、介護事業において介護保険の適用を受けて補助金の支払いを受けるために、各自治体等に設けられた保健福祉に関する部局に提出する書類の生成を行う。当該書類は、介護保険の申請や届出などを行う書類等であって、指定申請関連文書、報酬請求関連文書、指導監査関連文書などを含み、一例として、個別機能訓練加算、運動器機能向上加算、ADL維持加算、生活機能向上連携加算、科学的介護推進加算など加算に必要となる書類を含むが、これらに限定されない。また、当該書類は、引き継ぎ、申し送り、カルテ(利用者の状態、モニタリング等)、訓練実施記録、日誌、ケアプラン、アセスメント結果、各種計画書、ケアの提供表、介護給付明細(報酬請求に繋がる記録)等の、支援者が所属する組織にて作成・保管する書類も含まれていてよい。書類情報生成部218は、雛形記憶部236に記憶される、生成する書類の雛形の情報を取得し、実施状況情報記憶部235に記憶される、利用者の情報やケアの提供内容、訓練実施状況(実施内容や回数等)の該当する情報を当該雛形に流し込み、ワードファイルやPDFファイル等として、支援者端末50に出力してもよい。また、書類情報生成部218は、管理システムなどと連携し、実施状況情報記憶部235に記憶される、利用者の情報や実施内容、実施回数等の該当する情報を、当該管理システムのデータベースに記憶させてもよい。 The document information generation unit 218 generates documents to be submitted to departments related to health and welfare established in each local government, etc., in order to receive the application of nursing care insurance and the payment of subsidies in the nursing care business. Such documents include documents for applying for and notifying long-term care insurance, including documents related to designated applications, documents related to remuneration claims, and documents related to instruction audits. , ADL maintenance addition, living function improvement cooperation addition, scientific nursing care promotion addition, etc., but not limited to these. In addition, the documents include handover, transfer, chart (user's condition, monitoring, etc.), training implementation record, diary, care plan, assessment results, various plans, care provision chart, nursing care benefit statement (leading to remuneration claim) Records) and other documents created and stored by the organization to which the supporter belongs may also be included. The document information generation unit 218 acquires information on the template of the document to be generated, which is stored in the template storage unit 236, and stores the information on the user, the content of care provided, and the training implementation stored in the implementation status information storage unit 235. The corresponding information of the situation (implementation content, number of times, etc.) may be poured into the template and output to the supporter terminal 50 as a word file, PDF file, or the like. In addition, the document information generation unit 218 cooperates with the management system, etc., and stores relevant information such as user information, implementation details, implementation frequency, etc., stored in the implementation status information storage unit 235, into the database of the management system. may be stored.
 図15は、本実施形態の介護支援装置において実行される処理の流れの一例を示す図である。 FIG. 15 is a diagram showing an example of the flow of processing executed by the care support device of this embodiment.
 利用者端末10において撮像部111は、モードの入力を受け付け、利用者の運動中における身体を撮像し、動画データを取得する(S321)。評価リクエスト送信部112は、利用者を示す利用者ID、受け付けたモード、身体情報および動画データを含む評価リクエストをサーバ装置20に送信する(S322)。 The imaging unit 111 of the user terminal 10 receives mode input, images the user's body during exercise, and acquires video data (S321). The evaluation request transmission unit 112 transmits to the server device 20 an evaluation request including the user ID indicating the user, the received mode, the physical information and the video data (S322).
 サーバ装置20において評価リクエスト受信部211が評価リクエストを受信すると、画像解析部212は動画データを解析して特徴量を抽出し(S323)、各部分、各部位の位置を特定する(S324)。ここで画像解析部212は、画像上の位置を特定するようにしてもよいし、身体情報を用いて実寸の位置(地面からの高さ、身体の重心等の基準点からの距離など)を特定するようにしてもよい。評価部213は、各部分、各部位の位置や部分、部位の動き(位置の時系列的な変化)が条件を満たす評価条件情報から評価ランクおよびコメントを取得する(S325)。評価情報送信部214は、評価情報を作成して利用者端末10に送信する(S326)。 When the evaluation request reception unit 211 in the server device 20 receives the evaluation request, the image analysis unit 212 analyzes the moving image data to extract feature amounts (S323), and specifies the position of each part and each part (S324). Here, the image analysis unit 212 may specify the position on the image, or use the physical information to determine the actual position (height from the ground, distance from a reference point such as the center of gravity of the body, etc.). You may make it specify. The evaluation unit 213 acquires an evaluation rank and a comment from the evaluation condition information that satisfies the condition of each part, the position of each part, the movement of the part (change in position over time) (S325). The evaluation information transmission unit 214 creates evaluation information and transmits it to the user terminal 10 (S326).
 利用者端末10において評価表示部114は、受信した評価情報に基づいて、動画データ上に道具の位置、向き、動きなどを表示する(S327)。また、利用者端末10において評価表示部114は身体の姿勢を示す各部の位置(ボーン)を表示するとともに、評価ランクやコメントを表示してもよい(S327)。ここで評価表示部114は、部分の位置や向き、動きなど、また、部位の位置や動きの時系列的な変化などをグラフ表示してもよい。また、チェックポイント表示部115が、動画からチェックポイントの画像を抽出して表示してもよい。改善策リクエスト送信部116は、利用者からの指示に応じて、改善策リクエストをサーバ装置20に送信する(S328)。 The evaluation display unit 114 of the user terminal 10 displays the position, orientation, movement, etc. of the tool on the video data based on the received evaluation information (S327). In addition, the evaluation display unit 114 of the user terminal 10 may display the position (bone) of each part indicating the posture of the body, as well as the evaluation rank and comments (S327). Here, the evaluation display unit 114 may graphically display the position, orientation, movement, etc. of the part, and time-series changes in the position and movement of the part. Also, the checkpoint display unit 115 may extract and display images of checkpoints from the moving image. The remedy request transmission unit 116 transmits the remedy request to the server device 20 according to the instruction from the user (S328).
 サーバ装置20において、改善策リクエスト受信部215は、利用者端末10から送信される改善策リクエストを受信すると、改善策情報送信部216は、条件が満たされる改善策情報を検索し、検索した改善策情報に含まれているアドバイスを取得し(S329)、取得したアドバイスを含む改善策情報を作成して利用者端末10に送信する(S330)。 In the server device 20, when the improvement measure request reception unit 215 receives the improvement measure request transmitted from the user terminal 10, the improvement measure information transmission unit 216 searches for the improvement measure information that satisfies the conditions, and Advice included in the measure information is acquired (S329), and improvement measure information including the acquired advice is created and transmitted to the user terminal 10 (S330).
 利用者端末10において改善策情報受信部117が改善策情報を受信すると、改善策情報表示部118は、受信した改善策情報に含まれるアドバイスを表示するとともに、好適な道具の使い方を動画データに重畳させて表示することができる(S331)。 When the improvement measure information receiving unit 117 receives the improvement measure information in the user terminal 10, the improvement measure information display unit 118 displays the advice included in the received improvement measure information, and displays the suitable tool usage as video data. It can be superimposed and displayed (S331).
 利用者端末10において改善策情報受信部117が改善策情報を受信すると、改善策情報表示部118は、受信した改善策情報に含まれるアドバイスを表示するとともに、好適な身体の姿勢をボーンの形態で動画データに重畳させて表示することができる(S331)。 When the improvement measure information receiving unit 117 receives the improvement measure information in the user terminal 10, the improvement measure information display unit 118 displays the advice included in the received improvement measure information, and also displays the preferred body posture in the form of bones. (S331).
 利用者端末10において撮像部111が利用者の運動中における身体を撮像し、別の動画データを取得する。利用者端末10は、評価リクエスト送信部112によって、利用者を示す利用者ID、受け付けたモード、身体情報および動画データを含む評価リクエストをサーバ装置20に送信してもよいし、身体運動を捉えた動画データのみをサーバ装置20に送信しても良い(S332)。動画データを受信すると、実施状況情報取得部217は、画像解析部212による動画データの解析と、評価部213による評価と、の情報をもとに実施状況情報を取得する(S333)。書類情報生成部218は、少なくとも当該実施状況情報と、サーバ装置20に記憶された書類雛形をもとに、書類情報を生成する(S334) The imaging unit 111 of the user terminal 10 captures an image of the user's body while exercising, and acquires other moving image data. The user terminal 10 may transmit an evaluation request including the user ID indicating the user, the received mode, physical information and video data to the server device 20 by the evaluation request transmission unit 112, and may capture the body movement. Alternatively, only the moving image data obtained from the video data may be transmitted to the server device 20 (S332). Upon receiving the moving image data, the implementation status information acquisition unit 217 acquires implementation status information based on the analysis of the moving image data by the image analysis unit 212 and the evaluation by the evaluation unit 213 (S333). The document information generation unit 218 generates document information based on at least the implementation status information and the document template stored in the server device 20 (S334).
 以上のようにして、本実施形態の介護支援装置によれば、容易に身体運動についての評価を行うことができる。とくにスポーツに係る身体運動について、道具の各部分、身体の各部位の位置関係や動きについて評価することができるので、具体的な改善努力につながりやすく、成績向上が期待される。また、本実施形態の介護支援装置では、コメントやアドバイスも提供されるため、利用者は容易に現状と改善策とを把握することが可能になる。 As described above, according to the care support device of the present embodiment, it is possible to easily evaluate physical exercise. Especially for physical exercise related to sports, it is possible to evaluate the positional relationship and movement of each part of the tool and each part of the body. In addition, since the care support device of the present embodiment also provides comments and advice, the user can easily grasp the current situation and improvement measures.
 図16は、道具を用いた身体運動の評価を表示する画面の一例を示す図である。図16は、ウェイトリフティングモードでの動画を撮像した場合を説明している。図16に示すように、画面41では、バーベルのシャフトの位置を示す印411を表示している。バーベルのシャフトの動きを線412で表示している。 FIG. 16 is a diagram showing an example of a screen displaying an evaluation of physical exercise using a tool. FIG. 16 illustrates a case where a moving image is captured in the weightlifting mode. As shown in FIG. 16, the screen 41 displays a mark 411 indicating the position of the barbell shaft. The movement of the barbell shaft is indicated by line 412 .
 図17は、道具を用いた身体運動の評価を表示する画面の一例を示す他の図である。図17は、ウェイトリフティングモードでの動画を撮像した場合を説明している。ウェイトリフティングモードにおいて評価部213が評価を行った結果として、例えば一例としてバーベルのシャフトの傾き、シャフトの移動距離、移動速度(線421)などが表示されている図である。図17においては、基準値(線422)を表示している。実際の測定結果(線423)を表示し、さらに、基準値との差異を数値等で表示してもよいし、グラフ等で示してもよい。利用者はこれを参考にして直すべき動きや姿勢などを検討することができる。 FIG. 17 is another diagram showing an example of a screen displaying an evaluation of physical exercise using a tool. FIG. 17 illustrates a case where a moving image is captured in the weightlifting mode. As an example of the result of evaluation performed by the evaluation unit 213 in the weight lifting mode, the inclination of the barbell shaft, the movement distance of the shaft, the movement speed (line 421), and the like are displayed. In FIG. 17, the reference value (line 422) is displayed. The actual measurement result (line 423) may be displayed, and further, the difference from the reference value may be displayed numerically or graphically. The user can consider the movement and posture that should be corrected by referring to this.
 図18は、道具を用いた身体運動の評価を表示する画面の一例を示す他の図である。図18は、ウェイトリフティングモードでの動画を撮像した場合を説明している。ウェイトリフティングモードにおいて評価部213が評価を行った結果として、例えば、一例としてバーベルの最下点における身体の関節の角度などの基準値(線431)とともに評価結果(線432)などが表示されている図である。また、基準値との差異を数値等で表示してもよいし、グラフ等で示してもよい。さらに、身体と道具の関係による評価結果を表示してもよい。利用者はこれを参考にして直すべき動きや姿勢などを検討することができる。 FIG. 18 is another diagram showing an example of a screen displaying an evaluation of physical exercise using a tool. FIG. 18 illustrates a case where moving images are captured in the weightlifting mode. As a result of the evaluation performed by the evaluation unit 213 in the weight lifting mode, for example, the reference value (line 431) such as the angle of the body joint at the lowest point of the barbell and the evaluation result (line 432) are displayed. It is a figure. In addition, the difference from the reference value may be displayed numerically, or may be displayed in a graph or the like. Furthermore, the evaluation result based on the relationship between the body and the tool may be displayed. The user can consider the movement and posture that should be corrected by referring to this.
 図19は、道具を用いた身体運動の評価を表示する画面の一例を示す他の図である。図19は、ウェイトリフティングモードでの動画を撮像した場合を説明している。ウェイトリフティングモードにおいて評価部213が評価を行った結果として、線441では、画像から特定した道具の部分や身体の部位の所定の位置を線で結んで表示したボーンが表示されている。なお、ボーンは撮像した画像に重ねて表示してもよい。また、線442は、身体の各部位の加速度を表示している。さらに、ウェイトを何回持ち上げたかなどのカウント結果を表示してもよい。また、線443に示すように、評価結果と目的等に合わせた次のトレーニング結果を示してもよい。 FIG. 19 is another diagram showing an example of a screen displaying an evaluation of physical exercise using a tool. FIG. 19 illustrates a case where a moving image is captured in the weightlifting mode. As a result of the evaluation performed by the evaluation unit 213 in the weight lifting mode, a line 441 displays a bone that is displayed by connecting predetermined positions of parts of the tool and body parts identified from the image. Note that the bones may be displayed superimposed on the captured image. A line 442 represents the acceleration of each part of the body. Furthermore, a count result such as how many times the weight has been lifted may be displayed. In addition, as indicated by a line 443, the evaluation result and the next training result in accordance with the purpose or the like may be indicated.
 図20は、道具を用いた身体運動の評価を表示する画面の一例を示す他の図である。図20は、ウェイトリフティングモードでの動画を撮像した場合を説明している。ウェイトリフティングモードにおいて評価部213が評価を行った結果として、各種基準値との比較によって得られた評価(線441)を表示している。 FIG. 20 is another diagram showing an example of a screen displaying an evaluation of physical exercise using a tool. FIG. 20 illustrates a case where a moving image is captured in the weightlifting mode. As a result of evaluation performed by the evaluation unit 213 in the weight lifting mode, an evaluation (line 441) obtained by comparison with various reference values is displayed.
 以上、本実施形態について説明したが、上記実施形態は本発明の理解を容易にするためのものであり、本発明を限定して解釈するためのものではない。本発明は、その趣旨を逸脱することなく、変更、改良され得るとともに、本発明にはその等価物も含まれる。 Although the present embodiment has been described above, the above embodiment is intended to facilitate understanding of the present invention, and is not intended to limit and interpret the present invention. The present invention may be modified and improved without departing from its spirit, and the present invention also includes equivalents thereof.
 たとえば、本実施形態では、サーバ装置20において画像の解析を行うものとしたが、これに限らず、利用者端末10において画像の解析を行い、各部分、各部位の位置関係を特定するようにしてもよい。 For example, in the present embodiment, the server device 20 analyzes the image, but the present invention is not limited to this, and the user terminal 10 analyzes the image and specifies the positional relationship between each part and each part. may
 また、本実施形態では、部分、部位の位置は2次元の画像上の位置であることを想定したが、これに限らず、3次元の位置としてもよい。たとえば、利用者端末10が、カメラ106に加えてデプスカメラを備えている場合に、カメラ106からの画像と、デプスカメラからの深度マップとに基づいて、部分、部位の3次元の位置を特定することができる。また、たとえば、2次元画像から3次元を推定して、部分、部位の3次元の位置を特定してもよい。なお、カメラ106に代えてデプスカメラを設けるようにし、デプスカメラからの深度マップのみから3次元の位置を特定することも可能である。この場合、利用者端末10から画像データとともに、または画像データに代えて深度マップをサーバ装置20に送信するようにし、サーバ装置20の画像解析部212が3次元の位置を解析するようにすることができる。 In addition, in the present embodiment, it is assumed that the positions of parts and sites are positions on a two-dimensional image, but they are not limited to this, and may be three-dimensional positions. For example, when the user terminal 10 is equipped with a depth camera in addition to the camera 106, the three-dimensional position of a part or site is specified based on the image from the camera 106 and the depth map from the depth camera. can do. Further, for example, the three-dimensional position may be specified by estimating the three-dimensional image from the two-dimensional image. It is also possible to provide a depth camera in place of the camera 106 and specify the three-dimensional position only from the depth map from the depth camera. In this case, the depth map is transmitted from the user terminal 10 together with the image data or instead of the image data to the server device 20, and the image analysis unit 212 of the server device 20 analyzes the three-dimensional position. can be done.
 また、本実施形態では、道具を用いた運動中の利用者の身体を撮像した画像が利用者端末10からサーバ装置20に送信されるものとしたが、これに限らず、利用者端末10において画像から特徴量を抽出し、特徴量をサーバ装置20に送信するようにしてもよいし、利用者端末10が特徴量に基づいて道具の部分、身体の部位を推定し、部分、部位の絶対的な位置(画像のXY座標上の位置としてもよいし、基準位置(たとえば、地面や足先、頭、身体の重心など)からの実寸での距離としてもよいし、その他の任意の座標系での位置とすることもできる。)または複数部分間、複数部位間、複数部分部位間の相対的な位置関係を取得し、これらの絶対的な位置や相対的な位置関係をサーバ装置20に送信するようにしてもよい。 Further, in the present embodiment, an image of the user's body during exercise using a tool is transmitted from the user terminal 10 to the server device 20. A feature amount may be extracted from the image, and the feature amount may be transmitted to the server device 20. Alternatively, the user terminal 10 may estimate the part of the tool or the part of the body based on the feature amount, and determine the absolute value of the part or the part. position (on the XY coordinates of the image), or the actual distance from the reference position (for example, the ground, the tip of the foot, the head, the center of gravity of the body, etc.), or any other coordinate system ) Or acquire the relative positional relationship between multiple parts, between multiple parts, and between multiple parts, and send these absolute positions and relative positional relationships to the server device 20 You may make it transmit.
 また、本実施形態では、改善策情報にはサーバ装置20側で準備されたコンテンツが提供されるものとしたが、これに限らず、たとえば、基準値を含めるようにして、基準値に基づく正しい動きや姿勢(各部分の位置や向き等、各部位の位置や角度等)となる印やボーンを動画または動画から抽出した静止画に重畳して表示するようにしてもよい。これにより、どのような動きや姿勢とするべきかを容易に把握することができる。 Further, in the present embodiment, content prepared on the server device 20 side is provided as the improvement measure information, but this is not restrictive. Marks and bones representing movements and postures (positions and orientations of parts, positions and angles of parts, etc.) may be superimposed and displayed on moving images or still images extracted from moving images. This makes it possible to easily grasp what kind of movement and posture should be taken.
 また、本実施形態では、道具の部分や向き、身体の部位の位置または動き等(経時的な位置)について評価するものとしたが、これに限らず、利用者が装着している道具の位置を特定して評価するようにしてもよい。 In addition, in the present embodiment, evaluation is performed on the part and orientation of the tool, the position or movement of the part of the body, etc. (position over time), but the position of the tool worn by the user is not limited to this. may be specified and evaluated.
 また、本実施形態では、改善策についてはアドバイス等のコンテンツを提供するものとしたが、たとえば、道具のレコメンデーションを行うようにしてもよい。この場合、サーバ装置20は、利用者の身体情報(身長、体重等)に対応付けて、道具と当該道具のサイズ(長さ等)の基準値を記憶しておき、画像データから利用者が使用している道具の特徴量を抽出して道具の形状を特定し、当該形状と身体情報に含まれる利用者のサイズ(たとえば身長等)に基づいて道具の大きさを推定し、推定した道具の大きさと、基準値との差が所定の閾値以上であれば、基準値のサイズの道具をレコメンドすることができる。さらに、道具自体への条件(バーベルの重量等)、道具の使い方、身体条件(柔軟性など)、道具の部位の位置や向き、動きなどの情報から、目的に応じた道具をレコメンドしてもよい。 Also, in the present embodiment, content such as advice is provided for improvement measures, but for example, tool recommendations may be provided. In this case, the server device 20 stores the tool and the reference value of the size (length, etc.) of the tool in association with the user's physical information (height, weight, etc.). Identify the shape of the tool by extracting the feature value of the tool in use, estimate the size of the tool based on the size of the user (such as height, etc.) included in the shape and physical information, and estimate the tool If the difference between the size and the reference value is greater than or equal to a predetermined threshold value, a tool having the size of the reference value can be recommended. Furthermore, based on information such as the conditions for the tool itself (weight of the barbell, etc.), how to use the tool, physical conditions (flexibility, etc.), the position and orientation of the parts of the tool, and the movement of the tool, it is possible to recommend the tool according to the purpose. good.
 また、本実施形態では、改善策についてはアドバイス等のコンテンツを提供するものとしたが、たとえば、行っている身体運動を中断させてもよい。この場合、サーバ装置20は、利用者の身体情報(目的、身長、体重等)に対応付けて、身体運動を中断すべき基準値を記憶しておき、画像データから利用者が行っている身体運動の回数や速度など(例えば、バーベルを持ち上げるスピードが極端に落ちてしまう、また、一度に行う回数が多すぎるなど)が基準値から外れた場合に、身体運動を中断させる。この場合、利用者端末10に対して中止するようにコメントを出してもよいし、画面を消すなどディスプレイの表示を変化させることによって利用者に知らせてもよいし、アラート音などの音を出してもよいし、バイブレーションによって利用者に知らせてもよい。 In addition, in the present embodiment, content such as advice is provided as an improvement measure, but for example, the user may be suspended from exercising. In this case, the server device 20 stores a reference value at which physical exercise should be interrupted in association with the user's physical information (purpose, height, weight, etc.). When the number and speed of exercise (for example, the speed of lifting the barbell is extremely slow, or the number of times of exercise is too large at one time, etc.), the physical exercise is discontinued. In this case, a comment may be issued to the user terminal 10 asking the user to stop, the user may be notified by changing the display such as turning off the screen, or a sound such as an alert sound may be emitted. Alternatively, the user may be notified by vibration.
 また、本実施形態では、改善策についてはアドバイス等のコンテンツを提供するものとしたが、たとえば、病気の判定やその改善に向けた身体運動を提示してもよい。この場合、サーバ装置20は、前記身体情報に利用者が入力した症状や、評価情報から、利用者が発症していると想定される病気の候補を抽出し、絞り込みのためのスクリーニングテストを提示する。利用者がスクリーニングテストを行い、病名が絞り込めた段階で、医師の診察を受けることや、改善に向けた身体運動、または身体運動を行うための道具や、食事などの物品のレコメンドなどを行ってもよい。 In addition, in the present embodiment, content such as advice is provided for improvement measures, but for example, determination of illness and physical exercise for improvement may be presented. In this case, the server device 20 extracts candidates for the disease that the user is assumed to have developed from the symptoms entered by the user in the physical information and the evaluation information, and presents a screening test for narrowing down the disease. do. After the user has taken a screening test and has narrowed down the name of the disease, the service recommends that the user receive a medical examination, and recommends physical exercise for improvement, tools for physical exercise, and items such as meals. may
 また、道具の部位の位置を推定することにより、サーバ装置20は、道具のスピード、加速度、移動距離、軌道等を推定することができる。また、サーバ装置20は、時系列での道具の位置の変化のパターンを抽出することにより、パターンの回数を、道具を使った動作の回数として推定することができる。 In addition, by estimating the positions of parts of the tool, the server device 20 can estimate the speed, acceleration, movement distance, trajectory, etc. of the tool. In addition, the server device 20 can estimate the number of patterns as the number of actions using the tool by extracting patterns of changes in the position of the tool in time series.
 また、本実施形態では、運動の評価を行うものとしたが、これに限らず、ある姿勢または動きを検出した場合に、その動作に対する課題を提案するようにしてもよい。この場合、サーバ装置20は、ひとつまたは一連の姿勢または動きに対応付けて、評価コメントに代えて、課題を記憶しておき、当該課題を出力すればよい。 Also, in the present embodiment, motion is evaluated, but the present invention is not limited to this, and when a certain posture or motion is detected, a problem for that motion may be proposed. In this case, the server device 20 may store assignments instead of evaluation comments in association with one or a series of postures or movements, and output the assignments.
 また、本実施形態では、運動の評価を行うものとしたが、これに限らず、ある道具の動き、道具の向き、姿勢または身体の部位の動きを検出した場合に、行うべきトレーニング、リハビリ、演奏、またはその準備段階であるストレッチや筋力トレーニング、姿勢など、目的等に応じて身体運動を改善する内容を提示するようにしてもよい。この場合、サーバ装置20は、ひとつまたは一連の道具の部位の動き、道具の部位の向き、身体の姿勢または身体の部位の動きに対応付けて、評価コメントに代えて、トレーニング等の実施内容を記憶しておき、当該内容を出力すればよい。 In addition, in the present embodiment, the exercise is evaluated, but the present invention is not limited to this. Contents for improving physical exercise may be presented according to the purpose, such as performance, or preparatory stages such as stretching, strength training, and posture. In this case, the server device 20 associates one or a series of movement of a part of the tool, orientation of the part of the tool, body posture, or movement of the body part, and instead of the evaluation comment, the server device 20 provides the content of the training or the like. It is sufficient to store the content and output the content.
 また、本実施形態では、運動の評価を行うものとしたが、これに限らず、利用者が行った動作を自動検出するようにすることもできる。この場合、サーバ装置20は、たとえばシュートやパスなどの所定の動作を行う道具の各部分の位置や姿勢(身体の各部位の位置)を基準情報として記憶しておき、画像から解析した道具の部分や身体の部位の位置と基準情報とを比較して、画像中の利用者が行った動作を特定することができる。 Also, in the present embodiment, the exercise is evaluated, but it is not limited to this, and it is also possible to automatically detect the action performed by the user. In this case, the server device 20 stores, as reference information, the positions and postures (positions of each part of the body) of each part of a tool that performs a predetermined action such as a shoot or a pass, and the tool analyzed from the image. By comparing the position of the part or part of the body with the reference information, the action performed by the user in the image can be identified.
 また、本実施形態では、過去に撮像した画像を解析して運動の評価を行うものとしたが、これに限らず、リアルタイムに解析処理を行い、所定の動作を検出した場合に、次にとるべき姿勢や体勢、動かす部位や方向、回数などをレコメンドするようにしてもよい。この場合、姿勢または動きに対応付けて、評価コメントに代えて正しい姿勢や体勢を記憶しておき、リアルタイムに正しい姿勢や体勢、その時点の姿勢と正しい姿勢や体勢との差分を算定し、その差分を埋めるために必要な動作を出力すればよい。 Further, in the present embodiment, an image captured in the past is analyzed to evaluate the motion, but the present invention is not limited to this, and analysis processing is performed in real time, and when a predetermined motion is detected, the next step is taken. It is also possible to recommend postures and postures, parts and directions to be moved, the number of times, and the like. In this case, correct postures and postures are stored in association with postures or movements instead of evaluation comments, correct postures and postures are calculated in real time, and differences between the postures at that time and the correct postures and postures are calculated. Just output the actions needed to fill in the difference.
 また、本実施形態では、改善策情報は、改善策情報記憶部234に記憶された改善策を、モード、目的、身体情報、または評価部213が行った評価結果などをもとに抽出して、利用者端末10に提示するものとしたが、たとえば、別の時点で行われた複数回の訓練の画像を評価部213が評価し、評価結果が改善しているかを判定し、当該判定結果によって、改善策情報送信部216は、前回と異なる訓練メニュや、回数の増減を、利用者端末10に提示してもよい。この場合、改善策情報記憶部234は、評価部213が、利用者が行った身体運動を含む第1の画像をもとに利用者の身体運動の評価を行った結果と、当該利用者が前述したよりも後の異なる時点において行った身体運動を含む第2の画像をもとに利用者の身体運動の評価を行った結果を比較し、第2の画像をもとに行った評価が、第1の画像をもとに行った評価よりも、基準値に近づいている場合に改善したと判定する。改善策情報記憶部234は、当該第2の画像をもとに行った評価が、当該第1の画像をもとに行った評価よりも、基準値に遠ざかった場合に改善していないと判定する。改善策情報記憶部234は、当該第2の画像をもとに行った評価が、当該第1の画像をもとに行った評価と、所定の値以内の違いしかない場合に、変化していないと判定する。改善策情報記憶部234は、モードや目的、身体情報、評価部213が行った評価結果に加え、改善、改善していない、変化していないなどの判定結果をもとに、改善策を抽出しなおして、利用者端末10に提示してもよい。 Further, in the present embodiment, the improvement measure information is obtained by extracting the improvement measure stored in the improvement measure information storage unit 234 based on the mode, purpose, physical information, or the evaluation result performed by the evaluation unit 213. , is presented to the user terminal 10. For example, the evaluation unit 213 evaluates images of training performed a plurality of times at different times, determines whether the evaluation result is improved, and determines whether the evaluation result is improved. Accordingly, the improvement measure information transmission unit 216 may present to the user terminal 10 a training menu different from the previous time or an increase/decrease in the number of times. In this case, the improvement measure information storage unit 234 stores the result of the evaluation of the user's physical exercise by the evaluation unit 213 based on the first image including the physical exercise performed by the user, and By comparing the results of evaluating the user's body movement based on the second image containing the body movement performed at different times after the above, the evaluation based on the second image is compared. , it is determined that the image has improved when it is closer to the reference value than the evaluation performed based on the first image. The improvement measure information storage unit 234 determines that there is no improvement when the evaluation based on the second image is farther from the reference value than the evaluation based on the first image. do. If the evaluation based on the second image differs from the evaluation based on the first image by only a predetermined value or less, the improvement measure information storage unit 234 determines that there is no change. judge not. The improvement measure information storage unit 234 extracts improvement measures based on the mode, purpose, physical information, the evaluation results obtained by the evaluation unit 213, and the determination results such as improvement, no improvement, and no change. It may be redone and presented on the user terminal 10 .
 また、本実施形態では、所定の機能の実行および情報の記憶を、利用者端末10またはサーバ装置20で行うものとしているが、これに限らず、いずれか一方の装置で当該機能の実行および情報の記憶を行うこととしてもよい。または、本実施形態とは異なる形態で、機能部および記憶部を分散して設けてもよい。 Further, in the present embodiment, the user terminal 10 or the server device 20 executes the predetermined function and stores the information. may be stored. Alternatively, in a form different from that of the present embodiment, the functional unit and the storage unit may be provided separately.
  10  利用者端末
  20  サーバ装置
  30  撮像端末
  40  通信ネットワーク
  50  支援者端末
  111 撮像部
  112 評価リクエスト送信部
  113 評価情報受信部
  114 評価表示部
  115 チェックポイント表示部
  116 改善策リクエスト送信部
  117 改善策情報受信部
  118 改善策情報表示部
  130 身体情報記憶部
  131 画像記憶部
  132 評価情報記憶部
  133 改善策記憶部
  211 評価リクエスト受信部
  212 画像解析部
  213 評価部
  214 評価情報送信部
  215 改善策リクエスト受信部
  216 改善策情報送信部
  217 実施状況情報取得部
  218 書類情報生成部
  231 画像データ記憶部
  232 基準情報記憶部
  233 評価条件情報記憶部
  234 改善策情報記憶部
  235 グループ解析情報記憶部
 

 
10 user terminal 20 server device 30 imaging terminal 40 communication network 50 supporter terminal 111 imaging unit 112 evaluation request transmission unit 113 evaluation information reception unit 114 evaluation display unit 115 checkpoint display unit 116 improvement request transmission unit 117 improvement information reception Unit 118 Improvement measure information display unit 130 Physical information storage unit 131 Image storage unit 132 Evaluation information storage unit 133 Improvement measure storage unit 211 Evaluation request reception unit 212 Image analysis unit 213 Evaluation unit 214 Evaluation information transmission unit 215 Improvement request reception unit 216 Improvement measure information transmission unit 217 Implementation status information acquisition unit 218 Document information generation unit 231 Image data storage unit 232 Reference information storage unit 233 Evaluation condition information storage unit 234 Improvement measure information storage unit 235 Group analysis information storage unit

Claims (5)

  1.  介護業務を支援する介護支援装置であって、
     利用者の身体運動を含む第1画像情報をもとに、前記身体運動を評価する評価部と、
     前記身体運動の評価から、好適な訓練メニュを選定して支援者に提示する改善策情報送信部と、
     少なくとも前記利用者による前記訓練メニュの実施状況をもとに、書類情報を生成する書類情報生成部と、
    を有することを特徴とする、介護支援装置。
    A care support device that supports care work,
    an evaluation unit that evaluates the physical exercise based on the first image information including the user's physical exercise;
    an improvement measure information transmission unit that selects a suitable training menu from the evaluation of the physical exercise and presents it to a supporter;
    a document information generation unit that generates document information based on at least the state of implementation of the training menu by the user;
    A care support device, characterized by comprising:
  2.  少なくとも、前記利用者の第2画像情報をもとに前記訓練メニュの実施状況情報を取得する実施状況情報取得部と、
    を更に備え、
     前記書類情報生成部は、少なくとも前記実施状況情報をもとに、前記書類情報を生成すること、
    を特徴とする、請求項1に記載の介護支援装置。
    an implementation status information acquisition unit that acquires implementation status information of the training menu based on at least the second image information of the user;
    further comprising
    The document information generating unit generates the document information based on at least the implementation status information;
    The care support device according to claim 1, characterized by:
  3.  前記改善策情報送信部は、第1画像情報と第2画像情報をもとに、前記訓練メニュを選定し直すこと、
    を特徴とする、請求項1または2のいずれかに記載の介護支援装置。
    the improvement measure information transmitting unit reselecting the training menu based on the first image information and the second image information;
    The care support device according to claim 1 or 2, characterized by:
  4.  介護業務を支援する介護支援プログラムであって、
    プロセッサに、
     利用者の身体運動を含む第1画像情報をもとに、前記身体運動を評価する評価ステップと、
     前記身体運動の評価から、好適な訓練メニュを選定して支援者に提示する改善策情報送信ステップと、
     少なくとも前記利用者による前記訓練メニュの実施状況をもとに、書類情報を生成する書類情報生成ステップと、
    を実行させる、介護支援プログラム。
    A care support program that supports care work,
    to the processor,
    an evaluation step of evaluating the physical exercise based on the first image information including the user's physical exercise;
    a improvement measure information transmission step of selecting a suitable training menu from the evaluation of the physical exercise and presenting it to a supporter;
    a document information generation step of generating document information based on at least the state of implementation of the training menu by the user;
    A care support program that allows you to carry out
  5.  介護業務を支援する介護支援方法であって、
    プロセッサが、
     利用者の身体運動を含む第1画像情報をもとに、前記身体運動を評価する評価ステップと、
     前記身体運動の評価から、好適な訓練メニュを選定して支援者に提示する改善策情報送信ステップと、
     少なくとも前記利用者による前記訓練メニュの実施状況をもとに、書類情報を生成する書類情報生成ステップと、
    を行うことを特徴とする、介護支援方法。

     
    A nursing care support method for supporting nursing care work,
    the processor
    an evaluation step of evaluating the physical exercise based on the first image information including the user's physical exercise;
    a improvement measure information transmission step of selecting a suitable training menu from the evaluation of the physical exercise and presenting it to a supporter;
    a document information generation step of generating document information based on at least the state of implementation of the training menu by the user;
    A nursing care support method, characterized by performing

PCT/JP2022/048136 2021-12-28 2022-12-27 Care support device, care support program, and care support method WO2023127870A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021213725A JP2023097545A (en) 2021-12-28 2021-12-28 Care support device, care support program, and care support method
JP2021-213725 2021-12-28

Publications (1)

Publication Number Publication Date
WO2023127870A1 true WO2023127870A1 (en) 2023-07-06

Family

ID=86999037

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/048136 WO2023127870A1 (en) 2021-12-28 2022-12-27 Care support device, care support program, and care support method

Country Status (2)

Country Link
JP (1) JP2023097545A (en)
WO (1) WO2023127870A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019008771A1 (en) * 2017-07-07 2019-01-10 りか 高木 Guidance process management system for treatment and/or exercise, and program, computer device and method for managing guidance process for treatment and/or exercise
WO2019022102A1 (en) * 2017-07-25 2019-01-31 パナソニックIpマネジメント株式会社 Activity assistant method, program, and activity assistant system
WO2020107097A1 (en) * 2018-11-27 2020-06-04 Bodybuddy Algorithms Inc. Systems and methods for providing personalized workout and diet plans
JP2021117553A (en) * 2020-01-22 2021-08-10 株式会社ジェイテクト Exercise evaluation system and server system
JP2021529368A (en) * 2018-06-21 2021-10-28 インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation Virtual environment for physiotherapy

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019008771A1 (en) * 2017-07-07 2019-01-10 りか 高木 Guidance process management system for treatment and/or exercise, and program, computer device and method for managing guidance process for treatment and/or exercise
WO2019022102A1 (en) * 2017-07-25 2019-01-31 パナソニックIpマネジメント株式会社 Activity assistant method, program, and activity assistant system
JP2021529368A (en) * 2018-06-21 2021-10-28 インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation Virtual environment for physiotherapy
WO2020107097A1 (en) * 2018-11-27 2020-06-04 Bodybuddy Algorithms Inc. Systems and methods for providing personalized workout and diet plans
JP2021117553A (en) * 2020-01-22 2021-08-10 株式会社ジェイテクト Exercise evaluation system and server system

Also Published As

Publication number Publication date
JP2023097545A (en) 2023-07-10

Similar Documents

Publication Publication Date Title
US10973439B2 (en) Systems and methods for real-time data quantification, acquisition, analysis, and feedback
US10314536B2 (en) Method and system for delivering biomechanical feedback to human and object motion
US11037369B2 (en) Virtual or augmented reality rehabilitation
US9700242B2 (en) Motion information processing apparatus and method
CN104274183B (en) Action message processing unit
US9510789B2 (en) Motion analysis method
CN101489481A (en) Health management device
Dolatabadi et al. The toronto rehab stroke pose dataset to detect compensation during stroke rehabilitation therapy
JP7008342B2 (en) Exercise evaluation system
JP2020174910A (en) Exercise support system
Olugbade et al. Human observer and automatic assessment of movement related self-efficacy in chronic pain: from exercise to functional activity
JP7492722B2 (en) Exercise evaluation system
CN111883229B (en) Intelligent movement guidance method and system based on visual AI
Shi et al. A VR-based user interface for the upper limb rehabilitation
Gauthier et al. Human movement quantification using Kinect for in-home physical exercise monitoring
JP2016035651A (en) Home rehabilitation system
WO2021261529A1 (en) Physical exercise assistance system
JP6439106B2 (en) Body strain checker, body strain check method and program
WO2022030619A1 (en) Guidance support system
WO2023127870A1 (en) Care support device, care support program, and care support method
Durve et al. Machine learning approach for physiotherapy assessment
JP2021068069A (en) Providing method for unmanned training
WO2023007930A1 (en) Determination method, determination device, and determination system
US20240046510A1 (en) Approaches to independently detecting presence and estimating pose of body parts in digital images and systems for implementing the same
Diwyanjalee et al. Predicting Player’s Healthiness Using Machine Learning

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22916093

Country of ref document: EP

Kind code of ref document: A1