WO2022107355A1 - Program, method, information processing device, and system - Google Patents

Program, method, information processing device, and system Download PDF

Info

Publication number
WO2022107355A1
WO2022107355A1 PCT/JP2021/015226 JP2021015226W WO2022107355A1 WO 2022107355 A1 WO2022107355 A1 WO 2022107355A1 JP 2021015226 W JP2021015226 W JP 2021015226W WO 2022107355 A1 WO2022107355 A1 WO 2022107355A1
Authority
WO
WIPO (PCT)
Prior art keywords
person
information
measured
dimensional model
movement
Prior art date
Application number
PCT/JP2021/015226
Other languages
French (fr)
Japanese (ja)
Inventor
俊二 菅谷
Original Assignee
株式会社オプティム
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社オプティム filed Critical 株式会社オプティム
Publication of WO2022107355A1 publication Critical patent/WO2022107355A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion

Definitions

  • This disclosure relates to programs, methods, information processing devices, and systems.
  • Patent Document 1 describes a composite image generated from a plurality of continuously captured images of a moving subject, in which changes in the movement of the subject can be identified according to a time series, and a motion attached to the subject. Described is a motion evaluation device that displays a composite image and motion data in a state of being associated with motion data output from a sensor and displayed in a time series corresponding to the movement of the subject.
  • Patent Document 1 when trying to analyze the movements of a plurality of people, it is necessary to store moving images and sensing data for a plurality of people in the device, and it is necessary to prepare a large-capacity storage. There is. Further, when trying to analyze a different operation from the previous one for the same person, the same process must be executed again, which increases the load of the process.
  • an object of the present disclosure is to analyze a person's movement by using a photographing device, a sensor, or the like with a reduced processing load.
  • the program is to be executed by a computer including a processor and a memory, and the program is based on a step of acquiring the personal information of the person to be measured and the acquired personal information.
  • the step of acquiring, the step of moving the 3D model based on the motion information and the sensing information, the step of storing the movement of the 3D model in the storage means, and the step of analyzing the stored movement are executed in the processor.
  • a program is provided.
  • FIG. 1 It is a figure which shows the whole structure of the system 1. It is a figure which shows an example at the time of performing the sensing of the body of the person to be measured in the system 1. It is a figure which shows the functional configuration of a server 20. It is a figure which shows the data structure of the patient information database 2021, the operation log information database 2022, and the medical care information database 2023 which the server 20 stores. It is a figure which shows the outline of the apparatus which comprises the system 1. It is a flowchart which shows the process which a server 20 creates a 3D model based on the acquired personal information of a person to be measured.
  • FIG. 1 It is a figure which shows the whole structure of the system 1. It is a figure which shows an example at the time of performing the sensing of the body of the person to be measured in the system 1. It is a figure which shows the functional configuration of a server 20. It is a figure which shows the data structure of the patient information database 2021, the operation log information database 2022, and the medical care information
  • FIG. 5 is a flowchart showing a process in which the server 20 acquires motion information of a person to be measured and sensing information of a motion sensor, and moves a three-dimensional model based on the acquired information. It is a flowchart which shows the process which the server 20 stores the movement of a 3D model in a storage means, analyzes the stored movement, and transmits the analysis result to a user's terminal device. It is a figure which shows the screen example which presents the result of having compared the present motion with the motion at a predetermined time point in the past in the 3D model of a person to be measured.
  • An example of a screen showing the result of comparing the movement of the 3D model of the subject and the movement of the 3D model of another patient who suffered from the same disease as the subject and experienced the same treatment is shown. It is a figure. The results of comparing the movement of the 3D model of the subject with the movement of the 3D model of the patient who had the same disease as the subject and experienced different treatments and had a good course. It is a figure which shows the screen example which presents to the user.
  • System 1 creates a three-dimensional model based on the personal information of the person to be measured.
  • the personal information includes, for example, information on the body of the person to be measured, such as height, weight, and physique of the person to be measured.
  • the system 1 moves the created three-dimensional model from the sensing information sensed by the sensor attached to a part of the body of the person to be measured and the motion information obtained by photographing the person to be measured by the photographing apparatus.
  • System 1 analyzes the operation log when the 3D model is moved. As a result, the system 1 analyzes the operation of the person to be measured with a small processing load and a small amount of communication.
  • System 1 can be installed in, for example, a medical facility such as a hospital.
  • a doctor or the like may introduce the system 1 in order to efficiently manage the rehabilitation effect of the patient, the future treatment policy, and the like.
  • a sports facility such as a fitness gym
  • FIG. 1 is a diagram showing the overall configuration of the system 1.
  • the system 1 includes a server 20, an edge server 30, a photographing device 40, and a sensor 50.
  • the server 20 and the edge server 30 communicate with each other via the network 80.
  • the edge server 30 is connected to the photographing device 40 and the sensor 50.
  • the photographing device 40 and the sensor 50 are transmission / reception devices based on a communication standard used in a short-range communication system between information devices.
  • the photographing device 40 and the sensor 50 use a 2.4 GHz band such as a Bluetooth (registered trademark) module to receive a beacon signal from another information device equipped with the Bluetooth (registered trademark) module.
  • Receive The edge server 30 acquires information transmitted from the photographing device 40 and the sensor 50 based on the beacon signal using the short-range communication. In this way, the photographing device 40 and the sensor 50 transmit the acquired motion information of the person to be measured to the edge server 30 by short-range communication without going through the network 80.
  • the edge server 30 may communicate with the photographing device 40 and the sensor 50 via the network 80.
  • the server 20 manages the information of each patient.
  • the patient information includes, for example, personal information of the patient, information on the operation log of the three-dimensional model, medical information, and the like.
  • the server 20 creates a three-dimensional model based on the personal information of the person to be measured acquired from the user.
  • the server 20 moves the created three-dimensional model based on the information acquired from the edge server 30, and analyzes the movement.
  • the server 20 shown in FIG. 1 has a communication IF 22, an input / output IF 23, a memory 25, a storage 26, and a processor 29.
  • the communication IF 22 is an interface for inputting / outputting signals because the server 20 communicates with an external device.
  • the input / output IF 23 functions as an interface with an input device for receiving an input operation from the user and an output device for presenting information to the user.
  • the memory 25 is for temporarily storing a program, data processed by the program, or the like, and is, for example, a volatile memory such as a DRAM (Dynamic Random Access Memory).
  • the storage 26 is a storage device for storing data, for example, a flash memory or an HDD (Hard Disc Drive).
  • the processor 29 is hardware for executing an instruction set described in a program, and is composed of an arithmetic unit, registers, peripheral circuits, and the like.
  • each device can be grasped as an information processing device. That is, the aggregate of each device can be grasped as one "information processing device", and the system 1 may be formed as an aggregate of a plurality of devices.
  • the method of allocating the plurality of functions required to realize the system 1 according to the present embodiment to one or a plurality of hardware is in consideration of the processing capacity of each hardware and / or the specifications required for the system 1. Can be determined as appropriate.
  • the edge server 30 receives the information transmitted from the photographing device 40 and the sensor 50, and transmits the received information to the server 20. Further, the edge server 30 transmits the information acquired from the server 20 to the photographing device 40 and the sensor 50.
  • the information acquired from the server 20 includes, for example, information for updating the settings of the photographing device 40 or the sensor 50.
  • the photographing device 40 is a device for receiving light by a light receiving element and outputting it as a photographed image.
  • the photographing device 40 for example, any of the following devices is assumed.
  • ⁇ Visible light camera ⁇ Infrared camera ⁇ Ultraviolet camera ⁇ Ultrasonic sensor ⁇ RGB-D camera ⁇ LiDAR (Light Detection and Ranging)
  • the photographing device 40 may extract motion information from the moving image by photographing the movement of the body of the person to be measured and analyzing the captured moving image.
  • the photographing device 40 uses the trained model to extract motion information from a moving image in which the state of the person to be measured is constantly photographed, and transmits the motion information to the edge server 30.
  • the trained model may be created, for example, as follows. For example, after acquiring a moving image of the movement of the patient who is the subject to be measured by the photographing device 40, a doctor or the like analyzes the acquired moving image to specify the motion information from the moving image. For example, a doctor or the like creates learning data by associating information such as a vector representing the movement with the movement of the portion of the captured moving image in which the sensor 50 is attached.
  • the trained model is created by having the machine learning model perform machine learning according to the model learning program based on the training data.
  • the photographing device 40 may photograph the person to be measured so as to include the sensor 50 attached to the person to be measured, and acquire the operation information of the person to be measured by using the sensor 50 included in the image as a marker. .. Further, the photographing device 40 may transmit a moving image of the person to be measured to the edge server 30.
  • the image analysis may be performed by the edge server 30.
  • the trained model is stored in, for example, the edge server 30.
  • the edge server 30 acquires a moving image taken by the photographing device 40.
  • the edge server 30 performs image analysis using the trained model on the acquired moving image, and acquires motion information.
  • the edge server 30 transmits the acquired operation information to the server 20.
  • the sensor 50 is a motion sensor that is installed in a part of the body of the person to be measured and detects information about the movement of the body of the person to be measured. For example, the sensor 50 detects information such as the inclination of the body of the person to be measured, the operating speed, and the operating direction.
  • the sensor 50 is realized by, for example, a gyro sensor, an acceleration sensor, or the like.
  • the sensor 50 transmits various detected sensing information to the edge server 30.
  • FIG. 2 is a schematic diagram showing an installation example of the photographing device 40 and the sensor 50.
  • the sensor 50 is attached to, for example, a part of the body of the subject.
  • a plurality of sensors 50 are attached to the knee.
  • the attachment site on the knee is, for example, any one of the medial thigh, anterior thigh, lateral thigh, anterior knee, medial lower leg, anterior lower leg, lateral lower leg, and the like.
  • the photographing device 40 is installed at a position where the person to be measured can be photographed in a predetermined facility. For example, it is desirable that the photographing device 40 be installed in a high place because it is difficult to block the photographing of the person to be measured.
  • FIG. 3 is a diagram showing a functional configuration of the server 20. As shown in FIG. 3, the server 20 functions as a communication unit 201, a storage unit 202, and a control unit 203.
  • the communication unit 201 performs processing for the server 20 to communicate with an external device such as an edge server 30.
  • the storage unit 202 stores data and programs used by the server 20.
  • the storage unit 202 stores the patient information database 2021, the operation log information database 2022, the medical care information database 2023, and the like.
  • the patient information database 2021 shows information on each patient to be sensed by the system 1. Details will be described later.
  • the operation log information database 2022 shows information on the operation log when the server 20 operates the three-dimensional model based on the information acquired from the sensor 50. Details will be described later.
  • the medical information database 2023 shows information on the medical results of each patient. Details will be described later.
  • the control unit 203 exerts the functions shown as various modules by the processor of the server 20 performing processing according to the program.
  • the reception control module 2031 controls the process in which the server 20 receives a signal from an external device according to a communication protocol.
  • the transmission control module 2032 controls a process in which the server 20 transmits a signal to an external device according to a communication protocol.
  • the model creation module 2033 creates a three-dimensional model based on the personal information of the person to be measured input by the user.
  • the three-dimensional model is provided, for example, so that each part can be freely moved.
  • the model creation module 2033 may create a three-dimensional model based on the image of the person to be measured taken by the photographing apparatus 40. That is, the model creation module 2033 analyzes the moving image of the person to be measured taken by the photographing device 40, identifies the height, physique, etc. of the person to be measured, and creates a three-dimensional model based on the specified information. May be good.
  • the first acquisition module 2034 acquires motion information regarding the body movement of the person to be measured, which is photographed by the photographing device 40.
  • the second acquisition module 2035 acquires the sensing information sensed by the sensor 50.
  • the model operation module 2036 operates the created 3D model based on the acquired operation information and sensing information.
  • the model operation module 2036 acquires a large movement of the portion where the sensor 50 is mounted, based on the operation information acquired from the photographing device 40.
  • the model operation module 2036 acquires the fine movement of the portion where the sensor 50 is mounted, based on the sensing information acquired by the sensor 50.
  • the model operation module 2036 reflects the acquired large movement and small movement in the three-dimensional model, and operates the movement of the three-dimensional model.
  • the model operation module 2036 can reflect fine movements in addition to large movements to the three-dimensional model for the portion where the sensor 50 is mounted. Therefore, the model operation module 2036 can imitate complicated movements of the human body such as how to move the body and how to use muscles by using a three-dimensional model.
  • the analysis module 2037 analyzes the movement of the person to be measured based on the information in the operation log of the three-dimensional model. Specifically, for example, the analysis module 2037 has a range of motion in the target site, muscle mass in the target site, and coordination based on movements grasped from the motion log of the three-dimensional model, for example, displacement, load, velocity, acceleration, and the like. Evaluate the degree of exercise, the degree of balance exercise, etc.
  • the analysis module 2037 may compare the current movement of the person to be measured with the movement at a predetermined time in the past and evaluate the effect of rehabilitation and the like. As a result, the analysis module 2037 can easily evaluate the effect of rehabilitation based on the past movements of the person to be measured. Therefore, it can be expected to improve the motivation of the person to be measured for rehabilitation.
  • the analysis module 2037 may evaluate the effect of rehabilitation or the like based on the movement of a three-dimensional model of another patient suffering from the same disease. Specifically, for example, the analysis module 2037 uses the movement of the 3D model of the patient suffering from the same disease as the patient to be measured and the movement of the 3D model of the patient having a high therapeutic effect. And analyze by comparison. As another example, the analysis module 2037 includes the movement of the 3D model of the patient who suffered from the same disease as the patient who was the subject and who received different treatments, and the movement of the 3D model of the subject. To compare and analyze.
  • the analysis module 2037 may present the analysis result to the patient.
  • a terminal owned by the patient (not shown) displays the analysis result on the display. This enables the patient to understand the effect of his / her rehabilitation.
  • the analysis module 2037 may make it possible to recognize each person to be measured by the appearance of the three-dimensional model by pasting the image of the person to be measured taken by the photographing apparatus 40 on the surface of the created three-dimensional model. ..
  • FIG. 4 is a diagram showing the data structures of the patient information database 2021, the operation log information database 2022, and the medical care information database 2023 stored in the server 20.
  • the patient information database 2021 has an item "patient ID”, an item “name”, an item “age”, an item “gender”, an item “address”, an item “height”, and an item. Includes “weight” and the like.
  • the item "patient ID” is information that identifies each patient who is the subject.
  • the item "name” is information indicating the name of each patient. For example, the name of the patient ID "U001" indicates that it is "A”.
  • the item “age” is information indicating the age of each patient. For example, it indicates that the age of the patient ID "U001" is "20".
  • the item "gender” is information indicating the gender of each patient. For example, the gender of the patient ID "U001" indicates that it is "male”.
  • the item "address” is information indicating the address of each patient. For example, the address of the patient ID "U001" indicates that it is "B city, A prefecture”.
  • the item "height” is information indicating the height of each patient. For example, it indicates that the height of the patient ID "U001" is "166 cm”.
  • the item "body weight” is information indicating the weight of each patient. For example, the weight of the patient ID “U001" is shown to be “70 kg”.
  • the operation log information database 2022 includes an item “log ID”, an item “patient ID”, an item “acquisition date and time”, an item “location information”, and the like.
  • log ID indicates information that identifies each of the operation log information. Specifically, the information for identifying the operation log of the person to be measured acquired by the photographing device 40 and the sensor 50 acquired by the server 20 is shown.
  • patient ID indicates the patient information associated with the operation log. For example, the patient associated with the log ID "L001" indicates "U001".
  • the item "acquisition date and time” indicates information regarding the date and time when the server 20 acquired the operation log. For example, it indicates that the acquisition date and time of the log ID "L001" is "20/06/01 10:00:00".
  • the item "position information” indicates information regarding the position of the three-dimensional model operated by the model operation module 2036.
  • the position information is the information of the operation log in which the model operation module 2036 operates the three-dimensional model based on the operation information acquired by the photographing device 40 and the sensing information acquired by the sensor 50. Obtained in association with the corresponding position on the 3D model.
  • the position information of the three-dimensional model in the log ID "L001" is "position information 1".
  • the method for defining the position of the three-dimensional model may be any existing method.
  • the position information may include sensing information acquired by the sensor 50, for example, information regarding the angular velocity of body movement, acceleration, and the like as internal information.
  • the operation log information database 2022 shown in FIG. 4 only the data for the patient with the patient ID: U001 is displayed, but in reality, the data for a plurality of patients are accumulated.
  • the medical information database 2023 includes an item “patient ID”, an item “medical ID”, an item “medical treatment date and time”, an item “doctor in charge”, an item “disease”, and an item “medical treatment content”. , The item “prescription drug” and the item “progress information” etc. are included.
  • the item "patient ID” indicates information that identifies each patient.
  • the item "medical treatment ID” indicates information that identifies each medical treatment information associated with the patient ID. For example, it indicates that the patient ID "U001" is associated with the medical treatment ID "T001".
  • the item "medical treatment date and time” indicates information on the date and time when each patient received medical treatment by a doctor.
  • doctor in charge indicates information that identifies the doctor who treated the patient. For example, it indicates that the doctor in charge at the medical treatment ID "T001" and the medical treatment date and time "20/06/01" is "A”.
  • the item “disease” indicates information on the disease that was diagnosed by the doctor at the date and time when each patient received medical treatment by the doctor. For example, regarding the medical treatment ID "T001", it indicates that the illness “knee joint pain” was diagnosed by the doctor in charge "A” on the medical treatment date and time "20/06/01". At this time, the server 20 may store the information on the disease, which is different from the information on the medical treatment date and time.
  • the item "medical treatment content” indicates information on the content of medical treatment by a doctor at the date and time when each patient received medical treatment by a doctor. For example, in the medical treatment date and time "20/01/01" of the medical treatment ID "T001", it is shown that the medical treatment content is "surgery". At this time, the server 20 may store information on different medical treatment contents at each medical treatment date and time.
  • the item "prescription drug” indicates information on the drug prescribed by the doctor at the date and time when each patient received medical treatment by the doctor.
  • the prescribed drug is not limited to the prescription drug, and may be an over-the-counter drug.
  • the prescription drug is an "analgesic”.
  • the item "progress information” indicates information on the progress of treatment at the date and time when each patient received medical treatment by a doctor. For example, in the medical treatment date and time "20/01/01" of the medical treatment ID "T001", the progress information indicates that it is "next rehabilitation".
  • FIG. 5 is a diagram showing an outline of the system 1.
  • a sensor 50 for example, a motion sensor
  • the photographing device 40 photographs the movement of the person to be measured wearing the sensor 50.
  • the photographing device 40 stores the trained model, and acquires the motion information of the person to be measured from the captured image by using the trained model.
  • the photographing device 40 transmits the acquired operation information to the edge server 30.
  • the sensor 50 transmits the sensing information when the person to be measured operates to the edge server 30.
  • the edge server 30 transmits the received operation information and sensing information of the person to be measured to the server 20.
  • the server 20 creates a three-dimensional model of the person to be measured based on the personal information acquired from the user.
  • the server 20 operates the three-dimensional model based on the information acquired from the edge server 30.
  • the server 20 analyzes the movement information of the person to be measured based on the information of the operation log in which the three-dimensional model is operated.
  • the server 20 analyzes the movement of the person to be measured using the three-dimensional model, so that the amount of data for acquiring the analysis result can be suppressed. Further, since it is sufficient to transmit data that can operate the three-dimensional model, communication with a reduced data capacity becomes possible. Further, since the amount of data to be transmitted may be small, it is possible to analyze the movement of the person to be measured at a remote place by using a three-dimensional model. That is, even when the doctor is away from the patient, the doctor can recognize the effect of rehabilitation and can instruct the future rehabilitation policy.
  • FIG. 6 is a flowchart showing a process in which the server 20 creates a three-dimensional model based on the acquired personal information of the person to be measured.
  • the server 20 acquires the personal information of the person to be measured. Specifically, the server 20 acquires, for example, the personal information of the person to be measured transmitted from the terminal device of the user.
  • the personal information includes, for example, physical information, age, gender, etc. of the person to be measured, such as height, weight, and physique of the person to be measured.
  • the personal information may include information on muscle strength acquired by measuring muscle strength or the like.
  • the server 20 may acquire a moving image taken by the photographing device 40 as personal information of the person to be measured. For example, the server 20 acquires data on the range of motion of the body of the person to be measured or external features such as arm and foot lengths based on the moving image taken by the photographing device 40.
  • the server 20 may acquire information on the body of the person to be measured as point cloud data based on the moving image taken by the photographing device 40.
  • the server 20 creates a three-dimensional model of the person to be measured based on the personal information of the person to be measured acquired from the user.
  • the server 20 creates, for example, a three-dimensional model that imitates the person to be measured by adjusting the three-dimensional model provided by default based on the acquired personal information.
  • the server 20 acquires the parameters of the three-dimensional model provided by default as personal information based on the information such as the sex, age, height, weight, and range of movement of the body of the person to be measured.
  • a 3D model is created.
  • the server 20 may create a three-dimensional model based on the acquired point cloud data.
  • the 3D model may be inconsistent with the patient as time passes after it is created.
  • the server 20 may present, for example, a suggestion to modify the three-dimensional model to a user such as a doctor or a patient each time a preset period elapses.
  • the server 20 modifies the three-dimensional model based on the input personal information.
  • the server 20 may newly create a three-dimensional model based on the input personal information.
  • FIG. 7 shows a process in which the server 20 acquires the motion information of the person to be measured taken by the photographing device 40 and the sensing information detected by the sensor 50, and operates the three-dimensional model based on the acquired information. It is a flowchart.
  • step S701 the server 20 acquires the operation information obtained based on the image captured by the photographing device 40 and the sensing information detected by the sensor 50.
  • the photographing device 40 photographs the person to be measured so as to include the sensor 50 attached to the person to be measured.
  • the photographing device 40 acquires the motion information of the person to be measured by analyzing the photographed image using the trained model.
  • the server 20 receives the operation information acquired by the photographing device 40 via the edge server 30.
  • the server 20 receives the sensing information about the angular velocity (deg / s), the acceleration (mm / s2), etc. detected by the sensor 50 via the edge server 30.
  • the server 20 may acquire a moving image taken by the photographing device 40.
  • the server 20 performs image analysis using the trained model on the moving image taken by the photographing device 40.
  • the server 20 acquires operation information by performing image analysis on the moving image.
  • the server 20 moves the three-dimensional model based on the acquired operation information and sensing information. Specifically, for example, the server 20 moves the area corresponding to the portion where the sensor 50 is mounted in the created three-dimensional model based on the information acquired in step S702. Specifically, for example, the server 20 moves the corresponding area of the three-dimensional model based on the operation information, and moves the details of the corresponding area based on the sensing information such as the angular velocity and the acceleration acquired by the sensor 50. At this time, the server 20 moves the three-dimensional model, for example, reflecting parameters such as the range of motion of the person to be measured.
  • the server 20 stores the movement of the three-dimensional model in the storage means.
  • the server 20 stores the acquired position information of each part of the three-dimensional model as an operation log in the operation log information database 2022.
  • the server 20 can reflect the movement of the details of the body, which cannot be understood only by the moving image photographed by the photographing device 40, in the three-dimensional model. ..
  • FIG. 8 is a flowchart showing a process in which the server 20 analyzes the movement of the three-dimensional model stored in the storage means and transmits the analysis result to the user's terminal device.
  • step S801 the server 20 analyzes the movement of the three-dimensional model based on the operation log information database 2022 stored in the storage unit 202. Specifically, for example, a user such as a doctor operates an information processing terminal (not shown) such as a PC, specifies a time point at which the patient's movement is desired to be analyzed, and inputs an instruction to perform the analysis.
  • an information processing terminal not shown
  • the server 20 analyzes the movement of the three-dimensional model based on the operation log information database 2022 stored in the storage unit 202. Specifically, for example, a user such as a doctor operates an information processing terminal (not shown) such as a PC, specifies a time point at which the patient's movement is desired to be analyzed, and inputs an instruction to perform the analysis.
  • the server 20 When the server 20 receives the analysis instruction, it reads the operation log for a predetermined period including the specified time point from the operation log information database 2022.
  • the server 20 reflects the read operation log in the three-dimensional model in chronological order.
  • the server 20 is based on the movements grasped from the motion log of the three-dimensional model, for example, displacement, load, velocity, acceleration, etc. Evaluate the degree of
  • the server 20 compares the current movement of the patient with the movement at a predetermined time in the past in the analysis, and evaluates the effect of rehabilitation and the like.
  • the server 20 includes, for example, an operation log for a predetermined period including a specified time point from the operation log information database 2022, and a predetermined time point including a time point retroactively from the designated time point by a preset period. Read the operation log of the period.
  • the server 20 reflects the two types of operation logs read out in the three-dimensional model for the patient in chronological order. At this time, the server 20 may set a date closest to the designated time point as a past time point, or may set an arbitrary date designated by a user such as a doctor.
  • the server 20 compares the movements grasped from the motion logs of each three-dimensional model, and evaluates changes in the range of motion of the patient's target site, the muscle mass of the target site, the degree of coordinated movement, the degree of balance movement, and the like. ..
  • the server 20 evaluates the effect of rehabilitation and the like based on the movement of the three-dimensional model of another patient suffering from the same disease. Specifically, the server 20 extracts, for example, from the medical information database 2023, among other patients who have the same disease as the target patient and have experienced the same treatment, the patients who have a good course. The server 20 reads, for example, from the operation log information database 2022, the operation log for a predetermined period including the specified time point and the extracted operation log for the predetermined period of the patient. At this time, the server 20 reads out the operation log for a predetermined period including the time point designated by the user and the time point corresponding to the other patient. The time point designated by the user and the corresponding time point represent, for example, a time point in which the postoperative elapsed period recorded at the time point specified by the user is about the same.
  • the server 20 reflects the two types of operation logs read out in two different three-dimensional models in chronological order.
  • One 3D model is a 3D model created for the target patient
  • the other 3D model is a 3D model created for the other patient.
  • the server 20 compares the movements grasped from the motion logs of each three-dimensional model, and evaluates differences in the range of motion of the patient's target site, the muscle mass of the target site, the degree of coordinated movement, the degree of balance movement, and the like. ..
  • the server 20 extracts, for example, from the medical information database 2023, among other patients who suffer from the same disease as the target patient and have experienced different treatments, the patients who have a good course.
  • the server 20 reads, for example, from the operation log information database 2022, the operation log for a predetermined period including the specified time point and the extracted operation log for the predetermined period of the patient.
  • the server 20 reflects the read two types of operation logs in two different three-dimensional models in chronological order.
  • the server 20 compares the movements grasped from the motion logs of each three-dimensional model, and evaluates differences in the range of motion of the patient's target site, the muscle mass of the target site, the degree of coordinated movement, the degree of balance movement, and the like. ..
  • the analysis process by the server 20 is not limited to the one performed in response to the analysis instruction input from the user.
  • the server 20 may perform the analysis process at a predetermined timing.
  • the predetermined timing is a timing according to the progress of rehabilitation, and is, for example, as follows. ⁇ Timing when rehabilitation is finished ⁇ When there are multiple trainings during rehabilitation, timing when at least one training is finished
  • the server 20 determines, for example, whether or not the timing is predetermined by the following. -The preset time has been reached-The notification indicating the end has been received-The operation has not occurred for the preset period since the last operation occurred.
  • the server 20 When the process is executed, the server 20 reads the operation log stored immediately before the process is executed from the operation log information database 2022.
  • the server 20 reflects the read operation log in the three-dimensional model in chronological order and analyzes the patient's operation. This allows the server 20 to analyze the patient's behavior during or immediately after the rehabilitation. As a result, the patient can continue the rehabilitation while confirming the effect of his / her rehabilitation. In addition, it is possible to confirm the effect of rehabilitation while the interval between rehabilitation remains.
  • step S802 the server 20 stores the analysis result in, for example, the medical information database 2023.
  • the analysis result stored in the medical information database 2023 is presented to the requesting user in response to a request from a user such as a doctor or a patient. This allows the patient or the doctor to confirm the effect of the patient's rehabilitation.
  • FIG. 9 is a diagram showing an example of a screen that presents to the user the result of comparing the current movement with the movement at a predetermined time in the past in the three-dimensional model of the person to be measured.
  • the result display screen 901 shows a screen for the user's terminal device to present the result of analyzing the movement of the three-dimensional model by the server 20.
  • the user's terminal device acquires the operation information of the measured person photographed by the photographing device 40 and the sensing information of the measured person detected by the sensor 50 at a predetermined time in the past by the server 20. Then, a screen for displaying the result of operating the three-dimensional model based on the acquired information is shown.
  • the current result screen 903 shows the motion information of the person to be measured taken by the photographing device 40 and the sensing detected by the sensor 50 on the latest date when the server 20 is currently performing rehabilitation by the user's terminal device.
  • a screen for acquiring information and displaying the result of operating a three-dimensional model based on the acquired information is shown.
  • the analysis result display screen 904 compares the result of the server 20 analyzing the movement of the 3D model at a predetermined time in the past with the result of the analysis of the movement of the current 3D model by the user's terminal device. Shows the screen that displays the result.
  • the user's terminal device has information such as how the body movement of the subject has improved or deteriorated compared to the analysis results at a predetermined time in the past, and information such as future treatment policy. May be displayed on the screen.
  • the user's terminal device may display information referring to the medical information database 2023 stored in the server 20 on the screen.
  • the user's terminal device may display information such as medical treatment details and treatment policies input by a doctor or the like on the screen.
  • the person to be measured or a doctor can compare not only the analysis result of the current rehabilitation with the analysis result of the latest rehabilitation but also the analysis result of the person to be measured when the rehabilitation is started. can. Therefore, the person to be measured, a doctor, or the like can confirm the time when the rehabilitation effect was large or small, and can reflect it in the examination of the treatment policy or the like.
  • FIG. 10 presents to the user the result of comparing the movement of the 3D model of the subject with the movement of the 3D model of another patient who suffered from the same disease as the subject and experienced similar treatment. It is a figure which shows the screen example.
  • the result display screen 1001 shows a screen for the user's terminal device to present the result of analyzing the movement of the three-dimensional model by the user's terminal device, similarly to the result display screen 901 in FIG.
  • the user's terminal device displays the operation information of the measured person photographed by the photographing device 40 and the sensing information of the measured person detected by the sensor 50 at a predetermined time point on the server 20.
  • the screen for displaying the result of having acquired and operating the 3D model based on the acquired information is shown.
  • the other patient result display screen 1003 shows the progress of the other patients who suffered from the same disease as the subject and experienced the same treatment, which were photographed by the imaging device 40 at a predetermined time point on the server 20.
  • the comparative analysis result display screen 1004 shows the movement of the 3D model of the subject and the patient who suffered from the same disease as the subject and had a good course among other patients who experienced the same treatment.
  • the screen which displays the result of comparison with the movement of the 3D model of the user's terminal device is shown.
  • the user's terminal device displays on the screen the result of comparison with the analysis results of other patients who suffered from the same disease and experienced the same treatment, so that the range of motion and the target site can be measured.
  • Information such as changes in muscle mass, degree of coordinated exercise, and recommended treatment methods may be presented to the subject.
  • the subject can refer to the results of the patients who have the most rehabilitation effect among the patients suffering from the same disease, and it is expected that the motivation for rehabilitation will be improved.
  • FIG. 11 shows the movement of the 3D model of the subject and the movement of the 3D model of the patient who had the same disease as the subject and experienced different treatments and had a good course. It is a figure which shows the screen example which presents the comparison result to the user.
  • the result display screen 1101 shows a screen for the user's terminal device to present the result of analyzing the movement of the three-dimensional model by the user's terminal device, similarly to the result display screen 1001 in FIG.
  • the measured person result display screen 1102 acquires the operation information of the measured person photographed by the photographing device 40 and the sensing information of the measured person detected by the sensor 50 at a predetermined time point, and the acquired information.
  • a screen for displaying the result of operating the three-dimensional model based on the user's terminal device is shown.
  • the measurement subject result display screen 1103 of another treatment shows the progress of the other patients who suffered from the same disease as the subject and experienced different treatments, which were photographed by the imaging device 40 at a predetermined time point on the server 20.
  • the comparative analysis result display screen 1104 showed a good course among the results of the server 20 analyzing the three-dimensional model of the subject and the other patients who suffered from the same disease as the subject and experienced different treatments.
  • a screen is shown on which the user's terminal device displays the result of comparing the results of analyzing the movement of the patient's three-dimensional model.
  • the user's terminal device displays on the screen the result of comparison with the results of other patients who suffered from the same disease and experienced different treatments, so that the range of motion and the muscles of the target site can be measured.
  • Information such as the amount, changes in the degree of coordinated movement, and recommended treatment methods may be presented to the subject.
  • a program for causing a computer including the processor 29 and the memory 25 to execute the program has the step (S601) of acquiring the personal information of the person to be measured by the processor 29, and 3 based on the acquired personal information.
  • step (S701) of acquiring motion information motion information is acquired using a trained model that has been trained to output information about the motion of the body in which a moving image of the body is captured as an input.
  • Appendix 5 The program according to Appendix 4, which evaluates the effect of rehabilitation by comparing the movements stored at a plurality of time points with different periods in the analysis step (S801).
  • Appendix 8 The program according to Appendix 6, wherein in the step of analysis (S801), the person to be measured different from the person to be measured is a person who has been diagnosed with the same disease as the person to be measured and has been treated differently.
  • (Appendix 11) A method executed by a computer including the processor 29 and the memory 25, wherein the processor 29 creates a three-dimensional model based on the step (S601) of acquiring the personal information of the person to be measured and the acquired personal information.
  • the step (S602), the step (S701) of acquiring the motion information of the person to be measured obtained by analyzing the photographed image of the person to be measured having the motion sensor 50 attached to a part of the body, and the motion sensor 50.
  • the control unit 203 acquires a personal information of the person to be measured (S601) and a step of creating a three-dimensional model based on the acquired personal information (S602).
  • the step (S701) of acquiring the motion information of the person to be measured obtained by analyzing the photographed image of the person to be measured having the motion sensor 50 attached to a part of the body, and the sensing information of the motion sensor 50.
  • An information processing device that executes a step of acquiring (S701), a step of moving a three-dimensional model based on motion information and sensing information (S702), and a step of analyzing the motion of the three-dimensional model (S801).

Abstract

[Problem] To use an image capturing device and a sensor, for example, to analyze the motions of a person with a reduced processing load. [Solution] This program, to be executed by a computer provided with a processor and a memory, causes the processor to execute: a step of acquiring personal information relating to a person being measured; a step of creating a three-dimensional model on the basis of the acquired personal information; a step of acquiring motion information relating to the person being measured, obtained by analyzing a captured image of the person being measured, on a part of the body of whom a motion sensor is being worn; a step of acquiring sensing information from the motion sensor; a step of moving the three-dimensional model on the basis of the motion information and the sensing information; and a step of analyzing the movement of the three-dimensional model.

Description

プログラム、方法、情報処理装置、システムPrograms, methods, information processing equipment, systems
 本開示は、プログラム、方法、情報処理装置、システムに関する。 This disclosure relates to programs, methods, information processing devices, and systems.
 人の動きをカメラ、及びセンサ等で捉え、捉えた動作を解析する技術が知られている。特許文献1には、動作中の被写体を連続的に撮像した複数の画像から生成された、時系列に応じて前記被写体の動作の変化が識別可能な合成画像と、当該被写体に装着されたモーションセンサから出力される、被写体の動作に対応する時系列で表示されるモーションデータとを対応付けた状態で、合成画像とモーションデータとを表示させる運動評価装置が記載されている。 The technique of capturing the movement of a person with a camera, a sensor, etc. and analyzing the captured motion is known. Patent Document 1 describes a composite image generated from a plurality of continuously captured images of a moving subject, in which changes in the movement of the subject can be identified according to a time series, and a motion attached to the subject. Described is a motion evaluation device that displays a composite image and motion data in a state of being associated with motion data output from a sensor and displayed in a time series corresponding to the movement of the subject.
特開2018-008015号公報Japanese Unexamined Patent Publication No. 2018-008015
 しかし、特許文献1の技術では、複数人の動作を解析しようとする場合、複数人分の動画及びセンシングデータ等を装置内に記憶しておかなければならず、大容量のストレージを用意する必要がある。また、同じ人について前と異なる動作を解析しようとする場合、同様の処理を改めて実行しなければならず、処理の負担が重くなってしまう。 However, in the technique of Patent Document 1, when trying to analyze the movements of a plurality of people, it is necessary to store moving images and sensing data for a plurality of people in the device, and it is necessary to prepare a large-capacity storage. There is. Further, when trying to analyze a different operation from the previous one for the same person, the same process must be executed again, which increases the load of the process.
 そこで、本開示の目的は、撮影装置、及びセンサ等を用い、人の動作を、処理負荷を抑えて解析することである。 Therefore, an object of the present disclosure is to analyze a person's movement by using a photographing device, a sensor, or the like with a reduced processing load.
 一実施形態によると、プロセッサと、メモリとを備えるコンピュータに実行させるためのプログラムであって、プログラムは、プロセッサに、被測定者の個人情報を取得するステップと、取得した個人情報に基づいて3次元モデルを作成するステップと、体の一部にモーションセンサが装着された被測定者についての撮影画像が解析されて得られる、被測定者の動作情報を取得するステップと、モーションセンサのセンシング情報を取得するステップと、動作情報及びセンシング情報に基づいて3次元モデルを動かすステップと、3次元モデルの動きを記憶手段に記憶させるステップと、記憶させた動きを解析するステップと、をプロセッサに実行させるプログラムが提供される。 According to one embodiment, the program is to be executed by a computer including a processor and a memory, and the program is based on a step of acquiring the personal information of the person to be measured and the acquired personal information. The step of creating a 3D model, the step of acquiring the motion information of the person to be measured obtained by analyzing the photographed image of the person to be measured with the motion sensor attached to a part of the body, and the sensing information of the motion sensor. The step of acquiring, the step of moving the 3D model based on the motion information and the sensing information, the step of storing the movement of the 3D model in the storage means, and the step of analyzing the stored movement are executed in the processor. A program is provided.
 本開示によれば、撮影装置、及びセンサ等を用い、人の動作を、処理負荷を抑えて解析することができる。 According to the present disclosure, it is possible to analyze human movements with a reduced processing load by using a photographing device, a sensor, or the like.
システム1の全体構成を示す図である。It is a figure which shows the whole structure of the system 1. システム1において、被測定者の体のセンシングなどを行うときの一例を示す図である。It is a figure which shows an example at the time of performing the sensing of the body of the person to be measured in the system 1. サーバ20の機能的な構成を示す図である。It is a figure which shows the functional configuration of a server 20. サーバ20が記憶する患者情報データベース2021、動作ログ情報データベース2022、診療情報データベース2023のデータ構造を示す図である。It is a figure which shows the data structure of the patient information database 2021, the operation log information database 2022, and the medical care information database 2023 which the server 20 stores. システム1を構成する機器などの概要を示す図である。It is a figure which shows the outline of the apparatus which comprises the system 1. サーバ20が、取得した被測定者の個人情報に基づいて、3次元モデルを作成する処理を示すフローチャートである。It is a flowchart which shows the process which a server 20 creates a 3D model based on the acquired personal information of a person to be measured. サーバ20が、被測定者の動作情報およびモーションセンサのセンシング情報を取得し、取得した情報に基づいて3次元モデルを動かす処理を示すフローチャートである。FIG. 5 is a flowchart showing a process in which the server 20 acquires motion information of a person to be measured and sensing information of a motion sensor, and moves a three-dimensional model based on the acquired information. サーバ20が、3次元モデルの動きを記憶手段に記憶させ、記憶させた動きを解析し、解析結果をユーザの端末装置に送信する処理を示すフローチャートである。It is a flowchart which shows the process which the server 20 stores the movement of a 3D model in a storage means, analyzes the stored movement, and transmits the analysis result to a user's terminal device. 被測定者の3次元モデルにおいて、現在の動きと、過去の所定時点の動きとを比較した結果をユーザに提示する画面例を示す図である。It is a figure which shows the screen example which presents the result of having compared the present motion with the motion at a predetermined time point in the past in the 3D model of a person to be measured. 被測定者の3次元モデルの動きと、被測定者と同じ疾病に罹患し、同様の治療を経験した他の患者の3次元モデルの動きとを比較した結果をユーザに提示する画面例を示す図である。An example of a screen showing the result of comparing the movement of the 3D model of the subject and the movement of the 3D model of another patient who suffered from the same disease as the subject and experienced the same treatment is shown. It is a figure. 被測定者の3次元モデルの動きと、被測定者と同じ疾病に罹患し、異なる治療を経験した他の患者のうち、経過が良好だった患者の3次元モデルの動きとを比較した結果をユーザに提示する画面例を示す図である。The results of comparing the movement of the 3D model of the subject with the movement of the 3D model of the patient who had the same disease as the subject and experienced different treatments and had a good course. It is a figure which shows the screen example which presents to the user.
 以下、図面を参照しつつ、本発明の実施の形態について説明する。以下の説明では、同一の部品には同一の符号を付してある。それらの名称および機能も同じである。したがって、それらについての詳細な説明は繰り返さない。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. In the following description, the same parts are designated by the same reference numerals. Their names and functions are the same. Therefore, the detailed description of them will not be repeated.
 <概要>
 以下の実施形態では、撮影装置、及びセンサ等を用い、人の動作を、処理負荷を抑えて解析するためのシステム1について説明する。
<Overview>
In the following embodiment, a system 1 for analyzing a human movement while suppressing a processing load by using a photographing device, a sensor, and the like will be described.
 システム1は、被測定者の個人情報に基づいて3次元モデルを作成する。本実施形態において、個人情報には、例えば、被測定者の身長、体重、体格など、被測定者の身体の情報などが含まれる。システム1は、被測定者の体の一部に装着されたセンサがセンシングしたセンシング情報と、被測定者を撮影装置により撮影して得られる動作情報とから、作成した3次元モデルを動かす。システム1は、3次元モデルを動かしたときの動作ログを解析する。これにより、システム1は、少ない処理の負担、及び少ない通信の付加で、被測定者の動作を解析する。 System 1 creates a three-dimensional model based on the personal information of the person to be measured. In the present embodiment, the personal information includes, for example, information on the body of the person to be measured, such as height, weight, and physique of the person to be measured. The system 1 moves the created three-dimensional model from the sensing information sensed by the sensor attached to a part of the body of the person to be measured and the motion information obtained by photographing the person to be measured by the photographing apparatus. System 1 analyzes the operation log when the 3D model is moved. As a result, the system 1 analyzes the operation of the person to be measured with a small processing load and a small amount of communication.
 システム1は、例えば、病院などの医療施設等に設置され得る。例えば、医師などが、患者のリハビリ効果、今後の治療方針などを効率的に管理するため、システム1を導入することがあり得る。また、フィットネスジムなどのスポーツ施設などにおいて、システム1を利用することにより、トレーニング効果の測定、正しい体の動かし方の指導などのサービスをユーザに向けて提供することがあり得る。 System 1 can be installed in, for example, a medical facility such as a hospital. For example, a doctor or the like may introduce the system 1 in order to efficiently manage the rehabilitation effect of the patient, the future treatment policy, and the like. Further, in a sports facility such as a fitness gym, by using the system 1, it is possible to provide a service such as measurement of a training effect and instruction on how to move a correct body to a user.
 <1 システム全体の構成図>
 図1は、システム1の全体の構成を示す図である。
<1 Configuration diagram of the entire system>
FIG. 1 is a diagram showing the overall configuration of the system 1.
 図1に示すように、システム1は、サーバ20と、エッジサーバ30と、撮影装置40と、センサ50とを含む。サーバ20とエッジサーバ30とは、ネットワーク80を介して通信接続する。エッジサーバ30は、撮影装置40とセンサ50と接続されている。例えば、撮影装置40とセンサ50とは、情報機器間の近距離通信システムで用いられる通信規格に基づく送受信装置である。具体的には、撮影装置40とセンサ50とは、例えば、Bluetooth(登録商標)モジュールなど2.4GHz帯を使用して、Bluetooth(登録商標)モジュールを搭載した他の情報機器からのビーコン信号を受信する。エッジサーバ30は、当該近距離通信を利用したビーコン信号に基づき、撮影装置40とセンサ50とから送信される情報を取得する。このように、撮影装置40とセンサ50とは、取得した被測定者の動きの情報を、ネットワーク80を介さず、近距離通信によりエッジサーバ30へ送信する。なお、エッジサーバ30は、ネットワーク80を介して撮影装置40とセンサ50と通信接続してもよい。 As shown in FIG. 1, the system 1 includes a server 20, an edge server 30, a photographing device 40, and a sensor 50. The server 20 and the edge server 30 communicate with each other via the network 80. The edge server 30 is connected to the photographing device 40 and the sensor 50. For example, the photographing device 40 and the sensor 50 are transmission / reception devices based on a communication standard used in a short-range communication system between information devices. Specifically, the photographing device 40 and the sensor 50 use a 2.4 GHz band such as a Bluetooth (registered trademark) module to receive a beacon signal from another information device equipped with the Bluetooth (registered trademark) module. Receive. The edge server 30 acquires information transmitted from the photographing device 40 and the sensor 50 based on the beacon signal using the short-range communication. In this way, the photographing device 40 and the sensor 50 transmit the acquired motion information of the person to be measured to the edge server 30 by short-range communication without going through the network 80. The edge server 30 may communicate with the photographing device 40 and the sensor 50 via the network 80.
 サーバ20は、各患者の情報を管理する。患者の情報は、例えば、患者の個人情報、3次元モデルの動作ログに関する情報、診療情報等を含む。
 サーバ20は、ユーザから取得した被測定者の個人情報に基づいて、3次元モデルを作成する。サーバ20は、エッジサーバ30から取得した情報に基づいて、作成した3次元モデルを動かし、その動きを解析する。図1に示すサーバ20は、通信IF22、入出力IF23、メモリ25、ストレージ26、及びプロセッサ29を有する。
The server 20 manages the information of each patient. The patient information includes, for example, personal information of the patient, information on the operation log of the three-dimensional model, medical information, and the like.
The server 20 creates a three-dimensional model based on the personal information of the person to be measured acquired from the user. The server 20 moves the created three-dimensional model based on the information acquired from the edge server 30, and analyzes the movement. The server 20 shown in FIG. 1 has a communication IF 22, an input / output IF 23, a memory 25, a storage 26, and a processor 29.
 通信IF22は、サーバ20が外部の装置と通信するため、信号を入出力するためのインタフェースである。入出力IF23は、ユーザからの入力操作を受け付けるための入力装置、および、ユーザに対し情報を提示するための出力装置とのインタフェースとして機能する。メモリ25は、プログラム、および、プログラム等で処理されるデータ等を一時的に記憶するためのものであり、例えばDRAM(Dynamic Random Access Memory)等の揮発性のメモリである。ストレージ26は、データを保存するための記憶装置であり、例えばフラッシュメモリ、HDD(Hard Disc Drive)である。プロセッサ29は、プログラムに記述された命令セットを実行するためのハードウェアであり、演算装置、レジスタ、周辺回路などにより構成される。 The communication IF 22 is an interface for inputting / outputting signals because the server 20 communicates with an external device. The input / output IF 23 functions as an interface with an input device for receiving an input operation from the user and an output device for presenting information to the user. The memory 25 is for temporarily storing a program, data processed by the program, or the like, and is, for example, a volatile memory such as a DRAM (Dynamic Random Access Memory). The storage 26 is a storage device for storing data, for example, a flash memory or an HDD (Hard Disc Drive). The processor 29 is hardware for executing an instruction set described in a program, and is composed of an arithmetic unit, registers, peripheral circuits, and the like.
 本実施形態において、各装置(端末装置、サーバ等)を情報処理装置として把握することもできる。すなわち、各装置の集合体を1つの「情報処理装置」として把握することができ、システム1を複数の装置の集合体として形成してもよい。1つ又は複数のハードウェアに対して本実施形態に係るシステム1を実現することに要する複数の機能の配分の仕方は、各ハードウェアの処理能力及び/又はシステム1に求められる仕様等に鑑みて適宜決定することができる。 In this embodiment, each device (terminal device, server, etc.) can be grasped as an information processing device. That is, the aggregate of each device can be grasped as one "information processing device", and the system 1 may be formed as an aggregate of a plurality of devices. The method of allocating the plurality of functions required to realize the system 1 according to the present embodiment to one or a plurality of hardware is in consideration of the processing capacity of each hardware and / or the specifications required for the system 1. Can be determined as appropriate.
 エッジサーバ30は、撮影装置40、およびセンサ50から送信される情報を受信し、受信した情報を、サーバ20に送信する。また、エッジサーバ30は、サーバ20から取得した情報を撮影装置40、およびセンサ50へ送信する。サーバ20から取得する情報には、例えば、撮影装置40、またはセンサ50の設定を更新するための情報などが含まれる。 The edge server 30 receives the information transmitted from the photographing device 40 and the sensor 50, and transmits the received information to the server 20. Further, the edge server 30 transmits the information acquired from the server 20 to the photographing device 40 and the sensor 50. The information acquired from the server 20 includes, for example, information for updating the settings of the photographing device 40 or the sensor 50.
 撮影装置40は、受光素子により光を受光して、撮影画像として出力するためのデバイスである。撮影装置40は、例えば、以下のいずれかのデバイスが想定される。
 ・可視光カメラ
 ・赤外線カメラ
 ・紫外線カメラ
 ・超音波センサ
 ・RGB-Dカメラ
 ・LiDAR(Light Detection and Ranging)
 ある局面において、撮影装置40は、被測定者の体の動きなどを撮影し、撮影した動画像を解析することで、動画像から動作情報を抽出してもよい。例えば、撮影装置40は、学習済みモデルを用い、被測定者の様子を常時撮影した動画像から動作情報を抽出し、エッジサーバ30へ送信する。
The photographing device 40 is a device for receiving light by a light receiving element and outputting it as a photographed image. As the photographing device 40, for example, any of the following devices is assumed.
・ Visible light camera ・ Infrared camera ・ Ultraviolet camera ・ Ultrasonic sensor ・ RGB-D camera ・ LiDAR (Light Detection and Ranging)
In a certain aspect, the photographing device 40 may extract motion information from the moving image by photographing the movement of the body of the person to be measured and analyzing the captured moving image. For example, the photographing device 40 uses the trained model to extract motion information from a moving image in which the state of the person to be measured is constantly photographed, and transmits the motion information to the edge server 30.
 学習済みモデルは、例えば、以下のように作成されてもよい。例えば、被測定者である患者の動きを撮影装置40で撮影した動画像を取得したのち、取得した動画像を医師などが解析することにより、動画像から動作情報を特定する。例えば、医師などは、撮影した動画像のうち、センサ50を装着している部分の動きに対し、動作を表すベクトルなどの情報を関連付け、学習用データを作成する。学習済みモデルは、学習用データに基づき、モデル学習プログラムに従って機械学習モデルに機械学習を行わせることにより作成される。
 なお、撮影装置40は、例えば、被測定者に装着されるセンサ50を含むように被測定者を撮影し、画像に含まれるセンサ50をマーカとして被測定者の動作情報を取得してもよい。また、撮影装置40は、被測定者を撮影した動画をエッジサーバ30へ送信してもよい。
The trained model may be created, for example, as follows. For example, after acquiring a moving image of the movement of the patient who is the subject to be measured by the photographing device 40, a doctor or the like analyzes the acquired moving image to specify the motion information from the moving image. For example, a doctor or the like creates learning data by associating information such as a vector representing the movement with the movement of the portion of the captured moving image in which the sensor 50 is attached. The trained model is created by having the machine learning model perform machine learning according to the model learning program based on the training data.
The photographing device 40 may photograph the person to be measured so as to include the sensor 50 attached to the person to be measured, and acquire the operation information of the person to be measured by using the sensor 50 included in the image as a marker. .. Further, the photographing device 40 may transmit a moving image of the person to be measured to the edge server 30.
 ある局面において、画像解析は、エッジサーバ30で実施されてもよい。このとき、学習済みモデルは、例えば、エッジサーバ30に記憶される。エッジサーバ30は、撮影装置40で撮影された動画像を取得する。エッジサーバ30は、取得した動画像に対し、学習済みモデルを利用した画像解析を実施し、動作情報を取得する。エッジサーバ30は、取得した動作情報をサーバ20へ送信する。 In a certain aspect, the image analysis may be performed by the edge server 30. At this time, the trained model is stored in, for example, the edge server 30. The edge server 30 acquires a moving image taken by the photographing device 40. The edge server 30 performs image analysis using the trained model on the acquired moving image, and acquires motion information. The edge server 30 transmits the acquired operation information to the server 20.
 センサ50は、被測定者の体の一部の領域に設置され、設置された被測定者の体の動きに関する情報を検出するモーションセンサである。例えば、センサ50は、被測定者の体の傾き、動作する速度、動作する方向などの情報を検出する。センサ50は、例えば、ジャイロセンサ、加速度センサ等により実現される。センサ50は、検出した各種のセンシング情報を、エッジサーバ30へ送信する。 The sensor 50 is a motion sensor that is installed in a part of the body of the person to be measured and detects information about the movement of the body of the person to be measured. For example, the sensor 50 detects information such as the inclination of the body of the person to be measured, the operating speed, and the operating direction. The sensor 50 is realized by, for example, a gyro sensor, an acceleration sensor, or the like. The sensor 50 transmits various detected sensing information to the edge server 30.
 図2は、撮影装置40と、センサ50との設置例を表す模式図である。
 図2において、センサ50は、例えば、被験者の身体の一部に装着される。図2に示す例では、センサ50は、膝に複数装着されている。膝における装着部位は、例えば、内側大腿部、前大腿部、外側大腿部、前膝部、内側下腿部、前下腿部、外側下腿部等のうちいずれかである。撮影装置40は、所定の施設内において、被測定者を撮影可能な位置に設置される。例えば、撮影装置40は、被測定者の撮影が遮断されづらい、高所に設置されることが望ましい。
FIG. 2 is a schematic diagram showing an installation example of the photographing device 40 and the sensor 50.
In FIG. 2, the sensor 50 is attached to, for example, a part of the body of the subject. In the example shown in FIG. 2, a plurality of sensors 50 are attached to the knee. The attachment site on the knee is, for example, any one of the medial thigh, anterior thigh, lateral thigh, anterior knee, medial lower leg, anterior lower leg, lateral lower leg, and the like. The photographing device 40 is installed at a position where the person to be measured can be photographed in a predetermined facility. For example, it is desirable that the photographing device 40 be installed in a high place because it is difficult to block the photographing of the person to be measured.
 <1.1 サーバ20の構成>
 図3は、サーバ20の機能的な構成を示す図である。図3に示すように、サーバ20は、通信部201と、記憶部202と、制御部203としての機能を発揮する。
<1.1 Configuration of server 20>
FIG. 3 is a diagram showing a functional configuration of the server 20. As shown in FIG. 3, the server 20 functions as a communication unit 201, a storage unit 202, and a control unit 203.
 通信部201は、サーバ20が、エッジサーバ30等の外部の装置と通信するための処理を行う。 The communication unit 201 performs processing for the server 20 to communicate with an external device such as an edge server 30.
 記憶部202は、サーバ20が使用するデータ及びプログラムを記憶する。記憶部202は、患者情報データベース2021と、動作ログ情報データベース2022と、診療情報データベース2023等を記憶する。 The storage unit 202 stores data and programs used by the server 20. The storage unit 202 stores the patient information database 2021, the operation log information database 2022, the medical care information database 2023, and the like.
 患者情報データベース2021は、システム1によりセンシング等を行う各患者の情報を示す。詳細は後述する。 The patient information database 2021 shows information on each patient to be sensed by the system 1. Details will be described later.
 動作ログ情報データベース2022は、サーバ20がセンサ50から取得した情報に基づいて3次元モデルを動かしたときの動作ログに関する情報を示す。詳細は後述する。 The operation log information database 2022 shows information on the operation log when the server 20 operates the three-dimensional model based on the information acquired from the sensor 50. Details will be described later.
 診療情報データベース2023は、患者それぞれの診療結果に関する情報を示す。詳細は後述する。 The medical information database 2023 shows information on the medical results of each patient. Details will be described later.
 制御部203は、サーバ20のプロセッサがプログラムに従って処理を行うことにより、各種モジュールとして示す機能を発揮する。 The control unit 203 exerts the functions shown as various modules by the processor of the server 20 performing processing according to the program.
 受信制御モジュール2031は、サーバ20が外部の装置から通信プロトコルに従って信号を受信する処理を制御する。 The reception control module 2031 controls the process in which the server 20 receives a signal from an external device according to a communication protocol.
 送信制御モジュール2032は、サーバ20が外部の装置に対し通信プロトコルに従って信号を送信する処理を制御する。 The transmission control module 2032 controls a process in which the server 20 transmits a signal to an external device according to a communication protocol.
 モデル作成モジュール2033は、ユーザにより入力された被測定者の個人情報に基づいて、3次元モデルを作成する。3次元モデルは、例えば、各部を自由に動かすことが可能に設けられている。ある局面において、モデル作成モジュール2033は、撮影装置40で撮影された被測定者の画像に基づいて3次元モデルを作成してもよい。つまり、モデル作成モジュール2033は、撮影装置40が撮影した被測定者の動画像を解析し、被測定者の身長、体格などを特定し、特定した情報に基づいて、3次元モデルを作成してもよい。 The model creation module 2033 creates a three-dimensional model based on the personal information of the person to be measured input by the user. The three-dimensional model is provided, for example, so that each part can be freely moved. In a certain aspect, the model creation module 2033 may create a three-dimensional model based on the image of the person to be measured taken by the photographing apparatus 40. That is, the model creation module 2033 analyzes the moving image of the person to be measured taken by the photographing device 40, identifies the height, physique, etc. of the person to be measured, and creates a three-dimensional model based on the specified information. May be good.
 第1取得モジュール2034は、撮影装置40によって撮影された、被測定者の体の動きに関する動作情報を取得する。 The first acquisition module 2034 acquires motion information regarding the body movement of the person to be measured, which is photographed by the photographing device 40.
 第2取得モジュール2035は、センサ50によってセンシングされた、センシング情報を取得する。 The second acquisition module 2035 acquires the sensing information sensed by the sensor 50.
 モデル操作モジュール2036は、取得した動作情報とセンシング情報とに基づいて、作成した3次元モデルを操作する。 The model operation module 2036 operates the created 3D model based on the acquired operation information and sensing information.
 具体的に、例えば、モデル操作モジュール2036は、撮影装置40から取得された動作情報に基づき、センサ50が装着されている部位の大きな動きを取得する。モデル操作モジュール2036は、センサ50により取得されたセンシング情報に基づき、センサ50が装着されている部位の細かな動きを取得する。モデル操作モジュール2036は、取得した大きな動き、及び細かな動きをそれぞれ3次元モデルに反映させ、3次元モデルの動きを操作する。これにより、モデル操作モジュール2036は、センサ50が装着されている部位について、大きな動きに加えて細かな動きを3次元モデルに反映させることが可能となる。そのため、モデル操作モジュール2036は、体の動かし方、筋肉の使い方など、人体の複雑な動きを、3次元モデルを用いて模すことが可能となる。 Specifically, for example, the model operation module 2036 acquires a large movement of the portion where the sensor 50 is mounted, based on the operation information acquired from the photographing device 40. The model operation module 2036 acquires the fine movement of the portion where the sensor 50 is mounted, based on the sensing information acquired by the sensor 50. The model operation module 2036 reflects the acquired large movement and small movement in the three-dimensional model, and operates the movement of the three-dimensional model. As a result, the model operation module 2036 can reflect fine movements in addition to large movements to the three-dimensional model for the portion where the sensor 50 is mounted. Therefore, the model operation module 2036 can imitate complicated movements of the human body such as how to move the body and how to use muscles by using a three-dimensional model.
 解析モジュール2037は、3次元モデルの動作ログの情報に基づいて、被測定者の動きを解析する。具体的には、例えば、解析モジュール2037は、3次元モデルの動作ログから把握される動き、例えば、変位、負荷、速度、加速度等に基づき、対象部位における可動域、対象部位の筋肉量、協調運動の程度、バランス運動の程度等を評価する。 The analysis module 2037 analyzes the movement of the person to be measured based on the information in the operation log of the three-dimensional model. Specifically, for example, the analysis module 2037 has a range of motion in the target site, muscle mass in the target site, and coordination based on movements grasped from the motion log of the three-dimensional model, for example, displacement, load, velocity, acceleration, and the like. Evaluate the degree of exercise, the degree of balance exercise, etc.
 ある局面において、解析モジュール2037は、被測定者の現在の動きと、過去の所定時点における動きとを比較し、リハビリの効果などを評価してもよい。これにより、解析モジュール2037は、被測定者の過去の動きを基準としてリハビリの効果を分かり易く評価することが可能となる。そのため、被測定者のリハビリに対するモチベーションの向上などが期待できる。 In a certain aspect, the analysis module 2037 may compare the current movement of the person to be measured with the movement at a predetermined time in the past and evaluate the effect of rehabilitation and the like. As a result, the analysis module 2037 can easily evaluate the effect of rehabilitation based on the past movements of the person to be measured. Therefore, it can be expected to improve the motivation of the person to be measured for rehabilitation.
 ある局面において、解析モジュール2037は、同じ疾病に罹患した他の患者の3次元モデルの動きに基づいて、リハビリの効果などを評価してもよい。具体的には、例えば、解析モジュール2037は、被測定者である患者と同じ疾病に罹患した患者のうち、治療効果の高い患者の3次元モデルの動きと、被測定者の3次元モデルの動きとを比較して解析する。別の例として、解析モジュール2037は、被測定者である患者と同じ疾病に罹患した患者のうち、異なる治療を実施した患者の3次元モデルの動きと、被測定者の3次元モデルの動きとを比較して解析する。 In a certain aspect, the analysis module 2037 may evaluate the effect of rehabilitation or the like based on the movement of a three-dimensional model of another patient suffering from the same disease. Specifically, for example, the analysis module 2037 uses the movement of the 3D model of the patient suffering from the same disease as the patient to be measured and the movement of the 3D model of the patient having a high therapeutic effect. And analyze by comparison. As another example, the analysis module 2037 includes the movement of the 3D model of the patient who suffered from the same disease as the patient who was the subject and who received different treatments, and the movement of the 3D model of the subject. To compare and analyze.
 これにより、患者、または医師などは、同じ疾病に罹患した異なる患者間で結果を比較することで、リハビリの効果の度合い、効果的なリハビリ、治療の進め方などを検討することができる。他にも、患者、または医師などは、同じ疾病に罹患した異なる患者間で、異なる治療法による効果の違い、患者ごとの各治療法への適正等を把握することができる。そのため、患者、医師などは、患者の体質の違いなどに基づいた治療効果の違いを考慮した上で、治療方針などを検討することができる。 This allows patients, doctors, etc. to examine the degree of rehabilitation effect, effective rehabilitation, treatment procedure, etc. by comparing the results among different patients suffering from the same disease. In addition, a patient, a doctor, or the like can grasp the difference in the effect of different treatment methods among different patients suffering from the same disease, the appropriateness for each treatment method for each patient, and the like. Therefore, the patient, the doctor, and the like can consider the treatment policy and the like after considering the difference in the therapeutic effect based on the difference in the constitution of the patient and the like.
 ある局面において、解析モジュール2037は、解析結果を患者に提示してもよい。このとき、例えば、患者が所有する端末(図示せず)は、解析結果をディスプレイに表示する。これにより、患者は、自身のリハビリの効果を把握することが可能となる。また、他の患者との比較結果が表示される場合には、患者は、他の患者との違いを認識することが可能であるため、リハビリをより効果的に実践することが可能となる。また、解析モジュール2037は、作成した3次元モデルの表面に、撮影装置40で撮影した被測定者の画像を貼り付けることで、3次元モデルの外観により被測定者それぞれを認識可能にしてもよい。 In a certain aspect, the analysis module 2037 may present the analysis result to the patient. At this time, for example, a terminal owned by the patient (not shown) displays the analysis result on the display. This enables the patient to understand the effect of his / her rehabilitation. In addition, when the comparison result with other patients is displayed, the patient can recognize the difference from other patients, so that the rehabilitation can be practiced more effectively. Further, the analysis module 2037 may make it possible to recognize each person to be measured by the appearance of the three-dimensional model by pasting the image of the person to be measured taken by the photographing apparatus 40 on the surface of the created three-dimensional model. ..
 <2 データ構造>
 図4は、サーバ20が記憶する患者情報データベース2021、動作ログ情報データベース2022、診療情報データベース2023のデータ構造を示す図である。
<2 data structure>
FIG. 4 is a diagram showing the data structures of the patient information database 2021, the operation log information database 2022, and the medical care information database 2023 stored in the server 20.
 図4に示すように、患者情報データベース2021は、項目「患者ID」と、項目「氏名」と、項目「年齢」と、項目「性別」と、項目「住所」と、項目「身長」、項目「体重」等を含む。 As shown in FIG. 4, the patient information database 2021 has an item "patient ID", an item "name", an item "age", an item "gender", an item "address", an item "height", and an item. Includes "weight" and the like.
 項目「患者ID」は、被測定者である患者それぞれを識別する情報である。 The item "patient ID" is information that identifies each patient who is the subject.
 項目「氏名」は、患者それぞれの氏名を示す情報である。例えば、患者ID「U001」の氏名は「A」であることを示す。 The item "name" is information indicating the name of each patient. For example, the name of the patient ID "U001" indicates that it is "A".
 項目「年齢」は、患者それぞれの年齢を示す情報である。例えば、患者ID「U001」の年齢は「20」であることを示す。 The item "age" is information indicating the age of each patient. For example, it indicates that the age of the patient ID "U001" is "20".
 項目「性別」は、患者それぞれの性別を示す情報である。例えば、患者ID「U001」の性別は「男」であることを示す。 The item "gender" is information indicating the gender of each patient. For example, the gender of the patient ID "U001" indicates that it is "male".
 項目「住所」は、患者それぞれの住所を示す情報である。例えば、患者ID「U001」の住所は「A県B市」であることを示す。 The item "address" is information indicating the address of each patient. For example, the address of the patient ID "U001" indicates that it is "B city, A prefecture".
 項目「身長」は、患者それぞれの身長を示す情報である。例えば、患者ID「U001」の身長は「166cm」であることを示す。 The item "height" is information indicating the height of each patient. For example, it indicates that the height of the patient ID "U001" is "166 cm".
 項目「体重」は、患者それぞれの体重を示す情報である。例えば、患者ID「U001」の体重は「70kg」であることを示す。 The item "body weight" is information indicating the weight of each patient. For example, the weight of the patient ID "U001" is shown to be "70 kg".
 図4に示すように、動作ログ情報データベース2022は、項目「ログID」と、項目「患者ID」と、項目「取得日時」と、項目「位置情報」等を含む。 As shown in FIG. 4, the operation log information database 2022 includes an item "log ID", an item "patient ID", an item "acquisition date and time", an item "location information", and the like.
 項目「ログID」は、動作ログ情報のそれぞれを識別する情報を示す。具体的には、サーバ20が取得した、撮影装置40とセンサ50とが取得した被測定者の動作ログそれぞれを識別する情報を示す。 The item "log ID" indicates information that identifies each of the operation log information. Specifically, the information for identifying the operation log of the person to be measured acquired by the photographing device 40 and the sensor 50 acquired by the server 20 is shown.
 項目「患者ID」は、動作ログと関連付けられた患者の情報を示す。例えば、ログID「L001」と関連付けられた患者は「U001」であることを示す。 The item "patient ID" indicates the patient information associated with the operation log. For example, the patient associated with the log ID "L001" indicates "U001".
 項目「取得日時」は、サーバ20が動作ログを取得した日時に関する情報を示す。例えば、ログID「L001」の取得日時は「20/06/01 10:00:00」であることを示す。 The item "acquisition date and time" indicates information regarding the date and time when the server 20 acquired the operation log. For example, it indicates that the acquisition date and time of the log ID "L001" is "20/06/01 10:00:00".
 項目「位置情報」は、モデル操作モジュール2036により操作された3次元モデルの位置に関する情報を示す。具体的には、位置情報は、モデル操作モジュール2036が、撮影装置40により取得される動作情報と、センサ50により取得されるセンシング情報とに基づいて3次元モデルを操作した動作ログの情報を、3次元モデル上の対応する位置と関連付けて取得される。例えば、ログID「L001」における3次元モデルの位置情報は「位置情報1」であることを示す。3次元モデルの位置を規定する手法は既存のいかなる手法であってもよい。このとき、位置情報は、センサ50によって取得されたセンシング情報、例えば、体の動きの角速度、加速度などに関する情報などを内部情報として含んでいてもよい。
 なお、図4に示す動作ログ情報データベース2022では、患者ID:U001の患者についてのデータのみが表示されているが、実際には、複数の患者についてのデータが蓄積されている。
The item "position information" indicates information regarding the position of the three-dimensional model operated by the model operation module 2036. Specifically, the position information is the information of the operation log in which the model operation module 2036 operates the three-dimensional model based on the operation information acquired by the photographing device 40 and the sensing information acquired by the sensor 50. Obtained in association with the corresponding position on the 3D model. For example, it is shown that the position information of the three-dimensional model in the log ID "L001" is "position information 1". The method for defining the position of the three-dimensional model may be any existing method. At this time, the position information may include sensing information acquired by the sensor 50, for example, information regarding the angular velocity of body movement, acceleration, and the like as internal information.
In the operation log information database 2022 shown in FIG. 4, only the data for the patient with the patient ID: U001 is displayed, but in reality, the data for a plurality of patients are accumulated.
 図4において、診療情報データベース2023は、項目「患者ID」と、項目「診療ID」と、項目「診療日時」と、項目「担当医師」と、項目「疾病」と、項目「診療内容」と、項目「処方薬」と、項目「経過情報」等を含む。 In FIG. 4, the medical information database 2023 includes an item "patient ID", an item "medical ID", an item "medical treatment date and time", an item "doctor in charge", an item "disease", and an item "medical treatment content". , The item "prescription drug" and the item "progress information" etc. are included.
 項目「患者ID」は、患者それぞれを識別する情報を示す。 The item "patient ID" indicates information that identifies each patient.
 項目「診療ID」は、患者IDと紐づけられた、診療情報それぞれを識別する情報を示す。例えば、患者ID「U001」は診療ID「T001」と紐づけられていることを示す。 The item "medical treatment ID" indicates information that identifies each medical treatment information associated with the patient ID. For example, it indicates that the patient ID "U001" is associated with the medical treatment ID "T001".
 項目「診療日時」は、患者それぞれが医師による診療を受けた日時に関する情報を示す。 The item "medical treatment date and time" indicates information on the date and time when each patient received medical treatment by a doctor.
 項目「担当医師」は、患者を診療した医師を識別する情報を示す。例えば、診療ID「T001」、診療日時「20/06/01」における担当医師は「A」であることを示す。 The item "doctor in charge" indicates information that identifies the doctor who treated the patient. For example, it indicates that the doctor in charge at the medical treatment ID "T001" and the medical treatment date and time "20/06/01" is "A".
 項目「疾病」は、患者それぞれが医師による診療を受けた日時において、医師による診断等を受けた疾病に関する情報を示す。例えば、診療ID「T001」について、診療日時「20/06/01」に、担当医師「A」により、疾病「膝関節痛」の診断を受けたことを表す。このとき、サーバ20は、当該疾病の情報は、診療日時それぞれにおいて異なる疾病の情報を記憶していてもよい。 The item "disease" indicates information on the disease that was diagnosed by the doctor at the date and time when each patient received medical treatment by the doctor. For example, regarding the medical treatment ID "T001", it indicates that the illness "knee joint pain" was diagnosed by the doctor in charge "A" on the medical treatment date and time "20/06/01". At this time, the server 20 may store the information on the disease, which is different from the information on the medical treatment date and time.
 項目「診療内容」は、患者それぞれが医師による診療を受けた日時において、医師による診療の内容に関する情報を示す。例えば、診療ID「T001」の診療日時「20/05/01」において、診療内容は「手術」であることを示す。このとき、サーバ20は、診療日時それぞれにおいて異なる診療内容の情報を記憶していてもよい。 The item "medical treatment content" indicates information on the content of medical treatment by a doctor at the date and time when each patient received medical treatment by a doctor. For example, in the medical treatment date and time "20/05/01" of the medical treatment ID "T001", it is shown that the medical treatment content is "surgery". At this time, the server 20 may store information on different medical treatment contents at each medical treatment date and time.
 項目「処方薬」は、患者それぞれが医師による診療を受けた日時において、医師により処方された薬に関する情報を示す。このとき、処方される薬は、医療用医薬品のみに限定されず、一般用医薬品でもよい。例えば、診療ID「T001」の診療日時「20/05/01」において、処方薬は「鎮痛剤」であることを示す。 The item "prescription drug" indicates information on the drug prescribed by the doctor at the date and time when each patient received medical treatment by the doctor. At this time, the prescribed drug is not limited to the prescription drug, and may be an over-the-counter drug. For example, at the medical treatment date and time "20/05/01" of the medical treatment ID "T001", it is shown that the prescription drug is an "analgesic".
 項目「経過情報」は、患者それぞれが医師による診療を受けた日時において、治療経過に関する情報を示す。例えば、診療ID「T001」の診療日時「20/05/01」において、経過情報は「次回リハビリ」であることを示す。 The item "progress information" indicates information on the progress of treatment at the date and time when each patient received medical treatment by a doctor. For example, in the medical treatment date and time "20/05/01" of the medical treatment ID "T001", the progress information indicates that it is "next rehabilitation".
 <3 小括>
 図5は、システム1の概要を示す図である。図5に示す例では、被測定者である患者の体の一部に、センサ50(例えば、モーションセンサなど)が装着される。
 撮影装置40は、センサ50を装着した被測定者の動きを撮影する。撮影装置40は、学習済みモデルを記憶しており、学習済みモデルを用い、撮影した画像から被測定者の動作情報を取得する。撮影装置40は、取得した動作情報をエッジサーバ30に送信する。
 センサ50は、被測定者の動作した際のセンシング情報をエッジサーバ30に送信する。
 エッジサーバ30は、受信した被測定者の動作情報、センシング情報をサーバ20に送信する。
 サーバ20は、ユーザから取得した個人情報に基づいて、被測定者の3次元モデルを作成する。サーバ20は、エッジサーバ30から取得した情報に基づいて、3次元モデルを操作する。サーバ20は、3次元モデルを操作した動作ログの情報に基づいて、被測定者の動きの情報を解析する。
<3 Summary>
FIG. 5 is a diagram showing an outline of the system 1. In the example shown in FIG. 5, a sensor 50 (for example, a motion sensor) is attached to a part of the body of the patient who is the subject to be measured.
The photographing device 40 photographs the movement of the person to be measured wearing the sensor 50. The photographing device 40 stores the trained model, and acquires the motion information of the person to be measured from the captured image by using the trained model. The photographing device 40 transmits the acquired operation information to the edge server 30.
The sensor 50 transmits the sensing information when the person to be measured operates to the edge server 30.
The edge server 30 transmits the received operation information and sensing information of the person to be measured to the server 20.
The server 20 creates a three-dimensional model of the person to be measured based on the personal information acquired from the user. The server 20 operates the three-dimensional model based on the information acquired from the edge server 30. The server 20 analyzes the movement information of the person to be measured based on the information of the operation log in which the three-dimensional model is operated.
 これにより、サーバ20は、3次元モデルを用いて被測定者の動作を解析するため、解析結果を取得するためのデータ量を抑えることが可能となる。また、3次元モデルを操作可能なデータを送信すればよいため、データ容量を抑えた通信が可能となる。また、送信するデータ量が少なくてよいため、遠隔地にいる被測定者の動作を、3次元モデルを用いて解析することが可能となる。つまり、医師が患者から離れている場合であっても、医師は、リハビリの効果を認識可能となり、また、今後のリハビリ方針などを指導することが可能となる。 As a result, the server 20 analyzes the movement of the person to be measured using the three-dimensional model, so that the amount of data for acquiring the analysis result can be suppressed. Further, since it is sufficient to transmit data that can operate the three-dimensional model, communication with a reduced data capacity becomes possible. Further, since the amount of data to be transmitted may be small, it is possible to analyze the movement of the person to be measured at a remote place by using a three-dimensional model. That is, even when the doctor is away from the patient, the doctor can recognize the effect of rehabilitation and can instruct the future rehabilitation policy.
 <4 動作>
 以下、サーバ20が被測定者の体の動きをセンシング等するときの一連の処理について説明する。
<4 operation>
Hereinafter, a series of processes when the server 20 senses the movement of the body of the person to be measured will be described.
 <4.1 3次元モデルの作成>
 図6は、サーバ20が、取得した被測定者の個人情報に基づいて、3次元モデルを作成する処理を示すフローチャートである。
<4.1 Creation of 3D model>
FIG. 6 is a flowchart showing a process in which the server 20 creates a three-dimensional model based on the acquired personal information of the person to be measured.
 ステップS601において、サーバ20は、被測定者の個人情報を取得する。具体的には、サーバ20は、例えば、ユーザの端末装置から送信された、被測定者の個人情報などを取得する。個人情報には、例えば、被測定者の身長、体重、体格など、被測定者の身体の情報、年齢、性別などが含まれる。また、個人情報には、筋力測定等により取得される、筋力に関する情報が含まれていてもよい。サーバ20は、被測定者の個人情報として、撮影装置40が撮影した動画像を取得してもよい。例えば、サーバ20は、撮影装置40が撮影した動画像に基づいて、被測定者の身体の可動域、または、腕、足の長さ等の外見的な特徴に関するデータを取得する。他にも、サーバ20は、撮影装置40が撮影した動画像に基づいて、被測定者の身体の情報を点群データとして取得してもよい。 In step S601, the server 20 acquires the personal information of the person to be measured. Specifically, the server 20 acquires, for example, the personal information of the person to be measured transmitted from the terminal device of the user. The personal information includes, for example, physical information, age, gender, etc. of the person to be measured, such as height, weight, and physique of the person to be measured. In addition, the personal information may include information on muscle strength acquired by measuring muscle strength or the like. The server 20 may acquire a moving image taken by the photographing device 40 as personal information of the person to be measured. For example, the server 20 acquires data on the range of motion of the body of the person to be measured or external features such as arm and foot lengths based on the moving image taken by the photographing device 40. In addition, the server 20 may acquire information on the body of the person to be measured as point cloud data based on the moving image taken by the photographing device 40.
 ステップS602において、サーバ20は、ユーザから取得した被測定者の個人情報に基づいて、被測定者の3次元モデルを作成する。サーバ20は、例えば、デフォルトで設けられている3次元モデルを、取得した個人情報に基づいて調整することで、被測定者を模した3次元モデルを作成する。具体的には、サーバ20は、デフォルトで設けられている3次元モデルのパラメータを、個人情報として取得した、被測定者の性別、年齢、身長、体重、身体の可動域などの情報に基づいて調整することで、3次元モデルを作成する。なお、個人情報が点群データとして取得される場合、サーバ20は、取得した点群データに基づいて、3次元モデルを作成してもよい。 In step S602, the server 20 creates a three-dimensional model of the person to be measured based on the personal information of the person to be measured acquired from the user. The server 20 creates, for example, a three-dimensional model that imitates the person to be measured by adjusting the three-dimensional model provided by default based on the acquired personal information. Specifically, the server 20 acquires the parameters of the three-dimensional model provided by default as personal information based on the information such as the sex, age, height, weight, and range of movement of the body of the person to be measured. By adjusting, a 3D model is created. When personal information is acquired as point cloud data, the server 20 may create a three-dimensional model based on the acquired point cloud data.
 3次元モデルは、作成後時間が経過すると、患者との齟齬が生じる場合がある。サーバ20は、例えば、予め設定される期間が経過する度に、3次元モデルを修正する旨の提案を医師、または患者などのユーザへ提示してもよい。提示に応じて患者の個人情報が入力されると、サーバ20は、入力された個人情報に基づいて3次元モデルを修正する。なお、サーバ20は、入力された個人情報に基づいて3次元モデルを新たに作成してもよい。 The 3D model may be inconsistent with the patient as time passes after it is created. The server 20 may present, for example, a suggestion to modify the three-dimensional model to a user such as a doctor or a patient each time a preset period elapses. When the personal information of the patient is input according to the presentation, the server 20 modifies the three-dimensional model based on the input personal information. The server 20 may newly create a three-dimensional model based on the input personal information.
 <4.2 3次元モデルの操作>
 図7は、サーバ20が、撮影装置40により撮影された被測定者の動作情報、およびセンサ50で検出されるセンシング情報を取得し、取得した情報に基づいて3次元モデルを操作する処理を示すフローチャートである。
<4.2 Operation of 3D model>
FIG. 7 shows a process in which the server 20 acquires the motion information of the person to be measured taken by the photographing device 40 and the sensing information detected by the sensor 50, and operates the three-dimensional model based on the acquired information. It is a flowchart.
 ステップS701において、サーバ20は、撮影装置40により撮影される画像に基づいて得られる動作情報と、センサ50により検出されるセンシング情報とを取得する。具体的には、例えば、撮影装置40は、被測定者に装着されるセンサ50を含むように被測定者を撮影する。撮影装置40は、撮影した画像を、学習済みモデルを用いて解析することで、被測定者の動作情報を取得する。サーバ20は、撮影装置40により取得された動作情報を、エッジサーバ30を介して受信する。 In step S701, the server 20 acquires the operation information obtained based on the image captured by the photographing device 40 and the sensing information detected by the sensor 50. Specifically, for example, the photographing device 40 photographs the person to be measured so as to include the sensor 50 attached to the person to be measured. The photographing device 40 acquires the motion information of the person to be measured by analyzing the photographed image using the trained model. The server 20 receives the operation information acquired by the photographing device 40 via the edge server 30.
 また、サーバ20は、センサ50により検出された角速度(deg/s)、加速度(mm/s2)などについてのセンシング情報を、エッジサーバ30を介して受信する。 Further, the server 20 receives the sensing information about the angular velocity (deg / s), the acceleration (mm / s2), etc. detected by the sensor 50 via the edge server 30.
 ある局面において、サーバ20は、撮影装置40が撮影した動画像を取得してもよい。例えば、サーバ20は、撮影装置40が撮影した動画像に、学習済みモデルを利用した画像解析を実施する。サーバ20は、動画像に画像解析を実施することで、動作情報を取得する。 In a certain aspect, the server 20 may acquire a moving image taken by the photographing device 40. For example, the server 20 performs image analysis using the trained model on the moving image taken by the photographing device 40. The server 20 acquires operation information by performing image analysis on the moving image.
 ステップS702において、サーバ20は、3次元モデルを、取得した動作情報及びセンシング情報に基づいて動かす。具体的には、例えば、サーバ20は、作成した3次元モデルのうち、センサ50を装着した部位と対応する領域を、ステップS702において取得した情報に基づいて動かす。具体的には、例えば、サーバ20は、動作情報に基づいて3次元モデルの対応領域を動かすと共に、センサ50により取得された角速度、加速度などのセンシング情報に基づいて対応領域の細部を動かす。このとき、サーバ20は、例えば、被測定者の可動域等のパラメータを反映させて3次元モデルを動かす。 In step S702, the server 20 moves the three-dimensional model based on the acquired operation information and sensing information. Specifically, for example, the server 20 moves the area corresponding to the portion where the sensor 50 is mounted in the created three-dimensional model based on the information acquired in step S702. Specifically, for example, the server 20 moves the corresponding area of the three-dimensional model based on the operation information, and moves the details of the corresponding area based on the sensing information such as the angular velocity and the acceleration acquired by the sensor 50. At this time, the server 20 moves the three-dimensional model, for example, reflecting parameters such as the range of motion of the person to be measured.
 ステップS703において、サーバ20は、3次元モデルの動きを記憶手段に記憶させる。例えば、サーバ20は、取得した3次元モデルの各部位の位置情報を動作ログとして動作ログ情報データベース2022に格納する。 In step S703, the server 20 stores the movement of the three-dimensional model in the storage means. For example, the server 20 stores the acquired position information of each part of the three-dimensional model as an operation log in the operation log information database 2022.
 撮影装置40と、センサ50とを組み合わせて利用することにより、サーバ20は、撮影装置40により撮影された動画像のみではわからない体の細部の動きを、3次元モデルに反映させることが可能となる。 By using the photographing device 40 and the sensor 50 in combination, the server 20 can reflect the movement of the details of the body, which cannot be understood only by the moving image photographed by the photographing device 40, in the three-dimensional model. ..
 <4.3 動作ログの解析>
 図8は、サーバ20が、記憶手段に記憶させた、3次元モデルの動きを解析し、解析結果をユーザの端末装置に送信する処理を示すフローチャートである。
<4.3 Analysis of operation log>
FIG. 8 is a flowchart showing a process in which the server 20 analyzes the movement of the three-dimensional model stored in the storage means and transmits the analysis result to the user's terminal device.
 ステップS801において、サーバ20は、記憶部202に記憶している動作ログ情報データベース2022に基づき、3次元モデルの動きを解析する。具体的には、例えば、医師等のユーザは、PC等の情報処理端末(図示せず)を操作し、患者の動きを解析したい時点を指定し、解析を実施する旨の指示を入力する。 In step S801, the server 20 analyzes the movement of the three-dimensional model based on the operation log information database 2022 stored in the storage unit 202. Specifically, for example, a user such as a doctor operates an information processing terminal (not shown) such as a PC, specifies a time point at which the patient's movement is desired to be analyzed, and inputs an instruction to perform the analysis.
 サーバ20は、解析の指示を受け付けると、動作ログ情報データベース2022から、指定された時点を含む所定期間の動作ログを読み出す。サーバ20は、読み出した動作ログを時系列に沿って3次元モデルに反映させる。サーバ20は、3次元モデルの動作ログから把握される動き、例えば、変位、負荷、速度、加速度等に基づき、患者の対象部位における可動域、対象部位の筋肉量、協調運動の程度、バランス運動の程度等を評価する。 When the server 20 receives the analysis instruction, it reads the operation log for a predetermined period including the specified time point from the operation log information database 2022. The server 20 reflects the read operation log in the three-dimensional model in chronological order. The server 20 is based on the movements grasped from the motion log of the three-dimensional model, for example, displacement, load, velocity, acceleration, etc. Evaluate the degree of
 ある局面において、サーバ20は、解析において、患者の現在の動きと、過去の所定時点の動きとを比較し、リハビリの効果などを評価する。具体的には、サーバ20は、例えば、動作ログ情報データベース2022から、指定された時点を含む所定期間の動作ログと、指定された時点から予め設定された期間だけ過去に遡った時点を含む所定期間の動作ログとを読み出す。サーバ20は、読み出した2種類の動作ログを、時系列に沿って、患者についての3次元モデルにそれぞれ反映させる。このとき、サーバ20は、過去の時点として、指定した時点に最も近い日付を設定してもよいし、医師などのユーザから指定された任意の日付を設定してもよい。 In a certain aspect, the server 20 compares the current movement of the patient with the movement at a predetermined time in the past in the analysis, and evaluates the effect of rehabilitation and the like. Specifically, the server 20 includes, for example, an operation log for a predetermined period including a specified time point from the operation log information database 2022, and a predetermined time point including a time point retroactively from the designated time point by a preset period. Read the operation log of the period. The server 20 reflects the two types of operation logs read out in the three-dimensional model for the patient in chronological order. At this time, the server 20 may set a date closest to the designated time point as a past time point, or may set an arbitrary date designated by a user such as a doctor.
 サーバ20は、それぞれの3次元モデルの動作ログから把握される動きを比較し、患者の対象部位における可動域、対象部位の筋肉量、協調運動の程度、バランス運動の程度等の変化を評価する。 The server 20 compares the movements grasped from the motion logs of each three-dimensional model, and evaluates changes in the range of motion of the patient's target site, the muscle mass of the target site, the degree of coordinated movement, the degree of balance movement, and the like. ..
 ある局面において、サーバ20は、同じ疾病に罹患した他の患者の3次元モデルの動きに基づいて、リハビリの効果などを評価する。具体的には、サーバ20は、例えば、対象の患者と同じ疾病に罹患し、同様の治療を経験した他の患者のうち、経過が良好だった患者を、診療情報データベース2023から抽出する。サーバ20は、例えば、動作ログ情報データベース2022から、指定された時点を含む所定期間の動作ログと、抽出した患者の所定期間の動作ログとを読み出す。このとき、サーバ20は、他の患者について、ユーザにより指定された時点と対応する時点を含む所定期間の動作ログを読み出す。ユーザにより指定された時点と対応する時点とは、例えば、ユーザにより指定された時点において計上される術後の経過期間が同程度の時点を表す。 In a certain aspect, the server 20 evaluates the effect of rehabilitation and the like based on the movement of the three-dimensional model of another patient suffering from the same disease. Specifically, the server 20 extracts, for example, from the medical information database 2023, among other patients who have the same disease as the target patient and have experienced the same treatment, the patients who have a good course. The server 20 reads, for example, from the operation log information database 2022, the operation log for a predetermined period including the specified time point and the extracted operation log for the predetermined period of the patient. At this time, the server 20 reads out the operation log for a predetermined period including the time point designated by the user and the time point corresponding to the other patient. The time point designated by the user and the corresponding time point represent, for example, a time point in which the postoperative elapsed period recorded at the time point specified by the user is about the same.
 サーバ20は、読み出した2種類の動作ログを、時系列に沿って、2つの異なる3次元モデルに反映させる。一方の3次元モデルは、対象となる患者について作成された3次元モデルであり、他方の3次元モデルは、他の患者について作成された3次元モデルである。サーバ20は、それぞれの3次元モデルの動作ログから把握される動きを比較し、患者の対象部位における可動域、対象部位の筋肉量、協調運動の程度、バランス運動の程度等の違いを評価する。 The server 20 reflects the two types of operation logs read out in two different three-dimensional models in chronological order. One 3D model is a 3D model created for the target patient, and the other 3D model is a 3D model created for the other patient. The server 20 compares the movements grasped from the motion logs of each three-dimensional model, and evaluates differences in the range of motion of the patient's target site, the muscle mass of the target site, the degree of coordinated movement, the degree of balance movement, and the like. ..
 また、サーバ20は、例えば、対象の患者と同じ疾病に罹患し、異なる治療を経験した他の患者のうち、経過が良好だった患者を、診療情報データベース2023から抽出する。サーバ20は、例えば、動作ログ情報データベース2022から、指定された時点を含む所定期間の動作ログと、抽出した患者の所定期間の動作ログとを読み出す。サーバ20は、読み出した2種類の動作ログを、時系列に沿って、2つの異なる3次元モデルに反映させる。 Further, the server 20 extracts, for example, from the medical information database 2023, among other patients who suffer from the same disease as the target patient and have experienced different treatments, the patients who have a good course. The server 20 reads, for example, from the operation log information database 2022, the operation log for a predetermined period including the specified time point and the extracted operation log for the predetermined period of the patient. The server 20 reflects the read two types of operation logs in two different three-dimensional models in chronological order.
 サーバ20は、それぞれの3次元モデルの動作ログから把握される動きを比較し、患者の対象部位における可動域、対象部位の筋肉量、協調運動の程度、バランス運動の程度等の違いを評価する。 The server 20 compares the movements grasped from the motion logs of each three-dimensional model, and evaluates differences in the range of motion of the patient's target site, the muscle mass of the target site, the degree of coordinated movement, the degree of balance movement, and the like. ..
 なお、サーバ20による解析処理は、ユーザから入力される解析の指示に応じて実施されるものに限定されない。サーバ20は、所定のタイミングで解析処理を実施してもよい。所定のタイミングは、リハビリの経過に応じたタイミングであり、例えば、以下である。
 ・リハビリが終わったタイミング
 ・リハビリ中に複数の訓練がある場合、少なくとも1つの訓練が終わったタイミング
The analysis process by the server 20 is not limited to the one performed in response to the analysis instruction input from the user. The server 20 may perform the analysis process at a predetermined timing. The predetermined timing is a timing according to the progress of rehabilitation, and is, for example, as follows.
・ Timing when rehabilitation is finished ・ When there are multiple trainings during rehabilitation, timing when at least one training is finished
 サーバ20は、所定のタイミングか否かを、例えば、以下により判断する。
 ・予め設定されている時刻に到達したこと
 ・終了したことを表す通知を受信したこと
 ・最後に動作が発生してから予め設定された期間、動作が発生しないこと
The server 20 determines, for example, whether or not the timing is predetermined by the following.
-The preset time has been reached-The notification indicating the end has been received-The operation has not occurred for the preset period since the last operation occurred.
 処理を実施すると、サーバ20は、処理を実施する直前に記憶された動作ログを、動作ログ情報データベース2022から読み出す。サーバ20は、読み出した動作ログを時系列に沿って3次元モデルに反映させ、患者の動作を解析する。これにより、サーバ20は、リハビリの最中、又はリハビリが実施された直後に、患者の動作について解析を実施することが可能となる。これにより、患者は、自身のリハビリの効果を確認しながらリハビリを継続することができる。また、リハビリの間隔が残っているうちにリハビリの効果を確認することが可能となる。 When the process is executed, the server 20 reads the operation log stored immediately before the process is executed from the operation log information database 2022. The server 20 reflects the read operation log in the three-dimensional model in chronological order and analyzes the patient's operation. This allows the server 20 to analyze the patient's behavior during or immediately after the rehabilitation. As a result, the patient can continue the rehabilitation while confirming the effect of his / her rehabilitation. In addition, it is possible to confirm the effect of rehabilitation while the interval between rehabilitation remains.
 ステップS802において、サーバ20は、解析結果を例えば診療情報データベース2023に記憶する。診療情報データベース2023に記憶された解析結果は、医師、又は患者などのユーザからの要求に応じ、要求元のユーザへ提示される。これにより、患者、または医師は、患者のリハビリの効果などを確認することができる。 In step S802, the server 20 stores the analysis result in, for example, the medical information database 2023. The analysis result stored in the medical information database 2023 is presented to the requesting user in response to a request from a user such as a doctor or a patient. This allows the patient or the doctor to confirm the effect of the patient's rehabilitation.
 <5 画面例>
 図9は、被測定者の3次元モデルにおいて、現在の動きと、過去の所定時点の動きとを比較した結果をユーザに提示する画面例を示す図である。
<5 screen example>
FIG. 9 is a diagram showing an example of a screen that presents to the user the result of comparing the current movement with the movement at a predetermined time in the past in the three-dimensional model of the person to be measured.
 図9において、結果表示画面901は、ユーザの端末装置が、サーバ20が3次元モデルの動きを解析した結果を提示するための画面を示す。 In FIG. 9, the result display screen 901 shows a screen for the user's terminal device to present the result of analyzing the movement of the three-dimensional model by the server 20.
 過去結果画面902は、ユーザの端末装置が、サーバ20が過去の所定時点において、撮影装置40により撮影された被測定者の動作情報、およびセンサ50で検出される被測定者のセンシング情報を取得し、取得した情報に基づいて3次元モデルを操作した結果を表示するための画面を示す。 In the past result screen 902, the user's terminal device acquires the operation information of the measured person photographed by the photographing device 40 and the sensing information of the measured person detected by the sensor 50 at a predetermined time in the past by the server 20. Then, a screen for displaying the result of operating the three-dimensional model based on the acquired information is shown.
 現在結果画面903は、ユーザの端末装置が、サーバ20が現在、例えば、リハビリを実施した最新の日付において、撮影装置40により撮影された被測定者の動作情報、およびセンサ50で検出されるセンシング情報を取得し、取得した情報に基づいて3次元モデルを操作した結果を表示するための画面を示す。 The current result screen 903 shows the motion information of the person to be measured taken by the photographing device 40 and the sensing detected by the sensor 50 on the latest date when the server 20 is currently performing rehabilitation by the user's terminal device. A screen for acquiring information and displaying the result of operating a three-dimensional model based on the acquired information is shown.
 解析結果表示画面904は、ユーザの端末装置が、サーバ20が被測定者の過去の所定時点の3次元モデルの動きを解析した結果と、現在の3次元モデルの動きを解析した結果とを比較した結果を表示する画面を示す。例えば、ユーザの端末装置は、過去の所定時点の解析結果と比較して被測定者の体の動きが改善、悪化含めどのように変化したか、などの情報と、今後の治療方針などの情報とを、当該画面に表示してもよい。ある局面において、ユーザの端末装置は、サーバ20に記憶されている診療情報データベース2023を参照した情報を当該画面に表示してもよい。また、ユーザの端末装置は、医師などにより入力された診療内容、治療方針などの情報を当該画面に表示してもよい。 The analysis result display screen 904 compares the result of the server 20 analyzing the movement of the 3D model at a predetermined time in the past with the result of the analysis of the movement of the current 3D model by the user's terminal device. Shows the screen that displays the result. For example, the user's terminal device has information such as how the body movement of the subject has improved or deteriorated compared to the analysis results at a predetermined time in the past, and information such as future treatment policy. May be displayed on the screen. In a certain aspect, the user's terminal device may display information referring to the medical information database 2023 stored in the server 20 on the screen. In addition, the user's terminal device may display information such as medical treatment details and treatment policies input by a doctor or the like on the screen.
 これにより、被測定者、または医師などは、現在のリハビリの解析結果と、直近のリハビリの解析結果の比較だけでなく、リハビリを開始した時の被測定者の解析結果などとも比較することができる。そのため、被測定者、または医師などは、リハビリの効果が大きかった時期、小さかった時期などを確認することができ、治療方針などの検討に反映することができる。 As a result, the person to be measured or a doctor can compare not only the analysis result of the current rehabilitation with the analysis result of the latest rehabilitation but also the analysis result of the person to be measured when the rehabilitation is started. can. Therefore, the person to be measured, a doctor, or the like can confirm the time when the rehabilitation effect was large or small, and can reflect it in the examination of the treatment policy or the like.
 図10は、被測定者の3次元モデルの動きと、被測定者と同じ疾病に罹患し、同様の治療を経験した他の患者の3次元モデルの動きとを比較した結果をユーザに提示する画面例を示す図である。 FIG. 10 presents to the user the result of comparing the movement of the 3D model of the subject with the movement of the 3D model of another patient who suffered from the same disease as the subject and experienced similar treatment. It is a figure which shows the screen example.
 図10において、結果表示画面1001は、図9における結果表示画面901と同様に、ユーザの端末装置が、サーバ20が3次元モデルの動きを解析した結果を提示するための画面を示す。 In FIG. 10, the result display screen 1001 shows a screen for the user's terminal device to present the result of analyzing the movement of the three-dimensional model by the user's terminal device, similarly to the result display screen 901 in FIG.
 被測定者結果表示画面1002は、ユーザの端末装置が、サーバ20が所定時点において、撮影装置40により撮影された被測定者の動作情報、およびセンサ50で検出される被測定者のセンシング情報を取得し、取得した情報に基づいて3次元モデルを操作した結果を表示するための画面を示す。 On the measured person result display screen 1002, the user's terminal device displays the operation information of the measured person photographed by the photographing device 40 and the sensing information of the measured person detected by the sensor 50 at a predetermined time point on the server 20. The screen for displaying the result of having acquired and operating the 3D model based on the acquired information is shown.
 他の被測定者結果表示画面1003は、サーバ20が所定時点において、撮影装置40により撮影された、被測定者と同じ疾病に罹患し、同様の治療を経験した他の患者のうち、経過が良好だった患者の動作情報、およびセンサ50で検出される当該患者のセンシング情報を取得し、取得した情報に基づいて3次元モデルを操作した結果を、ユーザの端末装置が表示するための画面を示す。 The other patient result display screen 1003 shows the progress of the other patients who suffered from the same disease as the subject and experienced the same treatment, which were photographed by the imaging device 40 at a predetermined time point on the server 20. A screen for the user's terminal device to display the result of operating the 3D model based on the acquired good motion information of the patient and the sensing information of the patient detected by the sensor 50. show.
 比較解析結果表示画面1004は、サーバ20が被測定者の3次元モデルの動きと、被測定者と同じ疾病に罹患し、同様の治療を経験した他の患者のうち、経過が良好だった患者の3次元モデルの動きとを比較した結果を、ユーザの端末装置が表示する画面を示す。例えば、ユーザの端末装置は、被測定者に対し、同じ疾病に罹患し、同様の治療を経験した他の患者の解析結果と比較した結果を当該画面に表示することで、可動域、対象部位の筋肉量、協調運動の程度等の変化、推奨する治療法などの情報を被測定者に提示してもよい。 The comparative analysis result display screen 1004 shows the movement of the 3D model of the subject and the patient who suffered from the same disease as the subject and had a good course among other patients who experienced the same treatment. The screen which displays the result of comparison with the movement of the 3D model of the user's terminal device is shown. For example, the user's terminal device displays on the screen the result of comparison with the analysis results of other patients who suffered from the same disease and experienced the same treatment, so that the range of motion and the target site can be measured. Information such as changes in muscle mass, degree of coordinated exercise, and recommended treatment methods may be presented to the subject.
 これにより、被測定者は、同じ疾病に罹患した患者の中でも、最もリハビリ効果の出た患者の結果を参照することができ、リハビリへのモチベーションの向上が期待できる。 As a result, the subject can refer to the results of the patients who have the most rehabilitation effect among the patients suffering from the same disease, and it is expected that the motivation for rehabilitation will be improved.
 図11は、被測定者の3次元モデルの動きと、被測定者と同じ疾病に罹患し、異なる治療を経験した他の患者のうち、経過が良好だった患者の3次元モデルの動きとを比較した結果をユーザに提示する画面例を示す図である。 FIG. 11 shows the movement of the 3D model of the subject and the movement of the 3D model of the patient who had the same disease as the subject and experienced different treatments and had a good course. It is a figure which shows the screen example which presents the comparison result to the user.
 図11において、結果表示画面1101は、図10における結果表示画面1001と同様に、ユーザの端末装置が、サーバ20が3次元モデルの動きを解析した結果を提示するための画面を示す。 In FIG. 11, the result display screen 1101 shows a screen for the user's terminal device to present the result of analyzing the movement of the three-dimensional model by the user's terminal device, similarly to the result display screen 1001 in FIG.
 被測定者結果表示画面1102は、サーバ20が所定時点において、撮影装置40により撮影された被測定者の動作情報、およびセンサ50で検出される被測定者のセンシング情報を取得し、取得した情報に基づいて3次元モデルを操作した結果を、ユーザの端末装置が表示するための画面を示す。 The measured person result display screen 1102 acquires the operation information of the measured person photographed by the photographing device 40 and the sensing information of the measured person detected by the sensor 50 at a predetermined time point, and the acquired information. A screen for displaying the result of operating the three-dimensional model based on the user's terminal device is shown.
 別治療の被測定者結果表示画面1103は、サーバ20が所定時点において、撮影装置40により撮影された、被測定者と同じ疾病に罹患し、異なる治療を経験した他の患者のうち、経過が良好だった患者の動作情報、およびセンサ50で検出される当該患者のセンシング情報を取得し、取得した情報に基づいて3次元モデルを操作した結果を、ユーザの端末装置が表示するための画面を示す。 The measurement subject result display screen 1103 of another treatment shows the progress of the other patients who suffered from the same disease as the subject and experienced different treatments, which were photographed by the imaging device 40 at a predetermined time point on the server 20. A screen for the user's terminal device to display the result of operating the 3D model based on the acquired patient motion information and the patient's sensing information detected by the sensor 50 and the acquired information. show.
 比較解析結果表示画面1104は、サーバ20が被測定者の3次元モデルを解析した結果と、被測定者と同じ疾病に罹患し、異なる治療を経験した他の患者のうち、経過が良好だった患者の3次元モデルの動きを解析した結果を比較した結果を、ユーザの端末装置が表示する画面を示す。例えば、ユーザの端末装置は、被測定者に対し、同じ疾病に罹患し、異なる治療を経験した他の患者の結果と比較した結果を当該画面に表示することで、可動域、対象部位の筋肉量、協調運動の程度等の変化、推奨する治療法などの情報を被測定者に提示してもよい。 The comparative analysis result display screen 1104 showed a good course among the results of the server 20 analyzing the three-dimensional model of the subject and the other patients who suffered from the same disease as the subject and experienced different treatments. A screen is shown on which the user's terminal device displays the result of comparing the results of analyzing the movement of the patient's three-dimensional model. For example, the user's terminal device displays on the screen the result of comparison with the results of other patients who suffered from the same disease and experienced different treatments, so that the range of motion and the muscles of the target site can be measured. Information such as the amount, changes in the degree of coordinated movement, and recommended treatment methods may be presented to the subject.
 これにより、患者、または医師などは、現在受けている治療法だけでなく、他の治療法の効果も比較検討することができるため、より効果的にリハビリを実施するための治療方針などを検討することができる。 As a result, patients or doctors can compare and examine the effects of other treatment methods as well as the treatment methods they are currently receiving, so they can consider treatment policies for more effective rehabilitation. can do.
 <6 変形例>
 上記実施形態では、ユーザに提示される画面で、3次元モデルが表示される場合を説明した。しかしながら、ユーザに提示される画面に、3次元モデルが表示されなくても構わない。サーバ20は、動きについての情報をグラフ形式で提示してもよい。
<6 Modification example>
In the above embodiment, the case where the three-dimensional model is displayed on the screen presented to the user has been described. However, the 3D model may not be displayed on the screen presented to the user. The server 20 may present information about movement in graph format.
 <付記>
 以上の各実施形態で説明した事項を以下に付記する。
<Additional Notes>
The matters described in each of the above embodiments are added below.
  (付記1)
 プロセッサ29と、メモリ25とを備えるコンピュータに実行させるためのプログラムであって、プログラムは、プロセッサ29に、被測定者の個人情報を取得するステップ(S601)と、取得した個人情報に基づいて3次元モデルを作成するステップ(S602)と、体の一部にモーションセンサ50が装着された被測定者についての撮影画像が解析されて得られる、被測定者の動作情報を取得するステップ(S701)と、モーションセンサ50のセンシング情報を取得するステップ(S701)と、動作情報及びセンシング情報に基づいて3次元モデルを動かすステップ(S702)と、3次元モデルの動きを解析するステップ(S801)と、をプロセッサ29に実行させるプログラム。
(Appendix 1)
A program for causing a computer including the processor 29 and the memory 25 to execute the program, the program has the step (S601) of acquiring the personal information of the person to be measured by the processor 29, and 3 based on the acquired personal information. A step of creating a three-dimensional model (S602) and a step of acquiring motion information of the person to be measured obtained by analyzing a photographed image of the person to be measured having a motion sensor 50 attached to a part of the body (S701). A step of acquiring the sensing information of the motion sensor 50 (S701), a step of moving the 3D model based on the motion information and the sensing information (S702), and a step of analyzing the movement of the 3D model (S801). A program that causes the processor 29 to execute.
  (付記2)
動作情報を取得するステップ(S701)において、体が写っている動画を入力とし、写っている体の動きに関する情報を出力するように学習された学習済みモデルを用いて動作情報を取得する、付記1に記載のプログラム。
(Appendix 2)
In the step (S701) of acquiring motion information, motion information is acquired using a trained model that has been trained to output information about the motion of the body in which a moving image of the body is captured as an input. The program described in 1.
  (付記3)
 モーションセンサ50は、被測定者の体のうち、測定対象となる領域に1または複数装着される、付記1または2に記載のプログラム。
(Appendix 3)
The program according to Supplementary Note 1 or 2, wherein the motion sensor 50 is attached to one or more areas of the body of the person to be measured to be measured.
  (付記4)
 解析するステップ(S801)において、期間の離れた複数の時点で記憶された、3次元モデルの動きを比較する、付記1から3のいずれかに記載のプログラム。
(Appendix 4)
The program according to any one of Supplementary note 1 to 3, which compares the movements of the three-dimensional model stored at a plurality of time points at different time points in the analysis step (S801).
  (付記5)
 解析するステップ(S801)において、期間の離れた複数の時点で記憶された動きを比較することでリハビリの効果を評価する、付記4に記載のプログラム。
(Appendix 5)
The program according to Appendix 4, which evaluates the effect of rehabilitation by comparing the movements stored at a plurality of time points with different periods in the analysis step (S801).
  (付記6)
 解析するステップ(S801)において、被測定者の3次元モデルについて記憶されている動きと、被測定者とは異なる他の被測定者の3次元モデルについて記憶されている動きとを比較する、付記1から3のいずれかに記載のプログラム。
(Appendix 6)
In the step of analysis (S801), the motion memorized for the 3D model of the person to be measured is compared with the motion memorized for the 3D model of another person to be measured different from the person to be measured. The program described in any of 1 to 3.
  (付記7)
 解析するステップ(S801)において、被測定者とは異なる被測定者は、被測定者と同じ疾病と診断され、同様の治療を施され、治療の経過が被測定者の治療の経過と比較して良好だった者である、付記6記載のプログラム。
(Appendix 7)
In the analysis step (S801), the subject different from the subject is diagnosed with the same disease as the subject, is given the same treatment, and the course of treatment is compared with the course of treatment of the subject. The program described in Appendix 6, which is a good person.
  (付記8)
 解析するステップ(S801)において、被測定者とは異なる被測定者は、被測定者と同じ疾病と診断され、異なる治療を施された者である、付記6記載のプログラム。
(Appendix 8)
The program according to Appendix 6, wherein in the step of analysis (S801), the person to be measured different from the person to be measured is a person who has been diagnosed with the same disease as the person to be measured and has been treated differently.
  (付記9)
 解析するステップ(S801)において、解析対象とする時点に関する指定、および解析を実施する旨の指示を受け付け、指示に応じ、指定された時点に対応する3次元モデルの動きを解析する、付記1から8のいずれかに記載のプログラム。
(Appendix 9)
In the analysis step (S801), the designation regarding the time point to be analyzed and the instruction to perform the analysis are received, and the movement of the three-dimensional model corresponding to the designated time point is analyzed according to the instruction, from Appendix 1. The program described in any of 8.
  (付記10)
 解析するステップ(S801)において、リハビリの経過に応じたタイミングで、リハビリにおいて記憶された3次元モデルの動きを解析する、付記1から8のいずれかに記載のプログラム。
(Appendix 10)
The program according to any one of Supplementary note 1 to 8, which analyzes the movement of the three-dimensional model stored in the rehabilitation at the timing according to the progress of the rehabilitation in the analysis step (S801).
  (付記11)
 プロセッサ29と、メモリ25とを備えるコンピュータが実行する方法であって、プロセッサ29が、被測定者の個人情報を取得するステップ(S601)と、取得した個人情報に基づいて3次元モデルを作成するステップ(S602)と、体の一部にモーションセンサ50が装着された被測定者についての撮影画像が解析されて得られる、被測定者の動作情報を取得するステップ(S701)と、モーションセンサ50のセンシング情報を取得するステップ(S701)と、動作情報及びセンシング情報に基づいて3次元モデルを動かすステップ(S702)と、3次元モデルの動きを解析するステップ(S801)と、を実行する方法。
(Appendix 11)
A method executed by a computer including the processor 29 and the memory 25, wherein the processor 29 creates a three-dimensional model based on the step (S601) of acquiring the personal information of the person to be measured and the acquired personal information. The step (S602), the step (S701) of acquiring the motion information of the person to be measured obtained by analyzing the photographed image of the person to be measured having the motion sensor 50 attached to a part of the body, and the motion sensor 50. A method of executing a step of acquiring the sensing information (S701), a step of moving the 3D model based on the motion information and the sensing information (S702), and a step of analyzing the motion of the 3D model (S801).
  (付記12)
 制御部203を備える情報処理装置20であって、制御部203が、被測定者の個人情報を取得するステップ(S601)と、取得した個人情報に基づいて3次元モデルを作成するステップ(S602)と、体の一部にモーションセンサ50が装着された被測定者についての撮影画像が解析されて得られる、被測定者の動作情報を取得するステップ(S701)と、モーションセンサ50のセンシング情報を取得するステップ(S701)と、動作情報及びセンシング情報に基づいて3次元モデルを動かすステップ(S702)と、3次元モデルの動きを解析するステップ(S801)と、を実行する情報処理装置。
(Appendix 12)
In the information processing device 20 including the control unit 203, the control unit 203 acquires a personal information of the person to be measured (S601) and a step of creating a three-dimensional model based on the acquired personal information (S602). The step (S701) of acquiring the motion information of the person to be measured obtained by analyzing the photographed image of the person to be measured having the motion sensor 50 attached to a part of the body, and the sensing information of the motion sensor 50. An information processing device that executes a step of acquiring (S701), a step of moving a three-dimensional model based on motion information and sensing information (S702), and a step of analyzing the motion of the three-dimensional model (S801).
  (付記13)
 被測定者の個人情報を取得する手段(S601)と、取得した個人情報に基づいて3次元モデルを作成する手段(S602)と、体の一部にモーションセンサ50が装着された被測定者についての撮影画像が解析されて得られる、被測定者の動作情報を取得する手段(S701)と、モーションセンサ50のセンシング情報を取得する手段(S701)と、動作情報及びセンシング情報に基づいて3次元モデルを動かす手段(S702)と、3次元モデルの動きを解析する手段(S801)と、を備えるシステム。
(Appendix 13)
About the means for acquiring the personal information of the person to be measured (S601), the means for creating a three-dimensional model based on the acquired personal information (S602), and the person to be measured having the motion sensor 50 attached to a part of the body. A means for acquiring motion information of the person to be measured (S701) obtained by analyzing the captured image of the above, a means for acquiring sensing information of the motion sensor 50 (S701), and three dimensions based on the motion information and the sensing information. A system including a means for moving a model (S702) and a means for analyzing the movement of a three-dimensional model (S801).
 20 サーバ、22 通信IF、23 入出力IF、25 メモリ、26 ストレージ、29プロセッサ、30 エッジサーバ、40 撮影装置、50 センサ、80 ネットワーク、201 通信部、202 制御部、203 通信部、2021 患者情報データベース、2022 動作ログ情報データベース、2023 診療情報データベース。

 
20 servers, 22 communication IFs, 23 input / output IFs, 25 memories, 26 storages, 29 processors, 30 edge servers, 40 imaging devices, 50 sensors, 80 networks, 201 communication units, 202 control units, 203 communication units, 2021 patient information Database, 2022 operation log information database, 2023 medical information database.

Claims (13)

  1.  プロセッサと、メモリとを備えるコンピュータに実行させるためのプログラムであって、前記プログラムは、前記プロセッサに、
     被測定者の個人情報を取得するステップと、
     前記取得した個人情報に基づいて3次元モデルを作成するステップと、
     体の一部にモーションセンサが装着された前記被測定者についての撮影画像が解析されて得られる、前記被測定者の動作情報を取得するステップと、
     前記モーションセンサのセンシング情報を取得するステップと、
     前記動作情報及び前記センシング情報に基づいて前記3次元モデルを動かすステップと、
     前記3次元モデルの動きを解析するステップと、
    を前記プロセッサに実行させるプログラム。
    A program for causing a computer having a processor and a memory to execute the program, wherein the program causes the processor to execute the program.
    Steps to acquire personal information of the person to be measured and
    Steps to create a 3D model based on the acquired personal information,
    A step of acquiring motion information of the person to be measured, which is obtained by analyzing a photographed image of the person to be measured having a motion sensor attached to a part of the body.
    The step of acquiring the sensing information of the motion sensor and
    A step of moving the three-dimensional model based on the operation information and the sensing information,
    Steps to analyze the movement of the 3D model and
    A program that causes the processor to execute.
  2.  前記動作情報を取得するステップにおいて、体が写っている動画を入力とし、写っている体の動きに関する情報を出力するように学習された学習済みモデルを用いて前記動作情報を取得する、請求項1に記載のプログラム。 Claimed to acquire the motion information by using a trained model trained to input a moving image of a body and output information about the motion of the body in the step of acquiring the motion information. The program described in 1.
  3.  前記モーションセンサは、前記被測定者の体のうち、測定対象となる部位に1または複数装着される、請求項1または2に記載のプログラム。 The program according to claim 1 or 2, wherein the motion sensor is attached to one or a plurality of parts of the body of the person to be measured to be measured.
  4.  前記解析するステップにおいて、期間の離れた複数の時点で記憶された、前記3次元モデルの動きを比較する、請求項1から3のいずれかに記載のプログラム。 The program according to any one of claims 1 to 3, which compares the movements of the three-dimensional model stored at a plurality of time points with different periods in the analysis step.
  5.  前記解析するステップにおいて、期間の離れた複数の時点で記憶された動きを比較することでリハビリの効果を評価する、請求項4に記載のプログラム。 The program according to claim 4, wherein in the step of the analysis, the effect of rehabilitation is evaluated by comparing the movements stored at a plurality of time points with different periods.
  6.  前記解析するステップにおいて、前記被測定者の3次元モデルについて記憶されている動きと、前記被測定者とは異なる他の被測定者の3次元モデルについて記憶されている動きとを比較する、請求項1から3のいずれかに記載のプログラム。 In the step of analysis, the movement stored for the 3D model of the person to be measured is compared with the movement stored for the 3D model of another person to be measured, which is different from the person to be measured. The program according to any one of Items 1 to 3.
  7.  前記解析するステップにおいて、前記被測定者とは異なる被測定者は、前記被測定者と同じ疾病と診断され、同様の治療を施され、治療の経過が前記被測定者の治療の経過と比較して良好だった者である、請求項6記載のプログラム。 In the step of analysis, a person to be measured different from the person to be measured is diagnosed with the same disease as the person to be measured, is given the same treatment, and the course of treatment is compared with the course of treatment of the person to be measured. The program according to claim 6, which is a good person.
  8.  前記解析するステップにおいて、前記被測定者とは異なる被測定者は、前記被測定者と同じ疾病と診断され、異なる治療を施された者である、請求項6記載のプログラム。 The program according to claim 6, wherein in the step of the analysis, the person to be measured different from the person to be measured is a person who has been diagnosed with the same disease as the person to be measured and has been treated differently.
  9.  前記解析するステップにおいて、解析対象とする時点に関する指定、および解析を実施する旨の指示を受け付け、前記指示に応じ、前記指定された時点に対応する前記3次元モデルの動きを解析する、請求項1から8のいずれかに記載のプログラム。 A claim that, in the step of analysis, a designation regarding a time point to be analyzed and an instruction to perform analysis are received, and the movement of the three-dimensional model corresponding to the designated time point is analyzed in response to the instruction. The program according to any one of 1 to 8.
  10.  前記解析するステップにおいて、リハビリの経過に応じたタイミングで、前記リハビリにおいて記憶された前記3次元モデルの動きを解析する、請求項1から8のいずれかに記載のプログラム。 The program according to any one of claims 1 to 8, which analyzes the movement of the three-dimensional model stored in the rehabilitation at the timing corresponding to the progress of the rehabilitation in the analysis step.
  11.  プロセッサと、メモリとを備えるコンピュータが実行する方法であって、前記プロセッサが、
     被測定者の個人情報を取得するステップと、
     前記取得した個人情報に基づいて3次元モデルを作成するステップと、
     体の一部にモーションセンサが装着された前記被測定者についての撮影画像が解析されて得られる、前記被測定者の動作情報を取得するステップと、
     前記モーションセンサのセンシング情報を取得するステップと、
     前記3次元モデルを前記動作情報及び前記センシング情報に基づいて動かすステップと、
     前記3次元モデルの動きを解析するステップと、
    を実行する方法。
    A method performed by a computer comprising a processor and memory, wherein the processor is:
    Steps to acquire personal information of the person to be measured and
    Steps to create a 3D model based on the acquired personal information,
    A step of acquiring motion information of the person to be measured, which is obtained by analyzing a photographed image of the person to be measured having a motion sensor attached to a part of the body.
    The step of acquiring the sensing information of the motion sensor and
    A step of moving the three-dimensional model based on the operation information and the sensing information, and
    Steps to analyze the movement of the 3D model and
    How to run.
  12.  制御部を備える情報処理装置であって、前記制御部が、
     被測定者の個人情報を取得するステップと、
     前記取得した個人情報に基づいて3次元モデルを作成するステップと、
     体の一部にモーションセンサが装着された前記被測定者についての撮影画像が解析されて得られる、前記被測定者の動作情報を取得するステップと、
     前記モーションセンサのセンシング情報を取得するステップと、
     前記動作情報及び前記センシング情報に基づいて前記3次元モデルを動かすステップと、
     前記3次元モデルの動きを解析するステップと、
    を実行する情報処理装置。
    An information processing device including a control unit, wherein the control unit
    Steps to acquire personal information of the person to be measured and
    Steps to create a 3D model based on the acquired personal information,
    A step of acquiring motion information of the person to be measured, which is obtained by analyzing a photographed image of the person to be measured having a motion sensor attached to a part of the body.
    The step of acquiring the sensing information of the motion sensor and
    A step of moving the three-dimensional model based on the operation information and the sensing information,
    Steps to analyze the movement of the 3D model and
    Information processing device that executes.
  13.  被測定者の個人情報を取得する手段と、
     前記取得した個人情報に基づいて3次元モデルを作成する手段と、
     体の一部にモーションセンサが装着された前記被測定者についての撮影画像が解析されて得られる、前記被測定者の動作情報を取得する手段と、
     前記モーションセンサのセンシング情報を取得する手段と、
     前記動作情報及び前記センシング情報に基づいて前記3次元モデルを動かす手段と、
     前記3次元モデルの動きを解析する手段と、
    を備えるシステム。

     
    A means of acquiring personal information of the person to be measured,
    A means of creating a three-dimensional model based on the acquired personal information,
    A means for acquiring motion information of the person to be measured, which is obtained by analyzing a photographed image of the person to be measured having a motion sensor attached to a part of the body.
    A means for acquiring the sensing information of the motion sensor and
    A means for moving the three-dimensional model based on the operation information and the sensing information, and
    A means for analyzing the movement of the three-dimensional model and
    A system equipped with.

PCT/JP2021/015226 2020-11-18 2021-04-12 Program, method, information processing device, and system WO2022107355A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-191366 2020-11-18
JP2020191366A JP6869417B1 (en) 2020-11-18 2020-11-18 Programs, methods, information processing equipment, systems

Publications (1)

Publication Number Publication Date
WO2022107355A1 true WO2022107355A1 (en) 2022-05-27

Family

ID=75801884

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/015226 WO2022107355A1 (en) 2020-11-18 2021-04-12 Program, method, information processing device, and system

Country Status (2)

Country Link
JP (2) JP6869417B1 (en)
WO (1) WO2022107355A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012083955A (en) * 2010-10-12 2012-04-26 Nippon Telegr & Teleph Corp <Ntt> Motion model learning device, three-dimensional attitude estimation device, motion model learning method, three-dimensional attitude estimation method and program
JP2015061577A (en) * 2013-01-18 2015-04-02 株式会社東芝 Movement-information processing device
JP2016112108A (en) * 2014-12-12 2016-06-23 カシオ計算機株式会社 Exercise information display system, exercise information display method, and exercise information display program
WO2020208922A1 (en) * 2019-04-09 2020-10-15 パナソニックIpマネジメント株式会社 Lower limb muscle strength estimation system, lower limb muscle strength estimation method, and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012083955A (en) * 2010-10-12 2012-04-26 Nippon Telegr & Teleph Corp <Ntt> Motion model learning device, three-dimensional attitude estimation device, motion model learning method, three-dimensional attitude estimation method and program
JP2015061577A (en) * 2013-01-18 2015-04-02 株式会社東芝 Movement-information processing device
JP2016112108A (en) * 2014-12-12 2016-06-23 カシオ計算機株式会社 Exercise information display system, exercise information display method, and exercise information display program
WO2020208922A1 (en) * 2019-04-09 2020-10-15 パナソニックIpマネジメント株式会社 Lower limb muscle strength estimation system, lower limb muscle strength estimation method, and program

Also Published As

Publication number Publication date
JP6869417B1 (en) 2021-05-12
JP2022080362A (en) 2022-05-30
JP2022080824A (en) 2022-05-30

Similar Documents

Publication Publication Date Title
US20200401224A1 (en) Wearable joint tracking device with muscle activity and methods thereof
CN103889520B (en) A kind of non-intrusion type motion tracking for patient&#39;s implementation status with strengthen patient implementation physical treatment system
EP2726164B1 (en) Augmented-reality range-of-motion therapy system and method of operation thereof
US8029411B2 (en) Systems and methods of monitoring exercises and ranges of motion
US20150327794A1 (en) System and method for detecting and visualizing live kinetic and kinematic data for the musculoskeletal system
JP7057589B2 (en) Medical information processing system, gait state quantification method and program
CN104412269A (en) Method and apparatus for neuromotor rehabilitation using interactive setting systems
Kontadakis et al. Gamified platform for rehabilitation after total knee replacement surgery employing low cost and portable inertial measurement sensor node
TW201909058A (en) Activity support method, program, activity support system
Zhang et al. Architecture and design of a wearable robotic system for body posture monitoring, correction, and rehabilitation assist
Ianculescu et al. A smart assistance solution for remotely monitoring the orthopaedic rehabilitation process using wearable technology: Re. flex system
Rahman Multimedia environment toward analyzing and visualizing live kinematic data for children with hemiplegia
US11179065B2 (en) Systems, devices, and methods for determining an overall motion and flexibility envelope
US20230240594A1 (en) Posture assessment program, posture assessment apparatus, posture assessment method, and posture assessment system
Domb Wearable devices and their Implementation in various domains
JP6884306B1 (en) System, method, information processing device
JP2017529929A (en) Trajectory mapping of the anatomical part of the human or animal body
US20190117129A1 (en) Systems, devices, and methods for determining an overall strength envelope
WO2022107355A1 (en) Program, method, information processing device, and system
Ongvisatepaiboon et al. Smartphone-based audio-biofeedback system for shoulder joint tele-rehabilitation
Senington et al. Validity and reliability of innovative field measurements of tibial accelerations and spinal kinematics during cricket fast bowling
KR20190112988A (en) Apparatus and method for measuring physical exercise ability of rehabilitation patient using motion recognition band
JP6741892B1 (en) Measuring system, method, program
Huang et al. Evaluating power rehabilitation actions using a fuzzy inference method
RO133122A0 (en) System for monitoring movement in real time for exercises of kinetotherapy and monitoring method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21894232

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21894232

Country of ref document: EP

Kind code of ref document: A1