WO2020208944A1 - Système support de comportement et procédé de support de comportement - Google Patents

Système support de comportement et procédé de support de comportement Download PDF

Info

Publication number
WO2020208944A1
WO2020208944A1 PCT/JP2020/006339 JP2020006339W WO2020208944A1 WO 2020208944 A1 WO2020208944 A1 WO 2020208944A1 JP 2020006339 W JP2020006339 W JP 2020006339W WO 2020208944 A1 WO2020208944 A1 WO 2020208944A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
unit
support system
walking
content
Prior art date
Application number
PCT/JP2020/006339
Other languages
English (en)
Japanese (ja)
Inventor
貴拓 相原
嵩 内田
健一 入江
太一 濱塚
松村 吉浩
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Priority to CN202080016860.8A priority Critical patent/CN113473901B/zh
Priority to JP2021513507A priority patent/JP7182319B2/ja
Publication of WO2020208944A1 publication Critical patent/WO2020208944A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance

Definitions

  • the present invention relates to a behavior support system and a behavior support method.
  • Patent Document 1 discloses a life management system that analyzes a user's mental and physical condition so that a user's goal is achieved.
  • an object of the present invention is to provide a behavior support system and a behavior support method that can encourage the user to take effective actions for promoting health.
  • the behavior support system stores a measuring unit for measuring the walking feature amount of the user and time-series data of the feature amount measured by the measuring unit.
  • the estimation unit that estimates the change in the physical condition of the user based on the time series data, the estimation result by the estimation unit, and the behavioral habit of the user. It includes an analysis unit that determines the action content to be performed, and a presentation unit that presents the action content determined by the analysis unit to the user.
  • the behavior support method includes a step of measuring a user's walking feature amount and a step of generating time-series data of the feature amount by storing the measured feature amount in a storage unit.
  • one aspect of the present invention can be realized as a program for causing a computer to execute the above-mentioned action support method.
  • it can be realized as a computer-readable non-temporary recording medium in which the program is stored.
  • FIG. 1 is a diagram showing an outline of an action support system according to an embodiment.
  • FIG. 2 is a block diagram showing a functional configuration of the action support system according to the embodiment.
  • FIG. 3 is a flowchart showing the operation of the action support system according to the embodiment.
  • FIG. 4 is a diagram showing an example of an input screen for causing the user to input a behavior habit, which is displayed by the behavior support system according to the embodiment.
  • FIG. 5 is a diagram showing a physical condition estimation table stored in the behavior support system according to the embodiment.
  • FIG. 6 is a diagram showing an exercise recommendation table generated by the behavior support system according to the embodiment.
  • FIG. 7 is a diagram showing a meal recommendation table generated by the behavior support system according to the embodiment.
  • FIG. 1 is a diagram showing an outline of an action support system according to an embodiment.
  • FIG. 2 is a block diagram showing a functional configuration of the action support system according to the embodiment.
  • FIG. 3 is a flowchart showing the operation of the action support system
  • FIG. 8 is a diagram showing an example of exercise content recommended to the user displayed by the behavior support system according to the embodiment.
  • FIG. 9 is a diagram showing an example of meal contents and advertisement information recommended to the user displayed by the behavior support system according to the embodiment.
  • FIG. 10 is a diagram showing an example of a search result of the action content recommended to the user displayed by the action support system according to the embodiment.
  • each figure is a schematic view and is not necessarily exactly illustrated. Therefore, for example, the scales and the like do not always match in each figure. Further, in each figure, substantially the same configuration is designated by the same reference numerals, and duplicate description will be omitted or simplified.
  • FIG. 1 is a diagram showing an outline of the action support system 1 according to the present embodiment.
  • the behavior support system 1 shown in FIG. 1 measures the walking features of the user 10 and analyzes the time-series data of the measured features to estimate the change in the physical condition of the user 10.
  • the feature amount of walking is, for example, at least one of walking speed, stride length, stride, walking cycle, cadence, difference between left and right steps, shaking of the trunk during walking, and change in joint angle.
  • Physical condition is at least one of balance, strength, endurance, agility and cognitive function.
  • the action support system 1 determines the action content recommended to the user 10 as the recommended content based on the estimation result and the behavior habit of the user 10, and presents the determined recommended content to the user 10.
  • Behavioral habits are habits related to at least one of exercise and diet.
  • the recommended content includes exercise content and meal content recommended for the user 10.
  • vital data of the user 10 is measured in addition to the feature amount of walking.
  • Vital data is at least one of user 10's body weight, blood pressure, pulse, urine pH and urine sugar.
  • the measurement of the feature amount of walking and the vital data is performed while the user 10 leads a normal life by using, for example, the camera 12 and the toilet 13 provided in the residence 11 of the user 10. That is, the walking feature amount and vital data of the user 10 are measured without the user 10 paying special attention.
  • the user 10 may consciously measure the feature amount and vital data by using a weight scale, a sphygmomanometer, or the like.
  • Information indicating a change in the physical condition of the user 10 estimated based on the measured feature amount and vital data is transmitted to the server device 30 via a network such as a LAN (Local Area Network) or the Internet.
  • the server device 30 determines the action content recommended to the user 10 as the recommended content by analyzing the change in the physical condition of the user 10.
  • the determined recommended content is transmitted to the terminal device 40 or the like carried by the user 10 via the network.
  • the terminal device 40 presents the recommended content to the user 10 by image or voice. Thereby, it is possible to propose to the user 10 an appropriate exercise content and an appropriate dietary content for the user 10 in order to strengthen or maintain muscle strength. In this way, the behavior support system 1 according to the present embodiment can encourage the user 10 to take effective actions for promoting health.
  • FIG. 2 is a block diagram showing a functional configuration of the action support system 1 according to the present embodiment.
  • the action support system 1 includes a measuring device 20, a server device 30, and a terminal device 40.
  • the measuring device 20, the server device 30, and the terminal device 40 are connected to each other so as to be able to communicate with each other via the Internet or the like.
  • the measuring device 20 is a device that estimates changes in the physical condition of the user 10.
  • the measuring device 20 is realized by, for example, a microcomputer and various sensor devices, and is provided in the residence 11 of the user 10. As shown in FIG. 2, the measuring device 20 includes a communication unit 21, a storage unit 22, a measuring unit 23, and an estimating unit 28.
  • the communication unit 21 is realized by one or more communication interfaces that perform wireless communication or wired communication.
  • Wireless communication is, for example, communication based on communication standards such as Wi-Fi (registered trademark), Bluetooth (registered trademark), or ZigBee (registered trademark), but is not limited thereto.
  • the communication unit 21 transmits and receives information and signals by communicating with each of the server device 30 and the terminal device 40. For example, the communication unit 21 transmits the estimation result of the change in the physical condition of the user 10 estimated by the estimation unit 28 to the server device 30. Further, the communication unit 21 transmits the time-series data of vital data and the exercise history to the server device 30.
  • the storage unit 22 is a semiconductor memory such as a flash memory or a non-volatile storage device such as an HDD (Hard Disk Drive).
  • the storage unit 22 stores the time-series data of the feature amount measured by the measurement unit 23. Further, the storage unit 22 stores the time series data of the vital data measured by the measurement unit 23.
  • the storage unit 22 further stores the exercise history of the user 10 generated based on the amount of activity measured by the measurement unit 23.
  • the storage unit 22 may store the recommended contents transmitted from the server device 30.
  • the measuring unit 23 measures the walking feature amount of the user 10. In addition, the measurement unit 23 measures the vital data and the amount of activity of the user 10. As shown in FIG. 2, the measurement unit 23 includes a detection unit 24, a vital measurement unit 25, a determination unit 26, and an activity amount measurement unit 27. The measuring unit 23 does not have to include at least one of the vital measuring unit 25 and the activity measuring unit 27.
  • the detection unit 24 detects the operation of the user 10. Specifically, the detection unit 24 includes an imaging unit 24a and an acceleration sensor 24b. The detection unit 24 does not have to include at least one of the image pickup unit 24a and the acceleration sensor 24b.
  • the imaging unit 24a is realized by, for example, the camera 12 shown in FIG.
  • the imaging unit 24a generates a moving image by photographing the user 10.
  • the imaging unit 24a photographs the walking motion of the user 10.
  • the camera 12 is installed at the entrance of the residence 11 and photographs the user 10 entering and exiting the residence 11.
  • the camera 12 may be installed in a corridor, stairs, a room, or the like in the residence 11.
  • the moving image captured by the camera 12 (imaging unit 24a) includes the walking motion of the user 10.
  • the acceleration sensor 24b is carried by the user 10 and detects the acceleration according to the movement of the user 10.
  • the acceleration sensor 24b is attached to a body part such as the arm, waist, leg, neck or head of the user 10.
  • the acceleration sensor 24b is fixed to a fixture attached to a body portion of the user 10.
  • the fixture is, for example, a wristband or belt.
  • the accelerometer 24b is fixed to the body part of the user 10 by attaching the attachment to the body part.
  • the number of the acceleration sensors 24b attached to the user 10 is not limited to one, and may be a plurality.
  • the acceleration sensor 24b detects the movement of the attached portion and generates three-dimensional acceleration data.
  • the three-dimensional acceleration data shows, for example, the accelerations of the user 10 in the front-rear direction, the up-down direction, and the left-right direction.
  • the vital measurement unit 25 measures the vital data of the user 10.
  • the vital measurement unit 25 is realized by at least one such as a weight scale, a body fat scale, a blood pressure monitor, a pulse rate monitor, and a urinalysis meter.
  • the weight scale and the body fat scale are, for example, embedded in the floor in front of the wash basin of the dwelling 11, and measure the weight and body fat of the user 10 who rides on the scale.
  • the urinalysis meter is attached to the toilet 13 shown in FIG. 1, for example, and measures at least one of the urine pH and urine sugar of the user 10.
  • the sphygmomanometer and the pulse meter are, for example, wristband type measuring instruments that can be worn on the arm of the user 10 and measure the blood pressure and the pulse of the user 10.
  • the determination unit 26 determines the walking feature amount of the user 10 based on the motion detected by the detection unit 24. For example, the determination unit 26 determines the feature amount of walking by analyzing the moving image obtained by the imaging unit 24a. Specifically, the gait feature amount is at least one of the walking speed, stride length, stride, walking cycle, cadence, left-right step difference, trunk sway during walking, and change amount of joint angle.
  • the determination unit 26 estimates the feature amount using the acceleration detected by the acceleration sensor 24b.
  • the acceleration sensor 24b is attached to the waist of the user 10, the movement of the waist of the user 10 during walking is detected.
  • the movement of the waist and the walking features such as walking balance and walking speed have a predetermined correlation. Therefore, the determination unit 26 determines the walking feature amount by referring to the correspondence information indicating the correspondence relationship between the waist movement and the walking feature amount.
  • Correspondence information indicating the correspondence relationship between the movement of the waist and the feature amount of walking is stored in advance in, for example, the storage unit 22.
  • the correspondence information may be stored in the database unit 32 of the server device 30.
  • the determination unit 26 is realized by, for example, a microcomputer. Specifically, the determination unit 26 is realized by a non-volatile memory in which the program is stored, a volatile memory which is a temporary storage area for executing the program, an input / output port, a processor in which the program is executed, and the like. ..
  • the activity amount measuring unit 27 measures the activity amount of the user 10. Although the activity amount measuring unit 27 is an activity amount meter, it may be a pedometer for measuring the number of steps of the user 10. The activity meter or pedometer is always carried by the user 10 to measure the daily steps or activity of the user 10. The measured number of steps or activity amount is accumulated in the storage unit 22 as an exercise history. The exercise history is obtained, for example, by continuously measuring the number of steps or the amount of activity for each day over a plurality of days.
  • the estimation unit 28 estimates the change in the physical condition of the user 10 based on the time-series data measured by the measurement unit 23 and stored in the storage unit 22. Specifically, the estimation unit 28 estimates changes in physical condition, which is at least one of balance, muscular strength, endurance, agility, and cognitive function, based on time-series data of gait features.
  • the feature amount of walking and the physical condition have a predetermined correlation. Therefore, the estimation unit 28 determines the physical state of the user 10 from the walking feature amount by referring to the correspondence information indicating the correspondence relationship between the walking feature amount and the physical state. Correspondence information indicating the correspondence relationship between the feature amount of walking and the physical state is stored in advance in, for example, the storage unit 22.
  • the estimation unit 28 estimates the change in the physical condition by referring to the corresponding information using the time-series data of the walking feature amount.
  • the change is a change when a predetermined period such as several days, one week, one month, one year has passed.
  • the estimation unit 28 injures the user 10. It is presumed that walking is temporarily hindered due to such factors. Further, when the rate of change is a positive number, the estimation unit 28 estimates that the physical condition is gradually increased. The estimation result by the estimation unit 28 is transmitted to the server device 30 via the communication unit 21.
  • the estimation unit 28 is realized by, for example, a microcomputer. Specifically, the estimation unit 28 is realized by a non-volatile memory in which the program is stored, a volatile memory which is a temporary storage area for executing the program, an input / output port, a processor for executing the program, and the like. ..
  • the server device 30 determines the action content recommended to the user 10 as the recommended content based on the change in the physical condition estimated by the measuring device 20 and the behavioral habit of the user 10.
  • the server device 30 is, for example, a computer device. As shown in FIG. 2, the server device 30 includes a communication unit 31, a database unit 32, and an analysis unit 33.
  • the communication unit 31 is realized by one or more communication interfaces that perform wireless communication or wired communication.
  • the communication unit 31 transmits / receives information and signals by communicating with each of the measuring device 20 and the terminal device 40.
  • the communication unit 31 receives the estimation result of the physical condition of the user 10 estimated by the measuring device 20 and transmitted via the communication unit 21.
  • the communication unit 31 transmits the recommended content determined by the analysis unit 33 to the measuring device 20 or the terminal device 40.
  • the communication unit 31 receives information such as the behavioral habits of the user 10 acquired by the input unit 42 of the terminal device 40.
  • the communication unit 31 may receive time-series data of walking features and time-series data of vital data from the measuring device 20.
  • the database unit 32 is a storage unit that stores a database used by the action support system 1 to determine the action content recommended to the user 10 as the recommended content. A specific example of the database will be described later.
  • the database unit 32 may store the estimation result of the change in the physical state of the user 10, and the time-series data of the walking feature amount and the time-series data of the vital data. Further, the database unit 32 may store the estimation result of the change in the physical state of each of the plurality of users 10, and the time-series data of the walking feature amount and the time-series data of the vital data. For example, by using the data of another person having characteristics similar to the user 10, the appropriateness of the recommended content can be enhanced.
  • the database unit 32 is realized by a non-volatile storage device such as a semiconductor memory or an HDD.
  • the analysis unit 33 determines the behavior content recommended to the user 10 as the recommended content based on the result of the estimation of the physical condition by the estimation unit 28 and the behavior habit of the user 10.
  • the behavioral habit of the user 10 is acquired by, for example, the input unit 42 of the terminal device 40, and is received via the communication units 41 and 31. A specific example of the method of determining the recommended content by the analysis unit 33 will be described later.
  • the analysis unit 33 is realized by, for example, a microcomputer. Specifically, the analysis unit 33 is realized by a non-volatile memory in which the program is stored, a volatile memory which is a temporary storage area for executing the program, an input / output port, a processor for executing the program, and the like. ..
  • the terminal device 40 is a device that presents to the user 10. Further, the terminal device 40 accepts the input of information from the user 10.
  • the terminal device 40 is a mobile terminal that can be carried by the user 10, such as a smartphone.
  • the terminal device 40 may be a display device such as a television installed in the residence 11 of the user 10 or an audio output device such as a smart speaker.
  • the terminal device 40 includes a communication unit 41, an input unit 42, a presentation unit 43, and a storage unit 44.
  • the communication unit 41 is realized by one or more communication interfaces that perform wireless communication or wired communication.
  • the communication unit 41 transmits and receives information and signals by communicating with each of the measuring device 20 and the server device 30. For example, the communication unit 41 receives the recommended content determined by the analysis unit 33 of the server device 30. Further, the communication unit 41 transmits information such as the behavioral habit of the user 10 acquired by the input unit 42 to the server device 30.
  • the input unit 42 accepts input from the user 10.
  • the input unit 42 is, for example, a touch sensor or a physical button.
  • the input unit 42 may be a voice input device such as a microphone.
  • the input unit 42 accepts the input of the personal information of the user 10.
  • the personal information includes, for example, the behavioral habits, tastes and lifestyles of the user 10, and attribute information such as the age and gender of the user 10.
  • the personal information input by the input unit 42 is transmitted to the measuring device 20 or the server device 30 via the communication unit 41.
  • the presentation unit 43 presents the recommended content determined by the analysis unit 33 to the user 10. In addition, the presentation unit 43 presents the advertisement information of the business operator related to the recommended content.
  • the presentation unit 43 has a display unit 43a.
  • the display unit 43a is realized by, for example, a liquid crystal display panel or an organic EL (Electroluminescence) display panel.
  • the display unit 43a displays an image representing each of the recommended content and the advertisement information.
  • the display unit 43a generates and displays a GUI (Graphical User Interface) object such as a selection button for receiving a selection and an operation from the user 10, for example.
  • GUI Graphic User Interface
  • the presentation unit 43 may have a speaker that outputs sound in place of the display unit 43a or in addition to the display unit 43a.
  • the presentation unit 43 may output a voice representing the content to be presented to the user 10.
  • the storage unit 44 is a non-volatile storage device such as a flash memory.
  • the storage unit 44 stores the recommended contents transmitted from the server device 30.
  • the storage unit 44 stores information such as the behavioral habits of the user 10 received by the input unit 42.
  • the storage unit 44 stores, for example, an image to be displayed on the display unit 43a.
  • the configuration of the behavior support system 1 is not limited to the example shown in FIG.
  • two of the measuring device 20, the server device 30, and the terminal device 40 may not be able to communicate directly.
  • communication between the server device 30 and the terminal device 40 may not be possible.
  • the server device 30 and the terminal device 40 may transmit and receive information and signals via the measuring device 20.
  • FIG. 3 is a flowchart showing the operation of the action support system 1 according to the present embodiment.
  • the input unit 42 acquires personal information (S10). Specifically, as shown in FIG. 4, the display unit 43a displays an input screen for causing the user 10 to input a behavioral habit.
  • FIG. 4 is a diagram showing an example of an input screen for causing the user 10 to input a behavior habit, which is displayed by the behavior support system 1 according to the present embodiment.
  • the display unit 43a displays the questionnaire instruction 50 prompting the user 10 to answer.
  • the display unit 43a displays a plurality of question items 51 and a plurality of answer candidates 52 for the question item 51.
  • Question 51 is a question generated by the analysis unit 33 in order to grasp the behavioral habits, tastes, and lifestyles of the user 10.
  • a radio button 53 is displayed for each answer candidate 52 as an example of a GUI object for accepting a selection from the user 10.
  • the user 10 can input an answer to the question 51 by selecting the radio button 53 corresponding to the answer candidate 52.
  • the means for inputting the behavioral habit is not limited to this, and for example, the user 10 may be made to input the answer in characters by displaying the question item 51 and the text box.
  • the measurement unit 23 performs measurement processing (S20). Specifically, the detection unit 24 measures the walking feature amount of the user 10 (S21). More specifically, the image pickup unit 24a acquires the image of the user 10 by photographing the user 10 (S22). Next, the determination unit 26 determines the walking feature amount of the user 10 based on the image obtained by the imaging unit 24a (S23). For example, the determination unit 26 acquires walking features such as the walking speed of the user 10 by image processing. The determination unit 26 may determine the walking feature amount of the user 10 instead of the image pickup by the image pickup unit 24a or based on the acceleration data obtained by the acceleration sensor 24b in addition to the image pickup.
  • the vital measurement unit 25 of the measurement unit 23 measures the vital data of the user 10 (S24). Further, the activity amount measuring unit 27 of the measuring unit 23 measures the activity amount of the user 10 (S25).
  • the gait feature amount measurement (S21), vital data measurement (S24), and activity amount measurement (S25) may be performed at the same time, or any one of them may be performed first. The execution order of these processes is not particularly limited.
  • the measurement of the feature amount of walking (S21), the measurement of vital data (S24), and the measurement of the amount of activity (S25) may be performed a plurality of times, respectively.
  • the measured values obtained by the plurality of measurements are stored in at least one of the storage unit 22 and the database unit 32 as the feature amount time-series data, the vital data time-series data, and the exercise history, respectively. Further, the acquisition of personal information (S10) may be performed after the measurement process (S20), or may be performed during the measurement process (S20).
  • the estimation unit 28 estimates the change in the physical condition of the user 10 based on the measured time-series data of the walking features (S30).
  • the analysis unit 33 determines the action content recommended to the user 10 as the recommended content based on the estimation result by the estimation unit 28 and the personal information acquired by the input unit 42 (S40). Specifically, the analysis unit 33 determines the exercise content and the meal content recommended for the user 10.
  • the presentation unit 43 presents the determined recommended content to the user 10 (S50). Specifically, the display unit 43a displays the determined recommended content. A specific example of the display will be described later.
  • FIG. 5 is a diagram showing a physical condition estimation table stored in the behavior support system 1 according to the present embodiment.
  • the physical condition estimation table is stored in the storage unit 22 of the measuring device 20. Alternatively, it may be stored in the database unit 32 of the server device 30.
  • the physical condition estimation table is prepared for each demographic attribute such as "female in 50s” and “male in 60s".
  • Demographic attributes are, for example, age and gender.
  • the physical condition estimation table associates one or more reference values with each physical condition.
  • reference values are associated with each other in five stages of “A” to “E”.
  • the reference value at each stage may be represented by one value or may be represented by a certain range.
  • A” to “E” correspond to the evaluation result of the physical condition.
  • “A” indicates that the physical condition is the best and “E” indicates that the physical condition is the worst in the order of “A” to “E”, but the opposite may be true.
  • “C” indicates that the physical condition is an average value
  • "A” and “B” indicate that the physical condition is superior to the average value
  • "D indicates that the physical condition is superior to the average value
  • "And” E "indicate that the physical condition is inferior to the average.
  • the analysis unit 33 evaluates the physical condition of the user 10 by referring to the physical condition estimation table based on the physical condition of the user 10 estimated by the estimation unit 28. Specifically, the analysis unit 33 compares each reference value of the physical condition estimation table corresponding to the age and gender to which the user 10 belongs with the physical condition of the user 10, and the feature amount of the user 10 is "A". "" To "E” is determined. For example, when the balance of the user 10 estimated by the estimation unit 28 is a value of P2 or more and less than P1, the analysis unit 33 determines the evaluation of the balance of the user 10 as “B”. The analysis unit 33 evaluates each parameter of the physical condition. As a result, for example, as shown in FIG. 6, evaluation results such as "B" and "A" can be obtained for each parameter of the physical condition of the user 10.
  • FIG. 6 is a diagram showing an exercise recommendation table generated by the behavior support system 1 according to the present embodiment. Specifically, FIG. 6 shows the relationship between the evaluation result of the physical condition by the analysis unit 33 and the exercise content recommended for the user 10.
  • the difference from the standard value is a value obtained by subtracting the standard value from the value of the physical state for each physical state estimated by the estimation unit 28.
  • the standard value is, for example, the standard value of “C”, which corresponds to the average value of the same sex and the same age group. For example, if the difference between the muscle strength and the standard value is a positive value, it means that the muscle strength is higher than the average value of the features of the same sex and the same age.
  • the difference between the muscle strength and the standard value is a negative value, it means that the muscle strength is lower than the average value of the features of the same sex and the same age group.
  • balance, flexibility, agility and endurance are above the same-sex and age-related averages.
  • the difference from the target value is a value obtained by subtracting the target value input by the user 10 from the estimated physical condition value. For example, if the difference between the muscle strength and the target value is a positive value, it means that the muscle strength exceeds the target value. If the difference between the muscle strength and the target value is a negative value, it means that the muscle strength is below the target value. In the example shown in FIG. 6, each of balance and muscle strength is below the mean of same sex and age.
  • the priority order indicates the order of physical conditions that are emphasized when determining the exercise content recommended for the user 10.
  • the priority order is determined based on, for example, the sum of the difference from the standard value and the difference from the target value. Specifically, the priority order is determined in ascending order of the sum of the differences between these two. For example, in the example shown in FIG. 6, since the sum of the differences between the two muscle strengths is the smallest at "-1", the priority order is the highest "1". Since the sum of the differences between the two balances is "0", which is the second smallest, the priority order is "2".
  • the analysis unit 33 may determine the priority order so as to raise the order of the feature amount determined to be inferior in physical condition.
  • the priority order may be determined only based on the difference from the standard value. For example, in the example shown in FIG. 6, the evaluation value of muscle strength is “C”, which is the lowest, so that the priority order is the highest “1”. In this case, when there are a plurality of physical states having the same evaluation result, the analysis unit 33 may raise the rank of the physical states having a small difference from the target value.
  • the analysis unit 33 determines the exercise content based on the priority order and the difference between the standard value or the target value of each of the plurality of physical states.
  • the database unit 32 stores an exercise database in which appropriate exercise contents are associated with each physical condition in order to improve or maintain the corresponding physical condition.
  • the amount of increase / decrease in the physical condition may be associated with the exercise intensity and the amount of exercise of the exercise content.
  • the analysis unit 33 determines the exercise content appropriate for enhancing the feature amount having a high priority as the exercise content recommended to the user 10.
  • the analysis unit 33 determines the exercise content recommended for the user 10 based on the exercise habit of the user 10.
  • the exercise habit is, for example, whether or not the user 10 exercises on a daily basis, and at least one of the frequency and amount of exercise.
  • the analysis unit 33 determines the exercise intensity and the amount of exercise of the recommended exercise content based on the exercise habit. For example, when the user 10 exercises on a daily basis, the analysis unit 33 determines the exercise content having a strong exercise intensity or a large amount of exercise. For example, when the user 10 rarely exercises on a daily basis, the analysis unit 33 determines the exercise content having a weak exercise intensity or a low amount of exercise.
  • the analysis unit 33 determines the exercise content suitable for the exercise habit of the user 10 by referring to the database.
  • the exercise habit may include the exercise preference of the user 10. For example, when there are a plurality of candidates for the exercise content recommended to the user 10, the analysis unit 33 may determine the exercise content that suits the preference of the user 10.
  • the analysis unit 33 determines the upper limit value of the exercise intensity based on the vital data, and determines the exercise content whose exercise intensity is lower than the determined upper limit value.
  • the upper limit of exercise intensity is an upper limit within a range in which there is no risk of causing a health problem due to excessive load on the body. In other words, as long as you exercise at an intensity below the upper limit, the risk of health problems is sufficiently low.
  • the upper limit is determined based on, for example, vital data. For example, when blood pressure or heart rate is high, the upper limit is low. If the blood pressure or heart rate is low, the upper limit is high. In the present embodiment, the analysis unit 33 determines the upper limit value of the exercise intensity of the user 10 based on the vital data.
  • the analysis unit 33 may determine the lower limit value of the exercise intensity based on the physical ability.
  • the lower limit of exercise intensity corresponds to the minimum amount of exercise required to improve or maintain muscle strength. That is, even if the exercise intensity is less than the lower limit, it does not contribute to the enhancement or maintenance of muscle strength.
  • the lower limit depends on physical fitness. The higher the physical ability, the higher the lower limit. The lower the physical ability, the lower the lower limit.
  • the analysis unit 33 estimates that the user 10 is injured, so that the exercise content is sufficiently low in exercise intensity. You may decide.
  • the analysis unit 33 may correct the upper limit value and the lower limit value based on the personal characteristics of the user 10. Specifically, the analysis unit 33 may correct at least one of the upper limit value and the lower limit value based on at least one of the time series data of the feature amount and the exercise history. For example, the analysis unit 33 determines whether or not the constitution of the user 10 is a constitution in which muscles are easily attached based on the time-series data of the feature amount and the exercise history, and changes the lower limit value based on the determination result. You may.
  • FIG. 7 is a diagram showing a meal recommendation table generated by the behavior support system 1 according to the present embodiment. Specifically, FIG. 7 shows the relationship between the evaluation result of the physical condition by the analysis unit 33 and the meal content recommended to the user 10.
  • the dietary recommendation table shown in FIG. 7 includes dietary calories and intakes and reference values of each of the plurality of nutrients.
  • the intake amount is determined based on, for example, the eating habits of the user 10. Specifically, the intake amount is a declared value obtained by having the user 10 input the content of the meal ingested by the user 10 as a meal habit through the input unit 42 and based on the input result.
  • the reference value is, for example, a value determined for each demographic attribute.
  • the analysis unit 33 determines the recommended values for each of the calories and the plurality of nutrients based on the difference between the intake amount and the reference value.
  • the recommended value corresponds to the difference between the declared value of the intake of the user 10 and the intake to be ingested by the user 10. That is, the recommended value indicates the amount to be increased or decreased from the amount currently ingested by the user 10 for each of the calories and the plurality of nutrients.
  • the dietary recommendation table is associated with additional criteria for each combination of physical condition, dietary calories and multiple nutrients. Additional criteria are determined based on the evaluation results of physical condition. The additional criteria are used to correct the intake of the corresponding calorie or nutrient that the user 10 should ingest. Specifically, the analysis unit 33 increases the amount of intake that the user 10 should ingest if the additional criterion is "+”. If the additional criterion is "-”, the analysis unit 33 reduces the amount of intake that the user 10 should ingest. If the additional criterion is "0", the analysis unit 33 does not correct the intake amount that the user 10 should ingest.
  • the intake of user 10 is “40" and the reference value is “50”, so the difference is "-10". That is, since the intake for protein is not enough by “10", the recommended value becomes “+10".
  • the analysis unit 33 increases the recommended value from “10” and corrects it to, for example, "+20".
  • the correction amount at this time is determined based on, for example, the number of "+”. This indicates that the user 10 should consume “+20" more than the current protein intake in order to further enhance balance, muscle strength and endurance.
  • the analysis unit 33 determines the meal content recommended for the user 10 based on the recommended values of calories and the plurality of nutrients.
  • the database unit 32 stores a meal database in which calories and a plurality of nutrients are associated with each meal menu or food. By referring to the meal database, the analysis unit 33 determines an appropriate meal menu or food according to the recommended value as the meal content recommended to the user 10.
  • the analysis unit 33 determines foods that should not be recommended to the user 10 as excluded foods based on vital data, and determines the meal content using foods other than the excluded foods. For example, when the blood glucose level, which is an example of vital data, is equal to or higher than the threshold value, the analysis unit 33 determines a food containing a large amount of sugar as an excluded food.
  • the database unit 32 stores a food database in which a threshold value of vital data is associated with excluded foods that should not be ingested when the threshold value is exceeded or falls below the threshold value. The analysis unit 33 determines the excluded foods by referring to the food database based on the vital data of the user 10, and determines the meal contents using the foods other than the excluded foods.
  • the dietary habits may include the preferences of the user 10. For example, when there are a plurality of candidates for the meal content recommended to the user 10, the analysis unit 33 may preferentially determine the meal content that suits the taste of the user 10.
  • FIG. 8 is a diagram showing an example of the exercise content 60 recommended to the user 10 displayed by the action support system 1 according to the present embodiment.
  • the name of the exercise "walking" is displayed as the exercise content 60 recommended for the user 10.
  • an exercise intensity 61 indicating a guideline for walking speed and an exercise amount 62 indicating a walking time are displayed in text respectively.
  • the display unit 43a may display an image or video showing a specific operation example of the exercise content 60.
  • the display unit 43a may display the URL (Uniform Resource Locator) of the website that streams and distributes the video representing a specific operation example, in addition to the text representing the exercise content 60.
  • URL Uniform Resource Locator
  • the display unit 43a may display the meal content 70 recommended to the user 10.
  • FIG. 9 is a diagram showing an example of the meal content 70 and the advertisement information 71 recommended to the user 10 displayed by the behavior support system 1 according to the present embodiment.
  • the advertisement information 71 is information for advertising a product or service provided by a business operator related to the meal content 70.
  • the advertisement information 71 includes the name of the business operator, the URL, and the meal menu provided by the business operator.
  • the advertisement information 71 includes a plurality of meal menus, but may include only one meal menu.
  • the display unit 43a further displays the order button 72.
  • the order button 72 is displayed for each meal menu included in the advertisement information 71.
  • the order button 72 is an example of a GUI object for accepting an order for a product or service from the user 10.
  • the input unit 42 orders a meal from the corresponding business operator. For example, by registering the residence 11 of the user 10 in advance, the ordered meal can be delivered to the residence 11 and provided to the user 10.
  • the meal content 70 may indicate the amount of calories and nutrients to be ingested instead of the meal menu. Further, the meal content 70 may include, in addition to the meal menu, a cooking recipe for preparing the meal indicated by the meal menu.
  • the advertisement information 71 may be advertisement information of a business operator that delivers the ingredients included in the cooking recipe.
  • the display unit 43a may display the search result 80 on the Internet as the action content recommended to the user 10.
  • FIG. 10 is a diagram showing an example of a search result 80 of the action content recommended to the user 10 displayed by the action support system 1 according to the present embodiment.
  • the analysis unit 33 generates a search word used for the search by the search engine based on the estimation result of the change in the physical condition and the behavioral habit, and gives the search word to the search engine via the communication unit 31.
  • the analysis unit 33 assigns the item name of the physical condition having a high priority in the exercise recommendation table, words that lead to health promotion such as "improvement” or “improvement”, and user 10 such as "exercise” or “meal”. Generate a search word that includes the name of the recommended action.
  • the analysis unit 33 since the exercise for the purpose of improving "muscle strength" is recommended to the user 10, the analysis unit 33 generates the search word "muscle strength strengthening exercise".
  • the search word may include words indicating the amount of exercise and the intensity of exercise.
  • the search results obtained by the search engine are displayed on the display unit 43a as shown in FIG.
  • the analysis unit 33 searches for a search word including the names of nutrients that the user 10 should ingest in large quantities, their amounts, and words related to meals such as "food”, “supplement", and "recipe". Generate. Instead of the amount of nutrients, it may include the item name of the physical condition and words that lead to health promotion such as "improvement” or "improvement”.
  • the behavior support system 1 stores the measurement unit 23 for measuring the walking feature amount of the user 10 and the time series data of the feature amount measured by the measurement unit 23.
  • the estimation unit 28 that estimates the change in the physical condition of the user 10 based on the time series data, the estimation result by the estimation unit 28, and the behavioral habit of the user 10. It includes an analysis unit 33 that determines the action content to be performed, and a presentation unit 43 that presents the action content determined by the analysis unit 33 to the user 10.
  • the behavior content suitable for promoting the health of the user 10 is presented to the user 10. Since the user 10 can promote health by taking action according to the presented action content, it is possible to suppress the possibility of contracting a disease such as sarcopenia or flail. As described above, according to the present embodiment, the user 10 can be encouraged to take effective actions for promoting health.
  • behavioral habits are habits related to at least one of exercise and diet.
  • the user 10 can determine the appropriate behavior content to improve at least one of exercise and diet.
  • the feature amount of walking is at least one of the walking speed, stride length, stride, walking cycle, cadence, difference between left and right steps, swaying of the trunk during walking, and the amount of change in joint angle.
  • the feature amount of walking can be expressed quantitatively and concretely, so that it is possible to recommend more appropriate action contents for the user 10. Therefore, it is possible to encourage the user 10 to take effective actions for promoting the health of the user 10.
  • physical condition is at least one of balance, muscle strength, endurance, agility and cognitive function.
  • the physical condition can be expressed quantitatively and concretely, so that the user 10 can recommend more appropriate action contents. Therefore, it is possible to encourage the user 10 to take effective actions for promoting the health of the user 10.
  • the measurement unit 23 includes a vital measurement unit 25 that measures the vital data of the user 10.
  • the storage unit 22 further stores time-series data of vital data.
  • the analysis unit 33 further determines the action content based on the time series data of the vital data.
  • the vital data is at least one of the user 10's body weight, blood pressure, pulse, urine pH and urine sugar.
  • the vital data can be represented by a specific value, so that it is possible to recommend more appropriate action contents for the user 10. Therefore, it is possible to encourage the user 10 to take effective actions for promoting the health of the user 10.
  • the measuring unit 23 includes an imaging unit 24a (camera 12) for photographing the user 10, and measures the feature amount using the moving image obtained by the camera 12.
  • an imaging unit 24a camera 12
  • the step of measuring the walking feature amount of the user 10 and the time-series data of the feature amount are stored in the storage unit 22 by storing the measured feature amount.
  • the step of estimating the change in the physical condition of the user 10 based on the time series data, the estimation result, and the behavioral habit of the user 10 is determined. It includes a step and a step of presenting the determined action content to the user 10.
  • the user 10 can be encouraged to take effective actions for promoting health.
  • the recommended content recommended for the user 10 may not include at least one of the dietary content and the exercise content.
  • the measurement unit 23 does not have to measure at least one of the vital data and the activity amount of the user 10.
  • the communication method between the devices described in the above embodiment is not particularly limited.
  • the wireless communication method is, for example, short-range wireless communication such as ZigBee (registered trademark), Bluetooth (registered trademark), or wireless LAN (Local Area Network).
  • the wireless communication method may be communication via a wide area communication network such as the Internet.
  • wired communication may be performed between the devices instead of wireless communication.
  • the wired communication is a power line communication (PLC: Power Line Communication) or a communication using a wired LAN.
  • another processing unit may execute the processing executed by the specific processing unit. Further, the order of the plurality of processes may be changed, or the plurality of processes may be executed in parallel. Further, the distribution of the components of the muscle strength management system to a plurality of devices is an example.
  • another device may include the components of one device.
  • the terminal device 40 or the server device 30 may include the components included in the measuring device 20.
  • the terminal device 40 may include a measuring unit 23 or an estimating unit 28.
  • the terminal device 40 is a part of the components included in the measurement unit 23, specifically, at least one of the imaging unit 24a, the acceleration sensor 24b, the vital measurement unit 25, the determination unit 26, and the activity measurement unit 27. May be provided.
  • the server device 30 may include a determination unit 26 or an estimation unit 28.
  • the measuring device 20 or the terminal device 40 may include at least one of the database unit 32 and the analysis unit 33 included in the server device 30.
  • the measuring device 20 may include at least one of the input unit 42 and the presentation unit 43 included in the terminal device 40.
  • the muscle strength management system may be realized as a single device.
  • the processing described in the above embodiment may be realized by centralized processing using a single device (system), or may be realized by distributed processing using a plurality of devices. Good. Further, the number of processors that execute the above program may be singular or plural. That is, centralized processing may be performed, or distributed processing may be performed.
  • all or a part of the components such as the control unit may be configured by dedicated hardware, or may be realized by executing a software program suitable for each component. May be good. Even if each component is realized by a program execution unit such as a CPU (Central Processing Unit) or a processor reading and executing a software program recorded on a recording medium such as an HDD (Hard Disk Drive) or a semiconductor memory. Good.
  • a program execution unit such as a CPU (Central Processing Unit) or a processor reading and executing a software program recorded on a recording medium such as an HDD (Hard Disk Drive) or a semiconductor memory. Good.
  • a component such as a control unit may be composed of one or a plurality of electronic circuits.
  • the one or more electronic circuits may be general-purpose circuits or dedicated circuits, respectively.
  • One or more electronic circuits may include, for example, a semiconductor device, an IC (Integrated Circuit), an LSI (Large Scale Integration), or the like.
  • the IC or LSI may be integrated on one chip or may be integrated on a plurality of chips. Here, it is called IC or LSI, but the name changes depending on the degree of integration, and it may be called system LSI, VLSI (Very Large Scale Integration), or ULSI (Ultra Large Scale Integration).
  • An FPGA Field Programmable Gate Array programmed after the LSI is manufactured can also be used for the same purpose.
  • general or specific aspects of the present invention may be realized by a system, an apparatus, a method, an integrated circuit or a computer program.
  • a computer-readable non-temporary recording medium such as an optical disk, HDD or semiconductor memory in which the computer program is stored.
  • it may be realized by any combination of a system, an apparatus, a method, an integrated circuit, a computer program and a recording medium.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pathology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Physiology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Abstract

L'invention concerne un système support de comportement (1) comprenant : une unité de mesure (23) qui mesure une quantité caractéristique pour la marche d'un utilisateur (10) ; une unité d'enregistrement (22) pour enregistrer des données de série temporelle pour la quantité caractéristique mesurée par l'unité de mesure (23) ; une unité d'estimation (28) qui estime des changements dans la condition physique de l'utilisateur (10) sur la base des données de série temporelle ; une unité d'analyse (33) qui détermine un contenu de comportement à recommander à l'utilisateur (10) sur la base du résultat de l'estimation par l'unité d'estimation (28) et des habitudes comportementales de l'utilisateur (10) ; et une unité de présentation (43) qui présente le contenu de comportement déterminé par l'unité d'analyse (33) à l'utilisateur (10).
PCT/JP2020/006339 2019-04-09 2020-02-18 Système support de comportement et procédé de support de comportement WO2020208944A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202080016860.8A CN113473901B (zh) 2019-04-09 2020-02-18 行动支持系统及行动支持方法
JP2021513507A JP7182319B2 (ja) 2019-04-09 2020-02-18 行動支援システム及び行動支援方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-074255 2019-04-09
JP2019074255 2019-04-09

Publications (1)

Publication Number Publication Date
WO2020208944A1 true WO2020208944A1 (fr) 2020-10-15

Family

ID=72751802

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/006339 WO2020208944A1 (fr) 2019-04-09 2020-02-18 Système support de comportement et procédé de support de comportement

Country Status (3)

Country Link
JP (1) JP7182319B2 (fr)
CN (1) CN113473901B (fr)
WO (1) WO2020208944A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115721269A (zh) * 2022-11-07 2023-03-03 四川大学华西医院 一种基于六分钟步行试验的重症康复评估训练系统
WO2024034889A1 (fr) * 2022-08-12 2024-02-15 삼성전자주식회사 Procédé de détermination d'état de démarche, et procédé de réalisation de dispositif

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016043081A1 (fr) * 2014-09-18 2016-03-24 一博 椎名 Dispositif d'enregistrement, terminal mobile, dispositif d'analyse, programme et support de stockage
JP2017097401A (ja) * 2015-11-18 2017-06-01 セイコーエプソン株式会社 行動変容解析システム、行動変容解析方法および行動変容解析プログラム
WO2018012071A1 (fr) * 2016-07-14 2018-01-18 ソニー株式会社 Système de traitement d'informations, support d'enregistrement, et procédé de traitement d'informations
WO2019187099A1 (fr) * 2018-03-30 2019-10-03 株式会社日立製作所 Dispositif d'aide à l'autonomie des fonctions corporelles et procédé associé

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002297775A (ja) * 2001-03-29 2002-10-11 Mizuno Corp 健康・消費情報管理装置,方法およびプログラム
JP3965299B2 (ja) * 2001-12-26 2007-08-29 三菱化学株式会社 心身状態の調整装置及びコンピュータ読み取り可能な記録媒体
AU2010286471B2 (en) * 2009-08-28 2015-05-07 Allen Joseph Selner Characterizing a physical capability by motion analysis
JP2012226564A (ja) * 2011-04-20 2012-11-15 Sony Corp 情報処理装置、情報処理方法、およびプログラム
JP5696222B2 (ja) * 2011-09-14 2015-04-08 株式会社Nttドコモ ダイエット支援システムおよびダイエット支援方法
JP2015181708A (ja) * 2014-03-24 2015-10-22 セイコーエプソン株式会社 運動提示装置、運動提示方法及び運動提示プログラム
KR20170109962A (ko) * 2016-03-22 2017-10-10 배재대학교 산학협력단 웹 기반 체질량 변화 모니터링 시스템 및 방법
JP2018007979A (ja) * 2016-07-15 2018-01-18 カシオ計算機株式会社 運動支援装置及び運動支援方法、運動支援プログラム
CN107731275A (zh) * 2017-09-25 2018-02-23 上海斐讯数据通信技术有限公司 一种动态调整健康锻炼计划的方法及系统

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016043081A1 (fr) * 2014-09-18 2016-03-24 一博 椎名 Dispositif d'enregistrement, terminal mobile, dispositif d'analyse, programme et support de stockage
JP2017097401A (ja) * 2015-11-18 2017-06-01 セイコーエプソン株式会社 行動変容解析システム、行動変容解析方法および行動変容解析プログラム
WO2018012071A1 (fr) * 2016-07-14 2018-01-18 ソニー株式会社 Système de traitement d'informations, support d'enregistrement, et procédé de traitement d'informations
WO2019187099A1 (fr) * 2018-03-30 2019-10-03 株式会社日立製作所 Dispositif d'aide à l'autonomie des fonctions corporelles et procédé associé

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
FUKUNAGA, MASATAK A: "Extraction of walking motion and verification of feature stability in walking video", PROCEEDINGS OF THE 66TH NATIONAL CONFERENCE (2) ARTIFICIAL INTELLIGENCE AND COGNITIVE SCIENCE, 9 March 2004 (2004-03-09), pages 423, 424 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024034889A1 (fr) * 2022-08-12 2024-02-15 삼성전자주식회사 Procédé de détermination d'état de démarche, et procédé de réalisation de dispositif
CN115721269A (zh) * 2022-11-07 2023-03-03 四川大学华西医院 一种基于六分钟步行试验的重症康复评估训练系统

Also Published As

Publication number Publication date
JP7182319B2 (ja) 2022-12-02
CN113473901B (zh) 2024-09-13
JPWO2020208944A1 (ja) 2021-12-02
CN113473901A (zh) 2021-10-01

Similar Documents

Publication Publication Date Title
WO2020208945A1 (fr) Système et procédé de gestion de force musculaire
JP7127086B2 (ja) ヘルストラッキングデバイス
US11887496B2 (en) Methods and apparatus for coaching based on workout history and readiness/recovery information
US8712108B2 (en) Information processing apparatus, information outputting method and computer program storage device
CN107249435B (zh) 基于实时生理参数提供用户洞察力的系统和方法
CN109817302B (zh) 一种健身用专家系统
KR102080534B1 (ko) 맞춤형 건강관리 서비스 시스템
WO2020208944A1 (fr) Système support de comportement et procédé de support de comportement
CN107077711A (zh) 特性评价装置、特性评价系统、特性评价方法和特性评价程序
US20220323189A1 (en) Jaw movement analysis system
JP2019067392A (ja) 情報出力システム、情報出力方法及び情報出力プログラム
CN110415786A (zh) 信息处理装置及其工作方法
KR20160034199A (ko) 건강 관리 방법 및 장치
JP7256907B1 (ja) 情報処理プログラム、情報処理装置及び情報処理方法
KR102426924B1 (ko) 생체 임피던스 측정장치를 이용한 건강 관리시스템 및 건강 관리 장치
JP7203473B1 (ja) プログラム、情報処理方法及び情報処理装置
WO2023277156A1 (fr) Système d'amélioration des habitudes de vie, terminal portatif et procédé de commande
JP2023004124A (ja) 身体情報評価装置、身体情報評価方法、プログラム、および、記録媒体

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20787608

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021513507

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20787608

Country of ref document: EP

Kind code of ref document: A1