CN113473901A - Action support system and action support method - Google Patents
Action support system and action support method Download PDFInfo
- Publication number
- CN113473901A CN113473901A CN202080016860.8A CN202080016860A CN113473901A CN 113473901 A CN113473901 A CN 113473901A CN 202080016860 A CN202080016860 A CN 202080016860A CN 113473901 A CN113473901 A CN 113473901A
- Authority
- CN
- China
- Prior art keywords
- user
- unit
- action
- walking
- support system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000009471 action Effects 0.000 title claims abstract description 118
- 238000000034 method Methods 0.000 title claims description 28
- 238000005259 measurement Methods 0.000 claims abstract description 65
- 238000004458 analytical method Methods 0.000 claims abstract description 60
- 230000008859 change Effects 0.000 claims abstract description 36
- 238000003860 storage Methods 0.000 claims abstract description 34
- 235000005911 diet Nutrition 0.000 claims description 48
- 230000037213 diet Effects 0.000 claims description 40
- 210000003205 muscle Anatomy 0.000 claims description 25
- 230000033001 locomotion Effects 0.000 claims description 18
- 210000002700 urine Anatomy 0.000 claims description 9
- 230000036772 blood pressure Effects 0.000 claims description 8
- 238000009528 vital sign measurement Methods 0.000 claims description 8
- WQZGKKKJIJFFOK-GASJEMHNSA-N Glucose Natural products OC[C@H]1OC(O)[C@H](O)[C@@H](O)[C@@H]1O WQZGKKKJIJFFOK-GASJEMHNSA-N 0.000 claims description 4
- 230000003920 cognitive function Effects 0.000 claims description 4
- 239000008103 glucose Substances 0.000 claims description 4
- 238000004891 communication Methods 0.000 description 52
- 230000000694 effects Effects 0.000 description 20
- 235000013305 food Nutrition 0.000 description 19
- 230000001133 acceleration Effects 0.000 description 18
- 238000010586 diagram Methods 0.000 description 18
- 230000036541 health Effects 0.000 description 15
- 238000003384 imaging method Methods 0.000 description 13
- 235000015097 nutrients Nutrition 0.000 description 11
- 238000012545 processing Methods 0.000 description 10
- 238000011156 evaluation Methods 0.000 description 9
- 230000006399 behavior Effects 0.000 description 8
- 230000000378 dietary effect Effects 0.000 description 8
- 238000001514 detection method Methods 0.000 description 7
- 230000008569 process Effects 0.000 description 6
- 239000000470 constituent Substances 0.000 description 5
- 235000006694 eating habits Nutrition 0.000 description 5
- 239000004065 semiconductor Substances 0.000 description 5
- 230000007423 decrease Effects 0.000 description 4
- 230000010354 integration Effects 0.000 description 4
- 210000000577 adipose tissue Anatomy 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 3
- 238000004590 computer program Methods 0.000 description 3
- 238000012937 correction Methods 0.000 description 2
- 230000003247 decreasing effect Effects 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 235000021075 protein intake Nutrition 0.000 description 2
- 102000004169 proteins and genes Human genes 0.000 description 2
- 108090000623 proteins and genes Proteins 0.000 description 2
- 241000251468 Actinopterygii Species 0.000 description 1
- 208000036119 Frailty Diseases 0.000 description 1
- 208000027418 Wounds and injury Diseases 0.000 description 1
- 206010003549 asthenia Diseases 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 238000009535 clinical urine test Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000010411 cooking Methods 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000005802 health problem Effects 0.000 description 1
- 230000003862 health status Effects 0.000 description 1
- 230000001771 impaired effect Effects 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 210000003141 lower extremity Anatomy 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000006996 mental state Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000001737 promoting effect Effects 0.000 description 1
- 208000001076 sarcopenia Diseases 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000005303 weighing Methods 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Physics & Mathematics (AREA)
- Animal Behavior & Ethology (AREA)
- Pathology (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Physiology (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Medical Treatment And Welfare Office Work (AREA)
Abstract
The action support system (1) is provided with: a measurement unit (23) for measuring a characteristic amount of walking of the user (10); a storage unit (22) for storing the time-series data of the feature quantity measured by the measurement unit (23); an estimation unit (28) that estimates a change in the physical state of the user (10) on the basis of the time-series data; an analysis unit (33) that determines the action content recommended to the user (10) on the basis of the result of the estimation performed by the estimation unit (28) and the action habits of the user (10); and a presentation unit (43) that presents the action content determined by the analysis unit (33) to the user (10).
Description
Technical Field
The present invention relates to a mobility support system and a mobility support method.
Background
For example, patent literature 1 discloses a life management system that analyzes the mental state of a user to achieve the user's goal.
Documents of the prior art
Patent document
Patent document 1: japanese patent laid-open publication No. 2005-74107
Disclosure of Invention
Problems to be solved by the invention
It is required to urge the user to perform an effective action with the aim of health promotion.
Therefore, an object of the present invention is to provide an action support system and an action support method that can prompt a user to perform an effective action for health promotion.
Means for solving the problems
To achieve the above object, an action support system according to an aspect of the present invention includes: a measurement unit that measures a characteristic amount of walking of a user; a storage unit for storing time-series data of the feature amount measured by the measurement unit; an estimation unit configured to estimate a change in the physical state of the user based on the time-series data; an analysis unit configured to determine action content recommended to the user based on a result of the estimation by the estimation unit and an action habit of the user; and a presentation unit that presents the action content determined by the analysis unit to the user.
In addition, an action support method according to an aspect of the present invention includes: measuring a characteristic amount of walking of the user; a step of generating time-series data of the feature amount by storing the measured feature amount in a storage unit; a step of estimating a change in the physical state of the user based on the time-series data; determining action content recommended to the user based on the result of the estimation and the action habit of the user; and a step of presenting the determined action content to the user.
Further, an aspect of the present invention can be implemented as a program that causes a computer to execute the above-described action support method. Alternatively, the program may be realized as a computer-readable non-transitory recording medium storing the program.
Effects of the invention
According to the present invention, it is possible to urge the user to perform an effective action for health promotion.
Drawings
Fig. 1 is a diagram showing an outline of an action support system according to an embodiment.
Fig. 2 is a block diagram showing a functional configuration of an action support system according to the embodiment.
Fig. 3 is a flowchart showing an operation of the action support system according to the embodiment.
Fig. 4 is a diagram showing an example of an input screen for a user to input an action habit, which is displayed in the action support system according to the embodiment.
Fig. 5 is a diagram showing a body state estimation table stored in the action support system according to the embodiment.
Fig. 6 is a diagram showing an exercise advice table generated by the action support system according to the embodiment.
Fig. 7 is a diagram showing a diet advice table generated by the action support system according to the embodiment.
Fig. 8 is a diagram showing an example of the sports content recommended to the user displayed in the action support system according to the embodiment.
Fig. 9 is a diagram showing an example of the dietary content and the advertisement information recommended to the user displayed in the action support system according to the embodiment.
Fig. 10 is a diagram showing an example of a search result of action content recommended to a user displayed in the action support system according to the embodiment.
Detailed Description
Hereinafter, a behavior support system and a behavior support method according to embodiments of the present invention will be described in detail with reference to the drawings. The embodiments described below are all specific examples of the present invention. Therefore, the numerical values, shapes, materials, constituent elements, arrangement positions and connection forms of the constituent elements, steps, order of the steps, and the like shown in the following embodiments are examples, and do not limit the present invention. Thus, among the constituent elements of the following embodiments, constituent elements that are not recited in the independent claims indicating the uppermost concept of the present invention are described as arbitrary constituent elements.
The drawings are schematic and not necessarily strictly illustrated. Therefore, for example, the scales and the like do not always match in each drawing. In the drawings, substantially the same components are denoted by the same reference numerals, and redundant description is omitted or simplified.
(embodiment mode)
[ summary ]
First, an outline of the action support system according to the embodiment will be described with reference to fig. 1. Fig. 1 is a diagram showing an outline of an action support system 1 according to the present embodiment.
The action support system 1 shown in fig. 1 measures a characteristic amount of walking of the user 10 and analyzes time-series data of the measured characteristic amount, thereby estimating a change in the physical state of the user 10. The characteristic amount of walking is, for example, walking speed, stride length, stride width, walking cycle, walking frequency, and left-right step difference, and is at least 1 of the amount of change in trunk movement and joint angle during walking. In addition, the physical state is at least 1 of balance, muscle strength, endurance, agility, and cognitive function. Further, the action support system 1 determines the action content recommended to the user 10 as the recommended content based on the result of the estimation and the action habit of the user 10, and presents the determined recommended content to the user 10. The action habit is a habit relating to at least one of exercise and diet. The recommended content includes sports content and diet content recommended to the user 10.
In addition, in the action support system 1 according to the present embodiment, vital sign data of the user 10 is measured in addition to the characteristic amount of walking. The vital sign data is at least 1 of the user's 10 weight, blood pressure, pulse, urine pH and urine glucose.
The measurement of the characteristic amount of walking and the vital sign data is performed while the user 10 is in a normal life, for example, by using the camera 12, the toilet stool 13, and the like provided in the house 11 of the user 10. That is, the user 10 measures the characteristic amount of walking and vital sign data of the user 10 without particular awareness. Further, the user 10 may intentionally measure the feature quantity and the vital sign data using a scale, a sphygmomanometer, or the like.
Information indicating a change in the physical state of the user 10 estimated based on the measured feature amount and the vital sign data is transmitted to the server device 30 via a Network such as a Local Area Network (LAN) or the internet. The server device 30 analyzes the change in the physical state of the user 10, and determines the action content recommended to the user 10 as the recommended content.
The determined recommended content is transmitted to the terminal device 40 or the like carried by the user 10 via the network. The terminal device 40 presents the recommended content to the user 10 by an image or sound. Thus, in order to enhance or maintain muscle strength, exercise content and diet content appropriate for the user 10 can be proposed to the user 10. In this way, the action support system 1 according to the present embodiment can prompt the user 10 to perform an effective action for health promotion.
[ Structure ]
Hereinafter, a specific configuration of the action support system 1 according to the present embodiment will be described with reference to fig. 2. Fig. 2 is a block diagram showing a functional configuration of the action support system 1 according to the present embodiment.
As shown in fig. 2, the action support system 1 includes a measurement device 20, a server device 30, and a terminal device 40. The measurement device 20, the server device 30, and the terminal device 40 are connected to each other so as to be communicable via the internet or the like.
[ measurement device ]
The measurement device 20 is a device that estimates a change in the physical state of the user 10. The measurement device 20 is implemented by, for example, a microcomputer and various sensor devices, and is installed in the house 11 of the user 10. As shown in fig. 2, the measurement device 20 includes a communication unit 21, a storage unit 22, a measurement unit 23, and an estimation unit 28.
The communication unit 21 is implemented by 1 or more communication interfaces that perform wireless communication or wired communication. The wireless communication is, for example, communication based on communication standards such as Wi-Fi (registered trademark), Bluetooth (registered trademark), ZigBee (registered trademark), and the like, but is not limited thereto.
The communication unit 21 performs transmission and reception of information and signals through communication with the server device 30 and the terminal device 40, respectively. For example, the communication unit 21 transmits the estimation result of the change in the physical state of the user 10 estimated by the estimation unit 28 to the server device 30. The communication unit 21 transmits the time-series data of the vital sign data and the exercise history to the server device 30.
The storage unit 22 is a nonvolatile storage device such as a semiconductor memory such as a flash memory or an HDD (Hard Disk Drive). The storage unit 22 stores time-series data of the feature amount measured by the measurement unit 23. The storage unit 22 stores time-series data of the vital sign data measured by the measurement unit 23. The storage unit 22 also stores the exercise history of the user 10 generated based on the activity measured by the measurement unit 23. The storage unit 22 may store the recommended content transmitted from the server device 30.
The measurement unit 23 measures a characteristic amount of the walking of the user 10. The measurement unit 23 measures vital sign data and activity of the user 10. As shown in fig. 2, the measurement unit 23 includes a detection unit 24, a vital sign measurement unit 25, a determination unit 26, and an activity measurement unit 27. The measurement unit 23 may not include at least one of the vital sign measurement unit 25 and the activity measurement unit 27.
The detection unit 24 detects the movement of the user 10. Specifically, the detection unit 24 includes an imaging unit 24a and an acceleration sensor 24 b. The detection unit 24 may not include at least one of the imaging unit 24a and the acceleration sensor 24 b.
The imaging unit 24a is realized by the camera 12 shown in fig. 1, for example. The imaging unit 24a generates a moving image by imaging the user 10. Specifically, the imaging unit 24a photographs the walking motion of the user 10. For example, as shown in fig. 1, a camera 12 is installed at an entrance of a house 11 and photographs a user 10 who enters and exits the house 11. Alternatively, the camera 12 may be installed in a corridor, a step, or a room in the house 11. The moving image captured by the camera 12 (imaging unit 24a) includes a walking motion of the user 10.
The acceleration sensor 24b is carried by the user 10, and detects acceleration corresponding to the movement of the user 10. For example, the acceleration sensor 24b is attached to a part of the body of the user 10 such as an arm, a waist, a foot, a neck, or a head. For example, the acceleration sensor 24b is fixed to a mount attached to a part of the body of the user 10. The mounting member is for example a wrist strap or a belt or the like. The acceleration sensor 24b is fixed to the part of the body of the user 10 by the attachment being attached to the part of the body. The number of the acceleration sensors 24b attached to the user 10 is not limited to 1, and may be plural. The acceleration sensor 24b detects the movement of the attached portion, and generates three-dimensional acceleration data. The three-dimensional acceleration data indicates, for example, the acceleration of the user 10 in each of the front-back direction, the up-down direction, and the left-right direction.
The vital sign measurement unit 25 measures vital sign data of the user 10. For example, the vital sign measurement unit 25 is realized by at least 1 of a weighing scale, a body fat meter, a blood pressure meter, a pulse meter, a urine examination meter, and the like. The weight and body fat meter is embedded in a floor in front of a commode of the house 11, for example, and measures the weight, body fat, and the like of the user 10 riding thereon. Thus, when the user 10 organizes the meter, the weight and the like of the user 10 can be measured without the user 10 being aware of it. The urine test meter is attached to, for example, a toilet 13 shown in fig. 1, and measures at least 1 of urine pH and urine sugar of the user 10. The blood pressure meter and the pulse meter are, for example, wrist-worn meters that can be worn on the arms of the user 10, and measure the blood pressure and the pulse of the user 10.
The determination unit 26 determines the characteristic amount of the walking of the user 10 based on the motion detected by the detection unit 24. For example, the determination unit 26 determines the characteristic amount of walking by analyzing the moving image obtained by the imaging unit 24 a. Specifically, the walking characteristic amount is at least 1 of walking speed, stride length, stride width, walking cycle, walking frequency, left-right step difference, trunk movement during walking, and change amount of joint angle.
For example, the determination unit 26 estimates the feature amount using the acceleration detected by the acceleration sensor 24 b. When the acceleration sensor 24b is attached to the waist of the user 10, the movement of the waist of the user 10 during walking is detected. The movement of the waist has a predetermined correlation with walking characteristics such as walking balance and walking speed. Therefore, the determination unit 26 determines the characteristic amount of walking by referring to the correspondence information indicating the correspondence relationship between the movement of the waist and the characteristic amount of walking. Correspondence information indicating the correspondence relationship between the waist exercise and the characteristic amount of walking is stored in the storage unit 22 in advance, for example. The correspondence information may be stored in the database unit 32 of the server device 30.
The determination unit 26 is implemented by a microcomputer, for example. Specifically, the determination unit 26 is implemented by a nonvolatile memory in which a program is stored, a volatile memory which is a temporary storage area for executing the program, an input/output port, a processor for executing the program, and the like.
The activity amount measurement unit 27 measures the activity amount of the user 10. The activity amount measurement unit 27 is an activity amount meter, but may be a pedometer that measures the number of steps of the user 10. The activity meter or pedometer is often carried by the user 10, and measures the number of steps or activity of the user 10 for one day. The measured number of steps or the amount of activity is stored as an exercise history in the storage unit 22. The exercise history is obtained by continuously measuring the number of steps or the amount of activity per day for a plurality of days, for example.
The estimation unit 28 estimates a change in the physical state of the user 10 based on the time-series data measured by the measurement unit 23 and stored in the storage unit 22. Specifically, the estimation unit 28 estimates a change in the physical state, which is at least one of balance, muscle strength, endurance, agility, and cognitive function, based on the time-series data of the characteristic amount of walking.
The characteristic amount of walking and the physical state have a predetermined correlation. Therefore, the estimation unit 28 determines the physical state of the user 10 from the walking feature amount by referring to the correspondence information indicating the correspondence relationship between the walking feature amount and the physical state. Correspondence information indicating the correspondence between the characteristic amount of walking and the physical state is stored in the storage unit 22 in advance, for example.
The estimation unit 28 estimates a change in the body state by referring to the correspondence information using the time-series data of the features of walking. The change is, for example, a change in the case where a predetermined period of time such as several days, one week, one month, one year has elapsed. The estimation unit 28 estimates a change in the physical state of the user 10 based on the rate of change in the characteristic amount of walking (i.e., the amount of change/the period required for the change). For example, when the walking speed gradually decreases over a period of one month, that is, when the rate of change is negative and the absolute value thereof is smaller than a predetermined threshold value, the estimation unit 28 estimates that the muscle mass of the lower limb decreases due to weakness of the user 10. In addition, when the walking speed is rapidly decreased in one day, that is, when the rate of change is a negative number and the absolute value thereof is larger than a predetermined threshold value, the estimation unit 28 estimates that the user 10 is in a state where the walking is temporarily impaired by injury or the like. In addition, when the rate of change is a positive number, the estimation unit 28 estimates that the body state is gradually improved. The estimation result obtained by the estimation unit 28 is transmitted to the server device 30 via the communication unit 21.
The inference unit 28 is realized by a microcomputer, for example. Specifically, the estimation unit 28 is implemented by a nonvolatile memory in which a program is stored, a volatile memory which is a temporary storage area for executing the program, an input/output port, a processor for executing the program, and the like.
[ Server device ]
The server device 30 determines the action content recommended to the user 10 as the recommended content based on the change in the physical state estimated by the measurement device 20 and the action habit of the user 10. The server device 30 is, for example, a computer apparatus. As shown in fig. 2, the server device 30 includes a communication unit 31, a database unit 32, and an analysis unit 33.
The communication unit 31 is implemented by 1 or more communication interfaces that perform wireless communication or wired communication. The communication unit 31 performs transmission and reception of information and signals through communication with the measurement device 20 and the terminal device 40, respectively. For example, the communication unit 31 receives the estimation result of the physical state of the user 10 estimated by the measurement device 20 and transmitted via the communication unit 21. The communication unit 31 transmits the recommended content determined by the analysis unit 33 to the measurement device 20 or the terminal device 40. The communication unit 31 receives information such as the behavior habit of the user 10 acquired by the input unit 42 of the terminal device 40. The communication unit 31 may receive time-series data of the characteristic amount of walking and time-series data of the vital sign data from the measurement device 20.
The database unit 32 is a storage unit that stores a database used by the action support system 1 to determine the action content recommended to the user 10 as the recommended content. Specific examples of the database will be described later. The database unit 32 may store the estimation result of the change in the physical state of the user 10, and the time-series data of the characteristic amount of walking and the time-series data of the vital sign data. Further, the database unit 32 may store the estimation result of the change in the physical state of each of the plurality of users 10, and the time-series data of the characteristic amount of walking and the time-series data of the vital sign data. For example, by using data of another person having a similar characteristic to the user 10, the accuracy of recommending content can be improved. The database unit 32 is implemented by a nonvolatile storage device such as a semiconductor memory or an HDD.
The analysis unit 33 determines the action content recommended to the user 10 as the recommended content based on the result of the estimation of the physical state by the estimation unit 28 and the action habit of the user 10. The behavior habit of the user 10 is acquired by the input unit 42 of the terminal device 40 and received via the communication units 41 and 31, for example. A specific example of the method for determining the recommended content by the analysis unit 33 will be described later.
The analysis unit 33 is realized by a microcomputer, for example. Specifically, the analysis unit 33 is realized by a nonvolatile memory in which a program is stored, a volatile memory which is a temporary storage area for executing the program, an input/output port, a processor for executing the program, and the like.
[ terminal device ]
The terminal device 40 is a device that presents the user 10. Further, the terminal device 40 receives an input of information from the user 10. For example, the terminal device 40 is a portable terminal that can be carried by the user 10, and is a smartphone or the like. Alternatively, the terminal device 40 may be a display device such as a television set or an audio output device such as a smart speaker provided in the house 11 of the user 10. As shown in fig. 2, the terminal device 40 includes a communication unit 41, an input unit 42, a presentation unit 43, and a storage unit 44.
The communication unit 41 is implemented by 1 or more communication interfaces that perform wireless communication or wired communication. The communication unit 41 performs transmission and reception of information and signals through communication with the measurement device 20 and the server device 30, respectively. For example, the communication unit 41 receives the recommended content determined by the analysis unit 33 of the server device 30. The communication unit 41 transmits information such as the behavior habits of the user 10 acquired by the input unit 42 to the server device 30.
The input unit 42 receives an input from the user 10. The input unit 42 is, for example, a touch sensor or a physical button. The input unit 42 may be an audio input device such as a microphone.
The input unit 42 receives personal information of the user 10. The personal information includes attribute information such as the action habits, thoughts, and life styles of the user 10, and the age and sex of the user 10. The personal information input by the input unit 42 is transmitted to the measurement device 20 or the server device 30 via the communication unit 41.
The presentation unit 43 presents the recommended content determined by the analysis unit 33 to the user 10. Further, the presentation unit 43 presents advertisement information of the business associated with the recommended content.
As shown in fig. 2, the presentation unit 43 has a display unit 43 a. The display unit 43a is realized by, for example, a liquid crystal display panel, an organic EL (Electroluminescence) display panel, or the like. The display unit 43a displays images indicating the recommended contents and advertisement information. The display unit 43a generates and displays, for example, GUI (Graphical User Interface) objects such as selection buttons for accepting selections and operations from the User 10.
The presentation unit 43 may have a speaker for outputting sound instead of the display unit 43a or in addition to the display unit 43 a. The presentation unit 43 may output a sound indicating the content presented to the user 10.
The storage unit 44 is a nonvolatile storage device such as a flash memory. The storage unit 44 stores the recommended content transmitted from the server device 30. The storage unit 44 stores information such as the behavior habits of the user 10 received by the input unit 42. The storage unit 44 stores, for example, an image displayed on the display unit 43 a.
Although an example of the configuration of the action support system 1 has been described above with reference to fig. 2, the configuration of the action support system 1 is not limited to the example shown in fig. 2. For example, 2 devices out of the measurement device 20, the server device 30, and the terminal device 40 may not be able to perform direct communication. For example, the server apparatus 30 and the terminal apparatus 40 may not be able to communicate with each other. In this case, the server device 30 and the terminal device 40 may transmit and receive information and signals through the measurement device 20.
[ actions ]
Next, the operation of the action support system 1 according to the present embodiment will be described with reference to fig. 3. Fig. 3 is a flowchart showing the operation of the action support system 1 according to the present embodiment.
As shown in fig. 3, first, the input unit 42 acquires personal information (S10). Specifically, as shown in fig. 4, the display unit 43a displays an input screen for the user 10 to input the behavior habit.
Fig. 4 is a diagram showing an example of an input screen for the user 10 to input an action habit, which is displayed in the action support system 1 according to the present embodiment. For example, the display portion 43a displays a questionnaire instruction 50 prompting an answer to the user 10. The display unit 43a displays a plurality of questions 51 and a plurality of answer candidates 52 for the questions 51. The question item 51 is a question generated by the analysis unit 33 in order to grasp the action habits, thoughts, and lifestyle of the user 10.
In fig. 4, a radio button 53 is displayed for each answer candidate 52 as an example of a GUI object for receiving a selection from the user 10. The user 10 can input an answer to the question 51 by selecting the radio button 53 corresponding to the answer candidate 52. The method for inputting the action habit is not limited to this, and the user 10 may be caused to input an answer in text by displaying the question 51 and a text box, for example.
Next, the measurement unit 23 performs a measurement process (S20). Specifically, the detection unit 24 measures the characteristic amount of the walking of the user 10 (S21). More specifically, the user 10 is photographed by the image pickup unit 24a, and an image of the user 10 is acquired (S22). Next, the determination unit 26 determines the characteristic amount of walking of the user 10 based on the image obtained by the imaging unit 24a (S23). For example, the determination unit 26 obtains the characteristic amount of walking, such as the walking speed of the user 10, by image processing. Alternatively, or in addition to the imaging performed by the imaging unit 24a, the determination unit 26 may determine the characteristic amount of the walking of the user 10 based on the acceleration data obtained by the acceleration sensor 24 b.
The vital sign measurement unit 25 of the measurement unit 23 measures vital sign data of the user 10 (S24). Further, the activity amount measurement unit 27 of the measurement unit 23 measures the activity amount of the user 10 (S25). The measurement of the characteristic amount of walking (S21), the measurement of the vital sign data (S24), and the measurement of the activity amount (S25) may be performed simultaneously or may be performed 1 first. The order of execution of these processes is not particularly limited. The measurement of the characteristic amount of walking (S21), the measurement of the vital sign data (S24), and the measurement of the activity amount (S25) may be performed a plurality of times. The measurement values obtained by the plurality of measurements are stored in at least 1 of the storage unit 22 and the database unit 32 as time-series data of the feature quantity, time-series data of the vital sign data, and a motion history. The acquisition of the personal information (S10) may be performed after the measurement process (S20) or during the measurement process (S20).
Next, the estimation unit 28 estimates a change in the physical state of the user 10 based on the measured time-series data of the characteristic amount of walking (S30). Next, the analysis unit 33 determines the action content recommended to the user 10 as the recommended content based on the estimation result obtained by the estimation unit 28 and the personal information acquired by the input unit 42 (S40). Specifically, the analysis unit 33 determines the exercise content and the diet content recommended to the user 10.
Next, the presentation unit 43 presents the determined recommended content to the user 10 (S50). Specifically, the display unit 43a displays the determined recommended content. Specific examples of the display will be described later.
[ determination of exercise content ]
Next, a specific method of determining recommended content will be described. First, an example of a method for determining motion contents will be described with reference to fig. 5 and 6.
Fig. 5 is a diagram showing a body state estimation table stored in the action support system 1 according to the present embodiment. The body state estimation table is stored in the storage unit 22 of the measurement device 20. Alternatively, it may be stored in the database unit 32 of the server device 30.
As shown in fig. 5, the body state estimation table is prepared for each attribute of the demographics such as "50-year-old female", "60-year-old male", and the like. Demographic attributes are, for example, age and gender.
The body state estimation table associates 1 or more reference values for each body state. In the example shown in fig. 5, the reference values are associated with 5 stages "a" to "E". The reference value of each stage may be represented by 1 value or a certain range.
"a" to "E" correspond to the evaluation results of the physical state. For example, in the order of "a" to "E," a "indicates that the physical state is the best, and" E "indicates that the physical state is the worst, but the opposite may be true. For example, in the example shown in fig. 5, "C" indicates that the physical state is an average value, "a" and "B" indicate that the physical state is better than average, and "D" and "E" indicate that the physical state is worse than average.
In the present embodiment, the analysis unit 33 evaluates the physical state of the user 10 by referring to the physical state estimation table based on the physical state of the user 10 estimated by the estimation unit 28. Specifically, the analysis unit 33 compares the reference values of the physical condition estimation table corresponding to the age and sex to which the user 10 belongs with the physical condition of the user 10, and determines which of the features of the user 10 corresponds to "a" to "E". For example, when the balance of the user 10 estimated by the estimating unit 28 is a value equal to or greater than P2 and less than P1, the analyzing unit 33 determines that the evaluation of the balance of the user 10 is "B". The analysis unit 33 performs evaluation for each parameter of the physical state. Thus, for example, as shown in fig. 6, evaluation results such as "B" and "a" can be obtained for each parameter of the physical state of the user 10.
Fig. 6 is a diagram showing an exercise advice table generated by the action support system 1 according to the present embodiment. Specifically, fig. 6 shows the relationship between the evaluation result of the physical state obtained by the analysis unit 33 and the exercise content recommended to the user 10.
The exercise advice table shown in fig. 6 includes differences from the standard values, differences from the target values, and the emphasis orders. The difference from the standard value is a value obtained by subtracting the standard value from the value of the physical state for each physical state estimated by the estimation unit 28. The standard value is, for example, a reference value of "C" and corresponds to an average value of the same sex and the same age group. For example, a positive difference between the muscle strength and the standard value means that the muscle strength is higher than the average of the characteristic values of the same sex and the same age group. The difference between the muscle strength and the standard value is negative, meaning that the muscle strength is lower than the average of the characteristic values of the same sex and the same age group. In the example shown in fig. 6, balance, flexibility, agility and durability were higher than the average values for the same sex and same age period, respectively.
The difference from the target value is a value obtained by subtracting the target value input by the user 10 from the value of the inferred physical state. For example, the difference of the muscle force from the target value is a positive value, meaning that the muscle force is above the target value. The difference of the muscle force from the target value is negative, meaning that the muscle force is below the target value. In the example shown in fig. 6, the balance and muscle strength were lower than the average values for the same sex and the same age period, respectively.
The emphasis rank indicates the order of the physical state to be emphasized when determining the exercise content recommended to the user 10. The emphasis orders are determined based on, for example, the sum of the difference from the standard value and the difference from the target value. Specifically, the order of the sum of these 2 differences is used to determine the emphasis order. For example, in the example shown in fig. 6, since the sum of 2 differences of the muscle force is "-1" which is the smallest, the emphasis order is "1" which is the highest. Since the sum of balanced 2 differences is "0" to be 2 nd smaller, the emphasis order is "2".
The method of determining the order of emphasis is not limited to this. For example, the analysis unit 33 may determine the emphasis order so as to increase the order of the feature amount determined to be poor in the physical state. Specifically, the emphasis order may be determined based on only the difference from the standard value. For example, in the example shown in fig. 6, since the evaluation value of the muscle force is "C" which is the lowest, "1" whose emphasis is highest. In this case, when there are a plurality of physical states having the same evaluation result, the analysis unit 33 may increase the rank of the physical state having a small difference from the target value.
In the present embodiment, the analysis unit 33 determines the exercise content based on the difference between the emphasis order and the standard value or the target value of each of the plurality of physical states. For example, the database unit 32 stores a motion database in which, for each body state, a motion content suitable for increasing or maintaining the corresponding body state is associated. In the exercise database, the amount of increase or decrease in the physical state is associated with the exercise intensity and the amount of exercise of the exercise content. The analysis unit 33 refers to the sports database, and determines the sports content suitable for improving the feature quantity with a high priority as the sports content recommended to the user 10.
At this time, the analysis unit 33 determines the exercise content recommended to the user 10 based on the exercise habits of the user 10. The exercise habit is, for example, whether the user 10 performs exercise on a daily basis, and at least 1 of the frequency and amount of exercise performed by the user. The analysis unit 33 determines the exercise intensity and the exercise amount of the recommended exercise content based on the exercise habit. For example, when the user 10 performs exercise on a daily basis, the analysis unit 33 determines exercise contents with a strong exercise intensity or a large amount of exercise. For example, when the user 10 does not exercise almost all the day, the analysis unit 33 determines exercise content with low exercise intensity or low exercise amount. The database unit 32 stores a database in which the exercise habits are associated with recommended exercise contents, exercise intensity, and exercise amount, for example. The analysis unit 33 determines exercise content suitable for the exercise habit of the user 10 by referring to the database. In addition, the exercise habit may include a preference of the exercise of the user 10. For example, when there are a plurality of candidates in the sports content recommended to the user 10, the analysis unit 33 may determine the sports content matching the preference of the user 10.
For example, the analysis unit 33 determines an upper limit value of the exercise intensity based on the vital sign data, and determines the exercise content having the exercise intensity lower than the determined upper limit value. The upper limit value of exercise intensity is an upper limit value in a range in which a health problem does not occur due to an excessive load on the body. That is, if exercise is performed with an intensity of not more than the upper limit value, the possibility of occurrence of a problem in health is sufficiently low. The upper limit value is determined based on the vital sign data, for example. For example, when the blood pressure or the heart rate is high, the upper limit value becomes low. When the blood pressure or the heart rate is low, the upper limit value becomes high. In the present embodiment, the analysis unit 33 determines the upper limit value of the exercise intensity of the user 10 based on the vital sign data.
The analysis unit 33 may determine the lower limit of the exercise intensity based on the physical ability. The lower limit value of the exercise intensity corresponds to the amount of exercise required to increase or maintain the minimum muscle force. That is, even if exercise with an intensity less than the lower limit value is performed, it does not contribute to the enhancement or maintenance of the muscle force. The lower limit depends on the physical ability. The higher the physical ability, the higher the lower limit value. The lower the physical ability, the lower limit value. Further, the analysis unit 33 may determine the exercise content having a sufficiently low exercise intensity because it estimates that the user 10 is injured or the like when the temporal change amount of the physical state is larger than the threshold value based on the estimation result of the estimation unit 28.
The analysis unit 33 may correct the upper limit value and the lower limit value based on the characteristics of the individual user 10. Specifically, the analysis unit 33 may correct at least one of the upper limit value and the lower limit value based on at least 1 of the time-series data of the feature quantity and the motion history. For example, the analysis unit 33 may determine whether the physical constitution of the user 10 is a physical constitution that is likely to be fleshy based on the time-series data of the feature values and the exercise history, and change the lower limit value based on the determination result.
[ determination of diet Contents ]
Next, an example of a method for determining the contents of a drink will be described with reference to fig. 7.
Fig. 7 is a diagram showing a diet advice table generated by the action support system 1 according to the present embodiment. Specifically, fig. 7 shows the relationship between the evaluation result of the physical state by the analysis unit 33 and the dietary content recommended to the user 10.
The diet advice table shown in fig. 7 includes calories of the diet and the intake amounts and reference values of each of the plurality of nutrients. The intake amount is determined based on, for example, the eating habits of the user 10. Specifically, the intake amount is a declaration value obtained based on the input result, which is input by the user 10 via the input unit 42 as the eating habit of the diet ingested by the user 10. The reference value is, for example, a value determined for each demographic attribute.
The analysis unit 33 determines recommended values of each of the calories and the plurality of nutrients based on the difference between the intake amount and the reference value. The recommended value corresponds to a difference between the reported value of the intake amount of the user 10 and the intake amount to be taken by the user 10. That is, the recommended value represents an amount that should be increased or decreased compared to the amount currently taken by the user 10 for each of calories and a plurality of nutrients.
In the diet advice table, additional criteria are made to correspond according to the physical state and each combination of calories and various nutrients of the diet. The additional criterion is determined based on the evaluation result of the physical state. The additional reference is used for a correction of the intake that the user 10 of the corresponding calorie or nutrient should take. Specifically, if the additional reference is "+", the analysis unit 33 increases the amount of intake to be taken by the user 10. If the additional reference is "-", the analysis unit 33 decreases the intake amount that the user 10 should take. If the additional criterion is "0", the analysis unit 33 does not correct the intake amount to be taken by the user 10.
For example, in the case of protein, since the intake amount of the user 10 is "40" and the reference value is "50", the difference is "-10". That is, the amount of protein intake of "10" is insufficient, and therefore the recommended value is "+ 10". In addition, as an additional reference for the protein, a correspondence was established with "+" for balance, muscle strength and endurance, respectively. Therefore, the analysis unit 33 increases the recommended value from "10" to, for example, "+ 20". The correction amount at this time is determined based on, for example, the number of "+". Thus, the user 10 knows that "+ 20" should be ingested in an amount larger than the current protein intake amount in order to further improve balance, muscle strength, and endurance.
The analysis unit 33 determines the dietary content recommended to the user 10 based on the recommended value of each of the calories and the plurality of nutrients. For example, the database unit 32 stores a diet database in which calories and a plurality of nutrients are associated with each diet menu or food. The analysis unit 33 refers to the diet database to determine an appropriate diet menu or food corresponding to the recommended value as the diet content recommended to the user 10.
For example, the analysis unit 33 determines a food that should not be recommended to the user 10 as an excluded food based on the vital sign data, and determines the dietary content of the food other than the excluded food. For example, when the blood glucose level is equal to or higher than the threshold value, which is an example of vital sign data, the analysis unit 33 determines a food containing a large amount of sugar as an excluded food. For example, the database unit 32 stores a food database in which a threshold value of vital sign data is associated with an excluded food that should not be taken when the threshold value is higher or lower. The analysis unit 33 determines an excluded food by referring to the food database based on the vital sign data of the user 10, and determines the dietary content of the food other than the excluded food.
Here, the intake of calories or nutrients is taken as an example of the dietary habit, but the dietary habit is not limited thereto. The eating habits may also include the taste of the user 10. For example, when there are a plurality of candidates for the dietary content recommended to the user 10, the analysis unit 33 may preferentially determine the dietary content matching the preference of the user 10.
[ presentation example of recommended content ]
Next, a specific example of presentation of recommended content to the user 10 by the action support system 1 according to the present embodiment will be described.
Fig. 8 is a diagram showing an example of the sports content 60 recommended to the user 10 displayed in the action support system 1 according to the present embodiment. In the example shown in fig. 8, the name of the exercise of "walking" is displayed as the exercise content 60 recommended to the user 10. As specific contents of the exercise content 60, an exercise intensity 61 indicating an object of the walking speed and an exercise amount 62 indicating a walking time are displayed in text.
The display unit 43a may display an image or video indicating a specific operation example of the moving content 60. Alternatively, the display unit 43a may display, in addition to the text indicating the sports content 60, a URL (Uniform Resource Locator) of a website to which the video streaming media indicating the specific operation example is distributed.
In addition to the sports content 60, as shown in fig. 9, the display unit 43a may display the diet content 70 recommended to the user 10. Fig. 9 is a diagram showing an example of the dietary content 70 and the advertisement information 71 recommended to the user 10 displayed in the action support system 1 according to the present embodiment.
In the example shown in fig. 9, a diet menu of "fish dish" is displayed as diet content 70 recommended to the user 10. The advertisement information 71 is information for advertising a commodity or a service provided by a business associated with the diet content 70. The advertisement information 71 includes the name of the provider, URL, and diet menu provided by the provider. For example, the advertisement information 71 includes a plurality of diet menus, but may include only 1 diet menu.
The display section 43a also displays an order button 72. The order button 72 is displayed in accordance with each diet menu contained in the advertisement information 71. The order button 72 is an example of a GUI object for accepting an order of a product or service from the user 10. When the order button 72 is selected by the user 10, the input unit 42 orders the corresponding business person to perform food and drink. For example, by registering the house 11 of the user 10 in advance, the ordered food can be distributed to the house 11 and provided to the user 10.
The diet content 70 may indicate the calories and the amounts of nutrients to be taken instead of the diet menu. The diet content 70 may include a recipe for creating the diet indicated in the diet menu in addition to the diet menu. The advertisement information 71 may be advertisement information of a dealer who distributes food materials included in the cooking recipe.
As shown in fig. 10, the display unit 43a may display the search result 80 on the internet as the action content recommended to the user 10. Fig. 10 is a diagram showing an example of a search result 80 of action content recommended to the user 10 displayed in the action support system 1 according to the present embodiment.
Specifically, the analysis unit 33 generates a search word to be used for search by the search engine based on the estimation result of the change in the physical state and the behavior habit, and gives the search word to the search engine via the communication unit 31. For example, the analysis unit 33 generates a search word including the names of the actions recommended to the user 10, such as the words associated with health promotion, such as the item names of the body states with high emphasis ranks, "improvement" and "improvement" in the exercise advice table, the terms "exercise" and "diet". In the example shown in fig. 6, since the exercise for the purpose of improving the "muscle strength" is recommended to the user 10, the analysis unit 33 generates a search word of the "muscle strength increasing exercise". The search word may include a word indicating the amount of exercise and the intensity of exercise. The search result obtained by the search engine is displayed on the display unit 43a as shown in fig. 10.
Similarly, the analysis unit 33 generates a search word including the name and amount of the nutrient to be ingested by the user 10 in a large amount, and the words related to the diet such as "food", "complementary food", and "recipe". Terms related to health promotion, such as "improving" or "increasing" and the like, may be included instead of the amount of nutrients.
[ Effect and the like ]
As described above, the action support system 1 according to the present embodiment includes: a measurement unit 23 for measuring a characteristic amount of walking of the user 10; a storage unit 22 for storing time-series data of the feature amount measured by the measurement unit 23; an estimation unit 28 that estimates a change in the physical state of the user 10 based on the time-series data; an analysis unit 33 for determining the action content recommended to the user 10 based on the result of the estimation performed by the estimation unit 28 and the action habit of the user 10; and a presentation unit 43 for presenting the action content determined by the analysis unit 33 to the user 10.
Thus, the action content suitable for promoting the health of the user 10 is presented to the user 10 based on the estimation result of the change in the physical state of the user 10 and the action habit. Since the user 10 can promote health by performing an action in accordance with the presented action content, it is possible to suppress a possibility of suffering from a disease such as sarcopenia or frailty. As described above, according to the present embodiment, the user 10 can be urged to perform an effective action for health promotion.
For example, the action habit is a habit regarding at least one of exercise and diet.
Thus, it is possible to determine action content suitable for improving at least one of exercise and diet for the user 10 based on the action habit of at least one of exercise and diet.
For example, the characteristic amount of walking is at least 1 of walking speed, stride length, stride width, walking cycle, walking frequency, left-right step difference, trunk movement during walking, and change amount of joint angle.
This makes it possible to represent the characteristic amount of walking quantitatively and specifically, and thus to recommend action content more suitable for the user 10. Thus, the user 10 can be urged to perform an effective action for health promotion of the user 10.
Further, for example, the physical state is at least 1 of balance, muscle strength, endurance, agility, and cognitive function.
This makes it possible to express the physical state by a quantitative and specific value, and thus to recommend action content more appropriate for the user 10. Thus, the user 10 can be urged to perform an effective action for health promotion of the user 10.
The measurement unit 23 includes, for example, a vital sign measurement unit 25 that measures vital sign data of the user 10. The storage unit 22 also stores time-series data of vital sign data. The analysis unit 33 also determines the action content based on the time-series data of the vital sign data.
This makes it possible to determine the health status of the user 10, such as the presence or absence of illness, based on the vital sign data of the user 10. Therefore, appropriate exercise contents and diet contents can be presented in a non-trivial range so as not to impair the health of the user 10.
Further, for example, the vital sign data is at least 1 of the weight, blood pressure, pulse, urine pH and urine glucose of the user 10.
This makes it possible to represent the vital sign data by a specific value, and thus to recommend action content more appropriate for the user 10. Thus, the user 10 can be urged to perform an effective action for health promotion of the user 10.
The measurement unit 23 includes, for example, an imaging unit 24a (camera 12) for photographing the user 10, and measures the feature amount using a moving image obtained by the camera 12.
This makes it possible to easily measure the characteristic amount of walking without wearing a special measuring device by the user.
For example, the action support method according to the present embodiment includes: measuring a characteristic amount of walking of the user 10; a step of generating time-series data of the feature amount by storing the measured feature amount in the storage unit 22; a step of performing inference of a change in the physical state of the user 10 based on the time-series data; a step of deciding the action content recommended to the user 10 based on the inferred result and the action habit of the user 10; and a step of prompting the determined action content to the user 10.
As a result, the user 10 can be urged to perform an effective action for health promotion, as in the action support system 1 described above.
(others)
The action support system and the action support method according to the present invention have been described above based on the above-described embodiments, but the present invention is not limited to the above-described embodiments.
For example, at least one of the diet content and the exercise content may not be included in the recommended content recommended to the user 10.
For example, the measurement unit 23 may not measure at least one of the vital sign data and the activity of the user 10.
Note that the method of communication between devices described in the above embodiments is not particularly limited. When wireless communication is performed between devices, the wireless communication method (communication standard) is short-range wireless communication such as ZigBee (registered trademark), Bluetooth (registered trademark), or wireless LAN (Local Area Network). Alternatively, the wireless communication method (communication standard) may be communication via a wide area communication network such as the internet. In addition, wired communication may be performed between devices instead of wireless communication. The wired Communication is, specifically, Power Line Communication (PLC), Communication using a wired LAN, or the like.
In the above embodiment, the processing executed by a specific processing unit may be executed by another processing unit. Note that the order of the plurality of processes may be changed, or the plurality of processes may be executed in parallel. In addition, the distribution of the components included in the muscle force management system to the plurality of devices is an example. For example, another device may include the components included in one device. For example, the terminal device 40 or the server device 30 may include components included in the measurement device 20. Specifically, the terminal device 40 may include the measurement unit 23 or the estimation unit 28. Alternatively, the terminal device 40 may include some of the components included in the measurement unit 23, specifically, at least 1 of the imaging unit 24a, the acceleration sensor 24b, the vital sign measurement unit 25, the determination unit 26, and the activity measurement unit 27. For example, the server device 30 may include the determination unit 26 or the estimation unit 28. The measurement device 20 or the terminal device 40 may include at least 1 of the database unit 32 and the analysis unit 33 included in the server device 30. For example, the measurement device 20 may include at least 1 of the input unit 42 and the presentation unit 43 included in the terminal device 40. Furthermore, the muscle force management system may also be implemented as a single device.
For example, the processing described in the above embodiment may be realized by centralized processing using a single device (system), or may be realized by distributed processing using a plurality of devices. The processor that executes the program may be single or plural. That is, the collective processing may be performed, or the distributed processing may be performed.
In the above-described embodiment, all or a part of the components such as the control unit may be configured by dedicated hardware, or may be realized by executing a software program suitable for each component. Each component may be realized by a program execution Unit such as a CPU (Central Processing Unit) or a processor reading out and executing a software program recorded in a recording medium such as a HDD (Hard Disk Drive) or a semiconductor memory.
The components of the control unit and the like may be constituted by 1 or more electronic circuits. The 1 or more electronic circuits may be general-purpose circuits or dedicated circuits, respectively.
The 1 or more electronic circuits may include, for example, a semiconductor device, an IC (Integrated Circuit), an LSI (Large Scale Integration), or the like. The IC or LSI may be integrated in 1 chip or may be integrated in a plurality of chips. Referred to herein as IC or LSI, but may vary depending on the degree of Integration, and may be referred to as system LSI, VLSI (Very Large Scale Integration), or ULSI (Ultra Large Scale Integration). An FPGA (Field Programmable Gate Array) programmed after the manufacture of the LSI can be used for the same purpose.
The present invention in its entirety or a specific aspect may be implemented by a system, an apparatus, a method, an integrated circuit, or a computer program. Alternatively, the present invention can be realized by a computer-readable non-transitory recording medium such as an optical disk, an HDD, or a semiconductor memory, in which the computer program is stored. Further, the present invention can be realized by any combination of a system, an apparatus, a method, an integrated circuit, a computer program, and a recording medium.
In addition, the present invention includes an embodiment obtained by applying various modifications to the respective embodiments as will occur to those skilled in the art, and an embodiment obtained by arbitrarily combining the components and functions of the respective embodiments without departing from the scope of the present invention.
Description of the reference symbols
1 action support system
10 users
22. 44 storage part
23 measurement unit
24 detection part
24a image pickup unit
25 vital sign measuring unit
28 inference part
32 database part
33 analysis part
43 presentation part
60 sports content
70 diet content
Claims (8)
1. A behavior support system includes:
a measurement unit that measures a characteristic amount of walking of a user;
a storage unit for storing time-series data of the feature amount measured by the measurement unit;
An estimation unit configured to estimate a change in the physical state of the user based on the time-series data;
an analysis unit configured to determine action content recommended to the user based on a result of the estimation performed by the estimation unit and an action habit of the user; and
and a presentation unit that presents the action content determined by the analysis unit to the user.
2. The action support system according to claim 1,
the action habit is a habit relating to at least one of exercise and diet.
3. The action support system according to claim 1 or 2,
the characteristic amount of the walking is at least 1 of walking speed, stride length, stride width, walking cycle, walking frequency, left-right step difference, trunk movement during walking, and change amount of joint angle.
4. The action support system according to any one of claims 1 to 3,
the above physical state is at least 1 of balance, muscle strength, endurance, agility and cognitive function.
5. The action support system according to any one of claims 1 to 4,
the measurement unit includes a vital sign measurement unit that measures vital sign data of the user;
the storage unit further stores time-series data of the vital sign data;
The analysis unit also determines the action content based on the time-series data of the vital sign data.
6. The action support system according to claim 5,
the vital sign data is at least 1 of the weight, blood pressure, pulse, urine pH and urine glucose of the user.
7. The action support system according to any one of claims 1 to 6,
the measurement unit includes a camera for photographing the user, and measures the feature amount using a moving image obtained by the camera.
8. A method for supporting an action is provided,
the method comprises the following steps:
measuring a characteristic amount of walking of the user;
a step of generating time-series data of the feature amount by storing the measured feature amount in a storage unit;
a step of estimating a change in the physical state of the user based on the time-series data;
determining action content recommended to the user based on the result of the estimation and the action habit of the user; and
and a step of presenting the determined action content to the user.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019074255 | 2019-04-09 | ||
JP2019-074255 | 2019-04-09 | ||
PCT/JP2020/006339 WO2020208944A1 (en) | 2019-04-09 | 2020-02-18 | Behavior support system and behavior support method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113473901A true CN113473901A (en) | 2021-10-01 |
CN113473901B CN113473901B (en) | 2024-09-13 |
Family
ID=72751802
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202080016860.8A Active CN113473901B (en) | 2019-04-09 | 2020-02-18 | Action support system and action support method |
Country Status (3)
Country | Link |
---|---|
JP (1) | JP7182319B2 (en) |
CN (1) | CN113473901B (en) |
WO (1) | WO2020208944A1 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024034889A1 (en) * | 2022-08-12 | 2024-02-15 | 삼성전자주식회사 | Method for determining gait state, and device performing method |
CN115721269A (en) * | 2022-11-07 | 2023-03-03 | 四川大学华西医院 | Severe rehabilitation assessment training system based on six-minute walking test |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003190124A (en) * | 2001-12-26 | 2003-07-08 | Mitsubishi Chemicals Corp | Evaluating method for mental and physical conditions, control method for the same, evaluating device for the same, control device for the same and computer readable recording medium with recorded program thereon |
CN102438519A (en) * | 2009-08-28 | 2012-05-02 | 艾伦·约瑟夫·泽尔纳 | Characterizing a physical capability by motion analysis |
CN103503016A (en) * | 2011-09-14 | 2014-01-08 | 株式会社Ntt都科摩 | Diet support system and diet support method |
CN103518204A (en) * | 2011-04-20 | 2014-01-15 | 索尼公司 | Information processing device, information processing method, and program |
CN106132300A (en) * | 2014-03-24 | 2016-11-16 | 精工爱普生株式会社 | Motion suggestion device, motion reminding method and motion attention program |
KR20170109962A (en) * | 2016-03-22 | 2017-10-10 | 배재대학교 산학협력단 | System and method for monitoring change of body mass index web based |
JP2018007979A (en) * | 2016-07-15 | 2018-01-18 | カシオ計算機株式会社 | Exercise support apparatus, exercise support method, and exercise support program |
WO2018012071A1 (en) * | 2016-07-14 | 2018-01-18 | ソニー株式会社 | Information processing system, recording medium, and information processing method |
WO2019056605A1 (en) * | 2017-09-25 | 2019-03-28 | 上海斐讯数据通信技术有限公司 | Method and system for dynamically adjusting healthy exercise plan |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002297775A (en) | 2001-03-29 | 2002-10-11 | Mizuno Corp | Health/consumption information managing device, method, and program |
EP3196658A1 (en) * | 2014-09-18 | 2017-07-26 | Kunihiro Shiina | Recording device, mobile terminal, analysis device, program, and storage medium |
JP2017097401A (en) * | 2015-11-18 | 2017-06-01 | セイコーエプソン株式会社 | Behavior modification analysis system, behavior modification analysis method and behavior modification analysis program |
WO2019187099A1 (en) * | 2018-03-30 | 2019-10-03 | 株式会社日立製作所 | Bodily function independence assistance device and method therefor |
-
2020
- 2020-02-18 JP JP2021513507A patent/JP7182319B2/en active Active
- 2020-02-18 WO PCT/JP2020/006339 patent/WO2020208944A1/en active Application Filing
- 2020-02-18 CN CN202080016860.8A patent/CN113473901B/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003190124A (en) * | 2001-12-26 | 2003-07-08 | Mitsubishi Chemicals Corp | Evaluating method for mental and physical conditions, control method for the same, evaluating device for the same, control device for the same and computer readable recording medium with recorded program thereon |
CN102438519A (en) * | 2009-08-28 | 2012-05-02 | 艾伦·约瑟夫·泽尔纳 | Characterizing a physical capability by motion analysis |
CN103518204A (en) * | 2011-04-20 | 2014-01-15 | 索尼公司 | Information processing device, information processing method, and program |
CN103503016A (en) * | 2011-09-14 | 2014-01-08 | 株式会社Ntt都科摩 | Diet support system and diet support method |
CN106132300A (en) * | 2014-03-24 | 2016-11-16 | 精工爱普生株式会社 | Motion suggestion device, motion reminding method and motion attention program |
KR20170109962A (en) * | 2016-03-22 | 2017-10-10 | 배재대학교 산학협력단 | System and method for monitoring change of body mass index web based |
WO2018012071A1 (en) * | 2016-07-14 | 2018-01-18 | ソニー株式会社 | Information processing system, recording medium, and information processing method |
JP2018007979A (en) * | 2016-07-15 | 2018-01-18 | カシオ計算機株式会社 | Exercise support apparatus, exercise support method, and exercise support program |
WO2019056605A1 (en) * | 2017-09-25 | 2019-03-28 | 上海斐讯数据通信技术有限公司 | Method and system for dynamically adjusting healthy exercise plan |
Also Published As
Publication number | Publication date |
---|---|
JP7182319B2 (en) | 2022-12-02 |
CN113473901B (en) | 2024-09-13 |
JPWO2020208944A1 (en) | 2021-12-02 |
WO2020208944A1 (en) | 2020-10-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10950335B2 (en) | Health tracking device | |
WO2020208945A1 (en) | Muscle strength management system and muscle strength management method | |
US8712108B2 (en) | Information processing apparatus, information outputting method and computer program storage device | |
US8706731B2 (en) | System and method for providing healthcare program service based on vital signals and condition information | |
KR101592021B1 (en) | Personalized pregnancy, birth, postnatal care-related information providing service method, apparatus and system | |
CN107249435B (en) | System and method for providing user insight based on real-time physiological parameters | |
US10049598B1 (en) | Passive tracking and prediction of food consumption | |
EP3238611A1 (en) | A method and device for estimating a condition of a person | |
CN113473901B (en) | Action support system and action support method | |
CN111433859B (en) | Information processing device, method, and program | |
JP2014164411A (en) | Health management support system and program | |
KR102426924B1 (en) | Health management system and Health management apparatus using bio-impedance measuring apparatus | |
US20160374610A1 (en) | Hunger management | |
JP7193011B2 (en) | Nutrient Intake Estimation System, Nutrient Intake Estimation Method, Nutrient Intake Estimation Device, and Nutrient Intake Output Method | |
KR102395631B1 (en) | Personal dietarian management system using smart trays | |
JP2023004124A (en) | Body information evaluation device, method for evaluating body information, program, and recording medium | |
JP2022015735A (en) | Information processing device, information processing method, and program | |
JP2023114831A (en) | System, mobile terminal, server, information processing device, program, or method | |
KR20240024424A (en) | Personal custom healthcare system that utilizes personal health and medical data |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |