WO2023286469A1 - 情報処理方法、情報処理装置、及び情報処理プログラム - Google Patents
情報処理方法、情報処理装置、及び情報処理プログラム Download PDFInfo
- Publication number
- WO2023286469A1 WO2023286469A1 PCT/JP2022/021650 JP2022021650W WO2023286469A1 WO 2023286469 A1 WO2023286469 A1 WO 2023286469A1 JP 2022021650 W JP2022021650 W JP 2022021650W WO 2023286469 A1 WO2023286469 A1 WO 2023286469A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- risk
- height
- degree
- walking
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims description 49
- 238000003672 processing method Methods 0.000 title claims description 28
- 230000009471 action Effects 0.000 claims description 73
- 230000006399 behavior Effects 0.000 claims description 59
- 238000012549 training Methods 0.000 claims description 43
- 230000009194 climbing Effects 0.000 claims description 42
- 230000007613 environmental effect Effects 0.000 claims description 29
- 238000000034 method Methods 0.000 claims description 18
- 230000008569 process Effects 0.000 claims description 12
- 238000007634 remodeling Methods 0.000 claims description 7
- 230000000694 effects Effects 0.000 claims description 5
- 230000006870 function Effects 0.000 claims description 4
- 238000010586 diagram Methods 0.000 description 20
- 238000000605 extraction Methods 0.000 description 16
- 230000000630 rising effect Effects 0.000 description 15
- 230000004884 risky behavior Effects 0.000 description 15
- 238000001514 detection method Methods 0.000 description 14
- 238000004891 communication Methods 0.000 description 13
- 238000012545 processing Methods 0.000 description 12
- 238000004140 cleaning Methods 0.000 description 10
- 238000005070 sampling Methods 0.000 description 6
- 239000000470 constituent Substances 0.000 description 5
- 238000009418 renovation Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 230000009474 immediate action Effects 0.000 description 3
- CHIFOSRWCNZCFN-UHFFFAOYSA-N pendimethalin Chemical compound CCC(CC)NC1=C([N+]([O-])=O)C=C(C)C(C)=C1[N+]([O-])=O CHIFOSRWCNZCFN-UHFFFAOYSA-N 0.000 description 3
- 206010017577 Gait disturbance Diseases 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 230000002040 relaxant effect Effects 0.000 description 2
- 210000003371 toe Anatomy 0.000 description 2
- 208000010392 Bone Fractures Diseases 0.000 description 1
- 208000006670 Multiple fractures Diseases 0.000 description 1
- 208000027418 Wounds and injury Diseases 0.000 description 1
- 230000032683 aging Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000003287 bathing Methods 0.000 description 1
- 235000021152 breakfast Nutrition 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000005021 gait Effects 0.000 description 1
- 230000005802 health problem Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 235000012054 meals Nutrition 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 230000002618 waking effect Effects 0.000 description 1
- 238000005406 washing Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B19/00—Teaching not covered by other main groups of this subclass
- G09B19/003—Repetitive work cycles; Sequence of movements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1113—Local tracking of patients, e.g. in a hospital or private home
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1113—Local tracking of patients, e.g. in a hospital or private home
- A61B5/1114—Tracking parts of the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1116—Determining posture transitions
- A61B5/1117—Fall detection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/112—Gait analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/02—Details of sensors specially adapted for in-vivo measurements
- A61B2562/0219—Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6828—Leg
Definitions
- the present disclosure relates to technology for assisting human walking.
- a self-propelled traveling device records a captured image captured by an imaging means and information on an obstacle detected by an obstacle detection means together with date and time information and position information, and a flow line for each resident is recorded.
- Patent Literature 1 does not consider the relationship between the walking ability of a person and the height of a step on the floor to detect the danger associated with walking of a person, so further improvement is required.
- the present disclosure has been made to solve such problems, and is to provide a technique for appropriately assisting walking according to the walking ability of a person and the height of a step on the floor.
- An information processing method includes a processor of an information processing device that acquires, based on sensing data of a person, information indicating how the person's legs are raised, and detects steps on a floor of a space in which the person moves. obtains height information indicating the height of the person, determines the degree of risk associated with walking of the person based on the information on how to climb and the height information, generates walking assist information according to the degree of risk, and performs the walking Output assist information.
- appropriate walking assistance can be performed according to the person's walking ability and the height of the step on the floor.
- FIG. 1 is a block diagram showing an example of a configuration of a server according to Embodiment 1 of the present disclosure
- FIG. It is a figure which shows an example of the data structure of action information. It is a figure which shows an example of the data structure of an action pattern database. It is a figure which shows an example of the data structure of environment information. It is a figure which shows an example of a data structure of an environmental pattern database. It is a figure which shows an example of a data structure of a dangerous action database. It is a figure which shows an example of a data structure of a dangerous environment database.
- 4 is a flowchart showing an example of server processing according to Embodiment 1 of the present disclosure.
- FIG. 4 is a diagram showing an example of a notification screen displayed on the display of the terminal in Embodiment 1.
- FIG. FIG. 9 is a flow chart showing the details of the process (risk determination) in step S11 of FIG. 8.
- FIG. FIG. 10 is a block diagram showing an example of a configuration of a server according to Embodiment 2 of the present disclosure;
- FIG. 11 is a flowchart showing an example of processing when a server determines notification timing according to Embodiment 2 of the present disclosure;
- FIG. FIG. 10 is a flow chart showing an example of processing when a server transmits notification information in Embodiment 2 of the present disclosure;
- FIG. 10 is a diagram showing an example of a notification screen displayed on the display of the terminal according to the second embodiment of the present disclosure
- FIG. FIG. 11 is a block diagram showing an example of a configuration of a server according to Embodiment 3 of the present disclosure
- FIG. FIG. 12 is a flowchart showing an example of server processing according to Embodiment 3 of the present disclosure
- FIG. It is a figure which shows an example of the display screen of a reform proposal.
- FIG. 14 is a block diagram showing an example of a configuration of a server according to Embodiment 4;
- FIG. It is a figure which shows an example of the notification screen of training information.
- Patent Document 1 only detects obstacles that are positioned around the line of flow of people and obstruct walking, and does not consider the relationship between the walking ability of people and the height of steps on the floor. Therefore, it is not possible to provide proper walking assistance.
- An information processing method includes a processor of an information processing device that acquires, based on sensing data of a person, information indicating how the person's legs are raised, and detects steps on a floor of a space in which the person moves. obtains height information indicating the height of the person, determines the degree of risk associated with walking of the person based on the information on how to climb and the height information, generates walking assist information according to the degree of risk, and performs the walking Output assist information.
- the climbing direction information indicating how the person's leg rises is acquired, and the degree of risk associated with walking of the person is determined based on the acquired climbing direction information and the height information indicating the height of the step on the floor.
- the determination is made, and walking assist information corresponding to the degree of risk is output. Therefore, appropriate walking assistance can be performed according to the walking ability of the person and the height of the step on the floor.
- behavior pattern information indicating a behavior pattern of the person in the space is acquired and stored in a memory, and in the determination of the degree of risk, if the degree of risk is determined to be equal to or greater than a threshold value, the person is deemed to be in danger.
- Risky action information is generated by associating the action pattern information including the person's action and place related to the determined timing with the risk level, and in generating the walking assistance information, based on the risky action information, Notification information that notifies the action and location related to the timing may be generated as the walking assistance information, and the notification information may be presented in the output of the walking assistance information.
- the behavior related to the timing may include the behavior immediately before the timing
- the notification information may include the behavior immediately before the timing
- environmental pattern information indicating a pattern of changes in the environment in the space is further acquired and stored in a memory, and in the judgment of the degree of risk, if the degree of risk is determined to be equal to or greater than a threshold value, it is determined to be dangerous.
- dangerous environment information is generated. may be generated, and the notification information may be presented in the output of the walking assistance information.
- the environment in which there is a possibility of falling is notified, so the user can grasp the environment in which there is a high possibility of falling.
- a fall is further performed based on at least one of action pattern information and dangerous behavior information associated with the degree of risk, and dangerous environment information associated with the environmental pattern information and the degree of risk.
- the notification information may be presented when it is estimated that there is a situation in which there is a high possibility of falling.
- the presence or absence of a situation in which there is a high possibility of falling is estimated. Since it is presented, it is possible to notify the user of at least one of the dangerous behavior and the environment in the situation and to call the user's attention.
- the degree of risk may be the frequency at which the height of the step indicated by the height information is determined to be equal to or greater than the height of the leg indicated by the climbing information.
- the frequency at which it is determined that the height of the step is equal to or greater than the height of the step is adopted as the degree of risk, so frequent occurrence of walking assist information can be prevented.
- the walking assist information may be output at the timing when the risk is determined to be equal to or greater than a threshold.
- the walking assist information since the walking assist information is output at the timing when the degree of risk is determined to be the threshold value, the walking assist information can be output in real time at the timing when the possibility of falling is high.
- the height information includes the position of the step on the floor, and in acquiring the climbing direction information, the detected climbing direction information is stored in a memory, and further, the detected climbing direction information is stored in the memory. Predicting how the person's legs rise in the future when walking based on the history of the climbing method information, and based on the future leg climbing method and the height information, the step where there is a possibility of falling in the future. may be specified.
- a remodeling plan for the space for improving the possibility of future falls at the specified position of the step may be generated, and the remodeling plan may be output.
- a renovation plan is presented to improve the possibility of falling on a step that increases the risk of falling in the future, so it is possible to encourage renovation that improves the possibility of falling.
- training information for improving walking ability of the person may be presented according to the degree of risk, and the training information may be output.
- training information for improving a person's walking ability is presented according to the degree of risk, so it is possible to prompt the person to perform training to improve their walking ability.
- the training information may include a training place in the space previously specified based on the height information.
- the training information includes the training location in the space, it is possible to more reliably encourage the person to perform training to improve walking ability.
- the training information may be presented when the risk level is determined to be equal to or greater than a threshold.
- the training information is presented when the degree of risk is determined to be equal to or higher than the threshold, so the training information can be presented to people with low walking ability.
- An information processing device is an information processing device including a processor, wherein the processor acquires rising manner information indicating how the person's legs rise based on sensing data of the person, and the Obtaining height information indicating the height of a step on the floor of a space where a person moves, determining the degree of risk of walking of the person based on the information on how to climb and the height information, and determining the degree of risk according to the degree of risk
- a process of generating walking assist information and outputting the walking assist information is executed.
- An information processing program is an information processing program that causes a computer to function as an information processing device, wherein a processor is provided with how-up information indicating how the person's legs are raised based on sensing data of the person. obtain height information indicating the height of a step in the floor of the space where the person moves, determine the degree of risk associated with walking of the person based on the climbing direction information and the height information, and A process of generating walking assist information according to the degree and outputting the walking assist information is executed.
- the present disclosure can also be implemented as an information processing system operated by such an information processing program. It goes without saying that such a computer program can be distributed via a computer-readable non-temporary recording medium such as a CD-ROM or a communication network such as the Internet.
- a computer-readable non-temporary recording medium such as a CD-ROM or a communication network such as the Internet.
- FIG. 1 is a block diagram showing an example of the configuration of a server 1 according to Embodiment 1 of the present disclosure.
- the server 1 is an example of an information processing device.
- the server 1 is connected to, for example, a temperature sensor 2, an illumination sensor 3, a body sensor 4, an in-home sensor 5, a cleaning robot 6, a terminal 7, and electrical equipment 8 via a network.
- the network is, for example, a wide area network including mobile telephone networks and the Internet.
- the server 1 is, for example, a cloud server.
- the temperature sensor 2, the illuminance sensor 3, the in-home sensor 5, the cleaning robot 6, and the electrical equipment 8 are placed in the user's home.
- a user's home is an example of a space.
- a user is an example of a person.
- the temperature sensors 2 are arranged at one or more locations in the house, measure the room temperature at each location, and transmit sensing data indicating the measured temperature at each location to the server 1 at a predetermined sampling cycle.
- the illuminance sensor 3 is arranged at one or more locations in the home, measures the illuminance at each location, and transmits sensing data indicating the measured illuminance at each location to the server 1 at a predetermined sampling cycle.
- the sensing data transmitted from the temperature sensor 2 and the illuminance sensor 3 includes, for example, a home ID indicating the home where the sensing was performed, a location ID indicating the sensing location, the sensing time, and the sensor value.
- the body sensor 4 is, for example, an acceleration sensor or a gyro sensor attached to the user's leg, and transmits sensing data indicating the movement of the user's leg to the server 1 at predetermined sampling intervals.
- the sensing data transmitted from the body sensor 4 includes, for example, the user ID of the user wearing the body sensor 4, the sensing time, and the sensor value.
- the body sensor 4 may be a smartwatch or smart phone. If the body sensor 4 is a smart phone, the body sensor 4 is placed in the user's pants pocket.
- the in-home sensors 5 are, for example, image sensors arranged at a plurality of locations (for example, the ceiling) in the user's home, and transmit image data representing the user's movements as sensing data to the server 1 at predetermined sampling intervals.
- the cleaning robot 6 is a self-propelled robot that cleans the inside of the user's home. The cleaning robot 6 photographs the user inside the home using an image sensor. to server 1.
- Sensing data of the image sensor includes, for example, home ID, place ID, sensing time, and sensor value.
- the in-home sensor 5 and the cleaning robot 6 may be distance sensors that capture distance images instead of image sensors. Examples of ranging sensors are LiDAR and laser range finders.
- the terminal 7 is composed of, for example, an information terminal such as a mobile information terminal and a tablet computer, and is carried by the user.
- the terminal 7 receives notification information for notifying the risk of walking from the server 1 and displays the received notification information on the display.
- the electrical appliances 8 are, for example, household electrical appliances such as microwave ovens, water heaters, refrigerators, washing machines, televisions, and cookers.
- the electrical equipment 8 transmits the operation log to the server 1 at a predetermined sampling period.
- the server 1 includes a communication unit 11, a motion information extraction unit 12, an action information generation unit 13, an environment information generation unit 14, a climbing direction detection unit 15, a risk determination unit 16, an output unit 17, a blueprint storage unit 18, a height Information extraction unit 19, height database (DB) 20, walking database (DB) 21, behavior pattern database (DB) 22, environment pattern database (DB) 23, dangerous behavior database (DB) 24, and dangerous environment database (DB) ) 25.
- the motion information extractor 12 to the height information extractor 19 are implemented by a processor executing an information processing program. However, this is only an example, and the motion information extraction unit 12 to the height information extraction unit 19 may be configured by dedicated hardware circuits such as ASIC.
- the walking database 21 to the dangerous environment database 25 are composed of nonvolatile rewritable storage devices such as hard disk drives and solid state drives.
- the communication unit 11 is a communication circuit that connects the server 1 to the network.
- the communication unit 11 inputs sensing data transmitted from the temperature sensor 2 and the illuminance sensor 3 to the environment information generation unit 14 .
- the communication unit 11 inputs sensing data transmitted from the body sensor 4 , the indoor sensor 5 , and the cleaning robot 6 to the motion information extraction unit 12 .
- the communication unit 11 transmits the notification information generated by the output unit 17 to the terminal 7 .
- the movement information extraction unit 12 analyzes the sensing data input from the communication unit 11 to extract movement information indicating the movement of the user's body, and generates action information from the extracted movement information in time series at a predetermined sampling period. It is input to the unit 13 and the rising direction detection unit 15 .
- the movement information extracted from the sensing data of the image sensor transmitted from the indoor sensor 5 and the cleaning robot 6 includes, for example, the user's skeleton information.
- Skeletal information is information in which characteristic parts such as the user's toes, heels, tips of arms, face, and joints are connected by links indicating arms, neck, legs, and torso.
- the motion information extraction unit 12 may extract skeleton information using a known skeleton detection algorithm such as open pose. Note that the skeleton information includes the house ID, the place ID, and the sensing time included in the sensing data that is the extraction source. Furthermore, the motion information extraction unit 12 may identify the user from the image data using face recognition technology, and include the user ID of the identified user in the skeleton information.
- the motion information extracted from the sensing data of the acceleration sensor or the gyro sensor transmitted from the body sensor 4 includes, for example, leg height data indicating the height of the user's legs relative to the floor, the user ID, and the sensing time. .
- the motion information extraction unit 12 may calculate leg height data by integrating sensor values of an acceleration sensor or a gyro sensor, for example. Note that the leg height data includes the user ID and the sensing time included in the sensing data that is the extraction source.
- the leg height data is, for example, two-dimensional or three-dimensional coordinate data indicating the height position of the leg with respect to the floor.
- the server 1 manages one user in one home. Therefore, the home ID and user ID are omitted from the explanation. However, this is just an example, and the server 1 may manage multiple homes and multiple users. In this case, the home and user may be identified using the home ID and user ID.
- the skeleton information is input to the action information generation unit 13 and the climbing direction detection unit 15, and the leg height data is input to the climbing direction detection unit 15.
- the behavior information generation unit 13 generates behavior information indicating human behavior by analyzing movement information (skeletal information), and stores the history of the generated behavior information in a memory (not shown).
- FIG. 2 is a diagram showing an example of the data structure of action information.
- the action information includes "place”, “time”, and “action”.
- "Location” indicates the location where the user acted.
- a place is identified from a place ID included in the skeleton information.
- “Time” indicates the sensing time of the action.
- “Time” is specified from the sensing time included in the skeleton information.
- “Behavior” indicates the behavior obtained by analyzing the skeleton information. Actions to be analyzed are, for example, actions that the user performs in daily life, such as eating, moving, returning home, bathing, and exercising.
- the behavior information generation unit 13 may identify user behavior from skeleton information using, for example, a pattern matching technique, or may identify user behavior using a trained model that estimates user behavior from skeleton information. may Note that the behavior information generating unit 13 may further use the operation log transmitted from the electrical device 8 in addition to the skeleton information to estimate the behavior of the user, or estimate the behavior of the user from the image data of the image sensor. Alternatively, the behavior of the user may be estimated based on the information from the body sensor.
- the generated action information is stored in the action pattern database 22 .
- the action information generation unit 13 generates action pattern information indicating the user's action pattern from the generated action information, and stores it in the action pattern database 22 .
- FIG. 3 is a diagram showing an example of the data configuration of the behavior pattern database 22.
- the action pattern database 22 stores action pattern information including "place”, “time period”, and “behavior”.
- the behavior pattern information is information that indicates what kind of behavior the user takes in what time period, in what place, in a day.
- a "place” is a place where the user acted.
- “Time period” is the time period during which the user acted.
- An “action” is an action taken by a user.
- the action information generation unit 13 classifies the history of action information according to location and action, and specifies the time period during which the user takes the classified action in the classified place from the history of the classified action information, thereby generating action pattern information. should be generated.
- the first row of the behavior pattern database 22 stores behavior pattern information indicating that the user has a behavior pattern of eating in the kitchen between 19:00 and 20:00.
- the environmental information generating unit 14 analyzes the sensing data (the sensing data of the temperature sensor 2 and the illuminance sensor 3) input from the communication unit 11 to generate environmental information indicating the environment in the home, and generates the generated environmental information. history is stored in memory.
- FIG. 4 is a diagram showing an example of the data configuration of environment information.
- the environment information includes "place”, “time”, and “illuminance”.
- “Place” indicates the place where the illuminance is sensed, and is specified from the place ID included in the sensing data.
- “Time” is the sensing time, which is specified from the sensing time included in the sensing data.
- “Illuminance” is the illuminance at the sensing location.
- the environmental information may include the temperature of the sensing location in addition to the illuminance.
- the example of FIG. 4 shows environment information indicating that the illuminance of the corridor is "1" at 22:00.
- the environmental information generation unit 14 generates environmental pattern information indicating patterns of changes in the home environment in one day from the environmental information history, and stores it in the environmental pattern database 23 .
- FIG. 5 is a diagram showing an example of the data configuration of the environment pattern database 23.
- the environmental pattern database 23 stores environmental pattern information including "location”, "time zone”, and "illuminance”.
- the environmental information generation unit 14 classifies the history of environmental information by location and illuminance, and generates action pattern information by specifying the time zone of the classified illuminance in the classified location from the history of the classified environmental information. do it.
- the first row of the environment pattern database 23 stores environment pattern information indicating that the illuminance of the corridor is "1" during the time period from 22:00 to 23:00.
- the rising direction detection unit 15 generates rising direction information indicating how the user's leg rises based on the motion information (skeletal structure information and leg height data) input from the motion information extracting unit 12 .
- the way-up information is time-series data of the extent to which the leg is raised.
- the ascent information includes a location ID and sensing time.
- the leg lift is the maximum vertical distance between the floor surface and the lowest position of the leg (for example, the toe) in one walking cycle. If the leg raise width is different between the left and right legs, the lower leg raise width is adopted.
- the rising direction detection unit 15 may use either one of the input leg height data and skeletal information to calculate the extent of leg elevation in each walking cycle, or may use both of them to calculate the extent of leg elevation.
- the rising direction detection unit 15 basically calculates the amount of leg elevation in each walking cycle from the leg height data, and if the leg elevation amount cannot be calculated from the leg height data, the skeletal information is used. may be used to calculate the leg lift in each walking cycle. In this manner, the rising direction detecting unit 15 may calculate the leg lifting width for each walking cycle by interpolating one of the skeleton information and the leg height data with the other. The generated climbing direction information is input to the risk determination unit 16 and stored in the walking database 21 .
- the risk determination unit 16 acquires climbing direction information from the climbing direction detecting unit 15 and the walking database 21, acquires floor height information from the height database 20, and determines the acquired climbing direction information and the acquired height information. Based on this, the degree of danger regarding user's walking is determined. For example, the risk determination unit 16 determines that the height of the step indicated by the location ID included in the climbing direction information input from the climbing direction detection unit 15 is greater than or equal to the leg lift width included in the climbing direction information. It suffices to calculate the frequency (number of times) as the degree of risk. The details of the processing of the risk determination unit 16 will be described later. Alternatively, the degree of risk may be the ratio of the number of times the step could be passed without stumbling over the total number of times the step was passed. For example, if the height of the leg is higher than the height of the step, the risk determination unit 16 may determine that the user was able to pass the step without stumbling.
- the risk determination unit 16 acquires action pattern information related to the calculated degree of risk from the action pattern database 22, and associates the acquired action pattern information with the calculated degree of risk. , the risky action information is generated, and the generated risky action information is stored in the risky action database 24 .
- FIG. 6 is a diagram showing an example of the data configuration of the risky behavior database 24.
- the risky behavior database 24 stores risky behavior information.
- Dangerous action information is information indicating a place and a time period in which the user is likely to fall.
- the risky action information includes "location”, “time period”, “immediate action”, and "risk level”.
- “Place” indicates a place where the degree of danger is determined to be equal to or higher than the threshold.
- the "time period” is the time period in which the "immediate action” was performed.
- the “immediate action” is the action taken by the user immediately before the risk determination unit 16 determines that the risk is greater than or equal to the threshold.
- “Risk level” is the level of risk determined to be equal to or higher than the threshold.
- the risk determination unit 16 refers to the action pattern database 22 shown in FIG. and the calculated degree of risk are associated with each other to generate risky behavior information, and the generated risky behavior information is stored in the risky behavior database 24 .
- the risk determination unit 16 determines that the risk of the user in the hallway at 20:02 is equal to or higher than the threshold, from the action pattern database 22, "19:00-20:00” is taken as the action immediately before the meal. Identify. Then, the risk determination unit 16 determines the user's current location "corridor", the time period of the previous action "19:00-20:00", the previous action "eating", and the calculated risk “10 ” to generate risky behavior information and store it in the risky behavior database 24 . Note that the risk determination unit 16 may identify the current location of the user from the action information corresponding to the current time.
- the risk determination unit 16 acquires environmental pattern information related to the calculated degree of risk from the environmental pattern database 23, and associates the acquired environmental pattern information with the calculated degree of risk. By doing so, the dangerous environment information is generated and stored in the dangerous environment database 25 .
- FIG. 7 is a diagram showing an example of the data configuration of the dangerous environment database 25.
- the dangerous environment database 25 stores dangerous environment information.
- Dangerous environment information is information indicating a place and a time zone where the user is likely to fall.
- the dangerous environment information includes "place”, “time period”, “illuminance”, and “risk level”.
- “Location”, “time zone”, and “illuminance” are the same as the environmental pattern information shown in FIG.
- “Risk level” is the level of risk determined to be equal to or higher than the threshold.
- the risk determination unit 16 determines that the risk of the user in the corridor is equal to or higher than the threshold at 22:10, the related environment pattern information "Corridor , obtains environmental pattern information for "22:00-23:00", associates the obtained environmental pattern information with the calculated degree of risk "10" to generate dangerous environment information, and converts the generated dangerous environment information to dangerous environment Store in database 25 .
- the risk determination unit 16 When the calculated degree of risk is equal to or greater than the threshold, the risk determination unit 16 generates walking assistance information according to the calculated degree of risk, and inputs the generated walking assistance information to the output unit 17 .
- the walking assist information is, for example, notification information that notifies the user of at least one of a dangerous behavior and a dangerous environment related to walking of the user.
- the risk determination unit 16 acquires the related risky behavior from the risky behavior database 24, acquires the related risky environment from the risky environment database 25, and and based on the dangerous environment, generate notification information for notifying the dangerous behavior and the dangerous environment.
- a related risky behavior is a risky behavior whose time zone and place are related to the calculated risk level.
- a related dangerous environment is a dangerous environment in which the time zone and location are related to the calculated degree of danger.
- the gait database 21 is a database that stores the climbing direction information generated by the climbing direction detection unit 15 in chronological order.
- the walking database 21 associates and stores the location ID, the sensing time, and the leg raise width.
- the blueprint storage unit 18 stores blueprint data indicating the structure of the user's home, including the floor plan.
- the blueprint data is composed of, for example, CAD data that three-dimensionally shows the structure of a house.
- the height information extraction unit 19 extracts height information indicating the height of the step at each location from the design drawing data stored in the design drawing storage unit 18 .
- the height information extraction unit 19 may acquire the floor plan information of the house created by the cleaning robot 6 using techniques such as SLAM, and extract the height information using the acquired floor plan information.
- the height database 20 stores the height information extracted by the height information extraction unit 19.
- the height database 20 stores, for example, the location ID and the height of the step in association with each other.
- a place ID is an identifier that identifies a group of spaces such as a hallway, a living room, and the like that partition a house. If there are multiple steps at one location, the height database 20 may store the height of each step. Note that the location ID may be coordinates indicating an arbitrary location within the home.
- the output unit 17 uses the communication unit 11 to transmit the notification information input from the risk determination unit 16 to the terminal 7 .
- FIG. 8 is a flowchart showing an example of processing of the server 1 according to Embodiment 1 of the present disclosure.
- the sensors refer to the temperature sensor 2, the illuminance sensor 3, the body sensor 4, the in-home sensor 5, and the cleaning robot 6.
- step S1 the sensor transmits sensing data to the server 1.
- step S2 the motion information extraction unit 12 acquires sensing data from the image sensor transmitted from the in-home sensor 5 and the cleaning robot 6, acquires sensing data transmitted from the body sensor 4, and obtains sensing data from the acquired sensing data. The data is used to generate motion information.
- step S3 the behavior information generation unit 13 generates behavior information from the motion information generated in step S2, and generates behavior pattern information from the history of behavior information. As a result, behavior pattern information shown in FIG. 3 is generated.
- step S4 the environment information generator 14 creates environment information using the sensing data transmitted from the temperature sensor 2 and the illuminance sensor 3 in step S1, and creates environment pattern information from the history of the environment information. As a result, the environmental pattern information shown in FIG. 5 is generated.
- step S5 the behavior information generation unit 13 stores the behavior pattern information generated in step S3 in the behavior pattern database 22.
- step S ⁇ b>6 the environment information generation unit 14 stores the environment pattern information generated in step S ⁇ b>4 in the environment pattern database 23 .
- step S7 the motion information extraction unit 12 inputs the motion information (skeletal information and leg height data) generated in step S2 to the climbing direction detection unit 15.
- step S8 the rising direction detector 15 generates rising direction information from the input motion information.
- step S ⁇ b>9 the climbing direction detection unit 15 inputs the generated climbing direction information to the risk determination unit 16 .
- step S10 the risk determination unit 16 acquires from the height database 20 the height information indicating the height of the step corresponding to the location ID included in the input climbing direction information.
- step S11 the risk determination unit 16 calculates the risk based on the climbing direction information acquired in step S9 and the height information acquired in step S10, and determines whether the calculated risk is equal to or greater than the threshold. determine whether Here, it is assumed that the degree of danger is greater than or equal to the threshold.
- step S ⁇ b>12 the risk determination unit 16 acquires risky action information related to the calculated risk from the risky action database 24 .
- step S ⁇ b>13 the risk determination unit 16 acquires risky environment information related to the calculated risk from the risky environment database 25 .
- step S ⁇ b>14 the risk determination unit 16 generates notification information based on the relevant dangerous behavior information and dangerous environment information, and inputs the generated notification information to the output unit 17 .
- the output unit 17 transmits the input notification information to the terminal 7 .
- the terminal 7 generates a notification screen from the received notification information and displays it on the display.
- FIG. 9 is a diagram showing an example of the notification screen G1 displayed on the display of the terminal 7 in the first embodiment.
- the notification screen G1 includes messages for notifying dangerous actions (walking in the corridor), dangerous environments (the corridor is dark), and risk factors (there are high steps). As a result, the user can notice the existence of the step when moving in the corridor and prevent the risk of falling.
- FIG. 10 is a flowchart showing the details of the process (risk determination process) in step S11 of FIG.
- the risk determination unit 16 acquires climbing direction information from the climbing direction detection unit 15 .
- the risk determination unit 16 acquires from the walking database 21 the leg raise associated with the place ID included in the acquired climbing information, and calculates the statistical value of the acquired leg raise.
- the risk determination unit 16 may calculate the average value of the leg raises over a certain past period or the weighted average value of the leg raises over a certain past period as the statistic value of the leg raises.
- the weight used to calculate the weighted average value has a larger value, for example, for the most recent leg lift.
- the risk determination unit 16 may calculate the minimum value of the leg raise in a certain past period as the statistical value of the leg raise, or calculate the maximum value of the leg raise in the past certain period as the statistical value of the leg raise. It may be calculated as a value.
- the risk determination unit 16 may calculate the statistic value of the leg raise while excluding the leg raise in a period in which the variance of the leg raise stored in the walking database 21 is greater than a certain amount.
- step S103 the risk determination unit 16 calculates, as a risk, the frequency at which the step height at the location indicated by the location ID corresponding to the calculated leg lift statistical value is greater than or equal to the leg lift statistical value. do.
- step S104 the risk determination unit 16 determines whether or not the risk is greater than or equal to the threshold. If it is determined that the degree of risk is equal to or greater than the threshold (YES in step S104), the risk determination unit 16 associates the calculated degree of risk with action pattern information related to the degree of risk to generate risky behavior information. The risky action information obtained is stored in the risky action database 24 (step S105). On the other hand, if the degree of risk is less than the threshold (NO in step S104), the process ends without executing the processes of steps S105 and S106.
- the threshold may be set smaller as the variance of the leg raise width increases. By doing this, when the variance is large and the reliability of the leg raise width is low, the criteria for judging the degree of risk can be set rather severely.
- step S ⁇ b>106 the risk determination unit 16 associates the calculated risk with environmental pattern information related to the risk to generate risky environment information, and stores the generated risky environment information in the risky environment database 25 . .
- the way-up information indicating how the user's legs are raised is acquired, and based on the acquired way-up information and the height information indicating the height of the step on the floor, the user is determined, and notification information corresponding to the risk is presented to the user. Therefore, appropriate walking assistance can be performed according to the walking ability of the user and the height of the step on the floor.
- Embodiment 2 presents notification information to the user during a period when the degree of risk is equal to or greater than a threshold.
- FIG. 11 is a block diagram showing an example of the configuration of server 1A according to the second embodiment of the present disclosure.
- symbol is attached
- the server 1A further includes a risk determination prediction unit 26 and a notification timing determination unit 27 in addition to the server 1 .
- the risk determination prediction unit 26 estimates the presence or absence of a situation in which there is a high possibility of falling.
- the first row of the risky behavior database 24 stores risky behavior information indicating that it is dangerous for the user who ate in the time period of "19:00 to 20:00" to walk in the corridor after eating.
- the danger level of this dangerous action information is "10", which is higher than a predetermined reference value (for example, "6"), so there is a high possibility that the user will fall. Therefore, the risk determination prediction unit 26 estimates that there is a high possibility of falling when the user moves down the corridor after eating in the time period of "19:00-20:00".
- the second row of the dangerous behavior database 24 stores dangerous behavior information indicating that walking up stairs after taking a nap for the user who took a nap during the time period of "13:00-14:00" is dangerous. .
- the risk level of this dangerous action information is "5", but this risk level is lower than the reference value (for example, "6"). It is assumed that the probability of falling is low in the case of moving up the stairs after taking a nap during the time period.
- the first row of the dangerous environment database 25 stores dangerous environment information indicating that the environment of the corridor with the illuminance of "1" in the time period of "22:00-23:00" is dangerous. Since the risk level of this dangerous environment information is "10", which is higher than the reference value (for example, "6"), there is a high possibility that the user will fall. Therefore, the risk determination prediction unit 26 estimates that the environment indicated by this dangerous environment information is in a situation where there is a high possibility of falling.
- the second row of the dangerous environment database 25 stores dangerous environment information indicating that the environment of the stairs with the illuminance of "2" in the time period of "23:00-24:00" is dangerous.
- the user's risk level is "5", which is lower than the reference value (for example, "6"). We estimate that the situation is low.
- the notification timing determination unit 27 determines notification timing of the notification information when the risk determination prediction unit 26 estimates that there is a high possibility of falling.
- the notification timing is, for example, the timing at which the user's situation becomes the same as the situation in which it is estimated that the possibility of falling is high.
- the risk determination prediction unit 26 may make a risk determination taking into consideration not only the temporally immediately preceding data contained in the dangerous environment database 25 but also past time-series behavior data. For example, if the risk determination prediction unit 26 had dinner in the time period of '21:00-22:00' before the time period of '22:00-23:00', If past data analysis reveals that the risk around the stairs at 23:00 increases, correction may be made to increase the risk.
- the risk determination prediction unit 26 reduces the risk around the stairs during the time period of "22:00-23:00" if the bather takes a bath during the time period of "21:00-22:00". Such corrections may be made. It may be determined how far back the past time-series data is to be analyzed according to the processing time, processing load, and analysis accuracy.
- the notification timing determination unit 27 may determine the timing at which the user enters the corridor during a predetermined period (for example, 10 minutes) following "20:00" as the notification timing.
- the notification timing determination unit 27 may determine the timing at which the user enters the corridor during the time period "22:00-23:00" as the notification timing.
- the notification timing may be a timing when the user is relaxing, such as watching television after eating.
- the notification timing determination unit 27 may determine the timing at which the user enters the place as the notification timing. good.
- FIG. 12 is a flowchart showing an example of processing when the server 1A determines notification timing according to the second embodiment of the present disclosure. This process is periodically executed at a predetermined cycle, for example, once a day or once a week.
- the risk determination prediction unit 26 analyzes the risky action information stored in the risky action database 24 and the risky environment information stored in the risky environment database 25 to determine whether there is a situation where there is a high possibility of falling. to estimate
- the notification timing determination unit 27 determines notification timing (step S303). For example, in the example of the risky action information described above, the timing at which the user enters the corridor during a predetermined period following "19:00-20:00" is determined as the notification timing. On the other hand, if it is determined that there is no situation with a high possibility of overturning (NO in step S302), the process ends.
- FIG. 13 is a flowchart showing an example of processing when the server 1A transmits notification information according to the second embodiment of the present disclosure. This flowchart is always executed.
- step S311 the notification timing determination unit 27 monitors the behavior information and determines whether or not the notification timing has arrived based on the monitoring result. For example, when the action information generation unit 13 generates action information that satisfies the conditions of the location and time zone defined by the notification timing, the notification timing determination unit 27 determines that the notification timing has arrived.
- step S311 When the notification timing has arrived (YES in step S311), the notification timing determining unit 27 generates notification information (step S312). On the other hand, if the notification timing has not arrived (NO in step S311), the process waits in step S311.
- step S313 the output unit 17 uses the communication unit 11 to transmit the notification information to the terminal 7.
- FIG. 14 is a diagram showing an example of the notification screen G2 displayed on the display of the terminal 7 according to Embodiment 2 of the present disclosure.
- the notification screen G2 contains a message that there is a possibility of falling. Further, the notification screen G2 includes a message to the effect that there is a possibility of falling while walking in the corridor after eating. As a result, for example, when the user enters the corridor after eating, the user is notified of the possibility of falling through the notification screen G2, so that the user can be prevented from falling.
- the notification to the user may be notification using voice, vibration of a body sensor, or the like instead of the notification screen.
- the server 1A in the second embodiment based on at least one of the dangerous behavior information and the dangerous environment information, the presence or absence of a situation in which there is a high possibility of falling is estimated, and the situation in which there is a high possibility of falling is determined.
- the notification information is presented, so that the user can be notified of at least one of the dangerous behavior and the environment in the situation, and the user's attention can be called.
- Embodiment 3 specifies a place where the user may fall in the future, and notifies the user of a renovation proposal for the specified place.
- FIG. 15 is a block diagram showing an example of the configuration of the server 1B according to Embodiment 3 of the present disclosure.
- the server 1B further includes a climbing direction prediction unit 28 and a dangerous spot determination unit 29 in addition to the server 1A.
- the same reference numerals are assigned to the same constituent elements as in the first and second embodiments, and the description thereof is omitted.
- the climbing prediction unit 28 predicts the future leg climbing during walking based on the history of climbing information stored in the walking database 21 .
- the future refers to a point in the future, such as one year, three years, or ten years from now, and is not particularly limited.
- the rising direction prediction unit 28 calculates, for each place, the moving average value of the leg lift indicated by the rising direction information stored in the walking database 21, and calculates the position based on the temporal transition of the moving average value calculated for each place. Predict future leg raises.
- the rising direction prediction unit 28 may predict the future rise of the leg by linearly interpolating the time-series data of the moving average value. As the time width of the moving average value, an appropriate value such as one day, one month, or one year can be adopted.
- the rising direction prediction unit 28 may predict the future leg lift of the user by multiplying the current leg lift of the user by the reduction rate of the leg lift.
- the rate of decline is determined based on a decay function that defines age-dependent leg raises and that is obtained from a medical point of view. By inputting the user's current age and a future point in time into this decay function, the decay rate is determined.
- the dangerous place determination unit 29 identifies places where there is a possibility of falling in the future by comparing the height of the leg at each location calculated by the climbing direction prediction unit 28 and the height of the steps at each location. For example, if the future leg raise width in a certain place (corridor) is lower than the level difference in that place (corridor), the level difference in the corridor is determined as a level difference that may cause a fall in the future.
- the dangerous spot determination unit 29 generates a renovation proposal to improve the possibility of future falls at steps determined to have the possibility of future falls.
- the remodeling proposal includes, for example, a message that encourages remodeling to lower the level difference in a place where there is a possibility of falling.
- FIG. 16 is a flowchart showing an example of processing of the server 1B according to Embodiment 3 of the present disclosure. This flowchart is executed when, for example, the terminal 7 sends a request to create a reform proposal to the server 1B.
- the climbing direction prediction unit 28 acquires a history of climbing direction information from the walking database 21 .
- step S402 the rising direction prediction unit 28 calculates the moving average value for each location of the leg lift indicated by the history of the acquired rising direction information, and linearly interpolates the time-series data of the calculated moving average values. Calculate the future leg lift at each location. For the future point in time, a value designated by the user, which is included in the reform plan creation request transmitted from the terminal 7, may be adopted.
- step S403 the risky spot determination unit 29 identifies a step at which there is a possibility of a future fall by comparing the future leg lift width at each location with the height of the step at each location.
- step S404 the dangerous spot determination unit 29 generates a reform plan to improve the possibility of future falls on the specified steps.
- step S405 the output unit 17 uses the communication unit 11 to transmit the generated reform plan to the terminal 7. As a result, the terminal 7 displays a reform proposal notification screen on the display.
- FIG. 17 is a diagram showing an example of a reform proposal notification screen G3.
- the notification screen G3 since the step in the corridor is specified as the step where there is a possibility of falling in the future, the notification screen G3 includes a message to the effect that there is a risk of falling in the future at the step in the corridor. Furthermore, the notification screen G3 includes a message that encourages renovation to lower the steps in the corridor.
- the notification screen G3 displays the message of the reform proposal, but the image of the reform proposal may be included.
- An example of the image of the remodeling plan is an image in which a display object translucently displaying the shape of the remodeled hallway from which steps have been removed is superimposed on an overhead image of the hallway.
- a step on which there is a possibility of a fall in the future is specified, and the possibility of a fall on the step is identified. Since a remodeling plan that improves the above is created, the user can be encouraged to remodel to improve the possibility of falling in the future.
- Embodiment 4 generates training information for improving a person's walking ability.
- FIG. 18 is a block diagram showing an example of the configuration of server 1C according to the fourth embodiment.
- the same reference numerals are given to the same constituent elements as in the first to third embodiments, and the description thereof is omitted.
- the server 1C further includes a training information database (DB) 30 and a training information presentation unit 31 in addition to the server 1B.
- the training information database 30 preliminarily stores training information defining a training place at the user's home and a training method at the training place.
- An example of a training location is a location in the home with steps of suitable height to improve walking ability, such as stairs, hallways, and the like.
- the training information presentation unit 31 presents training information when the risk determination unit 16 determines that the risk is equal to or greater than the threshold. Specifically, when the risk determination unit 16 determines that the risk is equal to or greater than the threshold, the training information presentation unit 31 acquires the training information from the training information database 30 and inputs the training information to the output unit 17 .
- the training information presentation unit 31 may present the training information at a timing other than the timing at which the risk determination unit 16 determines that the risk is equal to or greater than the threshold. As this timing, a time zone in which the user is in a relaxed state can be adopted. The training information presentation unit 31 may monitor the behavior information generated by the behavior information generation unit 13 and determine that the user is in a relaxed state. An example of a relaxing state is a state in which the user is watching television. Note that the timing of presenting the training information is not limited to this, and may be, for example, the idle time after waking up until breakfast, and is not particularly limited.
- the output unit 17 uses the communication unit 11 to transmit the input training information to the terminal 7 .
- the terminal 7 generates a training information notification screen and displays it on the display.
- FIG. 19 is a diagram showing an example of a training information notification screen G4.
- the notification screen G4 includes a message prompting exercise for improving the walking ability because the walking ability is weakened. Further, the notification screen G4 includes a message such as "Using the step at the entrance" to convey a place in the home where exercise for improving walking ability is performed.
- the server 1C of the fourth embodiment since training information for improving the user's walking ability is presented according to the degree of risk, the user is prompted to perform training for improving the walking ability. be able to.
- the assist information is not limited to notification information, and may be, for example, a control signal for a walking assist suit worn by the user.
- a control signal for a walking assist suit worn by the user When the user wearing the walking assist suit is in a situation where there is a high possibility of falling, the walking assist suit will assist the user in walking.
- the information on how to raise the leg includes the amount of raising the leg, but may also include the amount of lateral movement of the leg.
- the width of lateral movement refers to the amount of movement of the leg in the lateral direction during walking.
- the horizontal direction is a direction perpendicular to the traveling direction and the vertical direction. Since the risk of walking increases when the width of left and right leg shake increases, the risk determination unit 16 may correct the risk so that the risk increases as the width of left and right leg shake increases.
- the technology according to the present disclosure is useful for preventing people from falling while walking.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- General Health & Medical Sciences (AREA)
- Heart & Thoracic Surgery (AREA)
- Public Health (AREA)
- Medical Informatics (AREA)
- Surgery (AREA)
- Biophysics (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Biomedical Technology (AREA)
- Pathology (AREA)
- Molecular Biology (AREA)
- Physiology (AREA)
- Dentistry (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Educational Technology (AREA)
- Educational Administration (AREA)
- Entrepreneurship & Innovation (AREA)
- Tourism & Hospitality (AREA)
- General Business, Economics & Management (AREA)
- Strategic Management (AREA)
- Primary Health Care (AREA)
- Marketing (AREA)
- Human Resources & Organizations (AREA)
- Economics (AREA)
- Alarm Systems (AREA)
Abstract
Description
人生100年時代を想定すると、加齢又はケガなどにより歩行能力が低下し、宅内での転倒などの事故が発生する可能性が高まり、それによって、人の健康寿命が短くなる危険性が高まっている。居住者は自らの歩行能力の低下及び場所的な環境要因により、転倒リスクが高まっていることに気づかず、突然の事故が発生しやすい状況で生活している。一度、事故が発生してしまうと最悪の場合、骨折等で寝たきりになってしまうなどの重度の健康障害につながることがあり、事故の発生前に対処することが必要となる。
図1は、本開示の実施の形態1におけるサーバ1の構成の一例を示すブロック図である。サーバ1は、情報処理装置の一例である。サーバ1は例えばネットワークを介して温度センサ2、照度センサ3、身体センサ4、宅内センサ5、掃除ロボット6、端末7、及び電気機器8と接続されている。ネットワークは、例えば携帯電話通信網及びインターネットを含む広域通信網である。サーバ1は、例えばクラウドサーバである。
実施の形態2は、危険度が閾値以上となる時間帯において通知情報をユーザに提示するものである。図11は、本開示の実施の形態2におけるサーバ1Aの構成の一例を示すブロック図である。なお、実施の形態2において実施の形態1と同一の構成要素には同一の符号を付し、説明を省略する。
実施の形態3は、ユーザが将来、転倒する可能性のある場所を特定し、特定した場所のリフォーム案を通知するものである。図15は、本開示の実施の形態3におけるサーバ1Bの構成の一例を示すブロック図である。サーバ1Bはサーバ1Aに対してさらに上がり方予測部28及び危険箇所判定部29を含む。なお、本実施の形態において、実施の形態1、2と同一の構成要素には同一の符号を付し、説明を省略する。
実施の形態4は、人の歩行能力を改善するための訓練情報を生成するものである。図18は、実施の形態4におけるサーバ1Cの構成の一例を示すブロック図である。なお、本実施の形態において、実施の形態1~3と同一の構成要素には同一の符号を付し、説明を省略する。
Claims (14)
- 情報処理装置のプロセッサが、
人のセンシングデータに基づき前記人の脚の上がり方を示す上がり方情報を取得し、
前記人が移動する空間の床の段差の高さを示す高さ情報を取得し、
前記上がり方情報及び前記高さ情報に基づいて前記人の歩行に関する危険度を判定し、
前記危険度に応じて歩行アシスト情報を生成し、
前記歩行アシスト情報を出力する、
情報処理方法。 - さらに、前記空間における前記人の行動パターンを示す行動パターン情報を取得してメモリに記憶し、
前記危険度の判定では、前記危険度が閾値以上と判定した場合、危険と判定したタイミングに関連する前記人の行動及び場所を含む前記行動パターン情報と前記危険度とを関連付けることで危険行動情報を生成し、
前記歩行アシスト情報の生成では、前記危険行動情報に基づいて、前記タイミングに関連する行動及び場所を通知する通知情報を前記歩行アシスト情報として生成し、
前記歩行アシスト情報の出力では、前記通知情報を提示する、
請求項1記載の情報処理方法。 - 前記タイミングに関連する行動は、前記タイミングの直前の行動を含み、
前記通知情報は前記直前の行動を含む、
請求項2記載の情報処理方法。 - さらに、前記空間における環境の変化のパターンを示す環境パターン情報を取得してメモリに記憶し、
前記危険度の判定では、前記危険度が閾値以上と判定した場合、危険と判定したタイミングに関連する前記環境パターン情報と前記危険度とを関連付けることで、危険環境情報を生成し、
前記歩行アシスト情報の生成では、前記危険環境情報に基づいて、転倒の可能性のある環境を通知する通知情報を生成し、
前記歩行アシスト情報の出力では、前記通知情報を提示する、
請求項1記載の情報処理方法。 - さらに、行動パターン情報及び前記危険度が対応付けられた危険行動情報と、環境パターン情報及び前記危険度が対応付けられた危険環境情報と、の少なくとも一方に基づいて、転倒の可能性の高い状況の有無を推定し、
前記歩行アシスト情報の出力では、転倒の可能性の高い状況が有ると推定された場合、前記通知情報を提示する、
請求項4に記載の情報処理方法。 - 前記危険度は、前記高さ情報が示す前記段差の高さが、前記上がり方情報が示す脚の上げ幅以上と判定された頻度である、
請求項1記載の情報処理方法。 - 前記歩行アシスト情報の出力では、前記危険度が閾値以上であると判定されたタイミングにおいて、前記歩行アシスト情報を出力する、
請求項1記載の情報処理方法。 - 前記高さ情報は、前記床における前記段差の位置を含み、
前記上がり方情報の取得では、検知した前記上がり方情報をメモリに記憶し、
さらに、前記メモリに記憶された前記上がり方情報の履歴に基づき前記人の歩行時における将来の脚の上がり方を予測し、
将来の脚の上がり方と前記高さ情報とに基づいて、将来、転倒の可能性のある前記段差を特定する、
請求項1記載の情報処理方法。 - さらに、特定された前記段差の位置における将来の転倒の可能性を改善するための前記空間のリフォーム案を生成し、
さらに、前記リフォーム案を出力する、
請求項8に記載の情報処理方法。 - さらに、前記危険度に応じて前記人の歩行能力を改善するための訓練情報を提示し、
さらに、前記訓練情報を出力する、
請求項1記載の情報処理方法。 - 前記訓練情報は、前記高さ情報に基づいて予め特定された前記空間における訓練場所を含む、
請求項10に記載の情報処理方法。 - 前記訓練情報の提示では、前記危険度が閾値以上であると判定された場合、前記訓練情報を提示する、
請求項10又は11に記載の情報処理方法。 - プロセッサを備える情報処理装置であって、
前記プロセッサは、
人のセンシングデータに基づき前記人の脚の上がり方を示す上がり方情報を取得し、
前記人が移動する空間の床の段差の高さを示す高さ情報を取得し、
前記上がり方情報及び前記高さ情報に基づいて前記人の歩行に関する危険度を判定し、
前記危険度に応じて歩行アシスト情報を生成し、
前記歩行アシスト情報を出力する、処理を実行する、
情報処理装置。 - 情報処理装置としてコンピュータを機能させる情報処理プログラムであって、
プロセッサに、
人のセンシングデータに基づき前記人の脚の上がり方を示す上がり方情報を取得し、
前記人が移動する空間の床の段差の高さを示す高さ情報を取得し、
前記上がり方情報及び前記高さ情報に基づいて前記人の歩行に関する危険度を判定し、
前記危険度に応じて歩行アシスト情報を生成し、
前記歩行アシスト情報を出力する、処理を実行させる、
情報処理プログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023535168A JPWO2023286469A1 (ja) | 2021-07-12 | 2022-05-26 | |
CN202280044853.8A CN117546198A (zh) | 2021-07-12 | 2022-05-26 | 信息处理方法、信息处理装置以及信息处理程序 |
US18/407,098 US20240144840A1 (en) | 2021-07-12 | 2024-01-08 | Information processing method, information processing device, and non-transitory computer readable recording medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-115098 | 2021-07-12 | ||
JP2021115098 | 2021-07-12 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/407,098 Continuation US20240144840A1 (en) | 2021-07-12 | 2024-01-08 | Information processing method, information processing device, and non-transitory computer readable recording medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023286469A1 true WO2023286469A1 (ja) | 2023-01-19 |
Family
ID=84919976
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/021650 WO2023286469A1 (ja) | 2021-07-12 | 2022-05-26 | 情報処理方法、情報処理装置、及び情報処理プログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US20240144840A1 (ja) |
JP (1) | JPWO2023286469A1 (ja) |
CN (1) | CN117546198A (ja) |
WO (1) | WO2023286469A1 (ja) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010026235A (ja) * | 2008-07-18 | 2010-02-04 | Panasonic Electric Works Co Ltd | カメラアングル調整装置 |
JP2014059208A (ja) * | 2012-09-18 | 2014-04-03 | Toshiba Corp | 移動補助装置および移動補助方法 |
JP6539845B2 (ja) * | 2015-03-31 | 2019-07-10 | 株式会社日本総合研究所 | 自走型走行装置、管理装置、及び歩行障害箇所判定システム |
-
2022
- 2022-05-26 JP JP2023535168A patent/JPWO2023286469A1/ja active Pending
- 2022-05-26 WO PCT/JP2022/021650 patent/WO2023286469A1/ja active Application Filing
- 2022-05-26 CN CN202280044853.8A patent/CN117546198A/zh active Pending
-
2024
- 2024-01-08 US US18/407,098 patent/US20240144840A1/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010026235A (ja) * | 2008-07-18 | 2010-02-04 | Panasonic Electric Works Co Ltd | カメラアングル調整装置 |
JP2014059208A (ja) * | 2012-09-18 | 2014-04-03 | Toshiba Corp | 移動補助装置および移動補助方法 |
JP6539845B2 (ja) * | 2015-03-31 | 2019-07-10 | 株式会社日本総合研究所 | 自走型走行装置、管理装置、及び歩行障害箇所判定システム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2023286469A1 (ja) | 2023-01-19 |
CN117546198A (zh) | 2024-02-09 |
US20240144840A1 (en) | 2024-05-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11819344B2 (en) | Systems for automatic assessment of fall risk | |
CN113397520B (zh) | 室内对象的信息检测方法及装置、存储介质和处理器 | |
WO2020253162A1 (zh) | 机器人及其控制方法、智能家居控制系统 | |
Werner et al. | Fall detection with distributed floor-mounted accelerometers: An overview of the development and evaluation of a fall detection system within the project eHome | |
JP6720909B2 (ja) | 行動検知装置、該方法および該プログラム、ならびに、被監視者監視装置 | |
JP2019020993A (ja) | 見守り支援システム及びその制御方法 | |
JP6852733B2 (ja) | 生体監視装置及び生体監視方法 | |
CN105700488A (zh) | 一种目标人体活动信息的处理方法及系统 | |
JP2018106437A (ja) | 行動評価装置、行動評価方法 | |
JP6589994B2 (ja) | 行動検知装置、行動検知方法及び行動検知プログラム | |
JP6142975B1 (ja) | 被監視者監視装置および該方法ならびに被監視者監視システム | |
WO2023286469A1 (ja) | 情報処理方法、情報処理装置、及び情報処理プログラム | |
JP7413936B2 (ja) | 情報処理装置、見守りシステム、および制御プログラム | |
JP7314939B2 (ja) | 画像認識プログラム、画像認識装置、学習プログラム、および学習装置 | |
CN113516008A (zh) | 基于人体骨骼关键点的人体活动异常监控系统 | |
JPWO2020003715A1 (ja) | レポート出力プログラム、レポート出力方法およびレポート出力装置 | |
JP2004046560A (ja) | 独居居住者ライフラインデータ処理システム | |
JP7259540B2 (ja) | 判定装置、判定装置の制御プログラム、および判定方法 | |
CN116030528A (zh) | 跌倒检测方法及装置 | |
WO2020003953A1 (ja) | コンピューターで実行されるプログラム、情報処理装置、および、コンピューターで実行される方法 | |
CN106092133A (zh) | 启动访客模式的判定方法、装置 | |
JP2020190889A (ja) | 要介護者見守りシステム | |
WO2020003952A1 (ja) | コンピューターで実行されるプログラム、情報処理装置、および、コンピューターで実行される方法 | |
Müller et al. | Multi-target data association in binary sensor networks for ambulant care support | |
Shah et al. | Embedded activity monitoring methods |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22841816 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023535168 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280044853.8 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22841816 Country of ref document: EP Kind code of ref document: A1 |