WO2022224833A1 - Living assistance system - Google Patents

Living assistance system Download PDF

Info

Publication number
WO2022224833A1
WO2022224833A1 PCT/JP2022/017241 JP2022017241W WO2022224833A1 WO 2022224833 A1 WO2022224833 A1 WO 2022224833A1 JP 2022017241 W JP2022017241 W JP 2022017241W WO 2022224833 A1 WO2022224833 A1 WO 2022224833A1
Authority
WO
WIPO (PCT)
Prior art keywords
resident
information
behavior
action
sensor
Prior art date
Application number
PCT/JP2022/017241
Other languages
French (fr)
Japanese (ja)
Inventor
貴史 市原
勝人 小村
拓也 工藤
祐美子 須藤
邦明 田中
和貴 山本
敦 佐々木
潤哉 杉本
Original Assignee
日立グローバルライフソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立グローバルライフソリューションズ株式会社 filed Critical 日立グローバルライフソリューションズ株式会社
Publication of WO2022224833A1 publication Critical patent/WO2022224833A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/22Social work or social welfare, e.g. community support activities or counselling services
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y10/00Economic sectors
    • G16Y10/60Healthcare; Welfare

Definitions

  • the present invention relates to life support systems.
  • Patent Literature 1 describes a person estimation unit 730 that estimates a person photographed in moving image data 610 (see FIG. 3), an action of the estimated person 320 (see FIG. 2), and an article related to the action. from the moving image data 610, and an action data generation unit 750 that generates action data 620 (see FIG. 1) based on the extracted information (see paragraph 0028). ), generating “behavior data” that associates “people” with “human behavior”.
  • the purpose of the present invention is to promote the health of residents who are not in a frail state as one of the monitoring services, and to help them recover to a healthy state according to the behavior of the residents who are in a frail state. It is to provide a system.
  • the life support system of the present invention includes a moving body that moves autonomously within a home, and resident behavior information recognized based on sensor information acquired by a sensor of the moving body or a sensor in the home.
  • an action support device for controlling the moving body in accordance with indoor environment information is provided.
  • the present invention it is possible to contribute to countermeasures against frailty for residents and recovery of frail residents to a healthy state, and to provide useful monitoring services for residents.
  • FIG. 1 is a configuration diagram of a life support system according to an embodiment
  • FIG. 2 is a configuration diagram of hardware in which the action support device of the life support system of the embodiment operates
  • FIG. FIG. 10 is a flowchart of processing of the first embodiment that is periodically performed by the action support device
  • FIG. 11 is a flowchart of processing of the second embodiment that is periodically performed by the action support device
  • FIG. 11 is a flowchart of processing of the third embodiment that is periodically performed by the action support device
  • FIG. 12 is a flowchart of processing of the fourth embodiment that is periodically performed by the action support device
  • FIG. 11 is a flow diagram of processing of Example 5 periodically performed by the action support device
  • FIG. 14 is a flowchart of processing of Example 6 periodically performed by the action support device
  • FIG. 16 is a flowchart of processing of the seventh embodiment that is periodically performed by the action support device;
  • FIG. 1 is a configuration diagram of a life support system and an action support device according to an embodiment.
  • the life support system detects mobile objects 2 such as robot vacuum cleaners, robot air purifiers, home appliance robots, and pet robots that move in the home, and detects environmental information such as temperature, humidity, and airflow in the home, and monitors the behavior of residents.
  • a plurality of in-home sensors 3 such as detecting cameras and motion sensors, household appliances such as a washing machine 4, a refrigerator 5, and an air conditioner 6, and an action support device 1 are connected via an in-home LAN 7. .
  • the action support device 1 communicates via the Internet 8 with a resident terminal 9a such as a resident's smart phone and a watcher terminal 9b of a watcher such as a family member or care supporter.
  • the action support device 1 recognizes based on the sensor information acquired by the sensor 21 of the moving body 2 autonomously moving in the house or the sensor information acquired by the in-home sensor 3 in the house, the behavior information of the resident or the environment in the house. It controls the moving body 2 according to the information.
  • the moving body 2 is configured to be able to move autonomously based on the map information in the home and the self-location detection means, and moves to the designated position based on the instruction from the action support device 1. 21 detects a resident. Also, the moving body 2 can communicate with the resident in close proximity.
  • the moving body 2 is equipped with sensors 21 such as a camera, a floor ranging sensor, a ranging sensor, a millimeter wave radar, and a thermography, and a control unit 23 controls driving means 22 such as a motor based on map information 25. It can run autonomously in the house under control. At this time, the vehicle runs while avoiding surrounding obstacles detected by the sensor 21 .
  • sensors 21 such as a camera, a floor ranging sensor, a ranging sensor, a millimeter wave radar, and a thermography
  • driving means 22 such as a motor based on map information 25. It can run autonomously in the house under control. At this time, the vehicle runs while avoiding surrounding obstacles detected by the sensor 21 .
  • the sensor 21 of the moving body 2 not only detects obstacles in the surrounding running, but also uses the camera image of the sensor 21 to identify the resident, follow the movement of the resident, change the direction of the resident, change the direction of the resident. It is possible to detect the behavior of a person, etc. Further, a sound sensor, a temperature sensor, a humidity sensor, an illuminance sensor, a microphone, or the like may be installed as the sensor 21 to detect indoor environmental information.
  • the moving body 2 is equipped with a housing heater 24 that raises the surface temperature of the housing, so that the temperature can be adjusted when the resident comes into contact with it, and the body temperature of the resident is high.
  • a speaker or a light-emitting unit may be installed to output a message and serve as a human interface unit with the resident.
  • the moving body 2 notifies the action support device 1 of sensor information such as the camera image and the temperature distribution image detected by the sensor 21 to detect the action information of the resident and the environment information, and also receives the information from the action support device 1. Moves to a designated position in the house according to the movement instruction.
  • the action support device 1 includes an action recognition unit 11 that recognizes actions of a resident and obtains action information, and a time-series action information storage unit 12 that stores the resident action information recognized by the action recognition unit 11 in time series. Analyzes the state of the resident from the current behavior information of the resident recognized by the behavior recognition unit 11 and the time-series behavior information of the time-series behavior information storage unit 12, and selects an action that encourages behavior change of the resident or a support action. It is composed of a moving object instruction unit 13 for instructing the moving object 2 and a watching information notification unit 14 for notifying the terminal of the watching person of the resident of information on the behavior information of the resident.
  • the person identification unit 111 of the action recognition unit 11 identifies the resident by recognizing the face image of the resident captured by the sensor 21 (camera) of the mobile body 2 (obtains the person identification information). Identification of a resident may be performed not only by face authentication but also by biometric authentication such as iris authentication using a face image.
  • the human behavior detection unit 112 of the behavior recognition unit 11 detects the behavior of each resident identified by the person identification unit 111 based on the sensor information of the sensor 21 of the moving body 2 and the sensor information of the indoor sensor 3. Infer and obtain behavioral information for a particular resident. Specifically, behavior information recognizes a person's movement from changes in the position of body parts such as the resident's head, hands, arms, feet, fingers, etc., and changes in body orientation and posture, etc. Actions such as “get up”, “lie down”, and “sit” are estimated and obtained from the combinations.
  • the human behavior detection unit 112 identifies the bedroom, the kitchen, and day/night based on the sensor information acquisition position and acquisition time, and performs actions such as “wake up”, “sleep”, “sit at the table”, and “relax”. is estimated and used as behavioral information.
  • the human behavior detection unit 112 acquires the environment information in the home based on the sensor information from the sensor 21 of the moving body 2 and the sensor information from the in-home sensor 3, and uses the environment information when analyzing the behavior of the resident. make it available for reference. Specifically, the operating states of the washing machine 4 and the air conditioner 6 are acquired as environmental information.
  • the human behavior detection unit 112 acquires the resident's activity level, walking speed, moving speed, rising speed, and falling state from the recognized human behavior, and uses them as behavior information.
  • the time-series behavior information storage unit 12 is a storage unit that stores, in chronological order, the resident's behavior information recognized by the behavior recognition unit 11 for each resident (person identification information) identified by the person identification unit 111 .
  • the behavior recognition unit 11 periodically obtains the behavior information of the resident, and sequentially records the recognition time and the behavior information. Further, the time-series behavior information storage unit 12 may record the recognition time and the behavior information when the behavior information obtained by the behavior recognition unit 11 changes.
  • the action information in the time-series action information storage unit 12 is called time-series action information, and the most recent action information recognized by the action recognition unit 11 is called current action information.
  • the behavior estimation unit 131 of the moving body instruction unit 13 compares the time-series behavior information and current behavior information for each resident, and analyzes the state of the resident. In addition, the behavior estimation unit 131 analyzes the state of the resident by adding the environmental information acquired based on the sensor information from the sensor 21 of the moving body 2 and the sensor information from the indoor sensor 3 .
  • the instruction information generation unit 132 generates instruction information for an action that prompts behavioral change of the resident of the moving object 2 or an assisting action according to the analysis result of the action estimation unit 131 .
  • the instruction information notification unit 133 notifies the mobile object 2 of the instruction information generated by the instruction information generation unit 132 via the home LAN 7 .
  • the watching information notifying unit 14 transmits the information related to the analysis result of the behavior estimating unit 131 or the information of the resident acquired by instructing the moving object 2 to the resident terminal 9a or the watching person terminal 9b via the Internet 8. to notify.
  • FIG. 2 is a configuration diagram of hardware on which the action support device 1 of FIG. 1 operates.
  • Action support device 1 is configured as computer 900 having CPU 901 , RAM 902 , ROM 903 , HDD 904 , communication I/F 905 , input/output I/F 906 and media I/F 907 .
  • Communication I/F 905 is connected to an external communication device 915 .
  • Input/output I/F 906 is connected to input/output device 916 .
  • a media I/F 907 reads and writes data from a recording medium 917 .
  • the CPU 901 controls each processing unit by executing a program (also called an application or an app for short) read into the RAM 902 . This program can be distributed via a communication line or recorded on a recording medium 917 such as a CD-ROM for distribution.
  • the CPU 901 executes a program to implement the functions of the action recognition unit 11, the moving object instruction unit 13, and the watching information notification unit 14, and the time-series action information storage unit 12 is configured in the HDD 904. do.
  • time-series behavior information storage unit 12 stores behavior information such as the amount of activity, walking speed, moving speed, rising speed, and falling as time-series behavior information for each resident
  • the behavior estimation unit 131 By comparing time-series behavior information and current behavior information, lack of exercise, such as ⁇ % increase/decrease in time compared to ⁇ months ago, ⁇ % increase/decrease in frequency compared to ⁇ months ago, etc. and increased exercise can be analyzed.
  • the action support device 1 determines that exercise is insufficient or exercise is increased when the amount of change in the action information is equal to or greater than the threshold. According to the analysis result of lack of exercise or increase in exercise of the resident, the action support device 1 allows the moving body 2 to perform the exercise "You should do gymnastics.” ” is generated as instruction information, and the instruction information is notified to the moving body 2 . Alternatively, instruction information may be sent to the moving body 2 so that the moving body 2 itself performs an exercise-related action to prompt the resident to change their behavior.
  • the action support device 1 notifies the resident terminal 9a or the watcher terminal 9b of changes in the resident's action information and the necessity of behavior modification as watching information. Furthermore, the resident terminal 9a or the watcher terminal 9b may be notified of the result of the behavioral change caused by the movement of the moving object 2 as watching information.
  • FIG. 3 is a flow diagram of processing periodically performed by the action support device 1 of the present embodiment.
  • the action recognition unit 11 acquires sensor information from the moving body 2 and the in-home sensor 3.
  • step S32 the person identification unit 111 recognizes the face image of the sensor information, obtains the person identification information of the resident, and identifies the resident.
  • step S33 the human behavior detection unit 112 estimates the behavior of each resident identified by the person identification information based on the sensor information of the moving body 2 and the in-home sensor 3, and obtains the behavior information (current behavior information).
  • step S34 the time-series behavior information storage unit 12 accumulates the behavior information acquired in step S33 as time-series behavior information.
  • step S35 the behavior estimation unit 131 compares the time-series behavior information and current behavior information for each resident, and analyzes the amount of change in the resident's behavior information.
  • step S36 the instruction information generation unit 132 determines whether or not the amount of change in the behavior information of the time-series behavior information and the current behavior information is equal to or greater than the threshold. , the resident proceeds to step S37 as lack of exercise or increased exercise. If the amount of change in behavior information is smaller than the threshold (No in S36), the process ends.
  • step S ⁇ b>37 the instruction information generation unit 132 performs a voice action (for example, “You should do gymnastics” to encourage behavioral change to overcome the lack of exercise according to the analysis result of the lack of exercise or the increase in exercise of the resident).
  • yo is set as the instruction information to the moving body 2.
  • step S38 the instruction information notification unit 133 notifies the moving object 2 of the instruction information set in step S37.
  • the moving body 2 may be notified of an instruction to move to the location of the resident so as to move to the vicinity of the resident.
  • step S39 the watching information notification unit 14 notifies the resident terminal 9a or the watching person terminal 9b of the amount of change in the behavior information of the resident and the need for behavior modification as watching information, and ends the process. do.
  • the life support system prompts the resident to maintain a certain amount of exercise, so it is possible to maintain the health of the resident.
  • time-series behavior information for each resident is stored in the time-series behavior information storage unit 12, the life rhythm of the resident (for example, every Sunday, wake up ⁇ turn on the microwave in the kitchen ⁇ sit at the dining table in the living room). to have breakfast). Also, from the time-series behavior information and the current behavior information, it is possible to detect a sign of frailty by analyzing a decrease in the resident's walking speed and a decrease in the amount of physical activity.
  • the action support device 1 when the action support device 1 recognizes that the estimated action of the resident is movement, the action support device 1 moves the moving object 2 ahead of the destination (for example, dining table) to support the resident. to In particular, when a sign of frailty is detected in the resident, the moving body 2 is instructed to perform an operation of photographing the resident's mouth after moving so that the state can be confirmed. In addition to the mouth, body movements that are manifested by signs of frailty may be photographed.
  • the destination for example, dining table
  • the action support device 1 confirms that the resident has arrived at the destination ahead of the moving body 2, and performs a predetermined operation (notice of non-arrival, etc.) if the resident does not arrive within a predetermined time.
  • the moving body 2 is instructed to
  • the arrival of the resident is detected by the indoor sensor 3 (human sensor) or by image recognition by the sensor 21 (camera) of the moving body 2.
  • the action support device 1 notifies the resident terminal 9a or the watcher terminal 9b of the confirmation result of the movement of the resident and the photographed image of the resident as watching information.
  • FIG. 4 is a flow diagram of the process periodically performed by the action support device 1 of the present embodiment. Steps S31 to S35 in FIG. 4 are the same as those in FIG. 3, so description thereof will be omitted here.
  • step S41 the instruction information generating unit 132 analyzes the time-series behavior information and the current behavior information to determine from the life rhythm of the resident whether or not the future estimated behavior will involve movement. If (No in S41), the process is terminated.
  • step S42 the instruction information generation unit 132 analyzes the decrease in walking speed and the amount of physical activity of the resident in the time-series behavior information and the current behavior information, and determines the presence or absence of signs of frailty in the resident.
  • step S43 the instruction information generation unit 132 sets the photographing operation including the mouth of the resident after movement to the instruction information of the moving body.
  • step S44 the instruction information generation unit 132 estimates the destination position from the time series behavior information and the current behavior information. For example, in the life rhythm recognized by the time-series behavior information, if it is estimated from the current behavior information (the range is on in the kitchen) that ⁇ I will sit at the dining table in the living room and have breakfast'' in the future, the movement destination Estimate the living as
  • step S45 the instruction information generating unit 132 sets the movement to the estimated destination (living room) in the instruction information of the moving body 2.
  • the instruction information for the moving body 2 is notified to the moving body 2 in step S47, which will be described later.
  • step S46 the instruction information generation unit 132 confirms that the resident has arrived at the destination ahead of the moving body 2, and if the resident does not arrive within a predetermined time, performs a predetermined operation (non-arrival notification, etc.). The operation of confirming the movement of the resident who does this is set in the instruction information of the moving body 2 .
  • step S47 the instruction information notification unit 133 notifies the moving object 2 of the instruction information set in steps S43, S45, and S46.
  • step S48 the watching information notification unit 14 notifies the resident terminal 9a or the watching person terminal 9b of the confirmation result of the movement of the resident and the photographed image of the resident as watching information, and ends the process.
  • the life support system can anticipate the estimated destination of the resident, so that the in-home sensor 3 can be complemented to grasp the state of the resident.
  • the in-home sensor 3 can be complemented to grasp the state of the resident.
  • photographing the mouth of a resident with signs of frailty it is possible to monitor whether the resident will aspirate.
  • the number of times of mastication and mastication speed from the photographed image it is possible to determine whether or not the muscles are weakened.
  • the action support device 1 preempts the movement destination of the resident and instructs the moving body 2 to confirm the movement of the resident and to perform the photographing operation including the mouth of the resident.
  • the action support device 1 detects the indoor temperature at the destination by the sensor 21 (temperature sensor, thermography) of the moving body 2, and controls the air conditioner installed at the destination. You may do so.
  • the moving body 2 may perform an action to prompt the resident who has moved to operate the air conditioner.
  • time-series behavior information storage unit 12 stores the time-series behavior information for each resident, changes in the resident's physical condition (for example, fever, decreased meal frequency, and not getting up) can be detected from the current behavior information. can recognize.
  • the action support device 1 when the action support device 1 determines that the resident is in poor physical condition, the action support device 1 instructs the mobile body 2 to measure the condition of the resident with the sensor 21 (for example, thermography, camera) provided in the mobile body 2. do. In addition, the action support device 1 instructs the mobile body 2 to change the set temperature of the housing heater 24 of the mobile body 2 according to the measured body temperature of the resident.
  • the sensor 21 for example, thermography, camera
  • the action support device 1 determines that the resident is in poor physical condition, it notifies the resident terminal 9a or the watcher terminal 9b of the physical condition information of the resident as watching information.
  • FIG. 5 is a flowchart of processing periodically performed by the action support device 1 of the present embodiment. Steps S31 to S35 in FIG. 5 are the same as those in FIG. 3, and thus description thereof is omitted here.
  • step S51 the instruction information generation unit 132 determines whether or not the resident is in poor physical condition by analyzing the time-series behavior information and the current behavior information. exit.
  • step S52 the instruction information generation unit 132 instructs the sensor 21 (thermography, camera) mounted on the moving body 2 to measure the condition of the resident. As the operation of the moving body 2, it is set in the instruction information. Thereby, the action support device 1 can know the body temperature of the resident and observe the state of the resident.
  • the sensor 21 thermoography, camera
  • step S53 the instruction information notification unit 133 notifies the moving object 2 of the instruction information set in step S52.
  • step S54 when the body temperature of the resident is notified from the moving body 2, the instruction information generation unit 132 changes the set temperature of the housing heater 24 of the moving body 2 according to the body temperature of the resident. , is set to the instruction information of the moving object. For example, if the body temperature of the resident is higher than the normal temperature, the surface temperature of the moving body 2 is raised.
  • step S55 the instruction information notification unit 133 notifies the moving object 2 of the instruction information set in step S54.
  • step S56 the watching information notification unit 14 notifies the resident terminal 9a or the watching person terminal 9b of the physical condition information of the resident as watching information.
  • the life support system can notify the resident of poor physical condition (fever) without forcing the resident to act.
  • the moving body 2 anticipates the destination and provides life support by photographing the resident and confirming the movement. You may make it Specifically, when the estimated action of meal preparation of the resident is analyzed from the current action information based on the life rhythm recognized from the time-series action information, the action support device 1 moves to the meal preparation time.
  • the body 2 is installed in the kitchen ahead of the kitchen, and photographs the opening and closing operation of the door of the refrigerator whose position information is recorded in the map information 25, or photographs the inside of the refrigerator, and manages foodstuffs and automatically orders. be able to provide assistance.
  • the location information of the refrigerator installed in the kitchen is acquired by photographing the QR code indicating the identification information of the refrigerator attached to the refrigerator with the sensor 21 (camera). More specifically, the identification information of the refrigerator is recognized by analyzing the photographed image of the QR code. Calculate the location information of the refrigerator.
  • the refrigerator whose position information has been specified is recorded in the map information 25 of the moving body 2 .
  • household appliances such as televisions, air conditioners, and microwave ovens can similarly acquire installation positions in the home.
  • the resident may input the grounding position of each home appliance to the map information 25 via the resident terminal 9a without using the QR code.
  • FIG. 6 is a flowchart of processing periodically performed by the action support device 1 of the present embodiment. Steps S31 to S35 of FIG. 6 are the same as those of FIG. 3, so description thereof is omitted here.
  • step S61 the instruction information generating unit 132 analyzes the time-series behavior information and the current behavior information to determine whether or not the future behavior estimated from the life rhythm of the resident is moving to the kitchen of the resident. If it is not movement (No in S61), the process is terminated.
  • the instruction information generation unit 132 sets the movement operation to the kitchen in the instruction information of the moving body 2 in step S62.
  • step S63 the instruction information generation unit 132 sets the instruction information of the moving body 2 to the photographing operation of opening and closing the door of the refrigerator installed in the kitchen or the photographing operation of the inside of the refrigerator.
  • step S64 the instruction information notification unit 133 notifies the moving object 2 of the instruction information set in steps S62 and S63.
  • the life support system can provide life support such as food management and automatic ordering for residents.
  • instruction information is notified to the moving object 2 based on the action information of the resident.
  • a case of notifying the moving object 2 of the instruction information will be described.
  • the action support device 1 acquires the operating state of the washing machine 4 as the environmental information and estimates the end of washing, the action support device 1 obtains the estimated action from the resident's time-series action information and the action information, It is determined whether or not the estimated behavior of the resident can be changed to the behavior of hanging the laundry, and if the behavior can be changed, the moving body 2 performs an action to prompt the resident to change the behavior to hang the laundry.
  • FIG. 7 is a flowchart of processing periodically performed by the action support device 1 of the present embodiment. Steps S31 to S34 in FIG. 7 are the same as in FIG. 3, and thus description thereof is omitted here.
  • the human behavior detection unit 112 estimates the operation state of the washing machine 4 installed in the home based on the sensor information of the moving body 2 and the home sensor 3 for each resident identified by the person identification information.
  • environmental information such as the end of washing (current environmental information).
  • the operating sound and notification sound of the washing machine 4 are recognized from the indoor environmental sound detected by the indoor sensor 3 (microphone), and the washing state and the completion of washing are estimated.
  • step S72 the behavior estimation unit 131 compares the time-series behavior information for each resident with the current behavior information/environmental information, and analyzes the resident's state.
  • step S73 the instruction information generation unit 132 determines whether or not the current environment information indicates that washing has ended. If the washing has not ended (No in S73), the process ends.
  • step S74 the instruction information generation unit 132 determines whether the future estimated behavior of the resident analyzed based on the time-series behavior information and the current behavior information is the behavior of hanging the laundry. If it is determined that the action is to hang the laundry (Yes in S74), the process is terminated.
  • step S75 the instruction information generation unit 132 determines whether the resident's current action information can be changed to the action of hanging the laundry, and if the change is not possible (No in S75), the process ends.
  • the priority of the current action and the action of hanging the laundry is determined, and if the priority of the current action is higher, the process ends. For example, if the current behavior is in a relaxed state, it is determined that continuation of the relaxed state has a higher priority, and the process ends.
  • Whether or not to give priority to the relaxed state may be determined based on the resident's past behavior information stored in the time-series behavior information storage unit 12 . For example, if the resident has not been in a relaxed state recently, the relaxed state may be prioritized.
  • step S76 the instruction information generation unit 132 sets, in the instruction information of the moving body 2, an action that prompts the resident to change his/her behavior to hang the laundry. Specifically, the moving body 2 is made to output the message "Washing is finished. Please hang the laundry to dry.”
  • the instruction information notification unit 133 notifies the moving object 2 of the instruction information set at step S76.
  • the life support system can prevent the resident from forgetting to take out the laundry from the washing machine 4 .
  • the action support device 1 may be applied to a clothes dryer to prevent the resident from forgetting to take out the dried clothes from the dryer at the end of drying.
  • the action support device 1 may be applied to a microwave oven to prevent the resident from forgetting to take out the heated food from the microwave oven at the end of heating.
  • home appliance tasks the operations of home appliances such as a washing machine, a clothes dryer, and a microwave oven are referred to as home appliance tasks. , etc., are referred to as home appliance task completion.
  • FIG. 8 is a flowchart of processing periodically performed by the action support device 1 of the present embodiment. Steps S31 to S34 in FIG. 8 are the same as those in FIG. 3, so descriptions thereof are omitted here.
  • the human behavior detection unit 112 detects the operating state of the air conditioner 6 installed in the home based on the sensor information of the moving body 2 and the home sensor 3 for each resident identified by the person identification information. Estimate and acquire environment information (current environment information) such as during operation. For example, the operation sound of the air conditioner 6 is recognized from the room temperature and the environmental sound detected by the indoor sensor 3 (microphone, temperature sensor), and the operating state of the cooling and heating operation is estimated.
  • environment information current environment information
  • the operation sound of the air conditioner 6 is recognized from the room temperature and the environmental sound detected by the indoor sensor 3 (microphone, temperature sensor), and the operating state of the cooling and heating operation is estimated.
  • step S82 the behavior estimation unit 131 compares the time-series behavior information for each resident with the current behavior information/environmental information, and analyzes the resident's state.
  • step S83 the instruction information generation unit 132 determines whether the future behavior of the resident analyzed based on the time-series behavior information and the current behavior information is behavior related to the operation of the air conditioner 6. It is determined whether or not the action corresponds to the current operating state of the air conditioner 6 indicated by the current environment information. Specifically, if the estimated action is to operate cooling and the air conditioner 6 is in the cooling operation, the process ends.
  • step S84 the instruction information generating unit 132 determines whether or not the current action information of the resident is in a relaxed state, and if it is in a relaxed state (Yes in S84), the process ends. In other words, if the current behavior information of the resident is in a relaxed state, the moving body 2 is not instructed to act to change the behavior by operating the air conditioner 6 in step S85.
  • step S85 the instruction information generation unit 132 sets the instruction information of the moving body 2 to prompt the resident to operate the air conditioner. Specifically, the moving body 2 outputs the message "Please operate the air conditioner".
  • the instruction information notification unit 133 notifies the moving object 2 of the instruction information set at step S85.
  • the action support device 1 compares the estimated action estimated from the resident's time-series action information with the current action state, and if the action is different or abnormal, the sensor 21 of the moving body 2 The moving body 2 is instructed to photograph the behavior of the resident.
  • the action support device 1 uses the estimated action of "sleeping” estimated from the time-series action information of relaxing at night ⁇ moving bedroom ⁇ bedroom, and the action of waking up outside of bedtime or trying to get out of bed. It compares the recognized current action information of "going to the toilet” and instructs the moving body 2 to photograph the behavior of the resident with the sensor 21 (camera) of the moving body 2 because of the different behavior.
  • the action support device 1 compares the action information of the resident photographed by instructing the moving body 2 and the past action information of the resident, and determines that there is a certain change over time (decline or improvement). In that case, the communication terminal of the resident or the guardian of the resident will be notified to that effect.
  • a constant change over time is, for example, a change in posture motion or walking motion.
  • FIG. 9 is a flowchart of processing periodically performed by the action support device 1 of the present embodiment. Steps S31 to S35 in FIG. 9 are the same as in FIG. 3, and thus description thereof is omitted here.
  • step S91 the instruction information generation unit 132 determines whether or not the estimated behavior of the resident estimated from the time-series behavior information is different from the current behavior information. end the process. Specifically, the estimated behavior of "sleep" is compared with the current behavior information of "going to the bathroom".
  • step S92 the instruction information generation unit 132 sets the instruction information of the moving body 2 to the action of photographing the resident.
  • step S93 the instruction information notification unit 133 notifies the moving object 2 of the instruction information set in step S92.
  • step S94 the action recognition unit 11 acquires sensor information captured by the sensor 21 (camera) of the moving body 2 as the action image of the resident.
  • step S95 the behavior estimation unit 131 compares the resident behavior information acquired in step S94 with past resident behavior information.
  • step S96 the instruction information generation unit 132 determines the resident's frailty risk from the comparison of the behavior information, and when it is determined that there is no constant change over time (decline or improvement) (No in S96), the process is performed. finish.
  • step S97 the watching information notification unit 14 notifies the resident terminal 9a or the watching person terminal 9b of the resident's constant change over time as watching information.
  • life support for the resident is provided by the moving body 2 based on the time-series action information and the current action information. can.
  • a robot cleaner as the moving body 2, cleaning route information is accumulated, and based on the detection status of obstacles in traveling, the position information of the garbage that causes the resident to fall is specified, and the garbage is mounted. An image is taken by the sensor 21 (camera). Then, the moving body 2 calls out to the resident for attention. In addition, the resident terminal 9a or the guardian terminal 9b is notified of the location information of the garbage and the photographed image of the garbage as the watching information.
  • the change in the degree of dirtiness in the house is obtained from the time-series information of the amount of dust sucked by the robot cleaner as the moving body 2, and when the degree of dirtiness increases, it is determined that the resident has a sign of frailty.
  • changes in the degree of tidying up in the house are obtained from the time-series information of obstacle detection by the robot cleaner, and when the degree of tidying up is declining, it is determined that the resident has signs of frailty.
  • the resident terminal 9a or the watcher terminal 9b is notified of the degree of dirtiness and the degree of tidying up as watching information.
  • the present invention is not limited to the above-described examples, and includes various modifications.
  • the above embodiments have been described in detail to facilitate understanding of the present invention, and are not necessarily limited to those having all the described configurations. Also, part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Primary Health Care (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Public Health (AREA)
  • Epidemiology (AREA)
  • Economics (AREA)
  • Strategic Management (AREA)
  • Theoretical Computer Science (AREA)
  • Child & Adolescent Psychology (AREA)
  • Bioethics (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Development Economics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Pathology (AREA)
  • Accounting & Taxation (AREA)
  • Medical Treatment And Welfare Office Work (AREA)
  • Alarm Systems (AREA)
  • Emergency Alarm Devices (AREA)

Abstract

This living assistance system is provided with a mobile body 2 that moves autonomously inside a house, and behavior assistance device (1) for controlling the moving body in accordance with environment information about the inside of the house or behavior information about a resident recognized on the basis of a sensor (21) of the mobile body or a sensor (3) inside the house. In particular, the behavior assistance device is provided with an behavior recognition unit for recognizing, as current behavior information, information about current behavior of the resident on the basis of the sensor information, and sequentially storing the recognized behavior information as time-series behavior information, and a mobile body instruction unit for diagnosing that the current behavior information has changed with respect to the time-series behavior information, and instructing the mobile body to carry out an operation for promoting an modification of the behavior of the resident, and, as one watching service, involvement with the resident is made so as to take a frailty countermeasure for the resident, or to restore the resident in a frail state to a healthy state.

Description

生活支援システムlife support system
 本発明は、生活支援システムに関する。 The present invention relates to life support systems.
 高齢者は、健常な状態から老い衰えて、要介護の状態へと移行する。現代医学では、健常な状態から要介護の状態に至る中間の状態を「フレイル」と呼び、フレイル状態の高齢者に対して適切な介入をすることにより、健常な状態へと回復することができると言われている。 Elderly people move from a healthy state to a state of aging and needing nursing care. In modern medicine, the intermediate state between a healthy state and a state requiring nursing care is called "frailty," and appropriate intervention for frail elderly people can restore them to a healthy state. It is said.
 現在、ホームセキュリティーや見守りサービスの一環として、宅内に様々なセンサデバイスが設置され、宅内の人の行動を検出することができる。例えば、特許文献1には、動画像データ610(図3参照)に撮影されている人を推定する人推定部730と、推定された人320(図2参照)の行動と行動に関連する物品を動画像データ610から抽出する行動及び物品抽出部740と、抽出された情報に基づいて行動データ620(図1参照)を生成する行動データ生成部750として機能することが記載され(段落0028参照)、「人」と「人の行動」を関連付けた「行動データ」を生成することが開示されている。 Currently, as part of home security and monitoring services, various sensor devices are installed in the home, and it is possible to detect the actions of people in the home. For example, Patent Literature 1 describes a person estimation unit 730 that estimates a person photographed in moving image data 610 (see FIG. 3), an action of the estimated person 320 (see FIG. 2), and an article related to the action. from the moving image data 610, and an action data generation unit 750 that generates action data 620 (see FIG. 1) based on the extracted information (see paragraph 0028). ), generating “behavior data” that associates “people” with “human behavior”.
特開2019-185184号公報JP 2019-185184 A
 上記の先行技術によれば、宅内の人の行動を検出することができるが、単に、「行動データ」に紐づけられている連絡先に行動に関わる情報を提供することが開示されているにすぎず、フレイル状態から健常な状態へ回復することまでは期待できない。 According to the above-mentioned prior art, it is possible to detect the behavior of people in the house. However, recovery from a frail state to a healthy state cannot be expected.
 本発明の目的は、見守りサービスのひとつとして、フレイル状態に無い居住者への健康促進や、フレイル状態の居住者の行動に応じて健常な状態へ回復できるように、居住者に関与する生活支援システムを提供することにある。 The purpose of the present invention is to promote the health of residents who are not in a frail state as one of the monitoring services, and to help them recover to a healthy state according to the behavior of the residents who are in a frail state. It is to provide a system.
 前記課題を解決するため、本発明の生活支援システムは、宅内を自律移動する移動体と、前記移動体が有するセンサ又は宅内のセンサで取得したセンサ情報に基づいて認識した、居住者の行動情報又は宅内の環境情報に応じて、前記移動体を制御する行動支援装置と、を備えるようにした。 In order to solve the above-mentioned problems, the life support system of the present invention includes a moving body that moves autonomously within a home, and resident behavior information recognized based on sensor information acquired by a sensor of the moving body or a sensor in the home. Alternatively, an action support device for controlling the moving body in accordance with indoor environment information is provided.
 本発明によれば、居住者のフレイル対策やフレイル状態の居住者の健常な状態への回復に寄与することができ、居住者の有用な見守りサービスを提供できる。 According to the present invention, it is possible to contribute to countermeasures against frailty for residents and recovery of frail residents to a healthy state, and to provide useful monitoring services for residents.
実施形態の生活支援システムの構成図である。1 is a configuration diagram of a life support system according to an embodiment; FIG. 実施形態の生活支援システムの行動支援装置が動作するハードウェアの構成図である。2 is a configuration diagram of hardware in which the action support device of the life support system of the embodiment operates; FIG. 行動支援装置が周期的に行う実施例1の処理のフロー図である。FIG. 10 is a flowchart of processing of the first embodiment that is periodically performed by the action support device; 行動支援装置が周期的に行う実施例2の処理のフロー図である。FIG. 11 is a flowchart of processing of the second embodiment that is periodically performed by the action support device; 行動支援装置が周期的に行う実施例3の処理のフロー図である。FIG. 11 is a flowchart of processing of the third embodiment that is periodically performed by the action support device; 行動支援装置が周期的に行う実施例4の処理のフロー図である。FIG. 12 is a flowchart of processing of the fourth embodiment that is periodically performed by the action support device; 行動支援装置が周期的に行う実施例5の処理のフロー図である。FIG. 11 is a flow diagram of processing of Example 5 periodically performed by the action support device; 行動支援装置が周期的に行う実施例6の処理のフロー図である。FIG. 14 is a flowchart of processing of Example 6 periodically performed by the action support device; 行動支援装置が周期的に行う実施例7の処理のフロー図である。FIG. 16 is a flowchart of processing of the seventh embodiment that is periodically performed by the action support device;
 以下、本発明の実施形態について、図面を参照しながら詳細に説明する。
 図1は、実施形態の生活支援システム及び行動支援装置の構成図である。
BEST MODE FOR CARRYING OUT THE INVENTION Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings.
FIG. 1 is a configuration diagram of a life support system and an action support device according to an embodiment.
 生活支援システムは、ロボット掃除機やロボット空気清浄機の家電ロボット、愛玩ロボット等の宅内を移動する移動体2と、宅内の温湿度や気流量等の環境情報の検出、及び居住者の行動を検出するカメラや人感センサ等の複数の宅内センサ3と、洗濯機4、冷蔵庫5、空気調和機6等の家電機器と、行動支援装置1が、宅内LAN7を介して接続して構成される。また、行動支援装置1は、インターネット8を介して、居住者のスマートホン等の居住者端末9aと、家族や介護支援者等の見守り者の見守り者端末9bと通信する。 The life support system detects mobile objects 2 such as robot vacuum cleaners, robot air purifiers, home appliance robots, and pet robots that move in the home, and detects environmental information such as temperature, humidity, and airflow in the home, and monitors the behavior of residents. A plurality of in-home sensors 3 such as detecting cameras and motion sensors, household appliances such as a washing machine 4, a refrigerator 5, and an air conditioner 6, and an action support device 1 are connected via an in-home LAN 7. . In addition, the action support device 1 communicates via the Internet 8 with a resident terminal 9a such as a resident's smart phone and a watcher terminal 9b of a watcher such as a family member or care supporter.
 詳細は後述するが、行動支援装置1は、宅内を自律移動する移動体2が有するセンサ21又は宅内の宅内センサ3で取得したセンサ情報に基づいて認識した、居住者の行動情報又は宅内の環境情報に応じて、移動体2を制御する。 Although the details will be described later, the action support device 1 recognizes based on the sensor information acquired by the sensor 21 of the moving body 2 autonomously moving in the house or the sensor information acquired by the in-home sensor 3 in the house, the behavior information of the resident or the environment in the house. It controls the moving body 2 according to the information.
 次に、図1により、移動体2と行動支援装置1の構成を説明する。
 移動体2は、宅内の地図情報と自己位置検出手段により、自律移動可能に構成されていると共に、行動支援装置1からの指示に基づいて、指示された位置に移動し、さらに、実装するセンサ21により居住者を検出する。また、移動体2は、居住者に近接して、コミュニケーションすることができる。
Next, the configurations of the mobile body 2 and the action support device 1 will be described with reference to FIG.
The moving body 2 is configured to be able to move autonomously based on the map information in the home and the self-location detection means, and moves to the designated position based on the instruction from the action support device 1. 21 detects a resident. Also, the moving body 2 can communicate with the resident in close proximity.
 詳しくは、移動体2は、カメラ、床面測距センサ、測距センサ、ミリ波レーダ、サーモグラフィ等のセンサ21を搭載し、制御部23が地図情報25に基づいてモータ等の駆動手段22を制御して宅内を自律走行する。この際、センサ21で検出した周囲の障害物を避けて走行する。 Specifically, the moving body 2 is equipped with sensors 21 such as a camera, a floor ranging sensor, a ranging sensor, a millimeter wave radar, and a thermography, and a control unit 23 controls driving means 22 such as a motor based on map information 25. It can run autonomously in the house under control. At this time, the vehicle runs while avoiding surrounding obstacles detected by the sensor 21 .
 移動体2のセンサ21は、周囲の走行の障害物を検知するだけでなく、センサ21のカメラ画像により、居住者の特定、居住者の移動に追従、居住者の方向に向きを変える、居住者の行動を検出する等を可能にする。さらに、音センサ、温度センサ、湿度センサ、照度センサ、マイク等をセンサ21として搭載し、宅内の環境情報を検出するようにしても良い。 The sensor 21 of the moving body 2 not only detects obstacles in the surrounding running, but also uses the camera image of the sensor 21 to identify the resident, follow the movement of the resident, change the direction of the resident, change the direction of the resident. It is possible to detect the behavior of a person, etc. Further, a sound sensor, a temperature sensor, a humidity sensor, an illuminance sensor, a microphone, or the like may be installed as the sensor 21 to detect indoor environmental information.
 また、移動体2は、筐体の表面温度を上昇させる筐体ヒータ24を搭載して、居住者が接触した際の温度を調整できるようにし、居住者の体温が高いことを知らせる。また、スピーカや発光部(不図示)を搭載してメッセージを出力し、居住者とのヒューマンインターフェイス部としてもよい。 In addition, the moving body 2 is equipped with a housing heater 24 that raises the surface temperature of the housing, so that the temperature can be adjusted when the resident comes into contact with it, and the body temperature of the resident is high. Also, a speaker or a light-emitting unit (not shown) may be installed to output a message and serve as a human interface unit with the resident.
 移動体2は、センサ21で検出したカメラ画像、温度分布画像等のセンサ情報を、居住者の行動情報や環境情報を検出するために行動支援装置1に通知すると共に、行動支援装置1からの移動指示により、指定された宅内の位置に移動する。 The moving body 2 notifies the action support device 1 of sensor information such as the camera image and the temperature distribution image detected by the sensor 21 to detect the action information of the resident and the environment information, and also receives the information from the action support device 1. Moves to a designated position in the house according to the movement instruction.
 行動支援装置1は、居住者の行動を認識して行動情報を求める行動認識部11と、行動認識部11で認識した居住者の行動情報を時系列に記憶する時系列行動情報記憶部12と、行動認識部11で認識した居住者の現在の行動情報と時系列行動情報記憶部12の時系列行動情報とから居住者の状態を分析し、居住者の行動変容を促す動作又は支援動作を移動体2に指示する移動体指示部13と、居住者の見守り者の端末に居住者の行動情報に関する情報を通知する見守り情報通知部14と、から構成する。 The action support device 1 includes an action recognition unit 11 that recognizes actions of a resident and obtains action information, and a time-series action information storage unit 12 that stores the resident action information recognized by the action recognition unit 11 in time series. Analyzes the state of the resident from the current behavior information of the resident recognized by the behavior recognition unit 11 and the time-series behavior information of the time-series behavior information storage unit 12, and selects an action that encourages behavior change of the resident or a support action. It is composed of a moving object instruction unit 13 for instructing the moving object 2 and a watching information notification unit 14 for notifying the terminal of the watching person of the resident of information on the behavior information of the resident.
 詳細には、行動認識部11の人特定部111は、移動体2のセンサ21(カメラ)により撮影した居住者の顔画像を認識して、居住者を特定する(人特定情報を求める)。居住者の特定は、顔認証に限らず、顔画像を用いた虹彩認証等の生体認証で行ってもよい。 Specifically, the person identification unit 111 of the action recognition unit 11 identifies the resident by recognizing the face image of the resident captured by the sensor 21 (camera) of the mobile body 2 (obtains the person identification information). Identification of a resident may be performed not only by face authentication but also by biometric authentication such as iris authentication using a face image.
 行動認識部11の人行動検知部112は、人特定部111で特定された居住者毎に、移動体2のセンサ21のセンサ情報や宅内センサ3のセンサ情報に基づいて、居住者の行動を推定し、特定の居住者の行動情報を取得する。詳しくは、行動情報は、居住者の頭、手、腕、足、指等の体の部位の位置変化、体の向きや姿勢変化等から、人の動作を認識し、認識した人の動作の組合せから「起きる」、「横になる」、「座る」等の行動を推定して求める。そして、人行動検知部112は、センサ情報の取得位置や取得時刻に基づいて、寝室、キッチン、昼夜を特定して、「起床」、「就寝」、「食卓着席」、「リラックス」などの行動を推定し、行動情報とする。 The human behavior detection unit 112 of the behavior recognition unit 11 detects the behavior of each resident identified by the person identification unit 111 based on the sensor information of the sensor 21 of the moving body 2 and the sensor information of the indoor sensor 3. Infer and obtain behavioral information for a particular resident. Specifically, behavior information recognizes a person's movement from changes in the position of body parts such as the resident's head, hands, arms, feet, fingers, etc., and changes in body orientation and posture, etc. Actions such as "get up", "lie down", and "sit" are estimated and obtained from the combinations. Then, the human behavior detection unit 112 identifies the bedroom, the kitchen, and day/night based on the sensor information acquisition position and acquisition time, and performs actions such as “wake up”, “sleep”, “sit at the table”, and “relax”. is estimated and used as behavioral information.
 さらに、人行動検知部112は、移動体2のセンサ21のセンサ情報と宅内センサ3のセンサ情報に基づいて、宅内の環境情報を取得し、居住者の行動分析を行う際に、環境情報を参照できるようにする。詳しくは、洗濯機4や空気調和機6の動作状態を環境情報として取得する。 Furthermore, the human behavior detection unit 112 acquires the environment information in the home based on the sensor information from the sensor 21 of the moving body 2 and the sensor information from the in-home sensor 3, and uses the environment information when analyzing the behavior of the resident. make it available for reference. Specifically, the operating states of the washing machine 4 and the air conditioner 6 are acquired as environmental information.
 また、人行動検知部112は、認識した人の動作から、居住者の活動量・歩行速度・移動速度、立ち上がり速度、転倒状態を取得し、行動情報とする。 In addition, the human behavior detection unit 112 acquires the resident's activity level, walking speed, moving speed, rising speed, and falling state from the recognized human behavior, and uses them as behavior information.
 時系列行動情報記憶部12は、人特定部111で特定した居住者(人特定情報)毎に、行動認識部11で認識した居住者の行動情報を、時系列に記憶する記憶部である。時系列行動情報記憶部12には、行動認識部11で周期的に居住者の行動情報を求めて、認識時刻と行動情報とを逐次記録する。また、行動認識部11で求めた行動情報が変化した際の認識時刻と行動情報とを時系列行動情報記憶部12に記録してもよい。 The time-series behavior information storage unit 12 is a storage unit that stores, in chronological order, the resident's behavior information recognized by the behavior recognition unit 11 for each resident (person identification information) identified by the person identification unit 111 . In the time-series behavior information storage unit 12, the behavior recognition unit 11 periodically obtains the behavior information of the resident, and sequentially records the recognition time and the behavior information. Further, the time-series behavior information storage unit 12 may record the recognition time and the behavior information when the behavior information obtained by the behavior recognition unit 11 changes.
 なお、以下の説明では、時系列行動情報記憶部12の行動情報を時系列行動情報と呼び、行動認識部11で認識した直近の行動情報を現在行動情報と呼ぶ。 In the following description, the action information in the time-series action information storage unit 12 is called time-series action information, and the most recent action information recognized by the action recognition unit 11 is called current action information.
 移動体指示部13の行動推定部131は、居住者毎の時系列行動情報と現在行動情報とを対比し、居住者の状態を分析する。また、行動推定部131は、移動体2のセンサ21のセンサ情報と宅内センサ3のセンサ情報に基づいて取得した環境情報を加えて、居住者の状態を分析する。 The behavior estimation unit 131 of the moving body instruction unit 13 compares the time-series behavior information and current behavior information for each resident, and analyzes the state of the resident. In addition, the behavior estimation unit 131 analyzes the state of the resident by adding the environmental information acquired based on the sensor information from the sensor 21 of the moving body 2 and the sensor information from the indoor sensor 3 .
 指示情報生成部132は、行動推定部131の分析結果に応じて、移動体2の居住者の行動変容を促す動作又は支援動作の指示情報を生成する。 The instruction information generation unit 132 generates instruction information for an action that prompts behavioral change of the resident of the moving object 2 or an assisting action according to the analysis result of the action estimation unit 131 .
 指示情報通知部133は、指示情報生成部132で生成した指示情報を、宅内LAN7を介して移動体2に通知する。 The instruction information notification unit 133 notifies the mobile object 2 of the instruction information generated by the instruction information generation unit 132 via the home LAN 7 .
 見守り情報通知部14は、行動推定部131の分析結果に係る情報、又は、移動体2に指示して取得した居住者の情報を、インターネット8を介して、居住者端末9a又は見守り者端末9bに通知する。 The watching information notifying unit 14 transmits the information related to the analysis result of the behavior estimating unit 131 or the information of the resident acquired by instructing the moving object 2 to the resident terminal 9a or the watching person terminal 9b via the Internet 8. to notify.
 移動体指示部13の詳細な動作は、後述のフロー図により説明する。 The detailed operation of the moving body instruction unit 13 will be explained later with reference to the flowchart.
 図2は、図1の行動支援装置1が動作するハードウェアの構成図である。
 行動支援装置1は、CPU901と、RAM902と、ROM903と、HDD904と、通信I/F905と、入出力I/F906と、メディアI/F907とを有するコンピュータ900として構成される。
 通信I/F905は、外部の通信装置915と接続される。入出力I/F906は、入出力装置916と接続される。メディアI/F907は、記録媒体917からデータを読み書きする。さらに、CPU901は、RAM902に読み込んだプログラム(アプリケーションや、その略のアプリとも呼ばれる)を実行することにより、各処理部を制御する。そして、このプログラムは、通信回線を介して配布したり、CD-ROM等の記録媒体917に記録して配布したりすることも可能である。
FIG. 2 is a configuration diagram of hardware on which the action support device 1 of FIG. 1 operates.
Action support device 1 is configured as computer 900 having CPU 901 , RAM 902 , ROM 903 , HDD 904 , communication I/F 905 , input/output I/F 906 and media I/F 907 .
Communication I/F 905 is connected to an external communication device 915 . Input/output I/F 906 is connected to input/output device 916 . A media I/F 907 reads and writes data from a recording medium 917 . Furthermore, the CPU 901 controls each processing unit by executing a program (also called an application or an app for short) read into the RAM 902 . This program can be distributed via a communication line or recorded on a recording medium 917 such as a CD-ROM for distribution.
 行動支援装置1では、CPU901がプログラムを実行することで、行動認識部11、移動体指示部13、見守り情報通知部14のそれぞれの機能を実現し、時系列行動情報記憶部12をHDD904に構成する。 In the action support device 1, the CPU 901 executes a program to implement the functions of the action recognition unit 11, the moving object instruction unit 13, and the watching information notification unit 14, and the time-series action information storage unit 12 is configured in the HDD 904. do.
 以後、実施形態の行動支援装置1の動作を詳細に説明する。 Hereinafter, the operation of the action support device 1 of the embodiment will be described in detail.
 《実施例1》
 時系列行動情報記憶部12には、居住者毎の時系列行動情報として、活動量・歩行速度・移動速度、立ち上がり速度、転倒などの行動情報を記憶しているので、行動推定部131は、時系列行動情報と現在行動情報とを対比することで、〇か月前に比べて〇%時間が増減している、〇か月前に比べて〇%回数が増減している等の運動不足や運動の増加を分析することができる。
<<Example 1>>
Since the time-series behavior information storage unit 12 stores behavior information such as the amount of activity, walking speed, moving speed, rising speed, and falling as time-series behavior information for each resident, the behavior estimation unit 131 By comparing time-series behavior information and current behavior information, lack of exercise, such as 〇% increase/decrease in time compared to 〇 months ago, 〇% increase/decrease in frequency compared to 〇 months ago, etc. and increased exercise can be analyzed.
 本実施形態では、行動支援装置1は、行動情報の変化量が閾値以上であれば、運動不足又は運動の増加と判定する。行動支援装置1は、居住者の運動不足や運動の増加の分析結果に応じて、運動不足を解消するための居住者の行動変容を促す動作として、移動体2が「体操した方が良いよ」の声掛けを居住者にする動作を指示情報として生成し、移動体2に指示情報を通知する。または、移動体2自身が体操に関連する動作を行うことで居住者に対して行動変容を促すよう、移動体2に指示情報を通知しても良い。 In the present embodiment, the action support device 1 determines that exercise is insufficient or exercise is increased when the amount of change in the action information is equal to or greater than the threshold. According to the analysis result of lack of exercise or increase in exercise of the resident, the action support device 1 allows the moving body 2 to perform the exercise "You should do gymnastics." ” is generated as instruction information, and the instruction information is notified to the moving body 2 . Alternatively, instruction information may be sent to the moving body 2 so that the moving body 2 itself performs an exercise-related action to prompt the resident to change their behavior.
 この際、行動支援装置1は、居住者の行動情報の変化、及び行動変容が必要であることを、見守り情報として、居住者端末9a又は見守り者端末9bに通知する。さらに、移動体2の動作によって行動変容した結果を見守り情報として居住者端末9a又は見守り者端末9bに通知しても良い。 At this time, the action support device 1 notifies the resident terminal 9a or the watcher terminal 9b of changes in the resident's action information and the necessity of behavior modification as watching information. Furthermore, the resident terminal 9a or the watcher terminal 9b may be notified of the result of the behavioral change caused by the movement of the moving object 2 as watching information.
 図3は、本実施形態の行動支援装置1が周期的に行う処理のフロー図である。 FIG. 3 is a flow diagram of processing periodically performed by the action support device 1 of the present embodiment.
 ステップS31で、行動認識部11は、移動体2及び宅内センサ3のセンサ情報を取得する。 At step S31, the action recognition unit 11 acquires sensor information from the moving body 2 and the in-home sensor 3.
 ステップS32で、人特定部111は、センサ情報の顔画像を認識して居住者の人特定情報を求め、居住者を特定する。 In step S32, the person identification unit 111 recognizes the face image of the sensor information, obtains the person identification information of the resident, and identifies the resident.
 ステップS33で、人行動検知部112は、人特定情報で特定される居住者毎に、移動体2及び宅内センサ3のセンサ情報に基づいて、居住者の行動を推定し、行動情報(現在行動情報)を取得する。 In step S33, the human behavior detection unit 112 estimates the behavior of each resident identified by the person identification information based on the sensor information of the moving body 2 and the in-home sensor 3, and obtains the behavior information (current behavior information).
 ステップS34で、時系列行動情報記憶部12は、ステップS33で取得した行動情報を時系列行動情報として蓄積する。 In step S34, the time-series behavior information storage unit 12 accumulates the behavior information acquired in step S33 as time-series behavior information.
 ステップS35で、行動推定部131は、居住者毎の時系列行動情報と現在行動情報とを対比し、居住者の行動情報の変化量を分析する。 In step S35, the behavior estimation unit 131 compares the time-series behavior information and current behavior information for each resident, and analyzes the amount of change in the resident's behavior information.
 ステップS36で、指示情報生成部132は、時系列行動情報と現在行動情報の行動情報の変化量が閾値以上であるか否かを判定し、閾値以上であれば(S36のYes)、分析結果は、居住者は運動不足又は運動の増加として、ステップS37に進む。行動情報の変化量が閾値より小さければ(S36のNo)、処理を終了する。 In step S36, the instruction information generation unit 132 determines whether or not the amount of change in the behavior information of the time-series behavior information and the current behavior information is equal to or greater than the threshold. , the resident proceeds to step S37 as lack of exercise or increased exercise. If the amount of change in behavior information is smaller than the threshold (No in S36), the process ends.
 ステップS37で、指示情報生成部132は、居住者の運動不足又は運動の増加の分析結果に応じて、運動不足を解消するための行動変容を促す声掛け動作(例えば、「体操した方が良いよ」の声掛け)を、移動体2への指示情報に設定する。 In step S<b>37 , the instruction information generation unit 132 performs a voice action (for example, “You should do gymnastics” to encourage behavioral change to overcome the lack of exercise according to the analysis result of the lack of exercise or the increase in exercise of the resident). yo”) is set as the instruction information to the moving body 2.
 ステップS38で、指示情報通知部133は、移動体2にステップS37で設定した指示情報を通知する。この際、居住者の近傍に移動するように、移動体2に居住者の位置への移動指示を通知するようにしてもよい。 In step S38, the instruction information notification unit 133 notifies the moving object 2 of the instruction information set in step S37. At this time, the moving body 2 may be notified of an instruction to move to the location of the resident so as to move to the vicinity of the resident.
 ステップS39で、見守り情報通知部14は、居住者の行動情報の変化量、及び行動変容が必要であることを、見守り情報として、居住者端末9a又は見守り者端末9bに通知し、処理を終了する。 In step S39, the watching information notification unit 14 notifies the resident terminal 9a or the watching person terminal 9b of the amount of change in the behavior information of the resident and the need for behavior modification as watching information, and ends the process. do.
 上記の行動支援装置1の処理によれば、生活支援システムは、居住者に一定の運動量を維持するように促すので、居住者の健康維持を図ることができる。 According to the processing of the action support device 1 described above, the life support system prompts the resident to maintain a certain amount of exercise, so it is possible to maintain the health of the resident.
 《実施例2》
 時系列行動情報記憶部12には、居住者毎の時系列行動情報が記憶されているので、居住者の生活リズム(例えば、毎週日曜日は、起床→キッチンでレンジON→リビングで食卓に着席して朝食を取る)を認識することができる。
 また、時系列行動情報と現在行動情報とから、居住者の歩行速度の低下や身体活動量の低下を分析してフレイル兆候を検知することができる。
<<Example 2>>
Since time-series behavior information for each resident is stored in the time-series behavior information storage unit 12, the life rhythm of the resident (for example, every Sunday, wake up → turn on the microwave in the kitchen → sit at the dining table in the living room). to have breakfast).
Also, from the time-series behavior information and the current behavior information, it is possible to detect a sign of frailty by analyzing a decrease in the resident's walking speed and a decrease in the amount of physical activity.
 本実施形態では、行動支援装置1は、居住者の推定行動を移動と認識した際に、移動先(例えば、食卓)に先回りして移動体2を移動して、居住者の支援を行えるようにする。特に、居住者にフレイル兆候を検知した際には、状態を確認できるよう移動後に居住者の口を含むように撮影する動作を行うことを移動体2に指示する。口以外に、フレイル兆候により表れる身体の動作を撮影してもよい。 In the present embodiment, when the action support device 1 recognizes that the estimated action of the resident is movement, the action support device 1 moves the moving object 2 ahead of the destination (for example, dining table) to support the resident. to In particular, when a sign of frailty is detected in the resident, the moving body 2 is instructed to perform an operation of photographing the resident's mouth after moving so that the state can be confirmed. In addition to the mouth, body movements that are manifested by signs of frailty may be photographed.
 さらに、行動支援装置1は、移動体2が先回りした移動先に居住者が到着したことを確認し、所定時間経過しても到着しない場合には、所定の動作(未着通知等)をするように移動体2に指示する。 Further, the action support device 1 confirms that the resident has arrived at the destination ahead of the moving body 2, and performs a predetermined operation (notice of non-arrival, etc.) if the resident does not arrive within a predetermined time. The moving body 2 is instructed to
 居住者の到着は、宅内センサ3(人感センサ)により検出、又は移動体2のセンサ21(カメラ)の画像認識により行う。 The arrival of the resident is detected by the indoor sensor 3 (human sensor) or by image recognition by the sensor 21 (camera) of the moving body 2.
 移動体2は、居住者の到着を検知した際に、さらに、「今まで外出だったのか」「別室にいたのか」等の居住者に確認する声掛け動作を行うとよい。 When the mobile body 2 detects the arrival of the resident, it is preferable to make a call to confirm the resident, such as "Have you been out until now?" and "Have you been in another room?"
 また、行動支援装置1は、居住者の移動の確認結果や居住者の撮影画像を、見守り情報として、居住者端末9a又は見守り者端末9bに通知する。 In addition, the action support device 1 notifies the resident terminal 9a or the watcher terminal 9b of the confirmation result of the movement of the resident and the photographed image of the resident as watching information.
 図4は、本実施形態の行動支援装置1が周期的に行う処理のフロー図である
 図4のステップS31からステップS35は、図3と同様のため、ここでは説明を省略する。
FIG. 4 is a flow diagram of the process periodically performed by the action support device 1 of the present embodiment. Steps S31 to S35 in FIG. 4 are the same as those in FIG. 3, so description thereof will be omitted here.
 ステップS41で、指示情報生成部132は、時系列行動情報と現在行動情報の分析により、居住者の生活リズムから今後の推定行動は移動を伴うものであるか否かを判定し、移動でない場合(S41のNo)には、処理を終了する。 In step S41, the instruction information generating unit 132 analyzes the time-series behavior information and the current behavior information to determine from the life rhythm of the resident whether or not the future estimated behavior will involve movement. If (No in S41), the process is terminated.
 ステップS42で、指示情報生成部132は、時系列行動情報と現在行動情報における居住者の歩行速度の低下や身体活動量の低下を分析して、居住者のフレイル兆候の有無を判定する。 In step S42, the instruction information generation unit 132 analyzes the decrease in walking speed and the amount of physical activity of the resident in the time-series behavior information and the current behavior information, and determines the presence or absence of signs of frailty in the resident.
 フレイル兆候が有る場合(S42のYes)には、ステップS43で、指示情報生成部132は、移動後の居住者の口を含む撮影動作を移動体の指示情報に設定する。 If there is a sign of frailty (Yes in S42), in step S43, the instruction information generation unit 132 sets the photographing operation including the mouth of the resident after movement to the instruction information of the moving body.
 ステップS44で、指示情報生成部132は、時系列行動情報と現在行動情報から移動先の位置を推定する。例えば、時系列行動情報により認識される生活リズムにおいて、現在行動情報(キッチンでレンジON)から、今後、「リビングで食卓に着席して朝食を取る」ことが推定される場合には、移動先としてリビングを推定する。 In step S44, the instruction information generation unit 132 estimates the destination position from the time series behavior information and the current behavior information. For example, in the life rhythm recognized by the time-series behavior information, if it is estimated from the current behavior information (the range is on in the kitchen) that ``I will sit at the dining table in the living room and have breakfast'' in the future, the movement destination Estimate the living as
 ステップS45で、指示情報生成部132は、推定される移動先(リビング)への移動動作を移動体2の指示情報に設定する。なお、この移動体2の指示情報は、後述のステップS47で、移動体2に通知する。 In step S45, the instruction information generating unit 132 sets the movement to the estimated destination (living room) in the instruction information of the moving body 2. The instruction information for the moving body 2 is notified to the moving body 2 in step S47, which will be described later.
 ステップS46で、指示情報生成部132は、移動体2が先回りした移動先に居住者が到着したことを確認し、所定時間経過しても到着しない場合には所定の動作(未着通知等)をする居住者の移動の確認動作を、移動体2の指示情報に設定する。 In step S46, the instruction information generation unit 132 confirms that the resident has arrived at the destination ahead of the moving body 2, and if the resident does not arrive within a predetermined time, performs a predetermined operation (non-arrival notification, etc.). The operation of confirming the movement of the resident who does this is set in the instruction information of the moving body 2 .
 ステップS47で、指示情報通知部133は、移動体2に、ステップS43、S45、S46で設定した指示情報を通知する。 In step S47, the instruction information notification unit 133 notifies the moving object 2 of the instruction information set in steps S43, S45, and S46.
 ステップS48で、見守り情報通知部14は、居住者端末9a又は見守り者端末9bに、居住者の移動の確認結果や居住者の撮影画像を、見守り情報として、通知し、処理を終了する。 In step S48, the watching information notification unit 14 notifies the resident terminal 9a or the watching person terminal 9b of the confirmation result of the movement of the resident and the photographed image of the resident as watching information, and ends the process.
 以上の行動支援装置1により、生活支援システムは、推定した居住者の移動先に先回りするので、宅内センサ3を補完して居住者の状態を把握することができる。特に、フレイル兆候の有る居住者の口を含むように撮影することで、居住者が誤嚥しないか見守ることができる。また、撮影画像から咀嚼回数や咀嚼速度を推定することで、筋肉が衰えてきているか否かを判断することができる。 With the action support device 1 described above, the life support system can anticipate the estimated destination of the resident, so that the in-home sensor 3 can be complemented to grasp the state of the resident. In particular, by photographing the mouth of a resident with signs of frailty, it is possible to monitor whether the resident will aspirate. In addition, by estimating the number of times of mastication and mastication speed from the photographed image, it is possible to determine whether or not the muscles are weakened.
 また、上記では、行動支援装置1が、居住者の移動先に先回りして、居住者の移動の確認動作することや、居住者の口を含む撮影動作を行うことを移動体2に指示することを説明したが、他に、行動支援装置1が、移動体2のセンサ21(温度センサ、サーモグラフィ)により移動先の室内温度を検出し、移動先の設置されている空気調和機を制御するようにしてもよい。また、移動した居住者に空気調和機を操作する行動を促す動作を移動体2が行うようにしてもよい。 Further, in the above description, the action support device 1 preempts the movement destination of the resident and instructs the moving body 2 to confirm the movement of the resident and to perform the photographing operation including the mouth of the resident. In addition to this, the action support device 1 detects the indoor temperature at the destination by the sensor 21 (temperature sensor, thermography) of the moving body 2, and controls the air conditioner installed at the destination. You may do so. Further, the moving body 2 may perform an action to prompt the resident who has moved to operate the air conditioner.
 《実施例3》
 時系列行動情報記憶部12には、居住者毎の時系列行動情報が記憶されているので、現在行動情報から居住者の体調の変化(例えば、発熱、食事頻度の低下、起きてこない)を認識することができる。
<<Example 3>>
Since the time-series behavior information storage unit 12 stores the time-series behavior information for each resident, changes in the resident's physical condition (for example, fever, decreased meal frequency, and not getting up) can be detected from the current behavior information. can recognize.
 本実施形態では、行動支援装置1は、居住者の体調不良を判定した場合に、移動体2が備えるセンサ21(例えば、サーモグラフィー、カメラ)で居住者の状態を計測する動作を移動体に指示をする。
 また、行動支援装置1は、移動体2の筐体ヒータ24の設定温度を、計測した居住者の体温に応じて変更する動作を移動体に指示をする。
In this embodiment, when the action support device 1 determines that the resident is in poor physical condition, the action support device 1 instructs the mobile body 2 to measure the condition of the resident with the sensor 21 (for example, thermography, camera) provided in the mobile body 2. do.
In addition, the action support device 1 instructs the mobile body 2 to change the set temperature of the housing heater 24 of the mobile body 2 according to the measured body temperature of the resident.
 さらに、行動支援装置1は、居住者の体調不良を判定した場合に、居住者の体調情報を見守り情報として、居住者端末9a又は見守り者端末9bに通知する。 Furthermore, when the action support device 1 determines that the resident is in poor physical condition, it notifies the resident terminal 9a or the watcher terminal 9b of the physical condition information of the resident as watching information.
 図5は、本実施形態の行動支援装置1が周期的に行う処理のフロー図である。
 図5のステップS31からステップS35は、図3と同様のため、ここでは説明を省略する。
FIG. 5 is a flowchart of processing periodically performed by the action support device 1 of the present embodiment.
Steps S31 to S35 in FIG. 5 are the same as those in FIG. 3, and thus description thereof is omitted here.
 ステップS51で、指示情報生成部132は、時系列行動情報と現在行動情報の分析により、居住者の体調不良であるか否かを判定し、体調不良でない場合(S51のNo)には、処理を終了する。 In step S51, the instruction information generation unit 132 determines whether or not the resident is in poor physical condition by analyzing the time-series behavior information and the current behavior information. exit.
 体調不良である場合(S51のYes)には、ステップS52で、指示情報生成部132は、移動体2に実装されているセンサ21(サーモグラフィー、カメラ)により、居住者の状態を計測することを移動体2の動作として、指示情報に設定する。これにより、行動支援装置1は、居住者の体温を知ることや様子を観察することができる。 If the physical condition is poor (Yes in S51), in step S52, the instruction information generation unit 132 instructs the sensor 21 (thermography, camera) mounted on the moving body 2 to measure the condition of the resident. As the operation of the moving body 2, it is set in the instruction information. Thereby, the action support device 1 can know the body temperature of the resident and observe the state of the resident.
 ステップS53で、指示情報通知部133は、移動体2に、ステップS52で設定した指示情報を通知する。 In step S53, the instruction information notification unit 133 notifies the moving object 2 of the instruction information set in step S52.
 ステップS54で、移動体2から居住者の体温が通知された場合に、指示情報生成部132は、移動体2の筐体ヒータ24の設定温度を居住者の体温に応じて設定変更する動作を、移動体の指示情報に設定する。例えば、居住者の体温が平熱より高ければ、移動体2の表面温度を高くする。 In step S54, when the body temperature of the resident is notified from the moving body 2, the instruction information generation unit 132 changes the set temperature of the housing heater 24 of the moving body 2 according to the body temperature of the resident. , is set to the instruction information of the moving object. For example, if the body temperature of the resident is higher than the normal temperature, the surface temperature of the moving body 2 is raised.
 ステップS55で、指示情報通知部133は、移動体2に、ステップS54で設定した指示情報を通知する。 In step S55, the instruction information notification unit 133 notifies the moving object 2 of the instruction information set in step S54.
 ステップS56で、見守り情報通知部14は、居住者の体調情報を見守り情報として、居住者端末9a又は見守り者端末9bに通知する。 In step S56, the watching information notification unit 14 notifies the resident terminal 9a or the watching person terminal 9b of the physical condition information of the resident as watching information.
 以上の行動支援装置1により、生活支援システムは、居住者に行動を強制することなく、体調不良(発熱)の気づきを与えることができる。 With the action support device 1 described above, the life support system can notify the resident of poor physical condition (fever) without forcing the resident to act.
 《実施例4》
 図4で説明した実施形態の行動支援装置1では、移動体2が移動先に先回りして、居住者の撮影や移動確認による生活支援を行うことを説明したが、移動先の環境情報を取得するようにしてもよい。具体的には、行動支援装置1は、時系列行動情報により認識した生活リズムにより、現在行動情報から居住者の食事の準備の推定行動が分析されていた場合に、食事の準備の時間に移動体2がキッチンに先回りして、キッチンに設置され、地図情報25に位置情報が記録されている冷蔵庫の扉の開閉動作の撮影、又は冷蔵庫の庫内を撮影し、食材管理や自動発注の生活支援を行えるようにする。
<<Example 4>>
In the action support device 1 of the embodiment described with reference to FIG. 4, it has been described that the moving body 2 anticipates the destination and provides life support by photographing the resident and confirming the movement. You may make it Specifically, when the estimated action of meal preparation of the resident is analyzed from the current action information based on the life rhythm recognized from the time-series action information, the action support device 1 moves to the meal preparation time. The body 2 is installed in the kitchen ahead of the kitchen, and photographs the opening and closing operation of the door of the refrigerator whose position information is recorded in the map information 25, or photographs the inside of the refrigerator, and manages foodstuffs and automatically orders. be able to provide assistance.
 キッチンに設置されている冷蔵庫の位置情報は、冷蔵庫に添付された冷蔵庫の識別情報を示すQRコードをセンサ21(カメラ)で撮影して取得する。詳しくは、QRコードの撮影画像を解析して冷蔵庫の識別情報を認識し、センサ21(カメラ又は距離センサ)で検出した冷蔵庫と移動体2との距離情報と移動体2の位置情報とから、冷蔵庫の位置情報を算出する。位置情報を特定した冷蔵庫は、移動体2の地図情報25に記録する。
 冷蔵庫に限らず、テレビ、空気調和機、レンジ等の家電も同様にして宅内の設置位置を取得できる。なお、QRコードを用いずに居住者端末9aを介して地図情報25に対して各家電の接地位置を居住者が入力しておいても良い。
The location information of the refrigerator installed in the kitchen is acquired by photographing the QR code indicating the identification information of the refrigerator attached to the refrigerator with the sensor 21 (camera). More specifically, the identification information of the refrigerator is recognized by analyzing the photographed image of the QR code. Calculate the location information of the refrigerator. The refrigerator whose position information has been specified is recorded in the map information 25 of the moving body 2 .
In addition to refrigerators, household appliances such as televisions, air conditioners, and microwave ovens can similarly acquire installation positions in the home. In addition, the resident may input the grounding position of each home appliance to the map information 25 via the resident terminal 9a without using the QR code.
 図6は、本実施形態の行動支援装置1が周期的に行う処理のフロー図である。
 図6のステップS31からステップS35は、図3と同様のため、ここでは説明を省略する。
FIG. 6 is a flowchart of processing periodically performed by the action support device 1 of the present embodiment.
Steps S31 to S35 of FIG. 6 are the same as those of FIG. 3, so description thereof is omitted here.
 ステップS61で、指示情報生成部132は、時系列行動情報と現在行動情報の分析により、居住者の生活リズムから今後の推定行動が、居住者のキッチン移動であるか否かを判定し、キッチン移動でない場合(S61のNo)には、処理を終了する。 In step S61, the instruction information generating unit 132 analyzes the time-series behavior information and the current behavior information to determine whether or not the future behavior estimated from the life rhythm of the resident is moving to the kitchen of the resident. If it is not movement (No in S61), the process is terminated.
 キッチン移動の場合(S61のYes)には、ステップS62で、指示情報生成部132は、キッチンへの移動動作を移動体2の指示情報に設定する。 In the case of movement to the kitchen (Yes in S61), the instruction information generation unit 132 sets the movement operation to the kitchen in the instruction information of the moving body 2 in step S62.
 ステップS63で、指示情報生成部132は、キッチンに設置されている冷蔵庫の扉の開閉動作の撮影動作、又は冷蔵庫の庫内の撮影動作を移動体2の指示情報に設定する。 In step S63, the instruction information generation unit 132 sets the instruction information of the moving body 2 to the photographing operation of opening and closing the door of the refrigerator installed in the kitchen or the photographing operation of the inside of the refrigerator.
 ステップS64で、指示情報通知部133は、移動体2に、ステップS62、S63で設定した指示情報を通知する。 In step S64, the instruction information notification unit 133 notifies the moving object 2 of the instruction information set in steps S62 and S63.
 以上の行動支援装置1により、生活支援システムは、居住者への食材管理や自動発注の生活支援を行える。 With the action support device 1 described above, the life support system can provide life support such as food management and automatic ordering for residents.
 《実施例5》
 上記の行動支援装置1では、居住者の行動情報に基づいて指示情報を移動体2に通知する実施例を説明したが、本実施形態では、居住者の行動情報と宅内の環境情報に基づいて指示情報を移動体2に通知する場合について説明する。
<<Example 5>>
In the action support device 1 described above, an example has been described in which instruction information is notified to the moving object 2 based on the action information of the resident. A case of notifying the moving object 2 of the instruction information will be described.
 具体的には、行動支援装置1は、洗濯機4の動作状態を環境情報として取得して洗濯終了を推定した際に、居住者の時系列行動情報と行動情報とから推定行動を求めて、居住者の推定行動が洗濯物を干す行動に変更可能かを判定し、変更可能な場合に、移動体2により、居住者に洗濯物を干すように行動変容を促す動作を行う。 Specifically, when the action support device 1 acquires the operating state of the washing machine 4 as the environmental information and estimates the end of washing, the action support device 1 obtains the estimated action from the resident's time-series action information and the action information, It is determined whether or not the estimated behavior of the resident can be changed to the behavior of hanging the laundry, and if the behavior can be changed, the moving body 2 performs an action to prompt the resident to change the behavior to hang the laundry.
 図7は、本実施形態の行動支援装置1が周期的に行う処理のフロー図である。
 図7のステップS31からステップS34は、図3と同様のため、ここでは説明を省略する。
FIG. 7 is a flowchart of processing periodically performed by the action support device 1 of the present embodiment.
Steps S31 to S34 in FIG. 7 are the same as in FIG. 3, and thus description thereof is omitted here.
 ステップS71で、人行動検知部112は、人特定情報で特定される居住者毎に、移動体2及び宅内センサ3のセンサ情報に基づいて、宅内に設置された洗濯機4の動作状態を推定し、洗濯終了等の環境情報(現在環境情報)を取得する。例えば、宅内センサ3(マイク)で検出した宅内の環境音から洗濯機4の動作音や報知音を認識し、洗濯状態や洗濯終了を推定する。 In step S71, the human behavior detection unit 112 estimates the operation state of the washing machine 4 installed in the home based on the sensor information of the moving body 2 and the home sensor 3 for each resident identified by the person identification information. environmental information such as the end of washing (current environmental information). For example, the operating sound and notification sound of the washing machine 4 are recognized from the indoor environmental sound detected by the indoor sensor 3 (microphone), and the washing state and the completion of washing are estimated.
 ステップS72で、行動推定部131は、居住者毎の時系列行動情報と現在行動情報・環境情報とを対比し、居住者の状態を分析する。 In step S72, the behavior estimation unit 131 compares the time-series behavior information for each resident with the current behavior information/environmental information, and analyzes the resident's state.
 ステップS73で、指示情報生成部132は、現在環境情報は、洗濯終了であるか否かを判定し、洗濯終了でない場合(S73のNo)には、処理を終了する。 In step S73, the instruction information generation unit 132 determines whether or not the current environment information indicates that washing has ended. If the washing has not ended (No in S73), the process ends.
 洗濯終了の場合(S73のYes)には、ステップS74で、指示情報生成部132は、時系列行動情報と現在行動情報に基づいて分析した今後の居住者の推定行動が洗濯物を干す行動か判定し、洗濯物を干す行動であれば(S74のYes)、処理を終了する。 If the washing is finished (Yes in S73), in step S74, the instruction information generation unit 132 determines whether the future estimated behavior of the resident analyzed based on the time-series behavior information and the current behavior information is the behavior of hanging the laundry. If it is determined that the action is to hang the laundry (Yes in S74), the process is terminated.
 ステップS75で、指示情報生成部132は、居住者の現在行動情報を洗濯物を干す行動に変更可能かを判定し、変更不可能であれば(S75のNo)、処理を終了する。換言すれば、現在行動と洗濯物を干す行動との優先度を判定し、現在行動の優先度が高ければ、処理を終了する。例えば、現在行動がリラックス状態の場合は、リラックス状態を継続することの優先度が高いと判断し、処理を終了する。リラックス状態を優先するか否かは、時系列行動情報記憶部12が記憶している居住者の過去の行動情報に基づいて判断しても良い。例えば、居住者が最近リラックス状態をしていない場合は、リラックス状態を優先することが考えられる。 In step S75, the instruction information generation unit 132 determines whether the resident's current action information can be changed to the action of hanging the laundry, and if the change is not possible (No in S75), the process ends. In other words, the priority of the current action and the action of hanging the laundry is determined, and if the priority of the current action is higher, the process ends. For example, if the current behavior is in a relaxed state, it is determined that continuation of the relaxed state has a higher priority, and the process ends. Whether or not to give priority to the relaxed state may be determined based on the resident's past behavior information stored in the time-series behavior information storage unit 12 . For example, if the resident has not been in a relaxed state recently, the relaxed state may be prioritized.
 ステップS76で、指示情報生成部132は、居住者に洗濯物を干す行動に行動変容するよう促す動作を移動体2の指示情報に設定する。具体的には、移動体2が「洗濯が終了しました。洗濯物を干してください。」のメッセージを出力するようにする。 In step S76, the instruction information generation unit 132 sets, in the instruction information of the moving body 2, an action that prompts the resident to change his/her behavior to hang the laundry. Specifically, the moving body 2 is made to output the message "Washing is finished. Please hang the laundry to dry."
 ステップS77で、指示情報通知部133は、移動体2に、ステップS76で設定した指示情報を通知する。 At step S77, the instruction information notification unit 133 notifies the moving object 2 of the instruction information set at step S76.
 以上の行動支援装置1により、生活支援システムは、居住者の洗濯機4から洗濯物の取り出し忘れを防止することができる。
 また、行動支援装置1は、衣類乾燥機に適用し、乾燥終了時の居住者の乾燥機から乾燥した衣類の取り出し忘れを防止するようにしてもよい。
 また、行動支援装置1は、電子レンジに適用し、加熱終了時の居住者の電子レンジから加熱した食品の取り出し忘れを防止するようにしてもよい。
 行動支援装置1では、洗濯機、衣類乾燥機、電子レンジ等の家電の動作を家電のタスクと称し、洗濯機の「洗濯終了」、衣類乾燥機の「乾燥終了」、電子レンジの「加熱終了」等を、家電のタスク終了と称する。
With the action support device 1 described above, the life support system can prevent the resident from forgetting to take out the laundry from the washing machine 4 .
Also, the action support device 1 may be applied to a clothes dryer to prevent the resident from forgetting to take out the dried clothes from the dryer at the end of drying.
Also, the action support device 1 may be applied to a microwave oven to prevent the resident from forgetting to take out the heated food from the microwave oven at the end of heating.
In the action support device 1, the operations of home appliances such as a washing machine, a clothes dryer, and a microwave oven are referred to as home appliance tasks. , etc., are referred to as home appliance task completion.
 《実施例6》
 次に、時系列行動情報と現在行動情報に基づいて分析した居住者の推定行動(例えば、冷房をつける)により設定される宅内の環境状態と、現在環境情報が示す環境状態とが異なる場合に、居住者に推定行動を促す動作を移動体2に指示する実施形態について説明する。
 詳しくは、推定行動が空気調和機6の稼働に関する行動であった場合に、空気調和機6が稼働していなければ、移動体2が、居住者に空気調和機6を稼働する行動に行動変容するように、居住者を促す。
<<Example 6>>
Next, when the environmental state of the house set by the estimated behavior of the resident (for example, turning on the air conditioner) analyzed based on the time-series behavior information and the current behavior information is different from the environmental state indicated by the current environmental information, , an embodiment for instructing the moving body 2 to perform an operation to prompt the resident to perform an estimated action will be described.
Specifically, when the estimated behavior is behavior related to the operation of the air conditioner 6, if the air conditioner 6 is not in operation, the moving body 2 causes the resident to change the behavior to the behavior of operating the air conditioner 6. urge residents to do so.
 図8は、本実施形態の行動支援装置1が周期的に行う処理のフロー図である。
 図8のステップS31からステップS34は、図3と同様のため、ここでは説明を省略する。
FIG. 8 is a flowchart of processing periodically performed by the action support device 1 of the present embodiment.
Steps S31 to S34 in FIG. 8 are the same as those in FIG. 3, so descriptions thereof are omitted here.
 ステップS81で、人行動検知部112は、人特定情報で特定される居住者毎に、移動体2及び宅内センサ3のセンサ情報に基づいて、宅内に設置された空気調和機6の動作状態を推定し、稼働中等の環境情報(現在環境情報)を取得する。例えば、宅内センサ3(マイク、温度センサ)で検出した宅内の環境音と室温から空気調和機6の動作音を認識し、冷暖房運転の動作状態を推定する。 In step S81, the human behavior detection unit 112 detects the operating state of the air conditioner 6 installed in the home based on the sensor information of the moving body 2 and the home sensor 3 for each resident identified by the person identification information. Estimate and acquire environment information (current environment information) such as during operation. For example, the operation sound of the air conditioner 6 is recognized from the room temperature and the environmental sound detected by the indoor sensor 3 (microphone, temperature sensor), and the operating state of the cooling and heating operation is estimated.
 ステップS82で、行動推定部131は、居住者毎の時系列行動情報と現在行動情報・環境情報とを対比し、居住者の状態を分析する。 In step S82, the behavior estimation unit 131 compares the time-series behavior information for each resident with the current behavior information/environmental information, and analyzes the resident's state.
 ステップS83で、指示情報生成部132は、時系列行動情報と現在行動情報に基づいて分析した今後の居住者の推定行動が空気調和機6の稼働に関する行動であった場合に、居住者の推定行動と現在環境情報により示される現在の空気調和機6の動作状態とが対応するか否かを判定し、対応していれば(S83のYes)、処理を終了する。具体的には、推定行動が冷房を稼働する行動であった場合に、空気調和機6が冷房運転中であれば、処理を終了する。 In step S83, the instruction information generation unit 132 determines whether the future behavior of the resident analyzed based on the time-series behavior information and the current behavior information is behavior related to the operation of the air conditioner 6. It is determined whether or not the action corresponds to the current operating state of the air conditioner 6 indicated by the current environment information. Specifically, if the estimated action is to operate cooling and the air conditioner 6 is in the cooling operation, the process ends.
 ステップS84で、指示情報生成部132は、居住者の現在行動情報がリラックス状態であるか否かを判定し、リラックス状態の場合(S84のYes)には、処理を終了する。つまり、居住者の現在の行動情報がリラックス状態であれば、ステップS85の空気調和機6を稼働する行動変容を促す動作を移動体2に指示しない。 In step S84, the instruction information generating unit 132 determines whether or not the current action information of the resident is in a relaxed state, and if it is in a relaxed state (Yes in S84), the process ends. In other words, if the current behavior information of the resident is in a relaxed state, the moving body 2 is not instructed to act to change the behavior by operating the air conditioner 6 in step S85.
 リラックス状態でない場合(S84のNo)には、ステップS85で、指示情報生成部132は、居住者に空気調和機を操作するように促す動作を移動体2の指示情報に設定する。具体的には、移動体2は「空気調和機を操作してください」のメッセージを出力するようにする。 If it is not in the relaxed state (No in S84), in step S85, the instruction information generation unit 132 sets the instruction information of the moving body 2 to prompt the resident to operate the air conditioner. Specifically, the moving body 2 outputs the message "Please operate the air conditioner".
 ステップS86で、指示情報通知部133は、移動体2に、ステップS85で設定した指示情報を通知する。 At step S86, the instruction information notification unit 133 notifies the moving object 2 of the instruction information set at step S85.
 以上の行動支援装置1により、居住者の空気調和機6の稼働忘れを防止できる。 With the above action support device 1, it is possible to prevent the resident from forgetting to operate the air conditioner 6.
 《実施例7》
 本実施形態では、行動支援装置1は、居住者の時系列行動情報から推定される推定行動と現在行動状態とを比較し、異なる行動又は異常行動の場合には、移動体2のセンサ21により居住者の挙動を撮影するように移動体2に動作指示する。
<<Example 7>>
In this embodiment, the action support device 1 compares the estimated action estimated from the resident's time-series action information with the current action state, and if the action is different or abnormal, the sensor 21 of the moving body 2 The moving body 2 is instructed to photograph the behavior of the resident.
 具体的には、行動支援装置1は、夜間リラックス→移動寝室→寝室の時系列行動情報から推定される「就寝中」の推定行動と、就寝時間外の起床やベッドから立ち上がろうとしている行動から認識した「トイレに行く」の現在行動情報とを比較し、異なる行動のため、移動体2のセンサ21(カメラ)により居住者の挙動を撮影するように移動体2に動作指示する。 Specifically, the action support device 1 uses the estimated action of "sleeping" estimated from the time-series action information of relaxing at night→moving bedroom→bedroom, and the action of waking up outside of bedtime or trying to get out of bed. It compares the recognized current action information of "going to the toilet" and instructs the moving body 2 to photograph the behavior of the resident with the sensor 21 (camera) of the moving body 2 because of the different behavior.
 さらに、行動支援装置1は、移動体2に指示して撮影した居住者の行動情報と過去の居住者の行動情報とを比較して、一定の経時変化が有り(衰えや改善)と判断した場合に、その旨を、居住者又は居住者の見守り者の通信端末に通知する。一定の経時変化とは、例えば姿勢動作や歩行動作の変化のことである。 Furthermore, the action support device 1 compares the action information of the resident photographed by instructing the moving body 2 and the past action information of the resident, and determines that there is a certain change over time (decline or improvement). In that case, the communication terminal of the resident or the guardian of the resident will be notified to that effect. A constant change over time is, for example, a change in posture motion or walking motion.
 図9は、本実施形態の行動支援装置1が周期的に行う処理のフロー図である。
 図9のステップS31からステップS35は、図3と同様のため、ここでは説明を省略する。
FIG. 9 is a flowchart of processing periodically performed by the action support device 1 of the present embodiment.
Steps S31 to S35 in FIG. 9 are the same as in FIG. 3, and thus description thereof is omitted here.
 ステップS91で、指示情報生成部132は、時系列行動情報から推定される居住者の推定行動と現在行動情報とが異なるか否かを判定し、同一又は相当する行動の場合(S91のNo)には、処理を終了する。具体的には、「就寝中」の推定行動と「トイレに行く」の現在行動情報とを比較する。 In step S91, the instruction information generation unit 132 determines whether or not the estimated behavior of the resident estimated from the time-series behavior information is different from the current behavior information. end the process. Specifically, the estimated behavior of "sleep" is compared with the current behavior information of "going to the bathroom".
 ステップS92で、指示情報生成部132は、居住者を撮影する動作を移動体2の指示情報に設定する。 In step S92, the instruction information generation unit 132 sets the instruction information of the moving body 2 to the action of photographing the resident.
 ステップS93で、指示情報通知部133は、移動体2に、ステップS92で設定した指示情報を通知する。 In step S93, the instruction information notification unit 133 notifies the moving object 2 of the instruction information set in step S92.
 ステップS94で、行動認識部11は、移動体2のセンサ21(カメラ)で撮影したセンサ情報を居住者の行動画像として取得する。 In step S94, the action recognition unit 11 acquires sensor information captured by the sensor 21 (camera) of the moving body 2 as the action image of the resident.
 ステップS95で、行動推定部131は、ステップS94で取得した居住者の行動情報と、過去の居住者の行動情報とを対比する。 In step S95, the behavior estimation unit 131 compares the resident behavior information acquired in step S94 with past resident behavior information.
 ステップS96で、指示情報生成部132は、行動情報の対比から居住者のフレイルリスクを判定し、一定の経時変化(衰えや改善)が無いと判定した場合(S96のNo)には、処理を終了する。 In step S96, the instruction information generation unit 132 determines the resident's frailty risk from the comparison of the behavior information, and when it is determined that there is no constant change over time (decline or improvement) (No in S96), the process is performed. finish.
 ステップS97で、見守り情報通知部14は、居住者の一定の経時変化を見守り情報として、居住者端末9a又は見守り者端末9bに通知する。 In step S97, the watching information notification unit 14 notifies the resident terminal 9a or the watching person terminal 9b of the resident's constant change over time as watching information.
 以上の行動支援装置1により、居住者の居住者の一定の経時変化を見守り情報として通知できる。 With the above-described action support device 1, it is possible to notify a resident of a resident's constant change over time as monitoring information.
 上記の行動支援システムでは、時系列行動情報と現在行動情報とに基づいて、移動体2により居住者の生活支援を行うようにしたが、宅内の時系列環境情報により、生活支援を行うこともできる。 In the action support system described above, life support for the resident is provided by the moving body 2 based on the time-series action information and the current action information. can.
 例えば、移動体2としてのロボット掃除機において、掃除の経路情報を蓄積し、走行の障害物の検出状況から、居住者の転倒の原因となるゴミの位置情報を特定すると共に、ゴミを搭載するセンサ21(カメラ)で撮影する。そして、移動体2により、居住者に注意喚起の声掛けを行う。また、ゴミの位置情報やゴミの撮影画像を、見守り情報として、居住者端末9a又は見守り者端末9bに通知する。 For example, in a robot cleaner as the moving body 2, cleaning route information is accumulated, and based on the detection status of obstacles in traveling, the position information of the garbage that causes the resident to fall is specified, and the garbage is mounted. An image is taken by the sensor 21 (camera). Then, the moving body 2 calls out to the resident for attention. In addition, the resident terminal 9a or the guardian terminal 9b is notified of the location information of the garbage and the photographed image of the garbage as the watching information.
 また、移動体2としてのロボット掃除機のゴミ吸い込み量の時系列情報から宅内のよごれ度の変化を求め、よごれ度が増加している場合に、居住者のフレイル兆候があると判定する。また、ロボット掃除機の障害物検知の時系列情報から宅内の片付け度の変化を求め、片付け度が低下している場合に、居住者のフレイル兆候がると判定する。よごれ度や片付け度を、見守り情報として、居住者端末9a又は見守り者端末9bに通知する。 In addition, the change in the degree of dirtiness in the house is obtained from the time-series information of the amount of dust sucked by the robot cleaner as the moving body 2, and when the degree of dirtiness increases, it is determined that the resident has a sign of frailty. Also, changes in the degree of tidying up in the house are obtained from the time-series information of obstacle detection by the robot cleaner, and when the degree of tidying up is declining, it is determined that the resident has signs of frailty. The resident terminal 9a or the watcher terminal 9b is notified of the degree of dirtiness and the degree of tidying up as watching information.
 本発明は上記した実施例に限定されるものではなく、様々な変形例が含まれる。上記の実施例は本発明で分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。また、ある実施形態の構成の一部を他の実施形態の構成に置き換えることが可能であり、また、ある実施形態の構成に他の実施形態の構成を加えることも可能である。 The present invention is not limited to the above-described examples, and includes various modifications. The above embodiments have been described in detail to facilitate understanding of the present invention, and are not necessarily limited to those having all the described configurations. Also, part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment.
 1 行動支援装置
 2 移動体
 3a 宅内センサ
 3b 宅内センサ
 4洗濯機
 5冷蔵庫
 6空気調和機
 7 宅内LAN
 8 インターネット
 9a 居住者端末
 9b 見守り者端末
 21 センサ
 22 駆動手段
 23 制御部
 24 筐体ヒータ
 25 地図情報
 11 行動認識部
 111 人特定部
 112 人行動検知部
 12 時系列行動情報記憶部
 13 移動体指示部
 131 行動推定部
 132 指示情報通知部
 14 見守り情報通知部
1 action support device 2 moving object 3a home sensor 3b home sensor 4 washing machine 5 refrigerator 6 air conditioner 7 home LAN
8 Internet 9a Resident terminal 9b Watcher terminal 21 Sensor 22 Driving means 23 Control unit 24 Case heater 25 Map information 11 Action recognition unit 111 Person identification unit 112 Human action detection unit 12 Time-series action information storage unit 13 Moving body instruction unit 131 action estimation unit 132 instruction information notification unit 14 watching information notification unit

Claims (15)

  1.  宅内を自律移動する移動体と、
     前記移動体が有するセンサ又は宅内のセンサで取得したセンサ情報に基づいて認識した、居住者の行動情報又は宅内の環境情報に応じて、前記移動体を制御する行動支援装置と、
    を備えることを特徴とする生活支援システム。
    a moving body that moves autonomously in the home;
    an action support device that controls the mobile object according to behavior information of the resident or environment information inside the home, which is recognized based on sensor information acquired by a sensor of the mobile object or a sensor in the home;
    A life support system characterized by comprising:
  2.  請求項1に記載の生活支援システムにおいて、
     前記行動支援装置は、
     前記移動体が有するセンサ又は宅内のセンサで取得したセンサ情報に基づいて居住者の現在の行動情報を現在行動情報として認識すると共に、認識した行動情報を時系列行動情報として逐次記憶する行動認識部と、
     前記時系列行動情報に対して前記現在行動情報が変化したと分析し、行動情報の変化量が閾値以上の場合に、前記居住者の行動変容を促す動作を前記移動体に指示する移動体指示部と、
    を備えることを特徴とする生活支援システム。
    In the life support system according to claim 1,
    The action support device is
    A behavior recognition unit that recognizes current behavior information of a resident as current behavior information based on sensor information acquired by a sensor of the moving body or a sensor in the home, and sequentially stores the recognized behavior information as time-series behavior information. When,
    A moving body instruction for analyzing that the current behavior information has changed with respect to the time-series behavior information, and instructing the moving body to perform an action that prompts behavior modification of the resident when the amount of change in the behavior information is equal to or greater than a threshold. Department and
    A life support system characterized by comprising:
  3.  請求項2に記載の生活支援システムにおいて、
     前記行動支援装置は、さらに、
     前記居住者の前記行動情報の変化に関する情報や前記行動変容に関する情報を、前記居住者又は前記居住者の見守り者の通信端末に通知する見守り情報通知部と、
    を備えることを特徴とする生活支援システム。
    In the life support system according to claim 2,
    The action support device further includes:
    a watching information notification unit that notifies the communication terminal of the resident or the watcher of the resident of the information about the change in the behavior information of the resident and the information about the behavior change;
    A life support system characterized by comprising:
  4.  請求項1に記載の生活支援システムにおいて、
     前記行動支援装置は、
     前記移動体が有するセンサ又は宅内のセンサで取得したセンサ情報に基づいて居住者の現在の行動情報を現在行動情報として認識すると共に、認識した行動情報を時系列行動情報として逐次記憶する行動認識部と、
     前記時系列行動情報と現在行動情報に基づいて、居住者が移動すると推定される場所への移動を移動体に指示する移動体指示部と、
    を備えることを特徴とする生活支援システム。
    In the life support system according to claim 1,
    The action support device is
    A behavior recognition unit that recognizes current behavior information of a resident as current behavior information based on sensor information acquired by a sensor of the moving body or a sensor in the home, and sequentially stores the recognized behavior information as time-series behavior information. When,
    a moving object instruction unit that instructs the moving object to move to a place where the resident is estimated to move based on the time-series behavior information and the current behavior information;
    A life support system characterized by comprising:
  5.  請求項4に記載の生活支援システムにおいて、
     前記移動体指示部は、
     前記時系列行動情報と現在行動情報に基づいて居住者のフレイル兆候を検知した場合に、居住者が移動すると推定される場所への移動を移動体に指示する
    ことを特徴とする生活支援システム。
    In the life support system according to claim 4,
    The moving body instruction unit
    A life support system, wherein when signs of frailty of a resident are detected based on the time-series behavior information and the current behavior information, a moving object is instructed to move to a place where the resident is estimated to move.
  6.  請求項4又は5に記載の生活支援システムにおいて、
     前記移動体指示部は、移動体の移動後に所定時間経過しても、前記居住者を移動先で認識できない場合には、前記居住者に対して所定の動作をするように前記移動体に指示する
    ことを特徴とする生活支援システム。
    In the life support system according to claim 4 or 5,
    The moving body instruction unit instructs the moving body to perform a predetermined action for the resident when the resident cannot be recognized at the destination even after a predetermined time has passed after the moving body moves. A life support system characterized by:
  7.  請求項1に記載の生活支援システムにおいて、
     前記行動支援装置は、
     前記移動体が有するセンサ又は宅内のセンサで取得したセンサ情報に基づいて居住者の現在の行動情報を現在行動情報として認識すると共に、認識した行動情報を時系列行動情報として逐次記憶する行動認識部と、
     前記時系列行動情報と現在行動情報に基づいて居住者の体調不良を判定した場合に、前記移動体が備える前記センサで前記居住者の状態を計測する動作を移動体に指示をする移動体指示部と、
    を備えることを特徴とする生活支援システム。
    In the life support system according to claim 1,
    The action support device is
    A behavior recognition unit that recognizes current behavior information of a resident as current behavior information based on sensor information acquired by a sensor of the moving body or a sensor in the home, and sequentially stores the recognized behavior information as time-series behavior information. When,
    A moving body instruction for instructing the moving body to measure the condition of the resident with the sensor provided in the moving body when the physical condition of the resident is determined based on the time-series behavior information and the current behavior information. Department and
    A life support system characterized by comprising:
  8.  請求項7に記載の生活支援システムにおいて、
     前記移動体指示部は、さらに、
     前記居住者の体温に応じて、移動体の筐体ヒータの設定温度を変更するように移動体に指示を出す
    ことを特徴とする生活支援システム。
    In the life support system according to claim 7,
    The moving body instruction unit further
    A life support system, wherein an instruction is given to a moving body to change a set temperature of a housing heater of the moving body according to the body temperature of the resident.
  9.  請求項4に記載の生活支援システムにおいて、
     前記移動体指示部は、
     冷蔵庫の設置場所への移動動作を移動体に指示をすると共に、
     移動後に、前記移動体の前記センサにより前記居住者の前記冷蔵庫の開閉動作を撮影する動作、又は、前記冷蔵庫の庫内を撮影する動作を含む所定の行動を移動体に指示する
    ことを特徴とする生活支援システム。
    In the life support system according to claim 4,
    The moving body instruction unit
    In addition to instructing the moving object to move to the installation location of the refrigerator,
    After the movement, the moving body is instructed to perform a predetermined action including an action of photographing the opening/closing action of the refrigerator by the resident or an action of photographing the inside of the refrigerator by the sensor of the moving body. life support system.
  10.  請求項1に記載の生活支援システムにおいて、
     前記行動支援装置は、
     前記移動体が有するセンサ又は宅内のセンサで取得したセンサ情報に基づいて、居住者の現在の行動情報を認識すると共に、家電のタスク状態を宅内の環境情報として認識し、認識した居住者の行動情報を時系列行動情報として逐次記憶する行動認識部と、
     前記環境情報が家電のタスク終了を示す場合に、居住者の現在の行動情報が終了した家電のタスクに伴う行動に変更可能かを判定し、終了した家電のタスクに伴う行動に前記居住者の行動変容を促す動作を前記移動体に指示する移動体指示部と、
    を備えることを特徴とする生活支援システム。
    In the life support system according to claim 1,
    The action support device is
    Based on the sensor information acquired by the sensor of the mobile object or the sensor in the house, the current behavior information of the resident is recognized, and the task state of the home appliance is recognized as environmental information in the house, and the recognized behavior of the resident an action recognition unit that sequentially stores information as time-series action information;
    When the environment information indicates that the task of the home appliance is completed, it is determined whether the current behavior information of the resident can be changed to the behavior associated with the completed home appliance task, and the behavior of the resident is changed to the behavior associated with the completed home appliance task. a moving object instruction unit that instructs the moving object to perform an action that encourages behavior modification;
    A life support system characterized by comprising:
  11.  請求項10に記載の生活支援システムにおいて、
     前記移動体指示部は、居住者の現在の行動情報がリラックス状態であれば、前記居住者の行動変容を促す動作を前記移動体に指示しない
    ことを特徴とする生活支援システム。
    In the life support system according to claim 10,
    The life support system, wherein the moving body instruction unit does not instruct the moving body to perform an action that encourages the behavior change of the resident if the current behavior information of the resident is in a relaxed state.
  12.  請求項1に記載の生活支援システムにおいて、
     前記行動支援装置は、
     前記移動体が有するセンサ又は宅内のセンサで取得したセンサ情報に基づいて、居住者の現在の行動情報を認識すると共に、空気調和機の動作状態を宅内の環境情報として認識し、認識した居住者の行動情報を時系列行動情報として逐次記憶する行動認識部と、
     前記時系列行動情報と現在行動情報に基づいて分析した今後の居住者の推定行動が空気調和機の動作に関する行動であった場合に、現在の空気調和機の動作状態を示す環境情報に応じて、前記空気調和機を操作する行動を促す動作を前記移動体に指示する移動体指示部と、
    を備えることを特徴とする生活支援システム。
    In the life support system according to claim 1,
    The action support device is
    Based on the sensor information acquired by the sensor of the moving body or the sensor in the house, the resident who recognizes the current behavior information of the resident and recognizes the operating state of the air conditioner as environmental information in the house. an action recognition unit that sequentially stores the action information of as time-series action information;
    If the future estimated behavior of the resident analyzed based on the time-series behavior information and the current behavior information is behavior related to the operation of the air conditioner, depending on the environmental information indicating the current operating state of the air conditioner , a moving object instruction unit for instructing the moving object to perform an action prompting an action to operate the air conditioner;
    A life support system characterized by comprising:
  13.  請求項12に記載の生活支援システムにおいて、
     前記移動体指示部は、居住者の現在の行動情報がリラックス状態であれば、前記空気調和機を稼働する行動を促す動作を前記移動体に指示しない
    ことを特徴とする生活支援システム。
    In the life support system according to claim 12,
    The life support system, wherein the mobile body instruction unit does not instruct the mobile body to perform an action to activate the air conditioner if the current behavior information of the resident is in a relaxed state.
  14.  請求項1に記載の生活支援システムにおいて、
     前記行動支援装置は、
     前記移動体が有するセンサ又は宅内のセンサで取得したセンサ情報に基づいて居住者の現在の行動情報を現在行動情報として認識すると共に、認識した行動情報を時系列行動情報として逐次記憶する行動認識部と、
     前記時系列行動情報から推定される推定行動と前記現在行動情報とが異なると判定した場合に、前記居住者を撮影する動作を前記移動体に指示する移動体指示部と、
    を備えることを特徴とする生活支援システム。
    In the life support system according to claim 1,
    The action support device is
    A behavior recognition unit that recognizes current behavior information of a resident as current behavior information based on sensor information acquired by a sensor of the moving body or a sensor in the home, and sequentially stores the recognized behavior information as time-series behavior information. When,
    a moving object instruction unit that instructs the moving object to take an image of the resident when it is determined that the estimated behavior estimated from the time-series behavior information is different from the current behavior information;
    A life support system characterized by comprising:
  15.  請求項14に記載の生活支援システムにおいて、
     前記行動支援装置は、さらに、
     前記移動体に指示して撮影した居住者の行動情報と過去の居住者の行動情報とを比較して、一定の経時変化が有りと判断した場合に、その旨を、前記居住者又は前記居住者の見守り者の通信端末に通知する見守り情報通知部と、
    を備えることを特徴とする生活支援システム。
    In the life support system according to claim 14,
    The action support device further includes:
    When it is determined that there has been a certain change over time by comparing the behavior information of the resident photographed by instructing the moving object and the behavior information of the past resident, the resident or the resident is notified to that effect. a monitoring information notification unit that notifies the communication terminal of the person watching over the person;
    A life support system characterized by comprising:
PCT/JP2022/017241 2021-04-19 2022-04-07 Living assistance system WO2022224833A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-070481 2021-04-19
JP2021070481A JP2022165216A (en) 2021-04-19 2021-04-19 life support system

Publications (1)

Publication Number Publication Date
WO2022224833A1 true WO2022224833A1 (en) 2022-10-27

Family

ID=83722923

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/017241 WO2022224833A1 (en) 2021-04-19 2022-04-07 Living assistance system

Country Status (2)

Country Link
JP (1) JP2022165216A (en)
WO (1) WO2022224833A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018086201A (en) * 2016-11-29 2018-06-07 株式会社メニコン Health management type pet robot
JP2019079204A (en) * 2017-10-23 2019-05-23 佐藤 良治 Information input-output control system and method
JP2019197509A (en) * 2018-05-11 2019-11-14 フューブライト・コミュニケーションズ株式会社 Nursing-care robot, nursing-care robot control method and nursing-care robot control program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018086201A (en) * 2016-11-29 2018-06-07 株式会社メニコン Health management type pet robot
JP2019079204A (en) * 2017-10-23 2019-05-23 佐藤 良治 Information input-output control system and method
JP2019197509A (en) * 2018-05-11 2019-11-14 フューブライト・コミュニケーションズ株式会社 Nursing-care robot, nursing-care robot control method and nursing-care robot control program

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
KUMAGAI KAZUMI, TOKUNAGA SEIKI, MIYAKE NORIHISA, TAMURA KAZUHIRO, MIZUUCHI IKUO, OTAKE-MATSUURA MIHOKO: "Analysis of Responses and Evaluation of Impressions by Older Adults to the Sensing/Voice-Calling Robot", NIHON ROBOTTO GAKKAISHI - JOURNAL OF THE ROBOTICS SOCIETY OF JAPAN, vol. 39, no. 9, 1 January 2021 (2021-01-01), JP , pages 866 - 869, XP055978505, ISSN: 0289-1824, DOI: 10.7210/jrsj.39.866 *

Also Published As

Publication number Publication date
JP2022165216A (en) 2022-10-31

Similar Documents

Publication Publication Date Title
US11711235B2 (en) Information providing method and information providing apparatus
US20210160326A1 (en) Utilizing context information of environment component regions for event/activity prediction
JP5548139B2 (en) How to interact with a robot that maintains a facility in real time or near real time
CN111886633B (en) Infant monitoring with intelligent audio cues based on analyzed video streams
CN111988424A (en) Intelligent sleep monitoring bed, system and method
US10921763B1 (en) Baby monitoring using a home monitoring system
CN113168756A (en) Abnormality detection system
WO2019199365A2 (en) Utilizing context information of environment component regions for event/activity prediction
Mori et al. One-room-type sensing system for recognition and accumulation of human behavior
WO2022224833A1 (en) Living assistance system
US20230333075A1 (en) Air quality sensors
KR102612827B1 (en) Controlling method for Artificial intelligence Moving robot
JP5473750B2 (en) Information processing apparatus, information processing method, and program
EP3992987A1 (en) System and method for continously sharing behavioral states of a creature
AU2020205651B2 (en) Carbon monoxide purge system for a property
CN114782704A (en) Method and device for determining state information, storage medium and electronic device
CN110794702A (en) Control method of household appliance, household appliance and computer readable storage medium
JP2020177447A (en) Housekeeping allotment system
JP6899358B2 (en) Home management system, home management program, and home management method
WO2023037415A1 (en) Behavior monitoring system and power saving method thereof
JP7465644B2 (en) Surveillance system and surveillance method
JP2021171359A (en) Server, dust collection movable body system, and dust collection planning method
JP2004041387A (en) Self-supporting ability determination device and method, residential facility, medium, and program
JP2023087246A (en) Mental state management system, housing, program, and mental state management method
JP2022059193A (en) Behavior estimation apparatus, behavior estimation method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22791607

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22791607

Country of ref document: EP

Kind code of ref document: A1