WO2020003952A1 - Computer executable program, information processing device, and computer execution method - Google Patents

Computer executable program, information processing device, and computer execution method Download PDF

Info

Publication number
WO2020003952A1
WO2020003952A1 PCT/JP2019/022468 JP2019022468W WO2020003952A1 WO 2020003952 A1 WO2020003952 A1 WO 2020003952A1 JP 2019022468 W JP2019022468 W JP 2019022468W WO 2020003952 A1 WO2020003952 A1 WO 2020003952A1
Authority
WO
WIPO (PCT)
Prior art keywords
resident
coefficient
time
activity
behavior
Prior art date
Application number
PCT/JP2019/022468
Other languages
French (fr)
Japanese (ja)
Inventor
寛 古川
武士 阪口
海里 姫野
恵美子 寄▲崎▼
遠山 修
藤原 浩一
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Priority to JP2020527343A priority Critical patent/JP7327397B2/en
Publication of WO2020003952A1 publication Critical patent/WO2020003952A1/en

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/22Ergometry; Measuring muscular strength or the force of a muscular blow
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/22Social work
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • G08B25/04Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using a single signalling line, e.g. in a closed loop

Definitions

  • the present disclosure relates to data processing, and more specifically, to data processing based on walking trajectories.
  • Patent Literature 1 discloses a method of grasping the activity of a person who needs nursing care, such as, An independent activity measurement system for measurement "is disclosed.
  • the system disclosed in Patent Literature 1 discloses a method of “detecting the position of a measurement subject A, such as an elderly person in need of nursing care, on a bed 94a by a bed sensor 12h, and based on the position detected by the bed sensor 12h, The state of the person, the start time and the holding time of the state are detected and stored as the detection result, and the equipment that is at least a predetermined distance from the bed, for example, the equipment using the portable toilet 94b (portable toilet 94b) is detected. ) Detecting by the sensor 12i, storing the use start time and the use time as a detection result, and displaying the detection result by the bed sensor 12h and the detection result by the portable toilet sensor 12i. " ).
  • Patent Literature 2 states, “To watch the relative changes in the behavior and ecology of the resident in a manner close to the resident's life, A monitoring system that can perform relative evaluations according to the occupants' abilities and environment and present watching specifications that match the evaluation ”(see [Issues] in [Summary]).
  • the nursing care certification is performed based on, for example, interviews with the target person and the caregiver, opinions of the attending physician, and the like, and quantitative measurement cannot be performed, and it may take time to determine the degree of need for nursing care. Therefore, there is a need for a technique that enables quantitative measurement and does not require much time for determination.
  • the present disclosure has been made in view of the above-described background, and an object in one aspect is to provide a technology that can easily grasp a change in the behavioral state of a person who needs care. An object in another aspect is to provide a technique for deriving quantitative information that can be used for recognition of a care degree.
  • a computer-executable program for determining the degree of care indicates to a computer a step of acquiring a plurality of trajectory data acquired on different days, representing a trajectory of a resident, and identifying a plurality of behavior states of a resident by a resident based on the plurality of trajectory data. Calculating the time of each of the identified behavioral states, multiplying the time of each of the behavioral states by a predetermined coefficient to calculate a product, and adding each of the products to obtain a resident's Calculating a daily activity amount.
  • the program causes the computer to further execute a step of deriving a change in the activity of the resident by comparing each of the amounts of activity calculated for each of the plurality of days.
  • the plurality of behavioral states include at least two or more of walking, sitting, lying down, and moving in a wheelchair.
  • the plurality of behavioral states include walking, sitting, lying down.
  • the magnitude relationship among the first coefficient for the walking time, the second coefficient for the sitting time, and the third coefficient for the lying time is: first coefficient> second coefficient> third coefficient.
  • the step of acquiring includes counting each trajectory data in the room of the resident.
  • an information processing apparatus includes a memory and a processor coupled to the memory.
  • the processor represents the resident's walking trajectory, acquires a plurality of trajectory data acquired on different days, identifies a plurality of behavioral states of the resident by the resident based on the plurality of trajectory data, and is identified. Calculate the time of each action state, calculate the product by multiplying the time of each action state by a predetermined coefficient, and calculate the daily activity amount of the resident by adding each product It is configured to be.
  • the processor is further configured to derive a change in the occupant's activity by comparing each of the calculated amounts of activity for each of the plurality of days.
  • the plurality of behavior states include at least two of walking, sitting, lying down, and moving in a wheelchair.
  • the plurality of behavior states include walking, sitting, and lying down.
  • the magnitude relationship among the first coefficient for the walking time, the second coefficient for the sitting time, and the third coefficient for the lying time is: first coefficient> second coefficient> third coefficient.
  • obtaining includes summing up each trajectory data in the resident's room.
  • a computer-implemented method for determining a caregiving level represents a walking trajectory of a resident, and acquiring a plurality of trajectory data acquired on different days, and identifying a plurality of behavior states of the resident by the resident based on the plurality of trajectory data, Calculating the time of each of the identified behavioral states, multiplying the time of each of the behavioral states by a predetermined coefficient to calculate a product, and adding each of the products to obtain a day for the resident; Calculating the amount of action.
  • the method further includes deriving a change in the occupant's activity state by comparing each amount of activity calculated for each of the plurality of days.
  • the plurality of behavior states include at least two or more of walking, sitting, lying down, and moving in a wheelchair.
  • the plurality of behavioral states include walking, sitting, and lying down.
  • the magnitude relationship among the first coefficient for the walking time, the second coefficient for the sitting time, and the third coefficient for the lying time is: first coefficient> second coefficient> third coefficient.
  • the step of acquiring includes counting each trajectory data in the room of the resident.
  • FIG. 1 is a diagram illustrating an example of a configuration of a watching system.
  • FIG. 1 is a block diagram illustrating an outline of a configuration of a watching system.
  • FIG. 3 is a block diagram illustrating a hardware configuration of a computer system 300 functioning as a cloud server 150. It is a figure showing an example of the outline of the device composition of watching system 100 using sensor box 119.
  • FIG. 3 is a diagram illustrating one mode of data storage in a hard disk 5 included in the cloud server 150.
  • FIG. 3 is a diagram conceptually illustrating one mode of data storage in a hard disk 5 of a cloud server 150.
  • 15 is a flowchart illustrating a part of a process executed by CPU 1 of cloud server 150 according to an embodiment.
  • FIG. 9 is a diagram showing transition of the result of the activity of the resident and the amount of indoor activity displayed on the monitor 8 according to an embodiment.
  • the system measures a moving path of a watching target (for example, a resident of a nursing care facility) and performs image analysis using image data sent from a camera arranged in the living room. Can grasp the behavioral state of the person. Then, the system measures the subject's walking time, sitting time (time at bed edge or sitting in a chair), and lying time (time lying down in bed). Since the amount of exercise varies depending on the position and activity of the subject, a coefficient that reflects the exercise load is set in advance. The system calculates a product by multiplying each measured time by a coefficient set according to the action state, and calculates a sum of the products as an action amount.
  • a watching target for example, a resident of a nursing care facility
  • the system calculates the amount of activity of the same resident on different days, and detects a change in the amount of activity of the resident by comparing the amounts of activity.
  • FIG. 1 is a diagram illustrating an example of the configuration of the watching system 100.
  • the watching target is, for example, a resident in each living room provided in the living room area 180 of the facility.
  • living rooms 110 and 120 are provided in a living room area 180.
  • the living room 110 is assigned to the resident 111.
  • the living room 120 is assigned to the resident 121.
  • the number of living rooms included in the watching system 100 is two, but the number is not limited to this.
  • Network 190 may include both an intranet and the Internet.
  • the mobile terminal 143 carried by the caregiver 141 and the mobile terminal 144 carried by the caregiver 142 can be connected to the network 190 via the access point 140. Further, the sensor box 119, the management server 200, and the access point 140 can communicate with the cloud server 150 via the network 190.
  • Each of the living rooms 110 and 120 includes a closet 112, a bed 113, and a toilet 114 as facilities.
  • the door of the living room 110 is provided with a door sensor 118 that detects opening and closing of the door.
  • a toilet sensor 116 for detecting the opening and closing of the toilet 114 is installed on the door of the toilet 114.
  • the bed 113 is provided with an odor sensor 117 for detecting the odor of each of the residents 111 and 121.
  • Each resident 111, 121 is equipped with a vital sensor 290 for detecting vital information of the resident 111, 121.
  • the detected vital information includes the resident's body temperature, respiration, heart rate, and the like.
  • the residents 111 and 121 can operate the care call slave 115, respectively.
  • the sensor box 119 has a built-in sensor for detecting the behavior of an object in the living rooms 110 and 120.
  • a sensor is a Doppler sensor for detecting the movement of an object.
  • a camera is another example.
  • the sensor box 119 may include both a Doppler sensor and a camera as sensors.
  • FIG. 2 is a block diagram showing an outline of the configuration of the watching system 100.
  • the sensor box 119 includes a control device 101, a read only memory (ROM) 102, a random access memory (RAM) 103, a communication interface 104, a camera 105, a Doppler sensor 106, a wireless communication device 107, and a storage device. 108.
  • the control device 101 controls the sensor box 119.
  • the control device 101 is composed of, for example, at least one integrated circuit.
  • the integrated circuit is, for example, at least one CPU (Central Processing Unit), MPU (Micro Processing Unit) or other processor, at least one ASIC (Application Specific Integrated Circuit), at least one FPGA (Field Programmable Gate Array), or these. And the like.
  • An antenna (not shown) and the like are connected to the communication interface 104.
  • the sensor box 119 exchanges data with an external communication device via the antenna.
  • External communication devices include, for example, management server 200, mobile terminals 143, 144 and other terminals, access point 140, cloud server 150, and other communication terminals.
  • the camera 105 is a near-infrared camera in one implementation.
  • the near-infrared camera includes an IR (Infrared) projector that emits near-infrared light.
  • IR Infrared
  • camera 105 is a surveillance camera that receives only visible light.
  • a 3D sensor or a thermographic camera may be used as camera 105.
  • the sensor box 119 and the camera 105 may be configured integrally or may be configured separately.
  • the Doppler sensor 106 is, for example, a microwave Doppler sensor, and emits and receives radio waves to detect the behavior (movement) of objects in the living rooms 110 and 120. Thereby, the biological information of the resident 111, 121 of the living room 110, 120 can be detected.
  • the Doppler sensor 106 emits microwaves in the 24 GHz band toward the beds 113 of the rooms 110 and 120, and receives reflected waves reflected by the residents 111 and 121. The reflected waves are Doppler shifted by the actions of the residents 111 and 121.
  • the Doppler sensor 106 can detect the respiratory state and heart rate of the residents 111 and 121 from the reflected waves.
  • the wireless communication device 107 receives signals from the care call slave device 240, the door sensor 118, the toilet sensor 116, the odor sensor 117, and the vital sensor 290, and transmits the signals to the control device 101.
  • the care call slave unit 240 includes a care call button 241. When the button is operated, care call slave device 240 transmits a signal indicating that the operation has been performed to wireless communication device 107.
  • the door sensor 118, the toilet sensor 116, the odor sensor 117, and the vital sensor 290 transmit respective detection results to the wireless communication device 107.
  • the storage device 108 is, for example, a fixed storage device such as a flash memory or a hard disk, or a recording medium such as an external storage device.
  • the storage device 108 stores a program executed by the control device 101 and various data used for executing the program.
  • the various data may include behavior information of the residents 111 and 121. The details of the action information will be described later.
  • At least one of the above-mentioned programs and data is a storage device other than the storage device 108 (for example, a storage area (for example, a cache memory) of the control device 101, a ROM 102, The RAM 103 and external devices (for example, the management server 200 and the portable terminals 143 and 144) may be stored.
  • a storage device other than the storage device 108 for example, a storage area (for example, a cache memory) of the control device 101, a ROM 102,
  • the RAM 103 and external devices for example, the management server 200 and the portable terminals 143 and 144) may be stored.
  • the action information is, for example, information indicating that the residents 111 and 121 have performed a predetermined action.
  • the predetermined action is “wake up” indicating that the resident 111, 121 has occurred, “getting out” indicating that the resident 111, 121 has left the bedding, and the resident 111, 121 has fallen from the bedding. This includes four actions of “fall” indicating that the resident has fallen, and “falling” indicating that the resident 111 or 121 has fallen.
  • the control device 101 generates each piece of behavior information of the resident 111, 121 associated with each room 110, 120 based on an image captured by the camera 105 installed in each room 110, 120. I do.
  • the control device 101 detects, for example, the heads of the residents 111 and 121 from the image, and based on the detected temporal changes in the sizes of the heads of the residents 111 and 121, “ “Wake up”, “get out of bed”, “fall” and “fall” are detected.
  • “Wake up”, “get out of bed”, “fall” and “fall” are detected.
  • the storage area of the beds 113 in the living rooms 110 and 120, the first threshold Th1, the second threshold Th2, and the third threshold Th3 are stored in the storage device.
  • the first threshold Th1 identifies the size of the resident's head between the lying position and the sitting position in the area where the bed 113 is located.
  • the second threshold value Th2 identifies whether or not the resident is in the standing posture, based on the size of the resident's head in the living rooms 110 and 120 excluding the area where the bed 113 is located.
  • the third threshold value Th3 identifies whether or not the resident is in the recumbent posture in the living rooms 110 and 120 excluding the area where the bed 113 is located, based on the size of the resident's head.
  • the control device 101 extracts a moving object region from the target image as a region of the occupants 111 and 121 by, for example, the background difference method or the frame difference method.
  • the control device 101 further derives from the extracted moving body region by, for example, a circular or elliptical Hough transform, by pattern matching using a prepared head model, or by a neural network learned for head detection.
  • the head areas of the residents 111 and 121 are extracted using the thresholds thus set.
  • the control device 101 detects “wake up”, “get out of bed”, “fall” and “fall” from the extracted position and size of the head.
  • the control device 101 determines that the position of the head extracted as described above is within the area where the bed 113 is located, and that the size of the head extracted as described above uses the first threshold Th1 to lie down. When it is detected that the size of the posture has changed to the size of the sitting posture, it may be determined that the action “wake up” has occurred.
  • the control device 101 controls the size of the head extracted as described above.
  • the control device 101 controls the size of the head extracted as described above.
  • the control device 101 determines that the position of the head extracted as described above is located in the living rooms 110 and 120 excluding the area where the bed 113 is located, and the size of the extracted head uses the third threshold Th3. If it is detected that the size has changed from a certain size to the size of the recumbent posture, it may be determined that the action “fall” has occurred.
  • the control device 101 of the sensor box 119 generates the behavior information of the residents 111 and 121.
  • an element other than the control device 101 for example, the cloud server 150
  • the cloud server 150 generates the behavior information of the residents 111 and 121 using the images in the living rooms 110 and 120. Is also good.
  • the mobile terminals 143 and 144 include a control device 221, a ROM 222, a RAM 223, a communication interface 224, a display 226, a storage device 228, and an input device 229.
  • the mobile terminals 143 and 144 are realized as, for example, a smartphone, a tablet terminal, a wristwatch-type terminal, or other wearable devices.
  • the control device 221 controls the mobile terminals 143 and 144.
  • the control device 221 is configured by, for example, at least one integrated circuit.
  • the integrated circuit includes, for example, at least one CPU, at least one ASIC, at least one FPGA, or a combination thereof.
  • An antenna (not shown) and the like are connected to the communication interface 224.
  • the mobile terminals 143 and 144 exchange data with an external communication device via the antenna and the access point 140.
  • External communication devices include, for example, the sensor box 119, the management server 200, and the like.
  • the display 226 is realized by, for example, a liquid crystal display, an organic EL (Electro Luminescence) display, or the like.
  • the input device 229 is realized by, for example, a touch sensor provided on the display 226. The touch sensor receives a touch operation on the mobile terminals 143 and 144, and outputs a signal corresponding to the touch operation to the control device 221.
  • the storage device 228 is realized by, for example, a flash memory, a hard disk or another fixed storage device, or a removable data recording medium.
  • FIG. 3 is a block diagram illustrating a hardware configuration of computer system 300 functioning as cloud server 150.
  • the computer system 300 includes, as main components, a CPU 1 that executes a program, a mouse 2 and a keyboard 3 that receive an instruction input by a user of the computer system 300, and data generated by executing the program by the CPU 1 or a mouse 2.
  • a RAM 4 for volatilely storing data input via the keyboard 3
  • a hard disk 5 for nonvolatilely storing data
  • an optical disk drive 6 a communication interface (I / F) 7, and a monitor 8 Including.
  • Each component is mutually connected by a data bus.
  • the optical disk drive 6 is loaded with a CD-ROM 9 and other optical disks.
  • the processing in the computer system 300 is realized by each hardware and software executed by the CPU 1.
  • Such software may be stored in the hard disk 5 in advance.
  • the software may be stored on the CD-ROM 9 or another recording medium and distributed as a computer program.
  • the software may be provided as a downloadable application program by an information provider connected to the so-called Internet.
  • Such software is temporarily stored in the hard disk 5 after being read from the recording medium by the optical disk drive 6 or another reading device, or downloaded via the communication interface 7.
  • the software is read from the hard disk 5 by the CPU 1 and stored in the RAM 4 in the form of an executable program.
  • CPU 1 executes the program.
  • Each component of the computer system 300 shown in FIG. 3 is a general component. Therefore, it can be said that one of the essential parts of the technical idea according to the present disclosure is software stored in the RAM 4, the hard disk 5, the CD-ROM 9, or other recording media, or software downloadable via a network.
  • the storage medium may include a non-transitory, computer-readable data storage medium. Since the operation of each piece of hardware of computer system 300 is well known, detailed description will not be repeated.
  • the recording medium is not limited to a CD-ROM, FD (Flexible Disk), or hard disk, but may be a magnetic tape, cassette tape, optical disk (MO (Magnetic Optical Disc) / MD (Mini Disc) / DVD (Digital Versatile Disc)). , IC (Integrated Circuit) card (including memory card), optical card, mask ROM, EPROM (Electronically Programmable Read-Only Memory), EEPROM (Electronically Erasable Programmable Read-Only Memory), and fixed memory such as flash ROM It may be a medium that carries the program.
  • IC Integrated Circuit
  • the program here includes not only a program directly executable by the CPU but also a program in a source program format, a compressed program, an encrypted program, and the like.
  • FIG. 4 is a diagram illustrating an example of a schematic device configuration of the watching system 100 using the sensor box 119.
  • the watching system 100 is used for watching the residents 111 and 121 who are the monitoring target (monitoring target) and other residents. As shown in FIG. 4, a sensor box 119 is attached to the ceiling of the living room 110. Sensor boxes 119 are similarly attached to other rooms.
  • a range 410 represents a detection range of the sensor box 119.
  • the Doppler sensor detects a person's behavior that has occurred within the range 410.
  • the sensor box 119 has a camera as a sensor, the camera captures an image in the range 410.
  • the sensor box 119 is installed in, for example, a nursing care facility, a medical facility, or a home.
  • the sensor box 119 is attached to the ceiling, and the resident 111 and the bed 113 are imaged from the ceiling.
  • the place where the sensor box 119 is mounted is not limited to the ceiling, and may be mounted on the side wall of the living room 110.
  • the watching system 100 detects a danger occurring to the resident 111 based on a series of images (that is, videos) obtained from the camera 105.
  • the detectable danger includes a fall of the resident 111 and a state where the resident 111 is at a danger location (for example, a bed fence).
  • the monitoring system 100 When the monitoring system 100 detects that the resident 111 is in danger, the monitoring system 100 notifies the caregivers 141, 143, etc. of that fact. As an example of the notification method, the watching system 100 notifies the danger of the resident 111 to the portable terminals 143 and 144 of the caregivers 141 and 142. Upon receiving the notification, the mobile terminals 143 and 144 notify the caregivers 141 and 142 of the danger of the resident 111 by a message, voice, vibration, or the like. Accordingly, the caregivers 141 and 142 can immediately recognize that the danger has occurred in the resident 111 and can rush to the resident 111 quickly.
  • FIG. 4 illustrates an example in which the watching system 100 includes one sensor box 119, but the watching system 100 may include a plurality of sensor boxes 119.
  • FIG. 4 shows an example in which the watching system 100 includes a plurality of mobile terminals 143 and 144. However, the watching system 100 can be realized by one mobile terminal.
  • FIG. 5 is a diagram illustrating one mode of data storage in the hard disk 5 included in the cloud server 150.
  • the hard disk 5 holds the table 60.
  • the table 60 sequentially stores data transmitted from each sensor provided in each living room. More specifically, the table 60 includes a room ID 61, a date and time 62, an X coordinate value 63, and a Y coordinate value 64.
  • the room ID 61 identifies the room of the resident.
  • the date and time 62 identifies the date and time when the signal sent from the sensor was acquired.
  • the X coordinate value 63 indicates the point detected at the date and time, that is, the X coordinate value of the position of the resident.
  • the Y coordinate value 64 represents the point detected at the date and time, that is, the Y coordinate value of the position of the resident.
  • the coordinate axes that are the basis of the X coordinate value and the Y coordinate value are defined, for example, with reference to the end point of the living room (for example, one corner of the room).
  • the coordinate axis may be defined based on a certain point in the facility where each living room is provided.
  • the hard disk 5 holds the image data sent from the sensor box 119.
  • Image data is acquired at predetermined time intervals.
  • the CPU 1 can identify the state of the resident by performing image analysis using each image data. For example, the CPU 1 can extract a head from each image data and extract a time when the resident is lying down, a time when the resident is sitting on the bed 113, and a time when the resident is walking.
  • the CPU 1 can exclude a walking locus of a person other than the resident.
  • the CPU 1 may calculate a walking speed for each walking locus, and exclude a walking locus whose speed is equal to or higher than a certain speed estimated to be a walking speed of a healthy person other than a resident from image analysis targets.
  • a walking speed measured in advance or a walking speed measured at the time of image analysis can be used as a walking speed of a healthy person.
  • the CPU 1 performs walking based on the x-coordinate value and the y-coordinate value and the time from one point to the next point among the points. Speed can be calculated.
  • FIG. 6 is a diagram conceptually illustrating an aspect of data storage in the hard disk 5 of the cloud server 150.
  • the hard disk 5 stores a table 600.
  • the table 600 holds daily data observed for each resident.
  • Table 600 includes resident ID 610, date 620, walking time 630, sitting time 640, lying time 650, and indoor activity 660.
  • the resident ID 610 identifies the resident who has been the target of observation.
  • the date 620 indicates the date when the movement was observed.
  • the walking time 630 indicates a time calculated as a time during which the resident is walking on that date.
  • the sitting time 640 indicates the time calculated as the time in which the resident is sitting at that date.
  • the lying time 650 indicates the time calculated as the time when the resident is lying on that date.
  • the indoor activity amount 660 represents the indoor activity amount of the resident.
  • the indoor activity amount 660 is calculated using the walking time 630, the sitting time 640, and the lying time 650 in a certain situation.
  • the CPU 1 multiplies each of the walking time 630, the sitting time 640, and the lying time 650 by a preset coefficient, and calculates the sum of the products as the indoor activity amount 660.
  • each coefficient may be set so as to satisfy, for example, the following magnitude relationship.
  • the coefficient of the lying time ⁇ the coefficient of the sitting time ⁇ the coefficient of the walking time
  • the state of the resident is not limited to the above three, but more states and coefficients according to the state may be used.
  • a coefficient may be set for a state of moving in a wheelchair, a state of standing, a state of exercising, and the like, and the coefficient may be used for calculating the indoor activity amount.
  • the exercise includes, for example, an exercise corresponding to the exercise ability of the resident of the facility, such as raising the heel, bending and stretching.
  • the magnitude relationship between the coefficients is as follows. Lying time coefficient ⁇ Sitting time coefficient ⁇ Standing time coefficient ⁇ Wheelchair moving time coefficient ⁇ Walking time coefficient ⁇ Exercise time coefficient [Operation outline of CPU1] (1)
  • the CPU 1 represents a trajectory of a resident and acquires a plurality of trajectory data acquired on different days from the hard disk 5.
  • the CPU 1 identifies a plurality of behavioral states of the resident for a day based on a plurality of trajectory data.
  • the action state includes, for example, a walking state, a sitting state, a lying state, and the like.
  • the walking state may include independent walking, walking with assistive devices, and moving in a wheelchair.
  • the CPU 1 calculates each time of each of the identified action states.
  • the CPU 1 calculates the product by multiplying the time of each action state by a predetermined coefficient, and calculates the daily activity amount of the resident by adding each product.
  • the CPU 1 derives a change in the activity state of the resident by comparing each indoor activity amount calculated for each of a plurality of days.
  • the user of the management server 200 may specify one or more past dates as the comparison target of the indoor activity amount.
  • the management server 200 transmits the specified date to the cloud server 150.
  • the CPU 1 reads out walking locus data and image data on the designated date from the hard disk 5.
  • the CPU 1 performs an image analysis using the image data and classifies the state of the person.
  • the CPU 1 detects an object that moves together with the image of a person, it determines that the resident is moving in a walking assist device or a wheelchair.
  • the plurality of behavior states include walking, sitting, and lying down.
  • the magnitude relationship among the first coefficient for the walking time, the second coefficient for the sitting time, and the third coefficient for the lying time is: first coefficient> second coefficient> third coefficient.
  • the first coefficient, the second coefficient, and the third coefficient can be defined, for example, in a relationship in which each behavior state and the consumed energy have a positive correlation.
  • Each coefficient can be changed according to the purpose of the judgment within a range that maintains the magnitude relation. For example, each coefficient can be set according to the age of each resident, the level of the degree of need for nursing care, the presence or absence of dementia, and the like.
  • FIG. 7 is a flowchart showing a part of a process executed by CPU 1 of cloud server 150 according to an embodiment.
  • CPU 1 acquires image data from hard disk 5.
  • CPU 1 identifies a plurality of behavioral states of the resident by day based on the plurality of image data. For example, the CPU 1 performs an image analysis using the image data and classifies the resident state. In a certain situation, the CPU 1 determines whether the resident is sitting, lying down, or walking based on the result of the image analysis. This determination is made based on the location, orientation, presence / absence of movement, etc. of the person's head obtained from the result of the image analysis.
  • CPU 1 calculates each time of each identified action state. For example, the CPU 1 calculates the time during which each state continues, totals the calculation results, and calculates the walking time 630, the sitting time 640, and the lying time 650.
  • step S740 the CPU 1 calculates a product by multiplying the time of each action state by a coefficient defined in advance for each action state.
  • this coefficient may be the same coefficient or a different coefficient among tenants.
  • the coefficient may be defined according to the resident's age, gender, degree of care required, and the like.
  • step S750 CPU 1 calculates the daily indoor activity amount of the resident by adding the products.
  • the calculation unit of the indoor activity amount is not limited to one day.
  • the CPU 1 may calculate, for example, one week's indoor activity, one month's indoor activity, or other unit activity according to the setting of the manager or the like.
  • step S760 the CPU 1 compares the respective amounts of action calculated for each of the plurality of days.
  • the CPU 1 can calculate the indoor activity amounts for the day designated as the day on which the indoor activity amount is to be calculated, the day before three months and the day six months before, and compare the changes.
  • step S770 CPU 1 derives a change in the activity state of the resident based on the result of the comparison. For example, for each resident, the CPU 1 compares the current indoor activity amount with the past indoor activity amount, and detects whether or not the indoor activity amount has changed. When detecting a resident whose indoor activity amount is reduced, the CPU 1 can output a result representing the indoor activity amount of the resident together with the walking time, the sitting time, and the lying time to the monitor 8 or a form.
  • FIG. 8 is a diagram illustrating a transition between the result of the activity of the resident and the amount of indoor activity displayed on the monitor 8 according to an embodiment.
  • the management server 200 sends the instruction to the cloud server 150.
  • the CPU 1 of the cloud server 150 performs the processing illustrated in FIG. 7 and transmits the result of the processing to the management server 200.
  • the monitor 8 of the management server 200 displays the result of the processing.
  • the monitor 8 displays, for a certain resident (Mr. A), a bar graph 810 showing walking time for the current month (for example, yesterday), the day before three months, and the day before six months, and the sitting time. Is displayed, and a bar graph 830 indicating the lying time is displayed. Further, the monitor 8 displays a graph 840 indicating the indoor activity amount calculated as described above using the walking time corresponding to the bar graph 810, the sitting time corresponding to the bar graph 820, and the lying time corresponding to the bar graph 830. In this way, the care staff, care manager, and other users of the resident can objectively grasp the change in the occupant's behavioral state, and thus can objectively determine the degree of nursing care required. , And the consent and transparency of the determination result can be enhanced.
  • Indoor activity walking time x 2.0 + sitting time x 1.4 + lying down time x 1.0 It may be classified into more times in another aspect.
  • the coefficient of the indoor activity amount can be set as follows.
  • the past days to be compared are not limited to three months and six months ago. Comparisons may be made based on weekly performance, such as one week ago and two weeks ago. Alternatively, the comparison may be made based on monthly results, such as one month ago and two months ago.
  • the indoor activities of a plurality of residents may be compared.
  • the comparison makes it easier to detect a change in the state of each resident.
  • the image data of the watching target is sequentially acquired.
  • the system classifies the state of the watching target person into a walking state, a sitting state, a lying state, a state of moving in a wheelchair, a standing state, a exercising state, and the like using the movement track data and the image data. I do.
  • the system totals the times of the classified states, multiplies the coefficient set for each state by the time of the state to calculate a product, and calculates the sum of the products as the indoor activity amount. In this way, various actions of a single resident can be represented by a single value, the amount of indoor activity. can do.
  • the indoor activity amount is calculated using data sent from the sensor box 119 (for example, image data from the camera 105 and output data from the Doppler sensor 106), the subjectivity of the judge is eliminated. Thereby, the amount of activity of the resident is objectively indicated, so that it is possible to objectively determine the degree of care required by the resident, for example, and convince the determination result.
  • This technology is applicable to information processing of data obtained in hospitals, nursing homes, nursing homes and other facilities.

Abstract

In this invention, processing executed by a CPU of a computer system comprises: a step (S710) of acquiring a plurality of image data sets from a hard disk; a step (S720) of identifying a plurality of activity states of a resident over one day on the basis of the plurality of image data sets; a step (S730) of calculating the respective durations for the activity states that have been identified; a step (S740) of multiplying the duration of each of the activity states by a coefficient pre-defined for each of the activity states to calculate a product; a step (S750) of adding each of the products thereby calculating the amount of indoor activity over one day by the resident; a step (S760) of comparing each of the amounts of activity calculated for each of a plurality of days; and a step (S770) of determining, on the basis of the comparison result, that a change has occurred in the activity states of the resident.

Description

コンピューターで実行されるプログラム、情報処理装置、および、コンピューターで実行される方法Computer-executed program, information processing apparatus, and computer-executed method
 本開示はデータ処理に関し、より特定的には、歩行軌跡に基づくデータ処理に関する。 The present disclosure relates to data processing, and more specifically, to data processing based on walking trajectories.
 介護が必要な人の活動度の把握に関し、例えば、特開2006-012057号公報(特許文献1)は、「要介護の高齢者等の生活状況を把握し、該高齢者の自立活動度を測定する自立活動度測定システム」を開示している。特許文献1に開示されたシステムは、「要介護の高齢者等である測定対象者Aの寝床94aでの位置を寝床センサ12hにより検出し、寝床センサ12hにより検出された位置に基づいて測定対象者の状態、該状態の開始時刻及び保持時間を検出し、検出結果として記憶する。また、寝床から所定の距離以上離れた設備、例えば、ポータブルトイレ94bを利用していることを設備(ポータブルトイレ)センサ12iにより検出し、利用開始時刻及び利用時間を検出結果として記憶する。そして、寝床センサ12hによる検出結果及びポータブルトイレセンサ12iによる検出結果を表示する。」というものである([要約]参照)。 For example, Japanese Patent Application Laid-Open Publication No. 2006-012057 (Patent Literature 1) discloses a method of grasping the activity of a person who needs nursing care, such as, An independent activity measurement system for measurement "is disclosed. The system disclosed in Patent Literature 1 discloses a method of “detecting the position of a measurement subject A, such as an elderly person in need of nursing care, on a bed 94a by a bed sensor 12h, and based on the position detected by the bed sensor 12h, The state of the person, the start time and the holding time of the state are detected and stored as the detection result, and the equipment that is at least a predetermined distance from the bed, for example, the equipment using the portable toilet 94b (portable toilet 94b) is detected. ) Detecting by the sensor 12i, storing the use start time and the use time as a detection result, and displaying the detection result by the bed sensor 12h and the detection result by the portable toilet sensor 12i. " ).
 特開2015-215711号公報(特許文献2)は、『居住者の生活に寄り添う形で居住者の行動や生態の相対的な変化を見守るべく、個人差が著しい「老い」の進行を、その居住者の能力や環境に応じて相対的に評価しその評価に合った見守りの仕様を提示できる見守りシステム』を開示している([要約]の[課題]参照)。 Japanese Patent Application Laid-Open No. 2015-215711 (Patent Literature 2) states, “To watch the relative changes in the behavior and ecology of the resident in a manner close to the resident's life, A monitoring system that can perform relative evaluations according to the occupants' abilities and environment and present watching specifications that match the evaluation ”(see [Issues] in [Summary]).
特開2006-012057号公報JP 2006-012057 A 特開2015-215711号公報JP-A-2015-215711
 見守りや介護を必要とする人は、施設などに長期にわたって滞在することが多く、その間に行動状況が変わる場合がある。したがって、行動状況の経時変化を容易に把握できる技術が必要とされている。 人 People who need watching or nursing care often stay in facilities for a long period of time, during which time their behavior may change. Therefore, there is a need for a technology that can easily grasp changes over time in behavioral status.
 また、介護認定は、例えば、対象者、介護者への聞き取りや主治医の意見等に基づいて行なわれ、定量的な測定ができず、要介護度の判定に時間がかかる場合がある。したがって、定量的な測定が可能で、判定に時間がかからない技術が必要とされている。 介 護 In addition, the nursing care certification is performed based on, for example, interviews with the target person and the caregiver, opinions of the attending physician, and the like, and quantitative measurement cannot be performed, and it may take time to determine the degree of need for nursing care. Therefore, there is a need for a technique that enables quantitative measurement and does not require much time for determination.
 本開示は上述のような背景に鑑みてなされたものであって、ある局面における目的は、介護が必要な人の行動状況の変化を容易に把握できる技術を提供することである。他の局面における目的は、介護度の認定のために使用可能な定量的な情報を導出する技術を提供することである。 The present disclosure has been made in view of the above-described background, and an object in one aspect is to provide a technology that can easily grasp a change in the behavioral state of a person who needs care. An object in another aspect is to provide a technique for deriving quantitative information that can be used for recognition of a care degree.
 ある実施の形態に従うと、介護度を判定するためにコンピュータで実行されるプログラムが提供される。このプログラムはコンピュータに、入居者の歩行軌跡を表わし、異なる日に取得された複数の軌跡データを取得するステップと、複数の軌跡データに基づいて、入居者による一日の複数の行動状態を識別し、識別された各行動状態の各々の時間を算出するステップと、各行動状態の時間と予め規定された係数とを乗算して積を算出し、各積を加算することにより、入居者の一日の行動量を算出するステップとを実行させる。 According to one embodiment, a computer-executable program for determining the degree of care is provided. This program indicates to a computer a step of acquiring a plurality of trajectory data acquired on different days, representing a trajectory of a resident, and identifying a plurality of behavior states of a resident by a resident based on the plurality of trajectory data. Calculating the time of each of the identified behavioral states, multiplying the time of each of the behavioral states by a predetermined coefficient to calculate a product, and adding each of the products to obtain a resident's Calculating a daily activity amount.
 ある実施の形態に従うと、プログラムはコンピュータに、複数の日の各々について算出された各行動量を比較することにより、入居者の活動状態の変化を導出するステップをさらに実行させる。 According to one embodiment, the program causes the computer to further execute a step of deriving a change in the activity of the resident by comparing each of the amounts of activity calculated for each of the plurality of days.
 ある実施の形態に従うと、複数の行動状態は、歩行、座位、横臥、車椅子での移動の少なくとも二つ以上を含む。 According to one embodiment, the plurality of behavioral states include at least two or more of walking, sitting, lying down, and moving in a wheelchair.
 ある実施の形態に従うと、複数の行動状態は、歩行、座位、横臥を含む。歩行の時間に対する第1係数と、座位の時間に対する第2係数と、横臥の時間に対する第3係数との大小関係は、第1係数>第2係数>第3係数である。 According to one embodiment, the plurality of behavioral states include walking, sitting, lying down. The magnitude relationship among the first coefficient for the walking time, the second coefficient for the sitting time, and the third coefficient for the lying time is: first coefficient> second coefficient> third coefficient.
 ある実施の形態に従うと、取得するステップは、入居者の居室における各軌跡データを集計することを含む。 According to one embodiment, the step of acquiring includes counting each trajectory data in the room of the resident.
 他の実施の形態に従うと、情報処理装置が提供される。この情報処理装置は、メモリーと、メモリーに結合されたプロセッサーとを備える。プロセッサーは、入居者の歩行軌跡を表わし、異なる日に取得された複数の軌跡データを取得し、複数の軌跡データに基づいて、入居者による一日の複数の行動状態を識別し、識別された各行動状態の各々の時間を算出し、各行動状態の時間と予め規定された係数とを乗算して積を算出し、各積を加算することにより、入居者の一日の行動量を算出するように構成されている。 According to another embodiment, an information processing apparatus is provided. The information processing device includes a memory and a processor coupled to the memory. The processor represents the resident's walking trajectory, acquires a plurality of trajectory data acquired on different days, identifies a plurality of behavioral states of the resident by the resident based on the plurality of trajectory data, and is identified. Calculate the time of each action state, calculate the product by multiplying the time of each action state by a predetermined coefficient, and calculate the daily activity amount of the resident by adding each product It is configured to be.
 ある実施の形態に従うと、プロセッサーは、複数の日の各々について算出された各行動量を比較することにより、入居者の活動状態の変化を導出するようにさらに構成されている。 According to one embodiment, the processor is further configured to derive a change in the occupant's activity by comparing each of the calculated amounts of activity for each of the plurality of days.
 ある実施の形態に従う情報処理装置において、複数の行動状態は、歩行、座位、横臥、車椅子での移動の少なくとも二つ以上を含む。 In the information processing apparatus according to an embodiment, the plurality of behavior states include at least two of walking, sitting, lying down, and moving in a wheelchair.
 ある実施の形態に従う情報処理装置において、複数の行動状態は、歩行、座位、横臥を含む。歩行の時間に対する第1係数と、座位の時間に対する第2係数と、横臥の時間に対する第3係数との大小関係は、第1係数>第2係数>第3係数である。 情報 処理 In the information processing device according to an embodiment, the plurality of behavior states include walking, sitting, and lying down. The magnitude relationship among the first coefficient for the walking time, the second coefficient for the sitting time, and the third coefficient for the lying time is: first coefficient> second coefficient> third coefficient.
 ある実施の形態に従う情報処理装置において、取得することは、入居者の居室における各軌跡データを集計することを含む。 に お い て In the information processing apparatus according to an embodiment, obtaining includes summing up each trajectory data in the resident's room.
 他の実施の形態に従うと、介護度を判定するためにコンピュータで実行される方法が提供される。この方法は、入居者の歩行軌跡を表わし、異なる日に取得された複数の軌跡データを取得するステップと、複数の軌跡データに基づいて、入居者による一日の複数の行動状態を識別し、識別された各行動状態の各々の時間を算出するステップと、各行動状態の時間と予め規定された係数とを乗算して積を算出し、各積を加算することにより、入居者の一日の行動量を算出するステップとを含む。 According to another embodiment, a computer-implemented method for determining a caregiving level is provided. The method represents a walking trajectory of a resident, and acquiring a plurality of trajectory data acquired on different days, and identifying a plurality of behavior states of the resident by the resident based on the plurality of trajectory data, Calculating the time of each of the identified behavioral states, multiplying the time of each of the behavioral states by a predetermined coefficient to calculate a product, and adding each of the products to obtain a day for the resident; Calculating the amount of action.
 他の実施の形態に従うと、方法は、複数の日の各々について算出された各行動量を比較することにより、入居者の活動状態の変化を導出するステップをさらに含む。 According to another embodiment, the method further includes deriving a change in the occupant's activity state by comparing each amount of activity calculated for each of the plurality of days.
 他の実施の形態に従う方法において、複数の行動状態は、歩行、座位、横臥、車椅子での移動の少なくとも二つ以上を含む。 In the method according to another embodiment, the plurality of behavior states include at least two or more of walking, sitting, lying down, and moving in a wheelchair.
 他の実施の形態に従う方法において、複数の行動状態は、歩行、座位、横臥を含む。歩行の時間に対する第1係数と、座位の時間に対する第2係数と、横臥の時間に対する第3係数との大小関係は、第1係数>第2係数>第3係数である。 方法 In a method according to another embodiment, the plurality of behavioral states include walking, sitting, and lying down. The magnitude relationship among the first coefficient for the walking time, the second coefficient for the sitting time, and the third coefficient for the lying time is: first coefficient> second coefficient> third coefficient.
 他の実施の形態に従う方法において、取得するステップは、入居者の居室における各軌跡データを集計することを含む。 に お い て In the method according to another embodiment, the step of acquiring includes counting each trajectory data in the room of the resident.
 ある局面において、介護が必要な人の行動状況の変化を容易に把握できる。他の局面において、介護度の認定のために使用可能な定量的な情報を導出することができる。 に お い て In certain situations, it is easy to grasp changes in the behavioral status of people who need care. In another aspect, quantitative information that can be used for recognition of the care degree can be derived.
 この発明の上記および他の目的、特徴、局面および利点は、添付の図面と関連して理解されるこの発明に関する次の詳細な説明から明らかとなるであろう。 The above and other objects, features, aspects and advantages of the present invention will become apparent from the following detailed description of the present invention that is understood in connection with the accompanying drawings.
見守りシステム100の構成の一例を示す図である。FIG. 1 is a diagram illustrating an example of a configuration of a watching system. 見守りシステム100の構成の概要を示すブロック図である。FIG. 1 is a block diagram illustrating an outline of a configuration of a watching system. クラウドサーバー150として機能するコンピューターシステム300のハードウェア構成を表わすブロック図である。FIG. 3 is a block diagram illustrating a hardware configuration of a computer system 300 functioning as a cloud server 150. センサーボックス119を用いた見守りシステム100の装置構成の概略の一例を示す図である。It is a figure showing an example of the outline of the device composition of watching system 100 using sensor box 119. クラウドサーバー150が備えるハードディスク5におけるデータの格納の一態様を表わす図である。FIG. 3 is a diagram illustrating one mode of data storage in a hard disk 5 included in the cloud server 150. クラウドサーバー150のハードディスク5におけるデータの格納の一態様を概念的に表す図である。FIG. 3 is a diagram conceptually illustrating one mode of data storage in a hard disk 5 of a cloud server 150. ある実施の形態に従うクラウドサーバー150のCPU1が実行する処理の一部を表わすフローチャートである。15 is a flowchart illustrating a part of a process executed by CPU 1 of cloud server 150 according to an embodiment. ある実施の形態に従ってモニター8に表示される入居者の活動状態の結果と室内行動量との推移を表わす図である。FIG. 9 is a diagram showing transition of the result of the activity of the resident and the amount of indoor activity displayed on the monitor 8 according to an embodiment.
 以下、図面を参照しつつ、本開示に係る技術思想の実施の形態について説明する。以下の説明では、同一の部品には同一の符号を付してある。それらの名称および機能も同じである。したがって、それらについての詳細な説明は繰り返さない。 Hereinafter, embodiments of the technical concept according to the present disclosure will be described with reference to the drawings. In the following description, the same components are denoted by the same reference numerals. Their names and functions are the same. Therefore, detailed description thereof will not be repeated.
 [技術思想]
 まず最初に、本開示に係る技術思想について説明する。ある局面において、システムは、見守り対象者(例えば、介護施設の入居者等)の移動経路を測定し、居室内に配置されたカメラから送られる画像データを用いて画像解析を行なうことにより、入居者の行動状態を把握することができる。その上で、システムは、対象者の歩行時間、座位時間(ベッド端あるいは椅子に座っている時間)、および横臥時間(ベッドで横になっている時間)を計測する。対象者の体勢や活動内容によって運動量が異なるので、運動の負荷が反映されるような係数が予め設定される。システムは、計測した各時間に、行動状態に応じて設定された係数を乗じて積を算出し、各積の和を行動量として算出する。
[Technical Thought]
First, a technical idea according to the present disclosure will be described. In one aspect, the system measures a moving path of a watching target (for example, a resident of a nursing care facility) and performs image analysis using image data sent from a camera arranged in the living room. Can grasp the behavioral state of the person. Then, the system measures the subject's walking time, sitting time (time at bed edge or sitting in a chair), and lying time (time lying down in bed). Since the amount of exercise varies depending on the position and activity of the subject, a coefficient that reflects the exercise load is set in advance. The system calculates a product by multiplying each measured time by a coefficient set according to the action state, and calculates a sum of the products as an action amount.
 さらに、システムは、同一の入居者について、異なる日の行動量をそれぞれ算出し、各行動量を比較することで、当該入居者の行動量の変化を検出する。 Furthermore, the system calculates the amount of activity of the same resident on different days, and detects a change in the amount of activity of the resident by comparing the amounts of activity.
 [見守りシステムの構成]
 図1は、見守りシステム100の構成の一例を示す図である。見守り対象は、例えば、施設の居室領域180に設けられた各居室内の入居者である。図1の見守りシステム100では、居室領域180に、居室110,120が設けられている。居室110は、入居者111に割り当てられている。居室120は、入居者121に割り当てられている。図1の例では、見守りシステム100に含まれる居室の数は2であるが、当該数はこれに限定されない。
[Configuration of watching system]
FIG. 1 is a diagram illustrating an example of the configuration of the watching system 100. The watching target is, for example, a resident in each living room provided in the living room area 180 of the facility. In the watching system 100 in FIG. 1, living rooms 110 and 120 are provided in a living room area 180. The living room 110 is assigned to the resident 111. The living room 120 is assigned to the resident 121. In the example of FIG. 1, the number of living rooms included in the watching system 100 is two, but the number is not limited to this.
 見守りシステム100では、居室110,120にそれぞれ設置されたセンサーボックス119と、管理センター130に設置された管理サーバー200と、アクセスポイント140とが、ネットワーク190を介して接続される。ネットワーク190は、イントラネットおよびインターネットのいずれをも含み得る。 In the watching system 100, the sensor boxes 119 installed in the living rooms 110 and 120, the management server 200 installed in the management center 130, and the access point 140 are connected via the network 190. Network 190 may include both an intranet and the Internet.
 見守りシステム100では、介護者141が携帯する携帯端末143、および、介護者142が携帯する携帯端末144は、アクセスポイント140を介してネットワーク190に接続可能である。さらに、センサーボックス119、管理サーバー200、および、アクセスポイント140は、ネットワーク190を介して、クラウドサーバー150と通信可能である。 In the watching system 100, the mobile terminal 143 carried by the caregiver 141 and the mobile terminal 144 carried by the caregiver 142 can be connected to the network 190 via the access point 140. Further, the sensor box 119, the management server 200, and the access point 140 can communicate with the cloud server 150 via the network 190.
 居室110,120は、それぞれ、設備として、タンス112、ベッド113、および、トイレ114を含む。居室110のドアには、当該ドアの開閉を検出するドアセンサー118が設置されている。トイレ114のドアには、トイレ114の開閉を検出するトイレセンサー116が設置されている。ベッド113には、各入居者111,121の臭いを検出する臭いセンサー117が設置されている。各入居者111,121は、当該入居者111,121のバイタル情報を検出するバイタルセンサー290を装着している。検出されるバイタル情報は、入居者の体温、呼吸、心拍数等を含む。居室110,120では、各入居者111,121は、それぞれ、ケアコール子機115を操作することができる。 Each of the living rooms 110 and 120 includes a closet 112, a bed 113, and a toilet 114 as facilities. The door of the living room 110 is provided with a door sensor 118 that detects opening and closing of the door. A toilet sensor 116 for detecting the opening and closing of the toilet 114 is installed on the door of the toilet 114. The bed 113 is provided with an odor sensor 117 for detecting the odor of each of the residents 111 and 121. Each resident 111, 121 is equipped with a vital sensor 290 for detecting vital information of the resident 111, 121. The detected vital information includes the resident's body temperature, respiration, heart rate, and the like. In the living rooms 110 and 120, the residents 111 and 121 can operate the care call slave 115, respectively.
 センサーボックス119は、居室110,120内の物体の挙動を検出するためのセンサーを内蔵している。センサーの一例は、物体の動作を検出するためのドップラーセンサーである。他の例は、カメラである。センサーボックス119は、センサーとしてドップラーセンサーとカメラの双方を含んでもよい。 The sensor box 119 has a built-in sensor for detecting the behavior of an object in the living rooms 110 and 120. One example of a sensor is a Doppler sensor for detecting the movement of an object. Another example is a camera. The sensor box 119 may include both a Doppler sensor and a camera as sensors.
 図2を参照して、見守りシステム100の構成要素について説明する。図2は、見守りシステム100の構成の概要を示すブロック図である。 With reference to FIG. 2, the components of the watching system 100 will be described. FIG. 2 is a block diagram showing an outline of the configuration of the watching system 100.
  [センサーボックス119]
 センサーボックス119は、制御装置101と、ROM(Read Only Memory)102と、RAM(Random Access Memory)103と、通信インターフェイス104と、カメラ105と、ドップラーセンサー106と、無線通信装置107と、記憶装置108とを備える。
[Sensor box 119]
The sensor box 119 includes a control device 101, a read only memory (ROM) 102, a random access memory (RAM) 103, a communication interface 104, a camera 105, a Doppler sensor 106, a wireless communication device 107, and a storage device. 108.
 制御装置101は、センサーボックス119を制御する。制御装置101は、たとえば、少なくとも1つの集積回路によって構成される。集積回路は、たとえば、少なくとも1つのCPU(Central Processing Unit)、MPU(Micro Processing Unit)その他のプロセッサー、少なくとも1つのASIC(Application Specific Integrated Circuit)、少なくとも1つのFPGA(Field Programmable Gate Array)、またはこれらの組み合わせなどによって構成される。 (4) The control device 101 controls the sensor box 119. The control device 101 is composed of, for example, at least one integrated circuit. The integrated circuit is, for example, at least one CPU (Central Processing Unit), MPU (Micro Processing Unit) or other processor, at least one ASIC (Application Specific Integrated Circuit), at least one FPGA (Field Programmable Gate Array), or these. And the like.
 通信インターフェイス104には、アンテナ(図示しない)などが接続される。センサーボックス119は、当該アンテナを介して、外部の通信機器との間でデータをやり取りする。外部の通信機器は、たとえば、管理サーバー200、携帯端末143,144その他の端末、アクセスポイント140、クラウドサーバー150、その他の通信端末などを含む。 ア ン テ ナ An antenna (not shown) and the like are connected to the communication interface 104. The sensor box 119 exchanges data with an external communication device via the antenna. External communication devices include, for example, management server 200, mobile terminals 143, 144 and other terminals, access point 140, cloud server 150, and other communication terminals.
 カメラ105は、一実現例では、近赤外カメラである。近赤外カメラは、近赤外光を投光するIR(Infrared)投光器を含む。近赤外カメラが用いられることにより、夜間でも居室110,120の内部を表わす画像が撮影され得る。他の実現例では、カメラ105は、可視光のみを受光する監視カメラである。さらに他の実現例では、カメラ105として、3Dセンサやサーモグラフィーカメラが用いられてもよい。センサーボックス119およびカメラ105は、一体として構成されてもよいし、別体で構成されてもよい。 The camera 105 is a near-infrared camera in one implementation. The near-infrared camera includes an IR (Infrared) projector that emits near-infrared light. By using the near-infrared camera, an image representing the inside of the living room 110 or 120 can be captured even at night. In another implementation, camera 105 is a surveillance camera that receives only visible light. In still other implementations, a 3D sensor or a thermographic camera may be used as camera 105. The sensor box 119 and the camera 105 may be configured integrally or may be configured separately.
 ドップラーセンサー106は、たとえばマイクロ波ドップラーセンサーであり、電波を放射及び受信して、居室110,120内の物体の挙動(動作)を検出する。これにより、居室110,120の入居者111,121の生体情報が検出され得る。一例では、ドップラーセンサー106は、24GHz帯のマイクロ波を各居室110,120のベッド113に向けて放射し、入居者111,121で反射した反射波を受信する。反射波は、入居者111,121の動作により、ドップラーシフトしている。ドップラーセンサー106は、当該反射波から、入居者111,121の呼吸状態や心拍数を検出し得る。 The Doppler sensor 106 is, for example, a microwave Doppler sensor, and emits and receives radio waves to detect the behavior (movement) of objects in the living rooms 110 and 120. Thereby, the biological information of the resident 111, 121 of the living room 110, 120 can be detected. In one example, the Doppler sensor 106 emits microwaves in the 24 GHz band toward the beds 113 of the rooms 110 and 120, and receives reflected waves reflected by the residents 111 and 121. The reflected waves are Doppler shifted by the actions of the residents 111 and 121. The Doppler sensor 106 can detect the respiratory state and heart rate of the residents 111 and 121 from the reflected waves.
 無線通信装置107は、ケアコール子機240、ドアセンサー118、トイレセンサー116、臭いセンサー117、および、バイタルセンサー290からの信号を受信し、当該信号を制御装置101へ送信する。たとえば、ケアコール子機240は、ケアコールボタン241を備える。当該ボタンが操作されると、ケアコール子機240は、当該操作があったことを示す信号を無線通信装置107へ送信する。ドアセンサー118、トイレセンサー116、臭いセンサー117、および、バイタルセンサー290は、それぞれの検出結果を無線通信装置107へ送信する。 The wireless communication device 107 receives signals from the care call slave device 240, the door sensor 118, the toilet sensor 116, the odor sensor 117, and the vital sensor 290, and transmits the signals to the control device 101. For example, the care call slave unit 240 includes a care call button 241. When the button is operated, care call slave device 240 transmits a signal indicating that the operation has been performed to wireless communication device 107. The door sensor 118, the toilet sensor 116, the odor sensor 117, and the vital sensor 290 transmit respective detection results to the wireless communication device 107.
 記憶装置108は、たとえば、フラッシュメモリーまたはハードディスク等の固定記憶装置、あるいは、外付けの記憶装置などの記録媒体である。記憶装置108は、制御装置101によって実行されるプログラム、および、当該プログラムの実行に利用される各種のデータを格納する。各種のデータは、入居者111,121の行動情報を含んでいてもよい。行動情報の詳細は後述する。 The storage device 108 is, for example, a fixed storage device such as a flash memory or a hard disk, or a recording medium such as an external storage device. The storage device 108 stores a program executed by the control device 101 and various data used for executing the program. The various data may include behavior information of the residents 111 and 121. The details of the action information will be described later.
 上記のプログラムおよびデータのうち少なくとも一方は、制御装置101がアクセス可能な記憶装置であれば、記憶装置108以外の記憶装置(たとえば、制御装置101の記憶領域(たとえば、キャッシュメモリーなど)、ROM102、RAM103、外部機器(たとえば、管理サーバー200や携帯端末143,144等)に格納されていてもよい。 At least one of the above-mentioned programs and data is a storage device other than the storage device 108 (for example, a storage area (for example, a cache memory) of the control device 101, a ROM 102, The RAM 103 and external devices (for example, the management server 200 and the portable terminals 143 and 144) may be stored.
 [行動情報]
 上記の行動情報について、説明する。行動情報は、たとえば入居者111,121が所定の行動を実行したことを表わす情報である。一例では、所定の行動は、入居者111,121が起きたことを表わす「起床」、入居者111,121が寝具から離れたことを表わす「離床」、入居者111,121が寝具から落ちたことを表わす「転落」、および、入居者111,121が倒れたことを表わす「転倒」の4つの行動を含む。
[Behavior information]
The above behavior information will be described. The action information is, for example, information indicating that the residents 111 and 121 have performed a predetermined action. In one example, the predetermined action is “wake up” indicating that the resident 111, 121 has occurred, “getting out” indicating that the resident 111, 121 has left the bedding, and the resident 111, 121 has fallen from the bedding. This includes four actions of “fall” indicating that the resident has fallen, and “falling” indicating that the resident 111 or 121 has fallen.
 一実施の形態では、制御装置101が、各居室110,120に設置されたカメラ105が撮像した画像に基づいて、各居室110,120に関連付けられた入居者111,121の各行動情報を生成する。制御装置101は、たとえば、上記画像から入居者111,121の頭部を検出し、この検出した入居者111,121の頭部における大きさの時間変化に基づいて、入居者111,121の「起床」、「離床」、「転倒」および「転落」を検出する。以下、行動情報の生成の一具体例を、より詳細に説明する。 In one embodiment, the control device 101 generates each piece of behavior information of the resident 111, 121 associated with each room 110, 120 based on an image captured by the camera 105 installed in each room 110, 120. I do. The control device 101 detects, for example, the heads of the residents 111 and 121 from the image, and based on the detected temporal changes in the sizes of the heads of the residents 111 and 121, “ "Wake up", "get out of bed", "fall" and "fall" are detected. Hereinafter, a specific example of the generation of the behavior information will be described in more detail.
 まず、記憶装置108に、居室110,120における各ベッド113の所在領域、第1閾値Th1、第2閾値Th2、および、第3閾値Th3が格納される。第1閾値Th1は、ベッド113の所在領域内において、横臥姿勢にあるときと座位姿勢にあるときとの間で入居者の頭部の大きさを識別する。第2閾値Th2は、ベッド113の所在領域を除く居室110,120内において、入居者の頭部の大きさに基づいて、当該入居者が立位姿勢にあるか否かを識別する。第3閾値Th3は、ベッド113の所在領域を除く居室110,120内において、入居者の頭部の大きさに基づいて、当該入居者が横臥姿勢にあるか否かを識別する。 First, the storage area of the beds 113 in the living rooms 110 and 120, the first threshold Th1, the second threshold Th2, and the third threshold Th3 are stored in the storage device. The first threshold Th1 identifies the size of the resident's head between the lying position and the sitting position in the area where the bed 113 is located. The second threshold value Th2 identifies whether or not the resident is in the standing posture, based on the size of the resident's head in the living rooms 110 and 120 excluding the area where the bed 113 is located. The third threshold value Th3 identifies whether or not the resident is in the recumbent posture in the living rooms 110 and 120 excluding the area where the bed 113 is located, based on the size of the resident's head.
 制御装置101は、対象画像から、例えば背景差分法やフレーム差分法によって、入居者111,121の人物の領域として、動体領域を抽出する。制御装置101は、さらに、当該抽出した動体領域から、例えば円形や楕円形のハフ変換によって、予め用意された頭部のモデルを用いたパターンマッチングによって、頭部検出用に学習したニューラルネットワークによって導出された閾値を用いて、入居者111,121の頭部領域を抽出する。制御装置101は、当該抽出された頭部の位置および大きさから、「起床」、「離床」、「転倒」および「転落」を検知する。 The control device 101 extracts a moving object region from the target image as a region of the occupants 111 and 121 by, for example, the background difference method or the frame difference method. The control device 101 further derives from the extracted moving body region by, for example, a circular or elliptical Hough transform, by pattern matching using a prepared head model, or by a neural network learned for head detection. The head areas of the residents 111 and 121 are extracted using the thresholds thus set. The control device 101 detects “wake up”, “get out of bed”, “fall” and “fall” from the extracted position and size of the head.
 制御装置101は、上記のように抽出された頭部の位置がベッド113の所在領域内にあり、かつ、上記のように抽出された頭部の大きさが第1閾値Th1を用いることによって横臥姿勢の大きさから座位姿勢の大きさへと変化したことを検出した場合に、行動「起床」が発生したことを決定してもよい。 The control device 101 determines that the position of the head extracted as described above is within the area where the bed 113 is located, and that the size of the head extracted as described above uses the first threshold Th1 to lie down. When it is detected that the size of the posture has changed to the size of the sitting posture, it may be determined that the action “wake up” has occurred.
 制御装置101は、上記のように抽出された頭部の位置がベッド113の所在領域内からベッド113の所在領域外へ移動した場合において、上記のように抽出された頭部の大きさに対して第2閾値Th2を適用することにより、頭部がある大きさから立位姿勢の大きさへと変化したことを検出したときには、行動「離床」が発生したと判定してもよい。 When the position of the head extracted as described above moves from within the area where the bed 113 is located to outside the area where the bed 113 is located, the control device 101 controls the size of the head extracted as described above. When it is detected that the head has changed from a certain size to a standing posture size by applying the second threshold value Th2, it may be determined that the action “getting out of bed” has occurred.
 制御装置101は、上記のように抽出された頭部の位置がベッド113の所在領域内からベッド113の所在領域外へ移動した場合において、上記のように抽出された頭部の大きさに対して第3閾値Th3を適用することにより、頭部がある大きさから横臥姿勢の大きさへと変化したことを検出したときには、行動「転落」が発生したと判定してもよい。 When the position of the head extracted as described above moves from within the area where the bed 113 is located to outside the area where the bed 113 is located, the control device 101 controls the size of the head extracted as described above. When it is detected that the head has changed from a certain size to a recumbent posture by applying the third threshold value Th3, it may be determined that the action “fall” has occurred.
 制御装置101は、上記のように抽出された頭部の位置がベッド113の所在領域を除く居室110,120内に位置し、かつ、抽出された頭部の大きさが第3閾値Th3を用いることによって或る大きさから横臥姿勢の大きさへと変化したことを検出した場合には、行動「転倒」が発生したと決定してもよい。 The control device 101 determines that the position of the head extracted as described above is located in the living rooms 110 and 120 excluding the area where the bed 113 is located, and the size of the extracted head uses the third threshold Th3. If it is detected that the size has changed from a certain size to the size of the recumbent posture, it may be determined that the action “fall” has occurred.
 以上説明されたように、一具体例では、センサーボックス119の制御装置101が、入居者111,121の各行動情報を生成する。なお、他の局面に従う見守りシステム100では、居室110,120内の画像を用いて、制御装置101以外の他の要素(例えば、クラウドサーバー150)が入居者111,121の行動情報を生成してもよい。 As described above, in one specific example, the control device 101 of the sensor box 119 generates the behavior information of the residents 111 and 121. In the watching system 100 according to another aspect, an element other than the control device 101 (for example, the cloud server 150) generates the behavior information of the residents 111 and 121 using the images in the living rooms 110 and 120. Is also good.
 [携帯端末143,144]
 携帯端末143,144は、制御装置221と、ROM222と、RAM223と、通信インターフェイス224と、ディスプレイ226と、記憶装置228と、入力デバイス229とを含む。ある局面において、携帯端末143,144は、例えば、スマートフォン、タブレット端末、腕時計型端末その他のウェアラブル装置等として実現される。
[Mobile terminals 143, 144]
The mobile terminals 143 and 144 include a control device 221, a ROM 222, a RAM 223, a communication interface 224, a display 226, a storage device 228, and an input device 229. In one aspect, the mobile terminals 143 and 144 are realized as, for example, a smartphone, a tablet terminal, a wristwatch-type terminal, or other wearable devices.
 制御装置221は、携帯端末143,144を制御する。制御装置221は、たとえば、少なくとも1つの集積回路によって構成される。集積回路は、たとえば、少なくとも1つのCPU、少なくとも1つのASIC、少なくとも1つのFPGA、またはそれらの組み合わせなどによって構成される。 The control device 221 controls the mobile terminals 143 and 144. The control device 221 is configured by, for example, at least one integrated circuit. The integrated circuit includes, for example, at least one CPU, at least one ASIC, at least one FPGA, or a combination thereof.
 通信インターフェイス224には、アンテナ(図示しない)などが接続される。携帯端末143,144は、当該アンテナおよびアクセスポイント140を介して、外部の通信機器との間でデータをやり取りする。外部の通信機器は、たとえば、センサーボックス119、管理サーバー200などを含む。 ア ン テ ナ An antenna (not shown) and the like are connected to the communication interface 224. The mobile terminals 143 and 144 exchange data with an external communication device via the antenna and the access point 140. External communication devices include, for example, the sensor box 119, the management server 200, and the like.
 ディスプレイ226は、たとえば液晶ディスプレイ、有機EL(Electro Luminescence)ディスプレイ等によって実現される。入力デバイス229は、たとえばディスプレイ226に設けられたタッチセンサーによって実現される。当該タッチセンサーは、携帯端末143,144に対するタッチ操作を受け付け、当該タッチ操作に応じた信号を制御装置221へ出力する。 The display 226 is realized by, for example, a liquid crystal display, an organic EL (Electro Luminescence) display, or the like. The input device 229 is realized by, for example, a touch sensor provided on the display 226. The touch sensor receives a touch operation on the mobile terminals 143 and 144, and outputs a signal corresponding to the touch operation to the control device 221.
 記憶装置228は、たとえば、フラッシュメモリー、ハードディスクその他の固定記憶装置、あるいは、着脱可能なデータ記録媒体等により実現される。 The storage device 228 is realized by, for example, a flash memory, a hard disk or another fixed storage device, or a removable data recording medium.
 [クラウドサーバーの構成]
 図3を参照して、コンピュータの一種としてのクラウドサーバー150の構成について説明する。図3は、クラウドサーバー150として機能するコンピューターシステム300のハードウェア構成を表わすブロック図である。
[Cloud Server Configuration]
The configuration of the cloud server 150 as a type of computer will be described with reference to FIG. FIG. 3 is a block diagram illustrating a hardware configuration of computer system 300 functioning as cloud server 150.
 コンピューターシステム300は、主たる構成要素として、プログラムを実行するCPU1と、コンピューターシステム300の使用者による指示の入力を受けるマウス2およびキーボード3と、CPU1によるプログラムの実行により生成されたデータ、又はマウス2若しくはキーボード3を介して入力されたデータを揮発的に格納するRAM4と、データを不揮発的に格納するハードディスク5と、光ディスク駆動装置6と、通信インターフェイス(I/F)7と、モニター8とを含む。各構成要素は、相互にデータバスによって接続されている。光ディスク駆動装置6には、CD-ROM9その他の光ディスクが装着される。 The computer system 300 includes, as main components, a CPU 1 that executes a program, a mouse 2 and a keyboard 3 that receive an instruction input by a user of the computer system 300, and data generated by executing the program by the CPU 1 or a mouse 2. Alternatively, a RAM 4 for volatilely storing data input via the keyboard 3, a hard disk 5 for nonvolatilely storing data, an optical disk drive 6, a communication interface (I / F) 7, and a monitor 8 Including. Each component is mutually connected by a data bus. The optical disk drive 6 is loaded with a CD-ROM 9 and other optical disks.
 コンピューターシステム300における処理は、各ハードウェアおよびCPU1により実行されるソフトウェアによって実現される。このようなソフトウェアは、ハードディスク5に予め記憶されている場合がある。また、ソフトウェアは、CD-ROM9その他の記録媒体に格納されて、コンピュータープログラムとして流通している場合もある。あるいは、ソフトウェアは、いわゆるインターネットに接続されている情報提供事業者によってダウンロード可能なアプリケーションプログラムとして提供される場合もある。このようなソフトウェアは、光ディスク駆動装置6その他の読取装置によりその記録媒体から読み取られて、あるいは、通信インターフェイス7を介してダウンロードされた後、ハードディスク5に一旦格納される。そのソフトウェアは、CPU1によってハードディスク5から読み出され、RAM4に実行可能なプログラムの形式で格納される。CPU1は、そのプログラムを実行する。 The processing in the computer system 300 is realized by each hardware and software executed by the CPU 1. Such software may be stored in the hard disk 5 in advance. The software may be stored on the CD-ROM 9 or another recording medium and distributed as a computer program. Alternatively, the software may be provided as a downloadable application program by an information provider connected to the so-called Internet. Such software is temporarily stored in the hard disk 5 after being read from the recording medium by the optical disk drive 6 or another reading device, or downloaded via the communication interface 7. The software is read from the hard disk 5 by the CPU 1 and stored in the RAM 4 in the form of an executable program. CPU 1 executes the program.
 図3に示されるコンピューターシステム300を構成する各構成要素は、一般的なものである。したがって、本開示に係る技術思想の本質的な部分の一つは、RAM4、ハードディスク5、CD-ROM9その他の記録媒体に格納されたソフトウェア、あるいはネットワークを介してダウンロード可能なソフトウェアであるともいえる。記録媒体は、一時的でない、コンピューター読取可能なデータ記録媒体を含み得る。なお、コンピューターシステム300の各ハードウェアの動作は周知であるので、詳細な説明は繰り返さない。 各 Each component of the computer system 300 shown in FIG. 3 is a general component. Therefore, it can be said that one of the essential parts of the technical idea according to the present disclosure is software stored in the RAM 4, the hard disk 5, the CD-ROM 9, or other recording media, or software downloadable via a network. The storage medium may include a non-transitory, computer-readable data storage medium. Since the operation of each piece of hardware of computer system 300 is well known, detailed description will not be repeated.
 なお、記録媒体としては、CD-ROM、FD(Flexible Disk)、ハードディスクに限られず、磁気テープ、カセットテープ、光ディスク(MO(Magnetic Optical Disc)/MD(Mini Disc)/DVD(Digital Versatile Disc))、IC(Integrated Circuit)カード(メモリーカードを含む)、光カード、マスクROM、EPROM(Electronically Programmable Read-Only Memory)、EEPROM(Electronically Erasable Programmable Read-Only Memory)、フラッシュROMなどの半導体メモリー等の固定的にプログラムを担持する媒体でもよい。 The recording medium is not limited to a CD-ROM, FD (Flexible Disk), or hard disk, but may be a magnetic tape, cassette tape, optical disk (MO (Magnetic Optical Disc) / MD (Mini Disc) / DVD (Digital Versatile Disc)). , IC (Integrated Circuit) card (including memory card), optical card, mask ROM, EPROM (Electronically Programmable Read-Only Memory), EEPROM (Electronically Erasable Programmable Read-Only Memory), and fixed memory such as flash ROM It may be a medium that carries the program.
 ここでいうプログラムとは、CPUにより直接実行可能なプログラムだけでなく、ソースプログラム形式のプログラム、圧縮処理されたプログラム、暗号化されたプログラム等を含む。 プ ロ グ ラ ム The program here includes not only a program directly executable by the CPU but also a program in a source program format, a compressed program, an encrypted program, and the like.
 [見守りシステム100の装置構成]
 図4を参照して、見守りシステム100を用いた見守りについて説明する。図4は、センサーボックス119を用いた見守りシステム100の装置構成の概略の一例を示す図である。
[Device Configuration of Watching System 100]
With reference to FIG. 4, watching using the watching system 100 will be described. FIG. 4 is a diagram illustrating an example of a schematic device configuration of the watching system 100 using the sensor box 119.
 見守りシステム100は、見守り対象者(監視対象者)である入居者111,121その他の入居者を見守るために利用される。図4に示されるように、居室110の天井には、センサーボックス119が取り付けられている。他の居室にも同様にセンサーボックス119が取り付けられている。 The watching system 100 is used for watching the residents 111 and 121 who are the monitoring target (monitoring target) and other residents. As shown in FIG. 4, a sensor box 119 is attached to the ceiling of the living room 110. Sensor boxes 119 are similarly attached to other rooms.
 範囲410は、センサーボックス119による検出範囲を表わす。センサーボックス119が前述のドップラーセンサーを有する場合、当該ドップラーセンサーは、範囲410内で生じた人の挙動を検出する。センサーボックス119がセンサーとしてカメラを有する場合、当該カメラは、範囲410内の画像を撮影する。 A range 410 represents a detection range of the sensor box 119. When the sensor box 119 has the above-mentioned Doppler sensor, the Doppler sensor detects a person's behavior that has occurred within the range 410. When the sensor box 119 has a camera as a sensor, the camera captures an image in the range 410.
 センサーボックス119は、たとえば、介護施設、医療施設、宅内などに設置される。図4の例では、センサーボックス119は、天井に取り付けられており、入居者111およびベッド113を天井から撮影している。センサーボックス119の取り付け場所は天井に限られず、居室110の側壁に取り付けられてもよい。 The sensor box 119 is installed in, for example, a nursing care facility, a medical facility, or a home. In the example of FIG. 4, the sensor box 119 is attached to the ceiling, and the resident 111 and the bed 113 are imaged from the ceiling. The place where the sensor box 119 is mounted is not limited to the ceiling, and may be mounted on the side wall of the living room 110.
 見守りシステム100は、カメラ105から得られた一連の画像(すなわち、映像)に基づいて入居者111に生じている危険を検知する。一例として、検知可能な危険は、入居者111の転倒や、危険個所(たとえば、ベッドの柵など)に入居者111がいる状態などを含む。 The watching system 100 detects a danger occurring to the resident 111 based on a series of images (that is, videos) obtained from the camera 105. As an example, the detectable danger includes a fall of the resident 111 and a state where the resident 111 is at a danger location (for example, a bed fence).
 見守りシステム100は、入居者111に危険が生じていることを検知した場合に、そのことを介護者141,143等に報知する。報知方法の一例として、見守りシステム100は、入居者111の危険を介護者141,142の携帯端末143,144に通知する。携帯端末143,144は、当該通知を受信すると、入居者111の危険をメッセージ、音声、振動等で介護者141,142に報知する。これにより、介護者141,142は、入居者111に危険が生じていることを即座に把握でき、入居者111の元に素早く駆け付けることができる。 When the monitoring system 100 detects that the resident 111 is in danger, the monitoring system 100 notifies the caregivers 141, 143, etc. of that fact. As an example of the notification method, the watching system 100 notifies the danger of the resident 111 to the portable terminals 143 and 144 of the caregivers 141 and 142. Upon receiving the notification, the mobile terminals 143 and 144 notify the caregivers 141 and 142 of the danger of the resident 111 by a message, voice, vibration, or the like. Accordingly, the caregivers 141 and 142 can immediately recognize that the danger has occurred in the resident 111 and can rush to the resident 111 quickly.
 なお、図4には、見守りシステム100が1つのセンサーボックス119を備えている例が示されているが、見守りシステム100は、複数のセンサーボックス119を備えてもよい。また、図4には、見守りシステム100が複数の携帯端末143,144を備えている例が示されているが、見守りシステム100は、一つの携帯端末でも実現され得る。 Note that FIG. 4 illustrates an example in which the watching system 100 includes one sensor box 119, but the watching system 100 may include a plurality of sensor boxes 119. FIG. 4 shows an example in which the watching system 100 includes a plurality of mobile terminals 143 and 144. However, the watching system 100 can be realized by one mobile terminal.
 [データ構造]
 図5を参照して、クラウドサーバー150のデータ構造について説明する。図5は、クラウドサーバー150が備えるハードディスク5におけるデータの格納の一態様を表わす図である。
[data structure]
The data structure of the cloud server 150 will be described with reference to FIG. FIG. 5 is a diagram illustrating one mode of data storage in the hard disk 5 included in the cloud server 150.
 ハードディスク5は、テーブル60を保持している。テーブル60は、各居室に設けられた各センサーから送信されるデータを逐次保存している。より具体的には、テーブル60は、部屋ID61と、日時62と、X座標値63と、Y座標値64とを含む。部屋ID61は、入居者の居室を識別する。日時62は、センサーから送られた信号が取得された日時を識別する。X座標値63は、当該日時において検出された点、すなわち、入居者の位置のX座標値を表わす。Y座標値64は、当該日時において検出された点、すなわち、入居者の位置のY座標値を表わす。ある局面において、X座標値およびY座標値の元となる座標軸は、例えば、当該居室の端点(例えば、部屋の片隅)を基準として規定される。別の局面において、当該座標軸は、各居室が設けられた施設におけるある一点を基準として規定されてもよい。 The hard disk 5 holds the table 60. The table 60 sequentially stores data transmitted from each sensor provided in each living room. More specifically, the table 60 includes a room ID 61, a date and time 62, an X coordinate value 63, and a Y coordinate value 64. The room ID 61 identifies the room of the resident. The date and time 62 identifies the date and time when the signal sent from the sensor was acquired. The X coordinate value 63 indicates the point detected at the date and time, that is, the X coordinate value of the position of the resident. The Y coordinate value 64 represents the point detected at the date and time, that is, the Y coordinate value of the position of the resident. In one aspect, the coordinate axes that are the basis of the X coordinate value and the Y coordinate value are defined, for example, with reference to the end point of the living room (for example, one corner of the room). In another aspect, the coordinate axis may be defined based on a certain point in the facility where each living room is provided.
 さらに、ハードディスク5は、センサーボックス119から送られた画像データを保持している。画像データは予め定められた時間間隔で取得される。CPU1は、各画像データを用いて画像解析を行なうことにより、入居者の状態を識別することができる。例えば、CPU1は、各画像データから頭を抽出し、入居者が横になっている時間、入居者がベッド113に座っている時間、および、入居者が歩いている時間を抽出し得る。 {Circle around (4)} The hard disk 5 holds the image data sent from the sensor box 119. Image data is acquired at predetermined time intervals. The CPU 1 can identify the state of the resident by performing image analysis using each image data. For example, the CPU 1 can extract a head from each image data and extract a time when the resident is lying down, a time when the resident is sitting on the bed 113, and a time when the resident is walking.
 なお、居室で歩行する人は、入居者以外に介護者や家族などの可能性もある。したがって、CPU1は、入居者以外の人の歩行軌跡を除外し得る。例えば、CPU1は、各歩行軌跡について歩行速度を算出し、入居者以外の健常者の歩行速度と推定される一定速度以上の速度が算出された歩行軌跡を、画像解析の対象から除外し得る。この場合、健常者の歩行速度として、予め測定された歩行速度や、画像解析の際に測定された歩行速度が使用できる。例えば、CPU1は、歩行軌跡から複数の時刻で検出された各点について、x座標値およびy座標値と、各点のうちのある点から次の点に至るまでの時間とに基づいて、歩行速度を算出し得る。 In addition, the person who walks in the room may be a caregiver or a family other than the resident. Therefore, the CPU 1 can exclude a walking locus of a person other than the resident. For example, the CPU 1 may calculate a walking speed for each walking locus, and exclude a walking locus whose speed is equal to or higher than a certain speed estimated to be a walking speed of a healthy person other than a resident from image analysis targets. In this case, a walking speed measured in advance or a walking speed measured at the time of image analysis can be used as a walking speed of a healthy person. For example, for each point detected at a plurality of times from the walking trajectory, the CPU 1 performs walking based on the x-coordinate value and the y-coordinate value and the time from one point to the next point among the points. Speed can be calculated.
 図6を参照して、クラウドサーバー150のデータ構造についてさらに説明する。図6は、クラウドサーバー150のハードディスク5におけるデータの格納の一態様を概念的に表す図である。ハードディスク5は、テーブル600を格納している。テーブル600は、入居者毎に観測された、日々のデータを保持している。 With reference to FIG. 6, the data structure of the cloud server 150 will be further described. FIG. 6 is a diagram conceptually illustrating an aspect of data storage in the hard disk 5 of the cloud server 150. The hard disk 5 stores a table 600. The table 600 holds daily data observed for each resident.
 テーブル600は、入居者ID610と、年月日620と、歩行時間630と、座位時間640と、横臥時間650と、室内行動量660とを含む。入居者ID610は、観測の対象となった入居者を識別する。年月日620は、移動が観測された年月日を表わす。歩行時間630は、その年月日において入居者が歩行している時間として算出された時間を表わす。座位時間640は、その年月日において入居者が座っている時間として算出された時間を表わす。 Table 600 includes resident ID 610, date 620, walking time 630, sitting time 640, lying time 650, and indoor activity 660. The resident ID 610 identifies the resident who has been the target of observation. The date 620 indicates the date when the movement was observed. The walking time 630 indicates a time calculated as a time during which the resident is walking on that date. The sitting time 640 indicates the time calculated as the time in which the resident is sitting at that date.
 横臥時間650は、その年月日において入居者が横になっている時間として算出された時間を表わす。室内行動量660は、当該入居者の室内における行動量を表わす。室内行動量660は、ある局面において、歩行時間630と座位時間640と横臥時間650とを用いて算出される。例えば、CPU1は、歩行時間630、座位時間640および横臥時間650のそれぞれに対して予め設定された係数を乗じ、各積の和を室内行動量660として算出する。なお、各係数は、例えば以下の大小関係を満たすように設定され得る。横臥時間の係数<座位時間の係数<歩行時間の係数
 入居者の状態は上記の3つに限られず、さらに多くの状態および当該状態に応じた係数が用いられてもよい。例えば、車椅子で移動している状態、立ったままでいる状態、運動している状態等について、それぞれ係数が設定されて、室内行動量の算出に使用されてもよい。ここで、運動は、例えば、かかと上げ、屈伸等、施設の入居者の運動能力に応じた運動を含む。このような場合、各係数の大小関係は、以下のようになる。
横臥時間の係数<座位時間の係数<立っている時間の係数<車椅子で移動している時間の係数<歩行時間の係数<運動している時間の係数
 [CPU1の動作概要]
 (1)ある局面において、CPU1は、入居者の歩行軌跡を表わし、異なる日に取得された複数の軌跡データをハードディスク5から取得する。CPU1は、複数の軌跡データに基づいて、入居者による一日の複数の行動状態を識別する。行動状態は、例えば、歩行状態、座位状態、横臥状態等を含む。別の局面において、歩行状態は、自立歩行、補助装置を用いた歩行、車椅子での移動を含み得る。CPU1は、識別された各行動状態の各々の時間を算出する。CPU1は、各行動状態の時間と予め規定された係数とを乗算して積を算出し、各積を加算することにより、入居者の一日の行動量を算出する。
The lying time 650 indicates the time calculated as the time when the resident is lying on that date. The indoor activity amount 660 represents the indoor activity amount of the resident. The indoor activity amount 660 is calculated using the walking time 630, the sitting time 640, and the lying time 650 in a certain situation. For example, the CPU 1 multiplies each of the walking time 630, the sitting time 640, and the lying time 650 by a preset coefficient, and calculates the sum of the products as the indoor activity amount 660. In addition, each coefficient may be set so as to satisfy, for example, the following magnitude relationship. The coefficient of the lying time <the coefficient of the sitting time <the coefficient of the walking time The state of the resident is not limited to the above three, but more states and coefficients according to the state may be used. For example, a coefficient may be set for a state of moving in a wheelchair, a state of standing, a state of exercising, and the like, and the coefficient may be used for calculating the indoor activity amount. Here, the exercise includes, for example, an exercise corresponding to the exercise ability of the resident of the facility, such as raising the heel, bending and stretching. In such a case, the magnitude relationship between the coefficients is as follows.
Lying time coefficient <Sitting time coefficient <Standing time coefficient <Wheelchair moving time coefficient <Walking time coefficient <Exercise time coefficient [Operation outline of CPU1]
(1) In a certain situation, the CPU 1 represents a trajectory of a resident and acquires a plurality of trajectory data acquired on different days from the hard disk 5. The CPU 1 identifies a plurality of behavioral states of the resident for a day based on a plurality of trajectory data. The action state includes, for example, a walking state, a sitting state, a lying state, and the like. In another aspect, the walking state may include independent walking, walking with assistive devices, and moving in a wheelchair. The CPU 1 calculates each time of each of the identified action states. The CPU 1 calculates the product by multiplying the time of each action state by a predetermined coefficient, and calculates the daily activity amount of the resident by adding each product.
 (2)ある局面において、CPU1は、複数の日の各々について算出された各室内行動量を比較することにより、入居者の活動状態の変化を導出する。例えば、管理サーバー200のユーザーが、室内行動量の比較対象として、一つ以上の過去の日付を指定し得る。管理サーバー200は、その指定された日付をクラウドサーバー150に送信する。CPU1は、指定された日付の歩行軌跡データおよび画像データをハードディスク5から読み出す。CPU1は、画像データを用いて画像解析を行ない、人の状態を分類する。さらに別の局面において、CPU1が、人の画像と共に移動する物体を検出した場合には、入居者が歩行補助装置あるいは車椅子で移動している状態にあると判断する。 (2) In a certain situation, the CPU 1 derives a change in the activity state of the resident by comparing each indoor activity amount calculated for each of a plurality of days. For example, the user of the management server 200 may specify one or more past dates as the comparison target of the indoor activity amount. The management server 200 transmits the specified date to the cloud server 150. The CPU 1 reads out walking locus data and image data on the designated date from the hard disk 5. The CPU 1 performs an image analysis using the image data and classifies the state of the person. In still another aspect, when the CPU 1 detects an object that moves together with the image of a person, it determines that the resident is moving in a walking assist device or a wheelchair.
 (3)ある局面において、複数の行動状態は、歩行、座位、横臥を含む。歩行の時間に対する第1係数と、座位の時間に対する第2係数と、横臥の時間に対する第3係数との大小関係は、第1係数>第2係数>第3係数である。第1係数、第2係数および第3係数は、例えば、各行動状態と消費エネルギーとが正の相関となる関係で規定され得る。各係数は、その大小関係を維持する範囲で、判断の目的に応じて、変更可能である。例えば、各係数は、各入居者の年齢、要介護度のレベル、認知症の有無等に応じて設定され得る。 (3) In one aspect, the plurality of behavior states include walking, sitting, and lying down. The magnitude relationship among the first coefficient for the walking time, the second coefficient for the sitting time, and the third coefficient for the lying time is: first coefficient> second coefficient> third coefficient. The first coefficient, the second coefficient, and the third coefficient can be defined, for example, in a relationship in which each behavior state and the consumed energy have a positive correlation. Each coefficient can be changed according to the purpose of the judgment within a range that maintains the magnitude relation. For example, each coefficient can be set according to the age of each resident, the level of the degree of need for nursing care, the presence or absence of dementia, and the like.
 [制御構造]
 図7を参照して、クラウドサーバー150の制御構造について説明する。図7は、ある実施の形態に従うクラウドサーバー150のCPU1が実行する処理の一部を表わすフローチャートである。
[Control structure]
The control structure of the cloud server 150 will be described with reference to FIG. FIG. 7 is a flowchart showing a part of a process executed by CPU 1 of cloud server 150 according to an embodiment.
 ステップS710にて、CPU1は、ハードディスク5から、画像データを取得する。
 ステップS720にて、CPU1は、複数の画像データに基づいて、入居者による一日の複数の行動状態を識別する。例えば、CPU1は、画像データを用いて画像解析を行ない、入居者の状態を分類する。ある局面において、CPU1は、画像解析の結果に基づいて、入居者が座っているか、横になっているか、歩いているかを判断する。この判断は、画像解析の結果から得られる人の頭の場所、向き、移動の有無等に基づいて行なわれる。
In step S710, CPU 1 acquires image data from hard disk 5.
At step S720, CPU 1 identifies a plurality of behavioral states of the resident by day based on the plurality of image data. For example, the CPU 1 performs an image analysis using the image data and classifies the resident state. In a certain situation, the CPU 1 determines whether the resident is sitting, lying down, or walking based on the result of the image analysis. This determination is made based on the location, orientation, presence / absence of movement, etc. of the person's head obtained from the result of the image analysis.
 ステップS730にて、CPU1は、識別された各行動状態の各々の時間を算出する。例えば、CPU1は、各状態が継続している時間を算出し、算出結果を集計し、歩行時間630、座位時間640および横臥時間650を算出する。 に て At step S730, CPU 1 calculates each time of each identified action state. For example, the CPU 1 calculates the time during which each state continues, totals the calculation results, and calculates the walking time 630, the sitting time 640, and the lying time 650.
 ステップS740にて、CPU1は、各行動状態の時間と、各行動状態について予め規定された係数とを乗算して積を算出する。なお、この係数は、入居者間で同じ係数でも異なる係数でもよい。また、ある局面において、係数は、入居者の年齢、性別、要介護度の程度等に応じて規定されてもよい。 In step S740, the CPU 1 calculates a product by multiplying the time of each action state by a coefficient defined in advance for each action state. In addition, this coefficient may be the same coefficient or a different coefficient among tenants. In some aspects, the coefficient may be defined according to the resident's age, gender, degree of care required, and the like.
 ステップS750にて、CPU1は、各積を加算することにより、入居者の一日の室内行動量を算出する。なお、室内行動量の算出単位は、一日に限られない。別の局面において、CPU1は、管理者等の設定に応じて、例えば、一週間の室内行動量、一カ月の室内行動量その他の単位の行動量を算出してもよい。 In step S750, CPU 1 calculates the daily indoor activity amount of the resident by adding the products. Note that the calculation unit of the indoor activity amount is not limited to one day. In another aspect, the CPU 1 may calculate, for example, one week's indoor activity, one month's indoor activity, or other unit activity according to the setting of the manager or the like.
 ステップS760にて、CPU1は、複数の日の各々について算出された各行動量を比較する。例えば、CPU1は、室内行動量を算出するべき日として指定された日、その3カ月前の日およびその6ヵ月前の日について、室内行動量をそれぞれ算出し、推移を比較し得る。 In step S760, the CPU 1 compares the respective amounts of action calculated for each of the plurality of days. For example, the CPU 1 can calculate the indoor activity amounts for the day designated as the day on which the indoor activity amount is to be calculated, the day before three months and the day six months before, and compare the changes.
 ステップS770にて、CPU1は、比較の結果に基づいて、入居者の活動状態の変化を導出する。例えば、CPU1は、入居者毎に、現時点での室内行動量と過去の室内行動量とを比較し、室内行動量の変化の有無を検出する。CPU1は、室内行動量の低下がみられる入居者を検出すると、当該入居者について、歩行時間、座位時間および横臥時間と共に室内行動量を表わす結果をモニター8あるいは帳票に出力し得る。 に て In step S770, CPU 1 derives a change in the activity state of the resident based on the result of the comparison. For example, for each resident, the CPU 1 compares the current indoor activity amount with the past indoor activity amount, and detects whether or not the indoor activity amount has changed. When detecting a resident whose indoor activity amount is reduced, the CPU 1 can output a result representing the indoor activity amount of the resident together with the walking time, the sitting time, and the lying time to the monitor 8 or a form.
 [表示態様]
 図8を参照して、入居者の活動状態の変化の表示態様の一例について説明する。図8は、ある実施の形態に従ってモニター8に表示される入居者の活動状態の結果と室内行動量との推移を表わす図である。
[Display mode]
An example of a display mode of a change in the activity state of the resident will be described with reference to FIG. FIG. 8 is a diagram illustrating a transition between the result of the activity of the resident and the amount of indoor activity displayed on the monitor 8 according to an embodiment.
 管理サーバー200のユーザーが室内行動量の算出を指示すると、管理サーバー200は、当該指示をクラウドサーバー150に送る。クラウドサーバー150のCPU1は、図7に示される処理を行ない、処理の結果を管理サーバー200に送信する。管理サーバー200のモニター8は、処理の結果を表示する。 When the user of the management server 200 instructs calculation of the indoor activity amount, the management server 200 sends the instruction to the cloud server 150. The CPU 1 of the cloud server 150 performs the processing illustrated in FIG. 7 and transmits the result of the processing to the management server 200. The monitor 8 of the management server 200 displays the result of the processing.
 より具体的には、モニター8は、ある入居者(Aさん)について、今月(例えば、昨日)、その3カ月前の日、およびその6ヵ月前の日における歩行時間を示す棒グラフ810、座位時間を示す棒グラフ820および横臥時間を示す棒グラフ830をそれぞれ表示する。さらに、モニター8は、棒グラフ810に対応する歩行時間と棒グラフ820に対応する座位時間と棒グラフ830に対応する横臥時間とを用いて前述のように算出した室内行動量を示すグラフ840を表示する。このようにすると、当該入居者の介護スタッフ、ケアマネジャーその他のユーザーは、入居者の行動状態の変化を客観的に把握することができるので、要介護度の判定等も客観的に行なうことが可能になり、判定結果の納得性および透明性が高まり得る。 More specifically, the monitor 8 displays, for a certain resident (Mr. A), a bar graph 810 showing walking time for the current month (for example, yesterday), the day before three months, and the day before six months, and the sitting time. Is displayed, and a bar graph 830 indicating the lying time is displayed. Further, the monitor 8 displays a graph 840 indicating the indoor activity amount calculated as described above using the walking time corresponding to the bar graph 810, the sitting time corresponding to the bar graph 820, and the lying time corresponding to the bar graph 830. In this way, the care staff, care manager, and other users of the resident can objectively grasp the change in the occupant's behavioral state, and thus can objectively determine the degree of nursing care required. , And the consent and transparency of the determination result can be enhanced.
 図8の例では、例えば、歩行時間の係数=2.0、座位時間の係数=1.4、横臥時間の係数=1.0として室内行動量は、以下の式で導出される。 In the example of FIG. 8, for example, the indoor activity amount is derived by the following equation, assuming that the coefficient of walking time = 2.0, the coefficient of sitting time = 1.4, and the coefficient of lying time = 1.0.
 室内行動量=歩行時間×2.0+座位時間×1.4+横臥時間×1.0
 別の局面でさらに多くの時間に分類される場合もあり得る。この場合、以下のように室内行動量の係数は設定され得る。
Indoor activity = walking time x 2.0 + sitting time x 1.4 + lying down time x 1.0
It may be classified into more times in another aspect. In this case, the coefficient of the indoor activity amount can be set as follows.
 室内行動量=横臥時間×0.5+座位時間×0.8+立っている時間(移動なし)×1.0+車椅子で自走している時間×1.3+歩行時間×1.5+運動時間×2.0
 なお、比較対象となる過去の日は、3カ月前および6ヵ月前に限られない。1週間前および2週間前のように比較が毎週の実績に基づいて行なわれてもよい。あるいは1カ月前および2カ月前のように比較が毎月の実績に基づいて行なわれてもよい。
Indoor activity = lying time x 0.5 + sitting time x 0.8 + standing time (no movement) x 1.0 + time running in a wheelchair x 1.3 + walking time x 1.5 + exercise time x 2 .0
The past days to be compared are not limited to three months and six months ago. Comparisons may be made based on weekly performance, such as one week ago and two weeks ago. Alternatively, the comparison may be made based on monthly results, such as one month ago and two months ago.
 また、別の局面において、複数の入居者の各室内行動量が比較されてもよい。比較することにより、各入居者の状態の変化を検出しやすくなる。 In another aspect, the indoor activities of a plurality of residents may be compared. The comparison makes it easier to detect a change in the state of each resident.
 [実施の形態のまとめ]
 以上のようにして、本実施の形態によれば、見守り対象者の画像データが逐次取得される。システムは、移動軌跡データと画像データとを用いて見守り対象者の状態を、歩行状態、座位状態、横臥状態、車椅子で移動している状態、立っている状態、運動している状態等に分類する。システムは、分類した状態の時間を集計し、各状態について設定された係数を当該状態の時間に乗じて積を算出し、各積の和を室内行動量として算出する。このようにすると、一人の入居者の様々な行動を室内行動量という一つの値で表現することができるので、各状態の時間が変わった場合でも、当該入居者の全体としての行動量を把握することができる。室内行動量は、センサーボックス119から送られるデータ(例えば、カメラ105からの画像データ、ドップラーセンサー106からの出力データ)を用いて算出されるので、判断者による主観が排除される。これにより、入居者の行動量が客観的に示されるので、例えば、入居者の要介護度の判定も客観的に行なうことが可能となり、判定結果への納得性も高め得る。
[Summary of Embodiment]
As described above, according to the present embodiment, the image data of the watching target is sequentially acquired. The system classifies the state of the watching target person into a walking state, a sitting state, a lying state, a state of moving in a wheelchair, a standing state, a exercising state, and the like using the movement track data and the image data. I do. The system totals the times of the classified states, multiplies the coefficient set for each state by the time of the state to calculate a product, and calculates the sum of the products as the indoor activity amount. In this way, various actions of a single resident can be represented by a single value, the amount of indoor activity. can do. Since the indoor activity amount is calculated using data sent from the sensor box 119 (for example, image data from the camera 105 and output data from the Doppler sensor 106), the subjectivity of the judge is eliminated. Thereby, the amount of activity of the resident is objectively indicated, so that it is possible to objectively determine the degree of care required by the resident, for example, and convince the determination result.
 今回開示された実施の形態はすべての点で例示であって制限的なものではないと考えられるべきである。本発明の範囲は上記した説明ではなくて請求の範囲によって示され、請求の範囲と均等の意味および範囲内でのすべての変更が含まれることが意図される。 The embodiments disclosed this time are to be considered in all respects as illustrative and not restrictive. The scope of the present invention is defined by the terms of the claims, rather than the description above, and is intended to include any modifications within the scope and meaning equivalent to the terms of the claims.
 本技術は、病院、老人ホーム、養護施設その他の施設で得られるデータの情報処理に適用可能である。 技術 This technology is applicable to information processing of data obtained in hospitals, nursing homes, nursing homes and other facilities.
 51 洗面台、52 机、60 テーブル、100 システム、101,221 制御装置、105 カメラ、106 ドップラーセンサー、107 無線通信装置、108,228 記憶装置、110,120 居室、111,121 入居者、112 タンス、113 ベッド、114 トイレ、115 ケアコール子機、116 トイレセンサー、117 センサー、118 ドアセンサー、119 センサーボックス、130 管理センター、140 アクセスポイント、141,142 介護者、143,144 携帯端末、150 クラウドサーバー、180 居室領域、190 ネットワーク、200 管理サーバー。 51 wash basin, 52 desk, 60 table, 100 system, 101,221 control device, 105 camera, 106 Doppler sensor, 107 wireless communication device, 108,228 storage device, 110, 120 living room, 111, 121 resident, 112 closet , 113 beds, 114 toilets, 115 care call handsets, 116 toilet sensors, 117 sensors, 118 door sensors, 119 sensor boxes, 130 administration centers, 140 access points, 141, 142 caregivers, 143, 144 mobile terminals, 150 cloud servers , 180 living room area, 190 network, 200 management server.

Claims (15)

  1.  コンピューターで実行されるプログラムであって、前記プログラムは前記コンピューターに、
     入居者の行動に関するデータを取得するステップと、
     前記入居者の行動に関するデータに基づいて、前記入居者による複数の行動状態を識別し、識別された各前記行動状態の各々の時間を算出するステップと、
     各前記行動状態の時間と、各前記行動状態について予め規定された係数とに基づいて、前記入居者の行動量を算出するステップとを実行させる、プログラム。
    A program executed on a computer, wherein the program is stored in the computer,
    Obtaining data on resident behavior;
    Based on the data on the behavior of the resident, identifying a plurality of behavior states by the resident, calculating the time of each identified behavior state,
    A program for executing a step of calculating an activity amount of the resident based on a time of each of the activity states and a coefficient defined in advance for each of the activity states.
  2.  前記プログラムは前記コンピューターに、複数の期間について算出された各前記行動量を比較することにより、前記入居者の活動状態の変化を導出するステップをさらに実行させる、請求項1に記載のプログラム。 The program according to claim 1, wherein the program causes the computer to further execute a step of deriving a change in the activity state of the resident by comparing each of the activity amounts calculated for a plurality of periods.
  3.  前記複数の行動状態は、歩行、座位、横臥、車椅子での移動の少なくとも二つ以上を含む、請求項1または2に記載のプログラム。 The program according to claim 1 or 2, wherein the plurality of behavior states include at least two of walking, sitting, lying down, and moving in a wheelchair.
  4.  前記複数の行動状態は、歩行、座位、横臥を含み、
     前記歩行の時間に対する第1係数と、前記座位の時間に対する第2係数と、前記横臥の時間に対する第3係数との大小関係は、前記第1係数>前記第2係数>前記第3係数である、請求項1または2に記載のプログラム。
    The plurality of behavior states include walking, sitting, lying down,
    The magnitude relationship between the first coefficient for the walking time, the second coefficient for the sitting time, and the third coefficient for the lying time is the first coefficient> the second coefficient> the third coefficient. The program according to claim 1.
  5.  前記取得するステップは、前記入居者の居室における移動軌跡データを集計することを含む、請求項1~4のいずれかに記載のプログラム。 The program according to any one of claims 1 to 4, wherein the step of acquiring includes counting movement trajectory data of the resident in the living room.
  6.  メモリーと、
     前記メモリーに結合されたプロセッサーとを備え、
     前記プロセッサーは、
     入居者の行動に関するデータを取得し、
     前記入居者の行動に関するデータに基づいて、前記入居者による複数の行動状態を識別し、識別された各前記行動状態の各々の時間を算出し、
     各前記行動状態の時間と、各前記行動状態について予め規定された係数とに基づいて、前記入居者の一日の行動量を算出するように構成されている、情報処理装置。
    Memory and
    A processor coupled to the memory,
    The processor is
    Get data on resident behavior,
    Based on the data regarding the resident's behavior, a plurality of behavior states by the resident are identified, and each time of each of the identified behavior states is calculated,
    An information processing apparatus configured to calculate a daily activity amount of the resident based on a time of each of the activity states and a coefficient defined in advance for each of the activity states.
  7.  前記プロセッサーは、複数の期間について算出された各前記行動量を比較することにより、前記入居者の活動状態の変化を導出するようにさらに構成されている、請求項6に記載の情報処理装置。 The information processing apparatus according to claim 6, wherein the processor is further configured to derive a change in the activity state of the resident by comparing each of the activity amounts calculated for a plurality of periods.
  8.  前記複数の行動状態は、歩行、座位、横臥、車椅子での移動の少なくとも二つ以上を含む、請求項6または7に記載の情報処理装置。 8. The information processing apparatus according to claim 6, wherein the plurality of action states include at least two or more of walking, sitting, lying down, and moving in a wheelchair. 9.
  9.  前記複数の行動状態は、歩行、座位、横臥を含み、
     前記歩行の時間に対する第1係数と、前記座位の時間に対する第2係数と、前記横臥の時間に対する第3係数との大小関係は、前記第1係数>前記第2係数>前記第3係数である、請求項6または7に記載の情報処理装置。
    The plurality of behavior states include walking, sitting, lying down,
    The magnitude relationship between the first coefficient for the walking time, the second coefficient for the sitting time, and the third coefficient for the lying time is the first coefficient> the second coefficient> the third coefficient. An information processing apparatus according to claim 6.
  10.  前記取得することは、前記入居者の居室における移動軌跡データを集計することを含む、請求項6~9のいずれかに記載の情報処理装置。 The information processing apparatus according to any one of claims 6 to 9, wherein the acquiring includes counting movement trajectory data in the room of the resident.
  11.  コンピューターで実行される方法であって、
     入居者の行動に関するデータを取得するステップと、
     前記入居者の行動に関するデータに基づいて、前記入居者による複数の行動状態を識別し、識別された各前記行動状態の各々の時間を算出するステップと、
     各前記行動状態の時間と、各前記行動状態について予め規定された係数とに基づいて、前記入居者の一日の行動量を算出するステップとを含む、方法。
    A method executed on a computer, the method comprising:
    Obtaining data on resident behavior;
    Based on the data on the behavior of the resident, identifying a plurality of behavior states by the resident, calculating the time of each identified behavior state,
    Calculating a daily activity amount of the resident based on a time of each of the activity states and a coefficient predefined for each of the activity states.
  12.  複数の期間について算出された各前記行動量を比較することにより、前記入居者の活動状態の変化を導出するステップをさらに含む、請求項11に記載の方法。 12. The method of claim 11, further comprising: deriving a change in the resident's activity by comparing each of the calculated amounts of activity for a plurality of time periods.
  13.  前記複数の行動状態は、歩行、座位、横臥、車椅子での移動の少なくとも二つ以上を含む、請求項11または12に記載の方法。 The method according to claim 11 or 12, wherein the plurality of behavior states include at least two of walking, sitting, lying down, and moving in a wheelchair.
  14.  前記複数の行動状態は、歩行、座位、横臥を含み、
     前記歩行の時間に対する第1係数と、前記座位の時間に対する第2係数と、前記横臥の時間に対する第3係数との大小関係は、前記第1係数>前記第2係数>前記第3係数である、請求項11または12に記載の方法。
    The plurality of behavior states include walking, sitting, lying down,
    The magnitude relationship between the first coefficient for the walking time, the second coefficient for the sitting time, and the third coefficient for the lying time is the first coefficient> the second coefficient> the third coefficient. A method according to claim 11 or claim 12.
  15.  前記取得するステップは、前記入居者の居室における移動軌跡データを集計することを含む、請求項11~14のいずれかに記載の方法。 The method according to any one of claims 11 to 14, wherein the acquiring step includes counting movement trajectory data of the resident in the room.
PCT/JP2019/022468 2018-06-26 2019-06-06 Computer executable program, information processing device, and computer execution method WO2020003952A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2020527343A JP7327397B2 (en) 2018-06-26 2019-06-06 Computer-implemented programs, information processing systems, and computer-implemented methods

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018121239 2018-06-26
JP2018-121239 2018-06-26

Publications (1)

Publication Number Publication Date
WO2020003952A1 true WO2020003952A1 (en) 2020-01-02

Family

ID=68986507

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/022468 WO2020003952A1 (en) 2018-06-26 2019-06-06 Computer executable program, information processing device, and computer execution method

Country Status (2)

Country Link
JP (1) JP7327397B2 (en)
WO (1) WO2020003952A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022059438A1 (en) * 2020-09-17 2022-03-24 コニカミノルタ株式会社 Information processing device, information processing system, program, and recording medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08131425A (en) * 1994-09-12 1996-05-28 Omron Corp Exercising quantity measuring instrument
JP2003288656A (en) * 2002-03-28 2003-10-10 Ntt Comware Corp Safety reporting device, safety reporting method, program for safety reporting device, and storage medium for safety reporting device
JP2007093433A (en) * 2005-09-29 2007-04-12 Hitachi Ltd Detector for motion of pedestrian

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016185738A1 (en) * 2015-05-20 2016-11-24 ノ-リツプレシジョン株式会社 Image analysis device, image analysis method, and image analysis program
JP6874679B2 (en) * 2015-05-27 2021-05-19 コニカミノルタ株式会社 Monitoring device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08131425A (en) * 1994-09-12 1996-05-28 Omron Corp Exercising quantity measuring instrument
JP2003288656A (en) * 2002-03-28 2003-10-10 Ntt Comware Corp Safety reporting device, safety reporting method, program for safety reporting device, and storage medium for safety reporting device
JP2007093433A (en) * 2005-09-29 2007-04-12 Hitachi Ltd Detector for motion of pedestrian

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022059438A1 (en) * 2020-09-17 2022-03-24 コニカミノルタ株式会社 Information processing device, information processing system, program, and recording medium

Also Published As

Publication number Publication date
JPWO2020003952A1 (en) 2021-08-02
JP7327397B2 (en) 2023-08-16

Similar Documents

Publication Publication Date Title
US9940822B2 (en) Systems and methods for analysis of subject activity
AU2013296153A1 (en) A system, method, software application and data signal for determining movement
US10813593B2 (en) Using visual context to timely trigger measuring physiological parameters
Miller et al. Smart homes that detect sneeze, cough, and face touching
JP7468350B2 (en) Condition monitoring device and control method for condition monitoring device
JP7435459B2 (en) Condition monitoring device and condition monitoring method
WO2020003952A1 (en) Computer executable program, information processing device, and computer execution method
WO2020075675A1 (en) Care system management method, management device and program
JP7342863B2 (en) Computer-executed programs, information processing systems, and computer-executed methods
JP7255359B2 (en) Program, Information Notification Apparatus, and Computer Implemented Method for Posting Information
JP7276336B2 (en) Computer-implemented programs, information processing systems, and computer-implemented methods
JP7205540B2 (en) Computer Executed Programs, Information Processing Devices, and Computer Executed Methods
JP7215481B2 (en) Computer Executed Programs, Information Processing Devices, and Computer Executed Methods
JP2021174189A (en) Method of assisting in creating menu of service, method of assisting in evaluating user of service, program causing computer to execute the method, and information providing device
JP7371624B2 (en) Programs that run on computers, information processing devices, and methods that run on computers
JP7310327B2 (en) Behavior detection device, system provided with same, behavior detection method, and program
WO2021215207A1 (en) Method performed by computer to provide information about care recipient, program, and information providing device
WO2022059438A1 (en) Information processing device, information processing system, program, and recording medium
Liao et al. An empirical study on engineering a real-world smart ward using pervasive technologies
WO2020137061A1 (en) Information display method, program, and information display device
JP2023105966A (en) Method and program executed by computer to detect change in state of resident, and resident state change detection device
JP2023108852A (en) Display device, display system, display method, and display program
JP2021196739A (en) Assistance degree estimation method, program, and information processing device
JP2021064327A (en) Method executed by computer for proposing health care content, program for making computer execute the method, and information processing device
JP2020057222A (en) Status monitoring device and status monitoring method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19826474

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020527343

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19826474

Country of ref document: EP

Kind code of ref document: A1