WO2020003704A1 - Control program, report output method, and report output device - Google Patents

Control program, report output method, and report output device Download PDF

Info

Publication number
WO2020003704A1
WO2020003704A1 PCT/JP2019/016684 JP2019016684W WO2020003704A1 WO 2020003704 A1 WO2020003704 A1 WO 2020003704A1 JP 2019016684 W JP2019016684 W JP 2019016684W WO 2020003704 A1 WO2020003704 A1 WO 2020003704A1
Authority
WO
WIPO (PCT)
Prior art keywords
event
information
subject
report
image data
Prior art date
Application number
PCT/JP2019/016684
Other languages
French (fr)
Japanese (ja)
Inventor
寛 古川
武士 阪口
海里 姫野
恵美子 寄▲崎▼
遠山 修
藤原 浩一
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Priority to JP2020527229A priority Critical patent/JP7327396B2/en
Publication of WO2020003704A1 publication Critical patent/WO2020003704A1/en

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/20ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management or administration of healthcare resources or facilities, e.g. managing hospital staff or surgery rooms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G12/00Accommodation for nursing, e.g. in hospitals, not covered by groups A61G1/00 - A61G11/00, e.g. trolleys for transport of medicaments or food; Prescription lists
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/22Social work
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H15/00ICT specially adapted for medical reports, e.g. generation or transmission thereof
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M9/00Arrangements for interconnection not involving centralised switching

Definitions

  • the present invention relates to a control program, a report output method, and a report output device, and more particularly, to a control program, a report output method, and a report output device for outputting a report on a target person to be handled by staff.
  • Japan's life expectancy has been remarkably prolonged due to the improvement of living standards, improvement of sanitary conditions, and improvement of medical care standards following the postwar economic growth. For this reason, coupled with a decrease in the birth rate, the aging society has a high aging rate. In such an aging society, an increase in the number of cared people who need nursing care due to illness, injury, aging, and the like is expected.
  • a nurse call system disclosed in Patent Document 1 as such a system, it is determined from a captured image of a camera installed on a bed of a patient whether the patient has woken up or left the bed. I do.
  • the image captured by the camera is displayed on the nurse call master provided in the nurse station.
  • Patent Literature 2 discloses a medical nursing care system that determines whether or not a near-miss incident requiring a countermeasure has occurred.
  • a support device is disclosed. This medical care support device calculates the similarity (correction notification hit rate) between the case record input by the caregiver and the past near miss incident recorded in the database, and the correction notification hit rate is equal to or more than a predetermined value. Is identified as countermeasure information that needs improvement.
  • the nurse call system disclosed in Patent Literature 1 can detect the rising and leaving of a patient in a bed, but cannot prevent an accident or a near-miss. Further, the medical care support device disclosed in Patent Document 2 identifies an error in measures taken in the past by comparing the occurred near-miss with past cases, and identifies accidents and accidents. It does not prevent the occurrence of connected near misses.
  • the present invention has been made in view of the above circumstances, by analyzing an event that may lead to an accident, a control program capable of outputting a report useful for preventing the occurrence of the event, a report output method, And a report output device.
  • the staff responding to the target person carries a mobile terminal, Any of the above (1) to (3), wherein information indicating that the staff has confirmed the event, which is input by the staff through the mobile terminal in response to the occurrence of the event, is included in the event information.
  • the control program according to 1.
  • the procedure (b) includes displaying the image data of the target person before and after the time when the number of occurrences of the event per unit period has increased, and (b1). Receiving a comment input by the user (b2).
  • the control program according to any one of (1) to (4), wherein in the step (c), the report including the comment received in the step (b2) is created.
  • step (d) of creating a comment that calls attention to the event of the target person.
  • the report created in the step (b) includes the image data of the target person before and after the period when the number of occurrences of the event per unit period increases.
  • the control program according to any one of (1) to (6).
  • event information of a predetermined event that may lead to an accident and image data associated with the event information are acquired, and information on the occurrence status of the event or the
  • a report in which the analysis result is visualized is created and output. This makes it possible to output a report that is useful for preventing the occurrence of an event that may lead to an accident.
  • FIG. 3 is a block diagram illustrating a schematic configuration of a detection unit.
  • FIG. 2 is a block diagram illustrating a schematic configuration of a server. It is a block diagram which shows the schematic structure of a staff terminal. It is a sequence chart which shows the procedure of a process of a watching system.
  • 5 is an example of an event list stored in a storage unit. It is an example of an operation screen for confirming an event displayed on a staff terminal. 5 is a flowchart illustrating a procedure of a report output process according to the first embodiment.
  • FIG. 13 is a flowchart illustrating a procedure of a report output process in a modified example.
  • FIG. 1 is a diagram illustrating an entire configuration of a watching system according to the present embodiment
  • FIG. 2 is a diagram illustrating an example of a detection unit installed around a bed in a room of a subject.
  • the watching system 1 includes a plurality of detection units 10, a server 20, a fixed terminal 30, and one or more staff terminals 40.
  • the server 20 functions as a report output device. These are communicably connected to each other by wire or wireless via a network 50 such as a LAN (Local Area Network), a telephone network or a data communication network.
  • the network 50 may include a repeater that relays a communication signal, such as a repeater, a bridge, a router, or a cross-connect.
  • the staff terminal 40, the detection unit 10, the server 20, and the fixed terminal 30 are mutually connected by a network 50 such as a wireless LAN (for example, a LAN according to the IEEE 802.11 standard) including an access point 51. Connected for communication.
  • the watching system 1 is provided at an appropriate place according to the target person 70.
  • the target person 70 (also referred to as a watching target person or a care target person) may be, for example, a patient who needs nursing due to illness or injury, a cared person who needs nursing care due to a decline in physical ability due to aging, or a single person living alone. A single person.
  • the target person 70 may be a person who needs to be detected when a predetermined inconvenient event such as an abnormal state occurs in the person.
  • the watching system 1 is suitably installed in buildings such as hospitals, welfare facilities for the elderly, and dwelling units, depending on the type of the subject 70.
  • the watching system 1 is arranged in a facility including a plurality of rooms (rooms) where a plurality of subjects 70 enter and a plurality of rooms including a nurse station.
  • the detection unit 10 is arranged in each living room, which is the observation area of the subject 70.
  • the four detection units 10 are arranged in the rooms of the subjects 70, A, B, C, and D, respectively.
  • the bed 60 is included in the observation area of the detection unit 10.
  • Staff 80 also referred to as care staff or care staff
  • who provide nursing or care for the subject 70 carries a staff terminal 40 which is a portable terminal.
  • the server 20 may not be located at the nurse station, and may be an external server unit connected to the network 50.
  • the fixed terminal 30 may be omitted, and the server 20 or the staff terminal 40 may perform the function.
  • FIG. 3 is a block diagram illustrating a schematic configuration of the detection unit.
  • the detection unit 10 includes a control unit 11, a communication unit 12, a camera 13, a nurse call unit 14, and a voice input / output unit 15, which are interconnected by a bus.
  • the control unit 11 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), and a memory such as a ROM (Read Only Memory), and performs control and arithmetic processing of each unit of the detection unit 10 according to a program.
  • the control unit 11 may further include an HDD (Hard ⁇ Disk ⁇ Drive) as a memory.
  • the communication unit 12 is an interface circuit (for example, a LAN card) for communicating with another device such as the server 20, the fixed terminal 30, or the staff terminal 40 via the network 50.
  • an interface circuit for example, a LAN card
  • the camera 13 is arranged, for example, on the ceiling of a living room or on an upper part of a wall, and captures an area including the bed 60 of the subject 70 directly below as an observation area, and outputs a captured image (image data).
  • This photographed image includes a still image and a moving image.
  • the camera 13 is a near-infrared camera, a visible light camera may be used instead, or these may be used in combination.
  • the control unit 11 determines (recognizes) the occurrence of a predetermined action of the target person 70 from the image captured by the camera 13.
  • the predetermined actions to be determined include “wake up” rising from the bed 60, “leaving off” from the bed 60, “falling” falling down from the bed 60, and “falling down” falling on the floor or the like.
  • the control unit 11 detects an image silhouette (hereinafter, referred to as a “human silhouette”) from a plurality of captured images (moving images).
  • the human silhouette can be detected by, for example, extracting a range of pixels having a relatively large difference by a time difference method of extracting a difference between images whose shooting time is before and after.
  • the human silhouette may be detected by a background difference method that extracts a difference between the captured image and the background image. Whether the user is awake, awake, falls, or falls is recognized based on the posture of the subject 70 (for example, standing, sitting, lying down, etc.) based on the detected human silhouette, and the relative position with respect to an installed object such as a bed 60 in a living room. Is done.
  • Such recognition may be performed by a program processed by the CPU of the control unit 11, or may be performed by a built-in processing circuit.
  • the control unit 11 determines the type of event and whether or not the event has occurred.
  • the server 20 may perform all or most of these recognitions, and the control unit 11 may transmit only the captured image to the server 20.
  • the control unit 11 transmits a notification to the effect that the action (an event to be described later) has occurred to the server 20 or the like.
  • the staff 80 is a person who performs various responses to the target person 70 according to the business. Services can include medical services and nursing care services.
  • the work of the staff 80 is a care work for the target person 70
  • the corresponding contents regarding each event will be described. "Waking up” is determined as an event, and if the determination is within a predetermined time (wake-up time set at the facility (for example, 7 to 8 am), morning care is performed. This morning care includes face washing, tooth brushing assistance, denture wearing, changing clothes assistance, and the like.
  • a wheelchair transfer or walking assistance may be required.
  • an alert may be generated at a fixed time by the nurse call unit 14 or the like.
  • the nurse call unit 14 includes a push-button switch, and detects a nurse call (also called a care call) when the switch is pressed by the subject 70.
  • the nurse call may be detected by a voice microphone instead of the push button type switch.
  • the control unit 11 sends a notification (nurse call notification) indicating that there is a nurse call via the communication unit 12 and the network 50. It is transmitted to the server 20 or the like.
  • the voice input / output unit 15 is, for example, a speaker and a microphone, and enables voice communication by transmitting and receiving a voice signal to / from the staff terminal 40 or the like via the communication unit 12.
  • the voice input / output unit 15 may be connected to the detection unit 10 via the communication unit 12 as an external device of the detection unit 10.
  • the detection unit 10 also includes a Doppler shift type body motion sensor that transmits and receives microwaves toward the bed 60 and detects Doppler shifts of microwaves caused by body movements (for example, respiratory movements) of the subject 70. May be further provided.
  • the body movement sensor detects body movement of the chest (up and down movement of the chest) accompanying the breathing movement of the subject 70, and disorder of the cycle in the body movement of the chest and movement of the chest that is equal to or less than a preset threshold value. When the amplitude at is detected, it is recognized that there is a slight movement abnormality.
  • a nurse call by the nurse call unit 14 and a change in the state recognized by the detection unit 10 regarding the subject 70 are included in the staff 80 such as getting up, leaving the bed, falling, falling down, and abnormal body movement.
  • An event for which a notification (notification) is to be performed is called an event.
  • the detection unit 10 transmits (outputs) information on the event that has occurred and the captured image to the server 20.
  • FIG. 4 is a block diagram showing a schematic configuration of the server.
  • the server 20 includes a control unit 21, a communication unit 22, and a storage unit 23.
  • the server 20 may be provided in the same building as the living room for the subject 70, or may be provided in a remote location and connectable via a network.
  • the server 20 may be a cloud server virtually constructed by a plurality of servers arranged on a network such as the Internet.
  • the components are communicably connected to each other by a bus.
  • the storage unit 23 functions as a database, and stores various information regarding the event list, the target person 70, and the staff 80.
  • the control unit 21 obtains event information of a predetermined event that may lead to an accident by cooperating with the communication unit 22 and image data of a predetermined period including a time point at which an event associated with the event has occurred. Functions as a unit. Further, the control unit 21 functions as a creating unit that creates a report by analyzing the event occurrence status from the acquired data. The control unit 21 also functions as an output unit that outputs the created report in cooperation with the communication unit 22.
  • the other control unit 21 and communication unit 22 have the same functions as those of the configuration of the detection unit 10, and thus a detailed description is omitted.
  • the event list includes information on various events including a predetermined event (for example, a falling event or a falling event) that may lead to an accident.
  • the predetermined event that may lead to the accident is also called a near-miss event, and if this situation continues, an accident may occur.
  • the server 20 determines (identifies) which subject 70 the event, such as a nurse call, getting up, getting out of bed, falling, falling down, etc., detected by the detecting unit 10 is related to.
  • the target person 70 that is, the resident of the room
  • the determined event type is associated with the target person 70 and added to the event list in the storage unit 23.
  • the subject 70 may be determined by reading the IC tag with an RFID reader provided in each room.
  • the subject 70 may be determined by arranging the detection unit 10 for each bed 60.
  • the entry of each staff 80 into the room may be detected by bringing the staff terminal 40 carried by the staff 80 close to the RFID reader.
  • the fixed terminal 30 is a so-called PC (Personal Computer), and includes a control unit including a CPU, a RAM, and the like, a communication unit, a display unit, an input unit, and a voice input / output unit.
  • the fixed terminal 30 is arranged, for example, in a nurse station.
  • a user staff 80, an administrator who manages the staff 80, etc.
  • a printer (not shown) outputs the data to paper.
  • the report may include a comment input from the input unit while referring to the captured image displayed on the display unit (display) of the fixed terminal 30 as described later.
  • the staff 80 or technical staff associates the room number with the detection unit 10 when attaching the detection unit 10 to each room (living room) through the fixed terminal 30, or associates the detection unit 10 with the room 60 such as the bed 60.
  • the position information of the installation object that is, the outline information of the ceiling camera 13 viewed upward is calibrated and designated.
  • the identification information such as the name and ID number of the subject 70 who is hospitalized or resident is associated with each room number.
  • FIG. 5 is a block diagram illustrating a schematic configuration of the staff terminal 40.
  • the staff terminal 40 includes a control unit 41, a wireless communication unit 42, a display unit 43, an input unit 44, and an audio input / output unit 45, which are interconnected by a bus.
  • the control unit 41 includes a CPU, a RAM, a ROM, and the like as the same configuration as the control unit 11 of the detection unit 10.
  • the wireless communication unit 42 enables wireless communication using a standard such as Wi-Fi or Bluetooth (registered trademark), and wirelessly communicates with each device via the access point 51 or directly.
  • the display unit 43 and the input unit 44 are touch panels, in which a touch sensor as the input unit 44 is superimposed on the display surface of the display unit 43 formed of liquid crystal or the like.
  • the display unit 43 and the input unit 44 display various operation screens displaying a list of a plurality of events included in the event list to the staff 80, and receive various operations through the operation screen.
  • the voice input / output unit 45 is, for example, a speaker and a microphone, and enables voice communication by the staff 80 with another staff terminal 40 via the wireless communication unit 42.
  • the staff terminal 40 is a device that functions as a user interface of the watching system 1, and can be configured by a portable communication terminal device such as a tablet computer, a smartphone, or a mobile phone.
  • the staff 80 performs a login authentication process through the assigned staff terminal 40 at the start of the work.
  • the staff 80 inputs a staff ID and a password through the touch panel (display unit 43, input unit 44) of the staff terminal 40.
  • the staff terminal 40 transmits this to the server 20.
  • the server 20 matches the authentication information stored in the storage unit 23 and transmits an authentication result according to the authority of the staff 80 to the staff terminal 40, thereby completing the login authentication.
  • FIG. 6 is a sequence chart showing the procedure of the processing of the watching system.
  • Step S110 As illustrated in FIG. 6, the detection unit 10 detects a movement of the subject 70 in the observation area or a nurse call event by the subject 70. Then, the detection unit 10 determines whether or not another event such as a fall or a fall has occurred based on the movement of the target person 70.
  • Steps S120, S130 When it is determined that an event has occurred, the event information is transmitted to the server 20.
  • the detecting unit 10 transmits image data obtained by photographing the observation region when an event occurs.
  • a predetermined time for example, several seconds to several tens of seconds
  • a moving image may be transmitted, and for other types of events, a still image at the time of occurrence may be transmitted.
  • the detection unit 10 continuously transmits the captured image to the server 20, and the server 20 stores the captured image in the temporary storage unit for a predetermined time (for example, the past several hours) while overwriting. . Then, when an event occurs and the event information is received from the detection unit 10, the server 20 may read out a still image or a moving image before and after the occurrence from the temporary storage unit and store the image in association with the event. .
  • Step S140 The server 20 updates the event list.
  • FIG. 7 is an example of an event list stored in the storage unit 23.
  • the events included in the event list include wake-up, wake-up, fall, fall, and nurse call as described above.
  • the latest event transmitted in step S120 is added to the end of the event list.
  • each event includes an event ID, a room number, a subject, an event type, an occurrence date and time (time), image data, and information on a corresponding state, which are automatically given as primary keys.
  • the image data is data of a still image or a moving image transmitted by step S130 and captured by the detection unit 10 when an event occurs.
  • the response status includes a status, a response staff, and a response date and time (time). The correspondence status will be described later.
  • the event with the event ID 010 is a fall, and the events with the event IDs 011 and 012 are nurse calls.
  • the event IDs 013 to 015 are wake-up, wake-up, and fall that have occurred consecutively for one subject 70 (Mr. B).
  • the status of the corresponding status regarding the events of the event IDs 010 to 012 has been handled, and the event IDs 013 to 015 are not supported.
  • Step S150 When the event list is updated due to a new event being detected and added, or the status of the event being changed, the server 20 updates the updated event list with all the logged-in staff terminals. Deliver to 40. Note that the distribution destination at this time may be distributed only to the staff terminal 40 used by the staff 80 in charge of the target person 70 who has caused the event. Each staff terminal 40 that has received the event list returns an acknowledgment command in response to receiving the event list from the server 20 (not shown). The server 20 may transmit the event list by transmitting only the difference data. For example, only the part that has changed from the event list transmitted immediately before, that is, only the event that has been added or updated, is transmitted to all staff terminals 40 as the event list.
  • Step S160 Each staff terminal 40 updates the display content of the display unit 43 in response to receiving the event list from the server 20.
  • FIG. 8 is an example of the operation screens 431 and 432 for inputting a response policy to an event displayed on the display unit 43 of the staff terminal 40.
  • the operation screen 431 shown in the figure is a screen on which an event list is displayed in a list. Events to be displayed can be scrolled up and down by flicking.
  • the areas a11 and a12 of the operation screen 431 correspond to the event IDs 011 and 012 of the event list in FIG. 7, respectively.
  • an icon indicating that the call is a nurse call and that the corresponding call has been completed are displayed.
  • the area a13 corresponds to the event IDs 013 to 015 surrounded by a broken line frame in FIG. Since these are related to the same subject 70 (Mr. B), they are displayed together in the same area.
  • thumbnail images are displayed in the area a13 so as to be easily distinguished from corresponding areas since they are not supported. This thumbnail image is created from a still image cut out from the moving image i015 (see FIG. 7) relating to the latest event (ID015).
  • a text message can be exchanged with another staff member 80 who is logging in.
  • the control unit 41 makes a transition to the operation screen 432 for content confirmation by receiving a click operation on the area a13 of the staff member 80 on the operation screen 431.
  • Step S170 The staff 80 checks the contents of the event on the operation screen 432 displayed on the staff terminal 40.
  • the ID (name: B) of the target person 70 that has caused (determined) the event and the ID of the staff 80 using the staff terminal 40 (name: staff A) are displayed.
  • the elapsed time from the event occurrence time is displayed.
  • the target person 70 Mr. B
  • the occurrence of an event of getting up, getting out of bed, or falling over is continuously determined, and both of them are unsupported.
  • icons representing events “wake up” and “wake up” other than the latest event are shown.
  • the elapsed time (“elapsed 0 minutes”) shown in the area a22 is the elapsed time (decimal fractions are rounded down) after the occurrence of the latest event is determined.
  • a character indicating “fall” and an icon are shown in the area a24.
  • a thumbnail image of the captured image at the time of the fall determination is displayed.
  • the staff 80 operates the "talk” button and the "view” button below when the thumbnail image is insufficient as information and the user wants to confirm the situation of the subject 70 who has fallen.
  • a call can be made with the target person 70 (Mr. B) through the voice input / output unit 15.
  • the “view” button the live video captured by the camera 13 can be viewed by streaming playback.
  • the staff 80 decides to take charge of the fall (ID015) displayed by operating the “corresponding” button in the area a25. If not, the user returns to the event list screen by operating the return button (triangle icon) in the area a26. While one staff member 80 is displaying the operation screen 432 shown in FIG. 8 and checking the state of an event such as a fall, the prohibition process for another staff member 80 is performed under the control of the server 20. . For example, even if another staff 80 selects the same event (ID013-015) while the staff 80 is checking the status of the event of the subject 70 (Mr. B), the operation screen 431 of the other staff 80 is selected. Is displayed with the characters "Checking status", and the "Corresponding" button is not displayed or cannot be selected.
  • Steps S180, S190 The staff terminal 40 transmits a response request for the selected event to the server 20 in response to the operation of the “correspondence” button, and in response, the server 20 returns an approval notification.
  • Step S200 The server 20 updates the event list according to the processing in step S180. Specifically, the status of the event list (ID013-015: fall etc.) is changed to "corresponding". In the present embodiment, the status is changed to “corresponding” in response to the operation of the above-mentioned “corresponding” button for the event selected by the staff member 80 and displayed on the operation screen 432. In other words, by operating the "correspondence” button, a "corresponding" response status, which is information indicating that confirmation has been performed, is input, and the response date and time is recorded assuming that a response has been made. More specifically, in the event list of FIG. 7, the time at which the processes of steps S180 and S190 are performed is recorded in the corresponding date and time column.
  • the correspondence date and time may be recorded in the response date and time column when the staff 80 enters the room of the subject 70 who has caused an event such as a fall, a fall, or a nurse call. May be completed by inputting the result of the operation on the operation screen of, and the input time may be recorded in the corresponding date and time column.
  • the server 20 transmits the updated event list to all the staff members 80 who have logged in.
  • the determination of a predetermined event that may lead to an accident for which the detection unit 10 has determined occurrence, that is, a fall or fall event includes an erroneous determination. If the criterion for determination is strict, there is a possibility that the determination may be missed when a fall or a fall actually occurs. On the other hand, if the criterion for determination is too loose, erroneous determination such as determining the occurrence of a fall or fall event is likely to occur when no fall or fall has actually occurred.
  • the criterion is not strict so as to minimize omission in the determination, and therefore, erroneous determination is included to some extent.
  • the staff 80 determines whether the erroneous determination has actually occurred or not by using the staff terminal 40 to check the situation. Specifically, the staff 80 can confirm that the situation of the target person 70 is not required to actually go to the room in the case of an erroneous determination by confirming the situation of the target person 70 by using a call or a captured image. Also, even if a fall or a fall actually occurs, no injury occurs, and there is a case where no special action is required. In this case, by confirming that the call is safe with the subject 70, etc., It is not necessary to go to the living room.
  • the detection unit 10 determines that an event such as a fall, a fall, or the like has occurred, the event is counted as the number of occurrences even if a fall, a fall, or the like has not actually occurred.
  • the number of occurrences can be said to be the number of suspected occurrences of falls, falls, and the like.
  • event information including the corresponding status of the event is accumulated in the event list.
  • FIG. 9 is a flowchart illustrating a procedure of a report output process performed by the watching system 1 according to the first embodiment. 9 is mainly performed by the control unit 21 of the server 20 executing a program stored in the storage unit 23.
  • Step S310 The control unit 21 of the server 20 accumulates (adds) event information of the event whose occurrence has been determined to the event list of the storage unit 23. Further, image data (still image or moving image) at the time when the event occurs is stored in association with the event. This accumulation process is performed by the above-described process (FIG. 6).
  • Step S320 The control unit 21 acquires, from the storage unit 23, event information of a fall or a fall that may lead to an accident. This is started, for example, by a request from a user (staff 80, an administrator who manages the staff 80, etc.) through the fixed terminal 30. The user selects a specific target person 70 through the fixed terminal 30. The control unit 21 acquires the fall / fall event information on the designated target person 70.
  • An event of leaving the bed may be included as an event leading to another accident. For example, a change in the arrangement of furniture in the living room such as the bed 60 and the television may change the posture of the target person 70 in the bed 60 at bedtime. As a result, a part of the body such as the leg often protrudes from the bed 60, and a large number of leaving events may be determined. In such an event, leaving the bed causes a fall from the bed, which indirectly leads to an accident.
  • FIG. 10 is a diagram illustrating an example in which the analysis result of a fall / fall in a certain subject 70 (for example, Mr. B) is visualized.
  • This figure is a table showing the transition of the total number of falls and the number of occurrences of the falls (the number of occurrences) for each hour from one day (6:00 to 5:00 (29:00) on the following day).
  • the time zone in which one or more falls or a fall event has occurred is shown in gray.
  • Step S340 The control unit 21 determines whether the number of occurrences of a fall or a fall event per unit period has increased. This determination may be made on an hourly basis or on a daily basis. When the determination is made on a daily basis, when the number of occurrences on a certain day is equal to or more than a predetermined multiple of the average number of occurrences on the day before or on the days before the day, it can be determined that there is an increase. In the example shown in FIG. 10, the number of occurrences on October 21 is 17 times more than the average of the number of occurrences on the previous two days (20th and 19th) by 6.5 or more. It can be determined that the number of occurrences has increased on March 21. If there is an increase, the process proceeds to step S350. On the other hand, if there is no increase, the process proceeds to step S360.
  • Step S350 The control unit 21 acquires image data associated with events before and after the increase time, and displays an image based on the image data on a display of the fixed terminal 30 used by a user (staff 80, manager, etc.). indicate.
  • the displayed image is a moving image (for example, several seconds to several tens of seconds), but may be a still image cut out from the moving image.
  • the information of the subject 70 stored in the storage unit 23 may be displayed on the display.
  • the information on the target person 70 includes, for example, walking state information (independence, wheelchair, walker) of the target person 70.
  • the control unit 21 analyzes the images to recognize furniture and the like (bed, walker, cane) in the living room, and arranges the furniture and the like in the two images before and after. If there is a change, an annotation image (encircling frame, notice, etc.) indicating the change may be added to the display image.
  • Step S360 The control unit 21 receives a comment to be added to the report. For example, a comment from a user is received through the fixed terminal 30.
  • the user compares the two images before and after the increase time displayed on the display of the fixed terminal 30 and inputs the obtained judgment result as a comment. For example, in the image after the time when the number of occurrences of the falling and falling events has increased compared to the previous image by comparing the images, if the arrangement of the furniture or the like in the living room is changed, the furniture or the like is used. It can be estimated that the number of occurrences has increased due to the change in the arrangement of.
  • the furniture and the like include not only fixed furniture such as a bed, a television, and a chest, but also a walking tool such as a walker, a wheelchair, and a cane.
  • the number of occurrences of the event may be increased by changing the position where the wand is placed to a position different from the previous position.
  • walking state information independence, wheelchair, walker
  • the walking state photographed image
  • the actual walking state is determined from an image taken when a person falls down and falls down alone in the living room. If this is different from the walking state information and it is found that the target person 70 is performing dangerous walking that should not be performed, this can be cited as the reason for an increase in the number of falling events.
  • this dangerous traveling there is a case where the target person 70 who uses a walker walks independently in a living room along a wall without using a walker.
  • Step S370 The control unit 21 creates a report based on the analysis result. If a comment is received in step S360, the comment is added to the report.
  • Step S380 The control unit 21 outputs a report by displaying it on the display of the fixed terminal 30, printing it on paper from a printer, or transmitting report data to a transmission destination address according to a request.
  • FIG. 11 is a diagram illustrating an example of a report that outputs a visualized analysis result.
  • FIG. 11 reflects a result of the analysis shown in FIG. 10 and shows a report on a predetermined event that may lead to an accident regarding the target person 70 (Mr. B) selected by the user.
  • the table in the upper part of FIG. 11 corresponds to the diagram shown in FIG.
  • the two middle images are captured images of the living room before and after the time when the increase occurred.
  • the comment at the bottom is the one received in step S360.
  • the event information of a predetermined event that may lead to an accident and the image data associated with the event information are obtained, and the number of occurrences of the event, the occurrence time , And the image data is used to analyze the event occurrence status, thereby creating a report in which the event occurrence status is visualized based on the analysis result, and outputting this.
  • the fall / fall event detected by the detection unit 10 is analyzed, so that the resident (target person 70) of the resident (target person 70) whose facility staff 80 has not been able to grasp it.
  • FIG. 12 is a flowchart illustrating a report output process according to the modification.
  • Steps S410 to S440 the control unit 21 performs the same processing as steps S310 to S340 shown in FIG. If there is an increase in the number of events that have occurred, the process proceeds to step S465.
  • Step S465) The control unit 21 creates a comment that calls attention to the increase.
  • This comment uses a fixed phrase, and creates a comment for confirming whether or not a change has been made in the period before and after the increase in the arrangement of furniture in the living room, which has led to a fall or an increase in the fall.
  • the comment column of the report in FIG. 11 is a comment that calls attention to an increase assumed in such a case.
  • Some or all of the comments may be stored in the storage unit 23 in advance.
  • the detection unit 10, the server 20, the fixed terminal 30, and the staff terminal 40 have been described as independent devices.
  • the present invention is not limited to this, and some configurations may be integrated.
  • the functions of the server 20 may be integrated in the fixed terminal 30.
  • the processing in the watching system 1 may include steps other than the steps of the above-described sequence chart or flowchart, or may not include some of the above-described steps. Further, the order of the steps is not limited to the above-described embodiment. Further, each step may be executed as one step in combination with another step, may be executed by being included in another step, or may be executed by being divided into a plurality of steps.
  • the means and method for performing various processes in the watching system 1 can be realized by either a dedicated hardware circuit or a programmed computer.
  • the control program may be provided by a computer-readable recording medium such as a USB memory or a DVD (Digital Versatile Disc) -ROM, or may be provided online via a network such as the Internet.
  • the program recorded on the computer-readable recording medium is usually transferred and stored in a storage unit such as a hard disk.
  • the program may be provided as independent application software, or may be incorporated as one function into software of a device such as a detection unit.

Abstract

The objective of the present invention is to output a report which is useful for preventing the occurrence of events, and with which an event occurrence status analysis result can be visualized. Event information relating to a predetermined event that may lead to an accident, and image data related to the event information are acquired, information relating to an event occurrence status, or information relating to an event occurrence status and image data, with regard to a subject 70, are used to analyze the event occurrence status, and a report visualizing the analysis results is created and output.

Description

制御プログラム、レポート出力方法、およびレポート出力装置Control program, report output method, and report output device
 本発明は制御プログラム、レポート出力方法、およびレポート出力装置に関し、スタッフにより対応が行われる対象者に関するレポートを出力する制御プログラム、レポート出力方法、およびレポート出力装置に関する。 The present invention relates to a control program, a report output method, and a report output device, and more particularly, to a control program, a report output method, and a report output device for outputting a report on a target person to be handled by staff.
 我が国は、戦後の高度経済成長に伴う生活水準の向上、衛生環境の改善、および医療水準の向上等により、長寿命化が顕著となっている。このため、出生率の低下と相まって、高齢化率が高い高齢化社会になっている。このような高齢化社会では、病気、怪我、および加齢などにより、介護を必要とする被介護者等の増加が想定される。 寿命 Japan's life expectancy has been remarkably prolonged due to the improvement of living standards, improvement of sanitary conditions, and improvement of medical care standards following the postwar economic growth. For this reason, coupled with a decrease in the birth rate, the aging society has a high aging rate. In such an aging society, an increase in the number of cared people who need nursing care due to illness, injury, aging, and the like is expected.
 要介護者等は、病院や老人福祉施設などの施設において、歩行中に転倒したり、ベッドから転落して怪我をしたりする虞がある。そのため、要介護者等がこのような状態になったときに看護師や介護士がすぐに駆けつけられるようにするために、要介護者等の状態を検知するためのシステムの開発が進められている。 (4) There is a risk that a care recipient or the like may fall during walking or fall from a bed and cause injuries in a facility such as a hospital or a welfare facility for the elderly. Therefore, in order to enable nurses and caregivers to rush immediately when such a condition occurs, a system for detecting the condition of the person requiring care has been developed. I have.
 このようなシステムとして特許文献1に開示されたナースコールシステムにおいては、患者のベッドに対して設置されたカメラの撮像映像から、患者の起床、離床のいずれかの状態になったか否かを判断する。そして、起床、または離床が行われた場合には、ナースステーションに設けられたナースコール親機に、カメラの撮像映像を表示させている。 In a nurse call system disclosed in Patent Document 1 as such a system, it is determined from a captured image of a camera installed on a bed of a patient whether the patient has woken up or left the bed. I do. When the wake-up or the wake-up is performed, the image captured by the camera is displayed on the nurse call master provided in the nurse station.
 また、介護業務に関し、事故やこれに繋がるヒヤリハットの対策が誤っていることによりヒヤリハット事例を再発させないために、特許文献2では、対策が必要なヒヤリハット事例が発生したか否かを判別する医療介護支援装置が開示されている。この医療介護支援装置では、介護担当者により入力されるケース記録と、データベースに記録されている過去のヒヤリハット事例との類似度合い(是正通知ヒット率)を算出し、是正通知ヒット率が所定値以上であった場合に、改善が必要な対策情報として特定している。 In addition, in order to prevent reoccurrence of a near-miss incident caused by an accident or a measure against a near-miss incident related to nursing care, Patent Literature 2 discloses a medical nursing care system that determines whether or not a near-miss incident requiring a countermeasure has occurred. A support device is disclosed. This medical care support device calculates the similarity (correction notification hit rate) between the case record input by the caregiver and the past near miss incident recorded in the database, and the correction notification hit rate is equal to or more than a predetermined value. Is identified as countermeasure information that needs improvement.
特開2014-90913号公報JP-A-2014-90913 特開2013-191184号公報JP 2013-191184 A
 しかしながら、特許文献1に開示されたナースコールシステムでは、ベッドにいる患者の起床、離床については、検知できるが、事故やヒヤリハットを未然に防ぐことにはできない。また、特許文献2に開示された医療介護支援装置は、発生したヒヤリハットを過去の事例と比較することで、過去のヒヤリハットへの対策が誤っていることを特定するものであり、事故やこれに繋がるヒヤリハットの発生を未然に防ぐものではない。 However, the nurse call system disclosed in Patent Literature 1 can detect the rising and leaving of a patient in a bed, but cannot prevent an accident or a near-miss. Further, the medical care support device disclosed in Patent Document 2 identifies an error in measures taken in the past by comparing the occurred near-miss with past cases, and identifies accidents and accidents. It does not prevent the occurrence of connected near misses.
 本発明は、上記事情に鑑みてなされたものであり、事故に繋がる可能性があるイベントを分析することで、イベントの発生の未然の防止に役立つレポートを出力可能な制御プログラム、レポート出力方法、およびレポート出力装置を提供することを目的とする。 The present invention has been made in view of the above circumstances, by analyzing an event that may lead to an accident, a control program capable of outputting a report useful for preventing the occurrence of the event, a report output method, And a report output device.
 本発明の上記課題は、以下の手段によって解決される。 上 記 The above object of the present invention is solved by the following means.
 (1)蓄積された、事故に繋がる可能性がある所定のイベントのイベント情報、および該イベントに関連付けられた画像データであって、観察領域における対象者の動きを検出する検出部の出力に基づいて、発生を判定した前記観察領域における前記対象者に関する所定の前記イベントを記録した前記イベント情報、および該イベントが発生した時点を含む所定期間の前記観察領域を撮影することで得られた前記画像データを取得する手順(a)と、
 前記対象者について、前記イベントの発生状況の情報を用いて、または、前記イベントの前記発生状況の情報、および前記画像データを用いて、前記イベントの発生状況を分析することで、分析結果が可視化されるレポートを作成する手順(b)と、
 前記手順(b)において作成されたレポートを出力する手順(c)と、
を含む処理をコンピューターに実行させるための制御プログラム。
(1) Based on the accumulated event information of a predetermined event that may lead to an accident and the image data associated with the event, based on the output of the detection unit that detects the movement of the subject in the observation area. The event information that records the predetermined event relating to the subject in the observation area whose occurrence has been determined, and the image obtained by photographing the observation area for a predetermined period including the time when the event occurred Step (a) of acquiring data;
The analysis result is visualized by using the information on the occurrence status of the event or analyzing the occurrence status of the event using the information on the occurrence status of the event and the image data for the subject. (B) to create a report to be
Outputting the report created in the step (b) (c);
A control program that causes a computer to execute processing including
 (2)前記イベントの前記発生状況の情報には、前記イベントの発生件数、および発生時刻が含まれる、上記(1)に記載の制御プログラム。 (2) The control program according to (1), wherein the information on the occurrence status of the event includes the number of occurrences of the event and the occurrence time.
 (3)所定の前記イベントには、前記対象者の転倒を判定した転倒のイベント、または前記対象者の転落を判定した転落のイベントが含まれる、上記(1)、または上記(2)に記載の制御プログラム。 (3) The above-mentioned (1) or (2), wherein the predetermined event includes a fall event that has determined that the subject has fallen or a fall event that has determined that the subject has fallen. Control program.
 (4)前記対象者への対応を行うスタッフは、携帯端末を携帯し、
 前記イベントが発生することに応じて、前記スタッフが前記携帯端末を通じて入力した、前記イベントを確認したことを示す情報が、前記イベント情報に含まれる、上記(1)から上記(3)のいずれかに記載の制御プログラム。
(4) The staff responding to the target person carries a mobile terminal,
Any of the above (1) to (3), wherein information indicating that the staff has confirmed the event, which is input by the staff through the mobile terminal in response to the occurrence of the event, is included in the event information. The control program according to 1.
 (5)前記手順(b)は、前記対象者について、前記イベントの単位期間当たりの発生件数の増加があった時期の前後の期間の前記画像データを、表示する手順(b1)と、
 ユーザーによるコメントの入力を受け付ける手順(b2)と、を含み、
 前記手順(c)では、前記手順(b2)で受け付けた、前記コメントを含む前記レポートを作成する、上記(1)から上記(4)のいずれかに記載の制御プログラム。
(5) The procedure (b) includes displaying the image data of the target person before and after the time when the number of occurrences of the event per unit period has increased, and (b1).
Receiving a comment input by the user (b2).
The control program according to any one of (1) to (4), wherein in the step (c), the report including the comment received in the step (b2) is created.
 (6)さらに、前記手順(a)で取得した前記イベント情報に基づいて、前記対象者について、前記イベントの単位期間当たりの発生件数の増加の発生有無を判定し、増加の発生を判定した場合には、前記対象者の前記イベントに関し、注意を促すコメントを作成する手順(d)を含み、
 前記手順(b)では、前記手順(d)において作成した前記コメントを含む前記レポートを作成する、上記(1)から上記(4)のいずれかに記載の制御プログラム。
(6) Further, based on the event information acquired in the step (a), for the subject, it is determined whether the number of occurrences of the event per unit period has increased, and it is determined that the increase has occurred. Includes a step (d) of creating a comment that calls attention to the event of the target person,
The control program according to any one of (1) to (4), wherein in the step (b), the report including the comment created in the step (d) is created.
 (7)前記手順(b)で作成する前記レポートには、前記対象者について、前記イベントの単位期間当たりの発生件数の増加があった時期の前後の期間の前記画像データが含まれる、上記(1)から上記(6)のいずれかに記載の制御プログラム。 (7) The report created in the step (b) includes the image data of the target person before and after the period when the number of occurrences of the event per unit period increases. The control program according to any one of (1) to (6).
 (8)蓄積された、事故に繋がる可能性がある所定のイベントのイベント情報、および該イベントに関連付けられた画像データであって、観察領域における対象者の動きを検出する検出部の出力に基づいて、発生を判定した前記観察領域における前記対象者に関する所定の前記イベントを記録した前記イベント情報、および該イベントが発生した時点を含む所定期間の前記観察領域を撮影することで得られた前記画像データを取得する手順(a)と、
 前記対象者について、前記イベントの発生状況の情報を用いて、または、前記イベントの前記発生状況の情報、および前記画像データを用いて、前記イベントの発生状況を分析することで、分析結果が可視化されるレポートを作成する手順(b)と、
 前記手順(b)において作成されたレポートを出力する手順(c)と、
を含むレポート出力方法。
(8) Based on the accumulated event information of a predetermined event that may lead to an accident and the image data associated with the event, based on the output of the detection unit that detects the movement of the subject in the observation area. The event information that records the predetermined event relating to the subject in the observation area whose occurrence has been determined, and the image obtained by photographing the observation area for a predetermined period including the time when the event occurred Step (a) of acquiring data;
The analysis result is visualized by using the information on the occurrence status of the event or analyzing the occurrence status of the event using the information on the occurrence status of the event and the image data for the subject. (B) to create a report to be
Outputting the report created in the step (b) (c);
Report output method including.
 (9)蓄積された、事故に繋がる可能性がある所定のイベントのイベント情報、および該イベントに関連付けられた画像データであって、観察領域における対象者の動きを検出する検出部の出力に基づいて、発生を判定した前記観察領域における前記対象者に関する所定の前記イベントを記録した前記イベント情報、および該イベントが発生した時点を含む所定期間の前記観察領域を撮影することで得られた前記画像データを取得する取得部と、
 前記対象者について、前記イベントの発生状況の情報を用いて、または、前記イベントの前記発生状況の情報、および前記画像データを用いて、前記イベントの発生状況を分析することで、分析結果が可視化されるレポートを作成する作成部と、
 前記作成部において作成されたレポートを出力する出力部と、
を備える、レポート出力装置。
(9) Based on the accumulated event information of a predetermined event that may lead to an accident and the image data associated with the event, based on the output of the detection unit that detects the movement of the subject in the observation area. The event information that records the predetermined event relating to the subject in the observation area whose occurrence has been determined, and the image obtained by photographing the observation area for a predetermined period including the time when the event occurred An acquisition unit for acquiring data;
The analysis result is visualized by using the information on the occurrence status of the event or analyzing the occurrence status of the event using the information on the occurrence status of the event and the image data for the subject. A creating section for creating a report to be generated,
An output unit that outputs the report created by the creation unit;
A report output device comprising:
 本発明によれば、事故に繋がる可能性がある所定のイベントのイベント情報、および、このイベント情報に関連付けられた画像データを取得し、対象者について、イベントの発生状況の情報、または、イベントの発生状況の情報、および画像データを用いて、イベントの発生状況を分析することで、分析結果が可視化されるレポートを作成し、これを出力する。これにより、事故に繋がる可能性のあるイベントの発生の未然の防止に役立つレポートを出力できる。 According to the present invention, event information of a predetermined event that may lead to an accident, and image data associated with the event information are acquired, and information on the occurrence status of the event or the By using the information on the occurrence status and the image data to analyze the occurrence status of the event, a report in which the analysis result is visualized is created and output. This makes it possible to output a report that is useful for preventing the occurrence of an event that may lead to an accident.
見守りシステムの全体構成を示す図である。It is a figure showing the whole composition of a watching system. 対象者の部屋に設置された検出部の例を示す図である。It is a figure showing the example of the detecting part installed in the room of the subject. 検出部の概略構成を示すブロック図である。FIG. 3 is a block diagram illustrating a schematic configuration of a detection unit. サーバーの概略構成を示すブロック図である。FIG. 2 is a block diagram illustrating a schematic configuration of a server. スタッフ端末の概略構成を示すブロック図である。It is a block diagram which shows the schematic structure of a staff terminal. 見守りシステムの処理の手順を示すシーケンスチャートである。It is a sequence chart which shows the procedure of a process of a watching system. 記憶部に記憶されるイベントリストの例である。5 is an example of an event list stored in a storage unit. スタッフ端末に表示されるイベントの確認を行うための操作画面の例である。It is an example of an operation screen for confirming an event displayed on a staff terminal. 第1の実施形態におけるレポートの出力処理の手順を示すフローチャートである。5 is a flowchart illustrating a procedure of a report output process according to the first embodiment. ある対象者における、転落・転倒の分析結果を可視化した例を示す図である。It is a figure which shows the example which visualized the analysis result of the fall / fall in a certain subject. 可視化した分析結果を出力したレポートの例を示す図である。It is a figure showing the example of the report which outputted the analysis result visualized. 変形例におけるレポートの出力処理の手順を示すフローチャートである。13 is a flowchart illustrating a procedure of a report output process in a modified example.
 以下、添付した図面を参照して、本発明の実施形態を説明する。なお、図面の説明において同一の要素には同一の符号を付し、重複する説明を省略する。また、図面の寸法比率は、説明の都合上誇張されており、実際の比率とは異なる場合がある。 Hereinafter, embodiments of the present invention will be described with reference to the accompanying drawings. In the description of the drawings, the same elements will be denoted by the same reference symbols, without redundant description. In addition, the dimensional ratios in the drawings are exaggerated for convenience of description, and may be different from the actual ratios.
 (全体構成)
 図1は本実施形態に係る見守りシステムの全体構成を示す図であり、図2は対象者の部屋のベッド周辺に設置された検出部の例を示す図である。
(overall structure)
FIG. 1 is a diagram illustrating an entire configuration of a watching system according to the present embodiment, and FIG. 2 is a diagram illustrating an example of a detection unit installed around a bed in a room of a subject.
 図1に示すように、見守りシステム1は、複数の検出部10、サーバー20、固定端末30、および1つ以上のスタッフ端末40を備える。サーバー20は、レポート出力装置として機能する。これらは、有線や無線によって、LAN(Local Area Network)、電話網またはデータ通信網等のネットワーク50を介して、相互に通信可能に接続される。ネットワーク50は、通信信号を中継するリピーター、ブリッジ、ルーターまたはクロスコネクト等の中継機を備えてもよい。図1に示す例では、スタッフ端末40、検出部10、サーバー20、および固定端末30は相互に、アクセスポイント51を含む無線LAN等(例えばIEEE802.11規格に従ったLAN)のネットワーク50によって、通信可能に接続されている。 As shown in FIG. 1, the watching system 1 includes a plurality of detection units 10, a server 20, a fixed terminal 30, and one or more staff terminals 40. The server 20 functions as a report output device. These are communicably connected to each other by wire or wireless via a network 50 such as a LAN (Local Area Network), a telephone network or a data communication network. The network 50 may include a repeater that relays a communication signal, such as a repeater, a bridge, a router, or a cross-connect. In the example illustrated in FIG. 1, the staff terminal 40, the detection unit 10, the server 20, and the fixed terminal 30 are mutually connected by a network 50 such as a wireless LAN (for example, a LAN according to the IEEE 802.11 standard) including an access point 51. Connected for communication.
 見守りシステム1は、対象者70に応じて適宜な場所に配設される。対象者70(見守り対象者、ケア対象者ともいう)は、例えば、病気や怪我等によって看護を必要とする患者、高齢による身体能力の低下等によって介護を必要とする被介護者、または一人暮らしの独居者等である。特に、早期発見および早期対処を可能にする観点から、対象者70は、例えば異常状態等の所定の不都合な事象がその者に生じた場合に、その発見を必要としている者であり得る。このため、見守りシステム1は、対象者70の種類に応じて、病院、老人福祉施設および住戸等の建物に好適に配設される。図1に示す例では、見守りシステム1は、複数の対象者70が入居する複数の部屋(居室)やナースステーションを含む複数の部屋を備える施設に配置されている。 The watching system 1 is provided at an appropriate place according to the target person 70. The target person 70 (also referred to as a watching target person or a care target person) may be, for example, a patient who needs nursing due to illness or injury, a cared person who needs nursing care due to a decline in physical ability due to aging, or a single person living alone. A single person. In particular, from the viewpoint of enabling early detection and early coping, the target person 70 may be a person who needs to be detected when a predetermined inconvenient event such as an abnormal state occurs in the person. For this reason, the watching system 1 is suitably installed in buildings such as hospitals, welfare facilities for the elderly, and dwelling units, depending on the type of the subject 70. In the example illustrated in FIG. 1, the watching system 1 is arranged in a facility including a plurality of rooms (rooms) where a plurality of subjects 70 enter and a plurality of rooms including a nurse station.
 検出部10は、対象者70の観察領域であるそれぞれの居室に配置される。図1に示す例では、4つの検出部10が対象者70であるAさん、Bさん、Cさん、およびDさんの居室にそれぞれ配置されている。検出部10の観察領域にはベッド60が含まれている。対象者70に対して看護または介護を行うスタッフ80(ケアスタッフ、または介護スタッフともいう)は、それぞれ携帯端末であるスタッフ端末40を持ち歩いている。ただし、見守りシステム1が備える各構成の位置や個数等は、図1に示す例に限定されない。例えば、サーバー20は、ナースステーションに配置されなくてもよく、ネットワーク50に接続されている外部のサーバーユニットであってもよい。また固定端末30を省略し、サーバー20またはスタッフ端末40がその機能を担ってもよい。 The detection unit 10 is arranged in each living room, which is the observation area of the subject 70. In the example illustrated in FIG. 1, the four detection units 10 are arranged in the rooms of the subjects 70, A, B, C, and D, respectively. The bed 60 is included in the observation area of the detection unit 10. Staff 80 (also referred to as care staff or care staff) who provide nursing or care for the subject 70 carries a staff terminal 40 which is a portable terminal. However, the position, the number, and the like of each component included in the watching system 1 are not limited to the example illustrated in FIG. For example, the server 20 may not be located at the nurse station, and may be an external server unit connected to the network 50. Further, the fixed terminal 30 may be omitted, and the server 20 or the staff terminal 40 may perform the function.
 (検出部10)
 図3は検出部の概略構成を示すブロック図である。同図に示すように、検出部10は、制御部11、通信部12、カメラ13、ナースコール部14、および音声入出力部15を備え、これらはバスによって、相互に接続されている。
(Detection unit 10)
FIG. 3 is a block diagram illustrating a schematic configuration of the detection unit. As shown in FIG. 1, the detection unit 10 includes a control unit 11, a communication unit 12, a camera 13, a nurse call unit 14, and a voice input / output unit 15, which are interconnected by a bus.
 制御部11は、CPU(Central Processing Unit)、およびRAM(Random Access Memory)、ROM(Read Only Memory)、等のメモリにより構成され、プログラムにしたがって検出部10の各部の制御および演算処理を行う。なお、制御部11は、メモリとして、さらにHDD(Hard Disk Drive)を備えてもよい。 The control unit 11 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), and a memory such as a ROM (Read Only Memory), and performs control and arithmetic processing of each unit of the detection unit 10 according to a program. Note that the control unit 11 may further include an HDD (Hard \ Disk \ Drive) as a memory.
 通信部12は、ネットワーク50を介して、例えば、サーバー20、固定端末30またはスタッフ端末40等の、他の装置と通信するためのインターフェース回路(例えばLANカード等)である。 The communication unit 12 is an interface circuit (for example, a LAN card) for communicating with another device such as the server 20, the fixed terminal 30, or the staff terminal 40 via the network 50.
 カメラ13は、例えば居室の天井、または壁の上部に配置され、観察領域として真下にある対象者70のベッド60を含む領域を撮影し、撮影画像(画像データ)を出力する。この撮影画像には、静止画および動画を含む。カメラ13は近赤外線カメラであるが、これに換えて可視光カメラを用いてもよく、これらを併用してもよい。 The camera 13 is arranged, for example, on the ceiling of a living room or on an upper part of a wall, and captures an area including the bed 60 of the subject 70 directly below as an observation area, and outputs a captured image (image data). This photographed image includes a still image and a moving image. Although the camera 13 is a near-infrared camera, a visible light camera may be used instead, or these may be used in combination.
 制御部11は、カメラ13が撮影した撮影画像から、対象者70の所定の行動の発生を判定(認識)する。この判定する所定の行動には、ベッド60から起き上がる「起床」、ベッド60から離れる「離床」、ベッド60から転落する「転落」、および床面等に転倒する「転倒」が含まれる。 The control unit 11 determines (recognizes) the occurrence of a predetermined action of the target person 70 from the image captured by the camera 13. The predetermined actions to be determined include “wake up” rising from the bed 60, “leaving off” from the bed 60, “falling” falling down from the bed 60, and “falling down” falling on the floor or the like.
 制御部11は、複数の撮影画像(動画)から画像のシルエット(以下、「人シルエット」と称する)を検出する。人シルエットは、例えば、撮影時刻が前後する画像の差分を抽出する時間差分法により差分が相対的に大きい画素の範囲を抽出することで検出され得る。人シルエットは、撮影画像と背景画像との差分を抽出する背景差分法により検出されてもよい。起床、離床、転落、転倒の別は、検出した人シルエットから対象者70の姿勢(例えば立位、座位および横臥等)、およびベッド60等の居室内の設置物との相対的な位置から認識される。これらの認識は、制御部11のCPUが処理するプログラムにより行ってもよく、組み込み型の処理回路により行うようにしてもよい。以下においては、制御部11側でイベントの種類、および発生有無の判定を行うものとして説明する。しかしながら、これに限られずサーバー20側でこれらの認識の全部またはほとんどの処理を行うようにし、制御部11ではサーバー20への撮影画像の送信のみを行うようにしてもよい。制御部11は、いずれかの種類の行動を認識した場合、その行動(後述のイベント)が発生した旨の通知をサーバー20等に送信する。 The control unit 11 detects an image silhouette (hereinafter, referred to as a “human silhouette”) from a plurality of captured images (moving images). The human silhouette can be detected by, for example, extracting a range of pixels having a relatively large difference by a time difference method of extracting a difference between images whose shooting time is before and after. The human silhouette may be detected by a background difference method that extracts a difference between the captured image and the background image. Whether the user is awake, awake, falls, or falls is recognized based on the posture of the subject 70 (for example, standing, sitting, lying down, etc.) based on the detected human silhouette, and the relative position with respect to an installed object such as a bed 60 in a living room. Is done. Such recognition may be performed by a program processed by the CPU of the control unit 11, or may be performed by a built-in processing circuit. In the following, a description will be given assuming that the control unit 11 determines the type of event and whether or not the event has occurred. However, the present invention is not limited to this, and the server 20 may perform all or most of these recognitions, and the control unit 11 may transmit only the captured image to the server 20. When recognizing any type of action, the control unit 11 transmits a notification to the effect that the action (an event to be described later) has occurred to the server 20 or the like.
 スタッフ80は、業務に応じた、対象者70への各種の対応を行う者である。業務には、医療業務、介護業務を含み得る。ここで、スタッフ80の業務が、対象者70に対する介護業務である場合に、各イベントに関する対応内容について説明する。イベントとして「起床」を判定し、その判定が所定時間内(施設で設定された起床時間(例えば午前7~8時))であれば、モーニングケアを行う。このモーニングケアには、洗顔、歯磨き介助、義歯装着、着替え介助等が含まれる。また、「離床」のイベントであれば、車椅子移乗、歩行介助が必要となる場合がある。また、検出部10が判定したイベント以外の定期的(定時)なイベントとして、飲料、および食事介助、排泄介助、車椅子移乗、歩行介助、体位変換(褥瘡予防)がある。これらの定期的イベントは、ナースコール部14等により、定時になるとアラートを発生させるようにしてもよい。 The staff 80 is a person who performs various responses to the target person 70 according to the business. Services can include medical services and nursing care services. Here, in the case where the work of the staff 80 is a care work for the target person 70, the corresponding contents regarding each event will be described. "Waking up" is determined as an event, and if the determination is within a predetermined time (wake-up time set at the facility (for example, 7 to 8 am), morning care is performed. This morning care includes face washing, tooth brushing assistance, denture wearing, changing clothes assistance, and the like. In addition, in the event of “leaving out of bed”, a wheelchair transfer or walking assistance may be required. In addition, as a regular (regular) event other than the event determined by the detection unit 10, there are drinks, meal assistance, excretion assistance, wheelchair transfer, walking assistance, and posture change (pressure ulcer prevention). For these regular events, an alert may be generated at a fixed time by the nurse call unit 14 or the like.
 ナースコール部14は、押しボタン式のスイッチを含み、スイッチが対象者70によって押されることでナースコール(ケアコールともいう)を検出する。押しボタン式のスイッチに換えて、音声マイクによりナースコールを検出してもよい。ナースコール部14のスイッチが押された場合、すなわち、ナースコールを検出した場合、制御部11は、通信部12およびネットワーク50を介して、ナースコールがあった旨の通知(ナースコール通知)をサーバー20等に送信する。 The nurse call unit 14 includes a push-button switch, and detects a nurse call (also called a care call) when the switch is pressed by the subject 70. The nurse call may be detected by a voice microphone instead of the push button type switch. When the switch of the nurse call unit 14 is pressed, that is, when a nurse call is detected, the control unit 11 sends a notification (nurse call notification) indicating that there is a nurse call via the communication unit 12 and the network 50. It is transmitted to the server 20 or the like.
 音声入出力部15は、例えばスピーカーとマイクであり、通信部12を介してスタッフ端末40等との間で音声信号を送受信することで音声通話を可能とする。なお、音声入出力部15は検出部10の外部装置として、通信部12を介して検出部10に接続されてもよい。 The voice input / output unit 15 is, for example, a speaker and a microphone, and enables voice communication by transmitting and receiving a voice signal to / from the staff terminal 40 or the like via the communication unit 12. The voice input / output unit 15 may be connected to the detection unit 10 via the communication unit 12 as an external device of the detection unit 10.
 また、検出部10は、ベッド60の方向に向けてマイクロ波を送受信して対象者70の体動(例えば呼吸動)によって生じたマイクロ波のドップラシフトを検出するドップラシフト方式の体動センサーを、さらに備えてもよい。この体動センサーにより、対象者70の呼吸動作に伴う胸部の体動(胸部の上下動)を検出し、その胸部の体動における周期の乱れや予め設定された閾値以下である胸部の体動における振幅を検知すると、微体動異常であると認識する。 The detection unit 10 also includes a Doppler shift type body motion sensor that transmits and receives microwaves toward the bed 60 and detects Doppler shifts of microwaves caused by body movements (for example, respiratory movements) of the subject 70. May be further provided. The body movement sensor detects body movement of the chest (up and down movement of the chest) accompanying the breathing movement of the subject 70, and disorder of the cycle in the body movement of the chest and movement of the chest that is equal to or less than a preset threshold value. When the amplitude at is detected, it is recognized that there is a slight movement abnormality.
 本実施形態においては、ナースコール部14によるナースコール、および対象者70に関する検出部10が認識した状態の変化であって、起床、離床、転落、転倒、微体動異常、等のスタッフ80に発報(報知)を行うべき事象をイベントと称する。検出部10は、生じたイベントの情報および撮影画像をサーバー20へ送信(出力)する。 In the present embodiment, a nurse call by the nurse call unit 14 and a change in the state recognized by the detection unit 10 regarding the subject 70 are included in the staff 80 such as getting up, leaving the bed, falling, falling down, and abnormal body movement. An event for which a notification (notification) is to be performed is called an event. The detection unit 10 transmits (outputs) information on the event that has occurred and the captured image to the server 20.
 (サーバー20)
 図4はサーバーの概略構成を示すブロック図である。サーバー20は、制御部21、通信部22、および記憶部23を備える。サーバー20は、対象者70用の居室と同じ建物内に設けられてもよく、遠隔地に設けられてネットワークを介して接続可能であってもよい。例えば、サーバー20は、インターネット等のネットワーク上に配置された複数のサーバーによって仮想的に構築されるクラウドサーバーであってもよい。各構成は、バスによって、相互に通信可能に接続されている。記憶部23は、データベースとして機能し、イベントリスト、対象者70、およびスタッフ80に関する各種情報を記憶する。制御部21は、通信部22と協働することで事故に繋がる可能性のある所定のイベントのイベント情報、およびイベントに関連付けられたイベントが発生した時点を含む所定期間の画像データを取得する取得部として機能する。また制御部21は、取得したこれらのデータからイベントの発生状況を分析することでレポートを作成する作成部として機能する。また制御部21は、通信部22と協働することで、作成したレポートを出力する出力部としても機能する。その他の制御部21、および通信部22は、検出部10の各構成と同様の機能を有するため、詳細な説明を省略する。このイベントリストには、後述するように、事故に繋がる可能性がある所定のイベント(例えば転落、転倒のイベント)を含む各種のイベントの情報が含まれる。この事故に繋がる可能性がある所定のイベントは、ヒヤリハットともよばれ、このままの状態が続くと事故になる可能性があるものである。
(Server 20)
FIG. 4 is a block diagram showing a schematic configuration of the server. The server 20 includes a control unit 21, a communication unit 22, and a storage unit 23. The server 20 may be provided in the same building as the living room for the subject 70, or may be provided in a remote location and connectable via a network. For example, the server 20 may be a cloud server virtually constructed by a plurality of servers arranged on a network such as the Internet. The components are communicably connected to each other by a bus. The storage unit 23 functions as a database, and stores various information regarding the event list, the target person 70, and the staff 80. The control unit 21 obtains event information of a predetermined event that may lead to an accident by cooperating with the communication unit 22 and image data of a predetermined period including a time point at which an event associated with the event has occurred. Functions as a unit. Further, the control unit 21 functions as a creating unit that creates a report by analyzing the event occurrence status from the acquired data. The control unit 21 also functions as an output unit that outputs the created report in cooperation with the communication unit 22. The other control unit 21 and communication unit 22 have the same functions as those of the configuration of the detection unit 10, and thus a detailed description is omitted. As will be described later, the event list includes information on various events including a predetermined event (for example, a falling event or a falling event) that may lead to an accident. The predetermined event that may lead to the accident is also called a near-miss event, and if this situation continues, an accident may occur.
 サーバー20は、検出部10が検出した、ナースコール、起床、離床、転落、転倒、等のイベントが、どの対象者70に関するものであるかを判定(識別)する。この判定は、イベントを検出した検出部10が設置されている部屋番号から、これに対応付けられている対象者70(すなわち、部屋の入居者)を判定する。そして、判定したイベントの種類と対象者70とを関連付けて記憶部23のイベントリストに追加する。なお、本実施形態では、対象者70の判定は、対象者70がICタグを携帯している場合には、このICタグを各部屋に設けたRFIDリーダーで読み取ることにより、判定してもよい。なお、相部屋等で、1つの部屋に複数の対象者70が存在する場合には、ベッド60毎に検出部10を配置することで、対象者70を判定してもよい。また、これに関連し、スタッフ80が携帯するスタッフ端末40をRFIDリーダーに近づけることにより、各スタッフ80の部屋への入室を検知するようにしてもよい。 The server 20 determines (identifies) which subject 70 the event, such as a nurse call, getting up, getting out of bed, falling, falling down, etc., detected by the detecting unit 10 is related to. In this determination, the target person 70 (that is, the resident of the room) associated with the room number where the detection unit 10 that has detected the event is installed is determined. Then, the determined event type is associated with the target person 70 and added to the event list in the storage unit 23. In the present embodiment, when the subject 70 carries an IC tag, the subject 70 may be determined by reading the IC tag with an RFID reader provided in each room. . When a plurality of subjects 70 are present in one room, such as in a shared room, the subject 70 may be determined by arranging the detection unit 10 for each bed 60. In this connection, the entry of each staff 80 into the room may be detected by bringing the staff terminal 40 carried by the staff 80 close to the RFID reader.
 (固定端末30)
 固定端末30は、いわゆるPC(Personal Computer)であり、CPU、RAM等で構成される制御部、通信部、表示部、入力部、および音声入出力部を備える。固定端末30は、例えばナースステーション内に配置される。また、後述するレポートを出力する際、ユーザー(スタッフ80、このスタッフ80を管理する管理者等)の指示の入力により、表示部に分析結果を図表等により可視化したレポートを表示したり、外部のプリンター(図示せず)から用紙に出力させたりする。また、このレポートには、後述するようにユーザーが固定端末30の表示部(ディスプレイ)に表示された撮影画像を参照しながら、入力部から入力したコメントを含ませるようにしてもよい。
(Fixed terminal 30)
The fixed terminal 30 is a so-called PC (Personal Computer), and includes a control unit including a CPU, a RAM, and the like, a communication unit, a display unit, an input unit, and a voice input / output unit. The fixed terminal 30 is arranged, for example, in a nurse station. Further, when outputting a report to be described later, a user (staff 80, an administrator who manages the staff 80, etc.) inputs an instruction to display a report in which the analysis result is visualized in a chart or the like on the display unit, or to display an external report. For example, a printer (not shown) outputs the data to paper. In addition, the report may include a comment input from the input unit while referring to the captured image displayed on the display unit (display) of the fixed terminal 30 as described later.
 また、スタッフ80または技術スタッフ等は、固定端末30を通じて、検出部10を各部屋(居室)に取り付けたときに、部屋番号と検出部10の対応付けをしたり、ベッド60等の居室内の設置物の位置情報、すなわち、天井のカメラ13による上方視の輪郭情報の校正、指定を行ったりする。また、入院または入居している対象者70の名前、ID番号等の識別情報と、各部屋番号との対応付けも行う。 Also, the staff 80 or technical staff associates the room number with the detection unit 10 when attaching the detection unit 10 to each room (living room) through the fixed terminal 30, or associates the detection unit 10 with the room 60 such as the bed 60. The position information of the installation object, that is, the outline information of the ceiling camera 13 viewed upward is calibrated and designated. In addition, the identification information such as the name and ID number of the subject 70 who is hospitalized or resident is associated with each room number.
 (スタッフ端末40)
 図5は、スタッフ端末40の概略構成を示すブロック図である。スタッフ端末40は、制御部41、無線通信部42、表示部43、入力部44、および音声入出力部45を備え、これらはバスにより相互に接続される。制御部41は、検出部10の制御部11と同様の構成として、CPU、RAM、ROM等を備える。無線通信部42により、Wi-Fi、Bluetooth(登録商標)等の規格を用いた無線通信が可能であり、アクセスポイント51を経由して、または直接的に各装置と無線通信する。表示部43、および入力部44は、タッチパネルであり、液晶等で構成される表示部43の表示面に、入力部44としてのタッチセンサーを重畳させたものである。表示部43、入力部44によって、スタッフ80に対して、イベントリストに含まれる複数のイベントを一覧表示した各種の操作画面を表示したり、操作画面を通じて各種の操作を受け付けたりする。音声入出力部45は、例えばスピーカーとマイクであり、無線通信部42を介して他のスタッフ端末40との間でスタッフ80による音声通話を可能にする。スタッフ端末40は、見守りシステム1のユーザーインターフェースとして機能する機器であり、例えば、タブレット型コンピューター、スマートフォンまたは携帯電話等の、持ち運び可能な通信端末機器によって構成できる。
(Staff terminal 40)
FIG. 5 is a block diagram illustrating a schematic configuration of the staff terminal 40. The staff terminal 40 includes a control unit 41, a wireless communication unit 42, a display unit 43, an input unit 44, and an audio input / output unit 45, which are interconnected by a bus. The control unit 41 includes a CPU, a RAM, a ROM, and the like as the same configuration as the control unit 11 of the detection unit 10. The wireless communication unit 42 enables wireless communication using a standard such as Wi-Fi or Bluetooth (registered trademark), and wirelessly communicates with each device via the access point 51 or directly. The display unit 43 and the input unit 44 are touch panels, in which a touch sensor as the input unit 44 is superimposed on the display surface of the display unit 43 formed of liquid crystal or the like. The display unit 43 and the input unit 44 display various operation screens displaying a list of a plurality of events included in the event list to the staff 80, and receive various operations through the operation screen. The voice input / output unit 45 is, for example, a speaker and a microphone, and enables voice communication by the staff 80 with another staff terminal 40 via the wireless communication unit 42. The staff terminal 40 is a device that functions as a user interface of the watching system 1, and can be configured by a portable communication terminal device such as a tablet computer, a smartphone, or a mobile phone.
 スタッフ80は、業務開始時に、割り当てられたスタッフ端末40を通じて、ログイン認証処理を行う。スタッフ80は、スタッフ端末40のタッチパネル(表示部43、入力部44)を通じて、スタッフID、パスワードを入力する。スタッフ端末40は、これをサーバー20に送信する。サーバー20は、記憶部23に記憶している認証情報を突き合わせ、スタッフ80の権限に応じた認証結果をスタッフ端末40に送信することで、ログイン認証が終了する。 The staff 80 performs a login authentication process through the assigned staff terminal 40 at the start of the work. The staff 80 inputs a staff ID and a password through the touch panel (display unit 43, input unit 44) of the staff terminal 40. The staff terminal 40 transmits this to the server 20. The server 20 matches the authentication information stored in the storage unit 23 and transmits an authentication result according to the authority of the staff 80 to the staff terminal 40, thereby completing the login authentication.
 (見守りシステムの処理の手順)
 次に、見守りシステム1全体で行われる、イベント情報、および画像データの蓄積処理について図6を参照して説明する。図6は、見守りシステムの処理の手順を示すシーケンスチャートである。
(Procedure of processing of watching system)
Next, a process of accumulating event information and image data performed by the entire watching system 1 will be described with reference to FIG. FIG. 6 is a sequence chart showing the procedure of the processing of the watching system.
 (ステップS110)
 図6に示すように、検出部10は、観察領域における対象者70の動き、または対象者70によるナースコールのイベントを検出する。そして、検出部10は、対象者70の動きから、転落、転倒、等の他のイベントの発生の有無を判定する。
(Step S110)
As illustrated in FIG. 6, the detection unit 10 detects a movement of the subject 70 in the observation area or a nurse call event by the subject 70. Then, the detection unit 10 determines whether or not another event such as a fall or a fall has occurred based on the movement of the target person 70.
 (ステップS120、S130)
 そしてイベントの発生を判定した場合、そのイベント情報をサーバー20に送信する。また、検出部10は、イベントが発生した時に観察領域を撮影することで得られた画像データを送信する。この場合、発生したイベントの種類が特定の種類、例えば転落、転倒、等の事故に繋がる可能性のイベントであれば、発生時点を含む発生時点前後の所定時間(例えば数秒~数十秒)の動画を送信し、その他の種類のイベントであれば発生時点の静止画を送信するようにしてもよい。なお、別の例として、検出部10は、撮影画像をサーバー20に連続して送信し続け、サーバー20側では、上書きさせながら所定時間分(例えば過去数時間分)だけ一時記憶部に記憶する。そしてサーバー20は、イベントが発生し、イベント情報を検出部10から受信するに応じて、一時記憶部から発生時点前後の静止画、または動画を読み出し、イベントに関連付けて保存するようにしてもよい。
(Steps S120, S130)
When it is determined that an event has occurred, the event information is transmitted to the server 20. In addition, the detecting unit 10 transmits image data obtained by photographing the observation region when an event occurs. In this case, if the type of event that has occurred is a specific type, for example, an event that may lead to an accident, such as a fall or a fall, a predetermined time (for example, several seconds to several tens of seconds) before and after the occurrence time including the occurrence time A moving image may be transmitted, and for other types of events, a still image at the time of occurrence may be transmitted. As another example, the detection unit 10 continuously transmits the captured image to the server 20, and the server 20 stores the captured image in the temporary storage unit for a predetermined time (for example, the past several hours) while overwriting. . Then, when an event occurs and the event information is received from the detection unit 10, the server 20 may read out a still image or a moving image before and after the occurrence from the temporary storage unit and store the image in association with the event. .
 (ステップS140)
 サーバー20は、イベントリストを更新する。図7は、記憶部23に記憶されるイベントリストの例である。イベントリストに含まれるイベントには、上述のように起床、離床、転落、転倒、およびナースコールが含まれる。ステップS120で送信された最新のイベントは、イベントリストの最後に追加される。
(Step S140)
The server 20 updates the event list. FIG. 7 is an example of an event list stored in the storage unit 23. The events included in the event list include wake-up, wake-up, fall, fall, and nurse call as described above. The latest event transmitted in step S120 is added to the end of the event list.
 図7に示すように、各イベントには、自動的に付与される主キーとなるイベントID、部屋番号、対象者、イベント種類、発生日時(時刻)、画像データ、および対応状況の情報が含まれる。画像データは、上述のとおり、ステップS130で送信された、イベントの発生時に検出部10により撮影して得られた静止画、または動画のデータである。対応状況は、ステイタス、対応スタッフ、および対応日時(時刻)が含まれる。対応状況については後述する。 As shown in FIG. 7, each event includes an event ID, a room number, a subject, an event type, an occurrence date and time (time), image data, and information on a corresponding state, which are automatically given as primary keys. It is. As described above, the image data is data of a still image or a moving image transmitted by step S130 and captured by the detection unit 10 when an event occurs. The response status includes a status, a response staff, and a response date and time (time). The correspondence status will be described later.
 図7に示す例では、イベントID010のイベントは転落であり、イベントID011、012のイベントは、ナースコールである。イベントID013~015は順に、一人の対象者70(Bさん)に連続して発生した、起床、離床、および転倒である。イベントID010~012のイベントに関する対応状況のステイタスは対応済みになっており、イベントID013~015は未対応である。 In the example shown in FIG. 7, the event with the event ID 010 is a fall, and the events with the event IDs 011 and 012 are nurse calls. The event IDs 013 to 015 are wake-up, wake-up, and fall that have occurred consecutively for one subject 70 (Mr. B). The status of the corresponding status regarding the events of the event IDs 010 to 012 has been handled, and the event IDs 013 to 015 are not supported.
 (ステップS150)
 サーバー20は、新たなイベントが検出されて追加される、またはイベントのステイタスが変更されることで、イベントリストが更新された場合には、更新後のイベントリストを、ログイン中の全てのスタッフ端末40に配信する。なお、このときの配信先は、イベントを発生させた対象者70を担当するスタッフ80が用いるスタッフ端末40にのみ配信するようにしてもよい。イベントリストを受信した各スタッフ端末40は、サーバー20からイベントリストを受信するに応じて、受領確認コマンドを返す(図示せず)。なお、サーバー20からのイベントリストの送信は、差分データのみを送信するようにしてもよい。例えば、直前に送信したイベントリストに対して変化した部分、すなわち追加または更新があったイベントのみをイベントリストとして全てのスタッフ端末40に送信する。
(Step S150)
When the event list is updated due to a new event being detected and added, or the status of the event being changed, the server 20 updates the updated event list with all the logged-in staff terminals. Deliver to 40. Note that the distribution destination at this time may be distributed only to the staff terminal 40 used by the staff 80 in charge of the target person 70 who has caused the event. Each staff terminal 40 that has received the event list returns an acknowledgment command in response to receiving the event list from the server 20 (not shown). The server 20 may transmit the event list by transmitting only the difference data. For example, only the part that has changed from the event list transmitted immediately before, that is, only the event that has been added or updated, is transmitted to all staff terminals 40 as the event list.
 (ステップS160)
 各スタッフ端末40は、サーバー20からイベントリストを受信するに応じて、表示部43の表示内容の更新を行う。
(Step S160)
Each staff terminal 40 updates the display content of the display unit 43 in response to receiving the event list from the server 20.
 図8は、スタッフ端末40の表示部43に表示されるイベントへの対応方針を入力する操作画面431、432の例である。同図に示す操作画面431は、イベントリストが一覧表示される画面である。表示するイベントは、フリック操作することで上下にスクロールさせることが可能である。 FIG. 8 is an example of the operation screens 431 and 432 for inputting a response policy to an event displayed on the display unit 43 of the staff terminal 40. The operation screen 431 shown in the figure is a screen on which an event list is displayed in a list. Events to be displayed can be scrolled up and down by flicking.
 操作画面431の領域a11、a12は、それぞれ、図7のイベントリストのイベントID011、012に対応する。領域a11、a12には、ナースコールであること示すアイコン、および対応済みであることが表示されている。領域a13は、図7の破線枠で囲むイベントID013~015に対応する。これらは、同一の対象者70(Bさん)に関するものであるため、同じ領域にまとめて表示される。また、未対応であることから、領域a13には、対応済みと区別し易いように、サムネイル画像を表示している。このサムネイル画像は、最新のイベント(ID015)に関する動画の画像i015(図7参照)から切り出した静止画から作成したものである。領域a14を操作することで、ログイン中の他のスタッフ80とテキストメッセージの交換を行える。制御部41は、操作画面431において、スタッフ80の領域a13へのクリック操作を受け付けることで、内容確認用の操作画面432に遷移させる。 The areas a11 and a12 of the operation screen 431 correspond to the event IDs 011 and 012 of the event list in FIG. 7, respectively. In the areas a11 and a12, an icon indicating that the call is a nurse call and that the corresponding call has been completed are displayed. The area a13 corresponds to the event IDs 013 to 015 surrounded by a broken line frame in FIG. Since these are related to the same subject 70 (Mr. B), they are displayed together in the same area. In addition, thumbnail images are displayed in the area a13 so as to be easily distinguished from corresponding areas since they are not supported. This thumbnail image is created from a still image cut out from the moving image i015 (see FIG. 7) relating to the latest event (ID015). By operating the area a14, a text message can be exchanged with another staff member 80 who is logging in. The control unit 41 makes a transition to the operation screen 432 for content confirmation by receiving a click operation on the area a13 of the staff member 80 on the operation screen 431.
 (ステップS170)
 スタッフ80は、スタッフ端末40に表示した操作画面432により、イベントの内容を確認する。操作画面432において、領域a21には、イベントを発生(判定)させた対象者70のID(名前:Bさん)、およびスタッフ端末40を使用するスタッフ80のID(名前:スタッフA)が表示されている。領域a22には、イベントの発生時刻からの経過時間が表示されている。なお、図7の破線枠内に示すように、対象者70(Bさん)については、起床、離床、転倒のイベントの発生を連続して判定し、共に未対応のステイタスであるので、この領域a22には、最新のイベント以外の「起床」、「離床」のイベントを表すアイコンが示されている。なお、領域a22に示す経過時間(「経過0分」)は、最新のイベントの発生を判定してからの経過時間(分未満を切り下げ)である。領域a23には、「転倒」を示す文字とアイコンが示されている。また領域a24には、転倒判定時の撮影画像のサムネイル画像が表示されている。スタッフ80は、サムネイル画像では情報として不十分で、さらに転倒を発生させた対象者70の状況を確認したいときは、下方にある「話す」ボタンや「見る」ボタンを操作する。「話す」ボタンを操作することで、音声入出力部15を通じて対象者70(Bさん)と通話できる。「見る」ボタンを操作することでカメラ13が撮影したライブ映像をストリーミング再生により視聴できる。
(Step S170)
The staff 80 checks the contents of the event on the operation screen 432 displayed on the staff terminal 40. In the operation screen 432, in the area a21, the ID (name: B) of the target person 70 that has caused (determined) the event and the ID of the staff 80 using the staff terminal 40 (name: staff A) are displayed. ing. In the area a22, the elapsed time from the event occurrence time is displayed. As shown in the broken-line frame in FIG. 7, for the target person 70 (Mr. B), the occurrence of an event of getting up, getting out of bed, or falling over is continuously determined, and both of them are unsupported. In a22, icons representing events “wake up” and “wake up” other than the latest event are shown. Note that the elapsed time (“elapsed 0 minutes”) shown in the area a22 is the elapsed time (decimal fractions are rounded down) after the occurrence of the latest event is determined. In the area a23, a character indicating “fall” and an icon are shown. In the area a24, a thumbnail image of the captured image at the time of the fall determination is displayed. The staff 80 operates the "talk" button and the "view" button below when the thumbnail image is insufficient as information and the user wants to confirm the situation of the subject 70 who has fallen. By operating the “talk” button, a call can be made with the target person 70 (Mr. B) through the voice input / output unit 15. By operating the “view” button, the live video captured by the camera 13 can be viewed by streaming playback.
 スタッフ80(スタッフA)は、領域a25の「対応する」ボタンを操作することで表示した転落(ID015)を、自らが率先して担当することに決定する。対応しない場合には,領域a26内の戻るボタン(三角アイコン)を操作することで、イベントリスト画面に戻る。なお、あるスタッフ80が、図8に示す操作画面432を表示し、転落等のイベントの状態を確認している最中は、サーバー20の制御により、他のスタッフ80への禁則処理がなされる。例えば、あるスタッフ80が対象者70(Bさん)のイベントの状態を確認している時に、他のスタッフ80が同じイベント(ID013~015)を選択したとしても、他のスタッフ80の操作画面431には、「状態確認中」の文字が表示され、「対応する」ボタンは表示されない、または選択できないようにする。 The staff 80 (staff A) decides to take charge of the fall (ID015) displayed by operating the “corresponding” button in the area a25. If not, the user returns to the event list screen by operating the return button (triangle icon) in the area a26. While one staff member 80 is displaying the operation screen 432 shown in FIG. 8 and checking the state of an event such as a fall, the prohibition process for another staff member 80 is performed under the control of the server 20. . For example, even if another staff 80 selects the same event (ID013-015) while the staff 80 is checking the status of the event of the subject 70 (Mr. B), the operation screen 431 of the other staff 80 is selected. Is displayed with the characters "Checking status", and the "Corresponding" button is not displayed or cannot be selected.
 (ステップS180、S190)
 スタッフ端末40は、「対応する」ボタンが操作されることに応じて、サーバー20に選択されたイベントに関する対応要求を送信し、これに応じて、サーバー20は承認通知を返信する。
(Steps S180, S190)
The staff terminal 40 transmits a response request for the selected event to the server 20 in response to the operation of the “correspondence” button, and in response, the server 20 returns an approval notification.
 (ステップS200)
 サーバー20は、ステップS180の処理に応じてイベントリストを更新する。具体的には、イベントリスト(ID013~015:転倒等)のステイタスを対応済みに変更する。本実施形態においては、スタッフ80が選択し、操作画面432に表示するイベントに対して上述の「対応する」ボタンを操作することに応じて、「対応済み」に変更している。すなわち、「対応する」ボタンの操作により、確認したことを示す情報である「対応済み」の対応状況が入力され、対応が行われたとみなして対応日時を記録する。より具体的には、図7のイベントリストにおいて、ステップS180、S190の処理が行われた時刻が、対応日時欄に記録される。なお、これに限られず、対応日時は、スタッフ80が、転落、転倒、ナースコール等のイベントを発生させた対象者70の部屋に入室した時刻を、対応日時欄に記録してもよく、別の操作画面により対応結果を入力することにより対応済みにし、入力時刻を対応日時欄に記録してもよい。また、サーバー20は、イベントリストを更新することに応じて、ログイン中の全てのスタッフ80に更新後のイベントリストを送信する。
(Step S200)
The server 20 updates the event list according to the processing in step S180. Specifically, the status of the event list (ID013-015: fall etc.) is changed to "corresponding". In the present embodiment, the status is changed to “corresponding” in response to the operation of the above-mentioned “corresponding” button for the event selected by the staff member 80 and displayed on the operation screen 432. In other words, by operating the "correspondence" button, a "corresponding" response status, which is information indicating that confirmation has been performed, is input, and the response date and time is recorded assuming that a response has been made. More specifically, in the event list of FIG. 7, the time at which the processes of steps S180 and S190 are performed is recorded in the corresponding date and time column. The correspondence date and time may be recorded in the response date and time column when the staff 80 enters the room of the subject 70 who has caused an event such as a fall, a fall, or a nurse call. May be completed by inputting the result of the operation on the operation screen of, and the input time may be recorded in the corresponding date and time column. In addition, in response to updating the event list, the server 20 transmits the updated event list to all the staff members 80 who have logged in.
 検出部10が発生を判定した事故に繋がる可能性のある所定のイベント、すなわち、転落、転倒のイベントの判定には、誤判定が含まれる。判定の基準を厳しくすると、実際に転落、転倒が発生したときに、判定漏れが生じる虞がある。一方で、判定の基準を緩くしすぎると、実際には、転落、転倒が生じていない場合に、転落、転倒のイベントの発生を判定するといった誤判定が発生しやすくなる。 所 定 The determination of a predetermined event that may lead to an accident for which the detection unit 10 has determined occurrence, that is, a fall or fall event, includes an erroneous determination. If the criterion for determination is strict, there is a possibility that the determination may be missed when a fall or a fall actually occurs. On the other hand, if the criterion for determination is too loose, erroneous determination such as determining the occurrence of a fall or fall event is likely to occur when no fall or fall has actually occurred.
 本実施形態では、判定漏れを極力無くすように判定基準を厳しくしていないため、誤判定がある程度含まれる。この誤判定か実際に発生したかの判別を、スタッフ80が、スタッフ端末40を用いて、状況を確認することにより行う。具体的には、スタッフ80は、対象者70の状況を、通話や撮影した画像により確認することで、誤判定の場合に、実際に居室に出向く対応が必要ないことを確認できる。また、実際に転落、転倒が発生したとしても、負傷が発生せず、特に対応が必要でない場合もあり、この場合も、対象者70との通話等により無事であることを確認することで、居室に出向く対応を省略できる。 (4) In the present embodiment, the criterion is not strict so as to minimize omission in the determination, and therefore, erroneous determination is included to some extent. The staff 80 determines whether the erroneous determination has actually occurred or not by using the staff terminal 40 to check the situation. Specifically, the staff 80 can confirm that the situation of the target person 70 is not required to actually go to the room in the case of an erroneous determination by confirming the situation of the target person 70 by using a call or a captured image. Also, even if a fall or a fall actually occurs, no injury occurs, and there is a case where no special action is required. In this case, by confirming that the call is safe with the subject 70, etc., It is not necessary to go to the living room.
 なお、本明細書においては、検出部10が転落、転倒、等のイベントの発生を判定した場合には、実際に転落、転倒、等が発生していなくても、発生件数にカウントする。換言すると、この発生件数は、転倒、転落等の発生が疑われる疑い件数とも言い得る。 In the present specification, if the detection unit 10 determines that an event such as a fall, a fall, or the like has occurred, the event is counted as the number of occurrences even if a fall, a fall, or the like has not actually occurred. In other words, the number of occurrences can be said to be the number of suspected occurrences of falls, falls, and the like.
 以上のようにして、転落、転倒等のイベントの発生を判定した場合に、そのイベントへの対応状況を含めたイベント情報をイベントリストに蓄積する。次にこの蓄積されたイベントリストを用いて、イベントの発生状況、および対応状況を分析し、分析結果が可視化されるレポートを作成し、出力する処理について説明する。 (4) When the occurrence of an event such as a fall or a fall is determined as described above, event information including the corresponding status of the event is accumulated in the event list. Next, a process of analyzing the event occurrence status and response status using the accumulated event list, creating a report in which the analysis result is visualized, and outputting the report will be described.
 (レポートの出力処理)
 図9は、第1の実施形態に係る見守りシステム1により実行されるレポートの出力処理の手順を示すフローチャートである。図9の処理は、主にサーバー20の制御部21が、記憶部23に記憶されたプログラムを実行することにより行われる。
(Report output processing)
FIG. 9 is a flowchart illustrating a procedure of a report output process performed by the watching system 1 according to the first embodiment. 9 is mainly performed by the control unit 21 of the server 20 executing a program stored in the storage unit 23.
 (ステップS310)
 サーバー20の制御部21は、記憶部23のイベントリストに、発生を判定したイベントのイベント情報を蓄積(追加)する。また、イベントが発生したときの画像データ(静止画または動画)を、イベントに関連付けて蓄積する。この蓄積処理は、上述した処理(図6)により行われる。
(Step S310)
The control unit 21 of the server 20 accumulates (adds) event information of the event whose occurrence has been determined to the event list of the storage unit 23. Further, image data (still image or moving image) at the time when the event occurs is stored in association with the event. This accumulation process is performed by the above-described process (FIG. 6).
 (ステップS320)
 制御部21は、記憶部23から事故に繋がる可能性のある転落、転倒のイベント情報を取得する。これは、例えば、固定端末30を通じたユーザー(スタッフ80、このスタッフ80を管理する管理者、等)からの要求により、開始する。ユーザーは、固定端末30を通じて、特定の対象者70を選択する。制御部21は、この指定された対象者70に関する転落、転倒のイベント情報を取得する。なお、他の事故に繋がるイベントとして、離床のイベントを含ませてもよい。例えば、ベッド60、テレビ等の居室内の家具の配置が変更されたことで、就寝時におけるベッド60内の対象者70の体勢が変わる場合がある。これにより、脚等の身体の一部がベッド60からはみ出ることが多発し、離床のイベントが多く判定されることがある。このような事象では、離床は、ベッドからの転落を引き起こし、間接的に事故に繋がるイベントとなる。
(Step S320)
The control unit 21 acquires, from the storage unit 23, event information of a fall or a fall that may lead to an accident. This is started, for example, by a request from a user (staff 80, an administrator who manages the staff 80, etc.) through the fixed terminal 30. The user selects a specific target person 70 through the fixed terminal 30. The control unit 21 acquires the fall / fall event information on the designated target person 70. An event of leaving the bed may be included as an event leading to another accident. For example, a change in the arrangement of furniture in the living room such as the bed 60 and the television may change the posture of the target person 70 in the bed 60 at bedtime. As a result, a part of the body such as the leg often protrudes from the bed 60, and a large number of leaving events may be determined. In such an event, leaving the bed causes a fall from the bed, which indirectly leads to an accident.
 (ステップS330)
 制御部21は、取得したイベント情報から、転落、転倒の件数を分析する。図10は、ある対象者70(例えばBさん)における、転落・転倒の分析結果を可視化した例を示す図である。同図は、1日(6時~翌日の5時(29時))の時間毎の転倒と転落の発生件数(発生件数)の合算の推移を示した表である。表中では、1回以上の転倒または転落のイベントが発生した時間帯を灰色で示している。
(Step S330)
The control unit 21 analyzes the number of falls and falls from the acquired event information. FIG. 10 is a diagram illustrating an example in which the analysis result of a fall / fall in a certain subject 70 (for example, Mr. B) is visualized. This figure is a table showing the transition of the total number of falls and the number of occurrences of the falls (the number of occurrences) for each hour from one day (6:00 to 5:00 (29:00) on the following day). In the table, the time zone in which one or more falls or a fall event has occurred is shown in gray.
 (ステップS340)
 制御部21は、転倒または転落のイベントの単位期間当たりの発生件数の増加があったか否かを判断する。この判断は、1時間単位で判断してもよく、1日単位で判断してもよい。1日単位で判断する場合、ある日の発生件数が、その前日、またはその日よりも前の数日の平均の発生件数に対して、所定倍数以上である場合に、増加があったと判断できる。図10に示す例では、10月21日の発生件数17件が、前2日間(20日、19日)の発生件数の平均値6.5件よりも2倍以上増加しているので、10月21日に発生件数の増加があったと判断できる。増加があれば、処理をステップS350に進める。一方で、増加がなければ処理をステップS360に進める。
(Step S340)
The control unit 21 determines whether the number of occurrences of a fall or a fall event per unit period has increased. This determination may be made on an hourly basis or on a daily basis. When the determination is made on a daily basis, when the number of occurrences on a certain day is equal to or more than a predetermined multiple of the average number of occurrences on the day before or on the days before the day, it can be determined that there is an increase. In the example shown in FIG. 10, the number of occurrences on October 21 is 17 times more than the average of the number of occurrences on the previous two days (20th and 19th) by 6.5 or more. It can be determined that the number of occurrences has increased on March 21. If there is an increase, the process proceeds to step S350. On the other hand, if there is no increase, the process proceeds to step S360.
 (ステップS350)
 制御部21は、増加時期の前後のイベントに関連付けられている画像データを取得して、ユーザー(スタッフ80、管理者、等)が使用する固定端末30のディスプレイに、この画像データに基づく画像を表示する。表示する画像は、動画(例えば数秒~数十秒)であるが、この動画から切り出した静止画であってもよい。
(Step S350)
The control unit 21 acquires image data associated with events before and after the increase time, and displays an image based on the image data on a display of the fixed terminal 30 used by a user (staff 80, manager, etc.). indicate. The displayed image is a moving image (for example, several seconds to several tens of seconds), but may be a still image cut out from the moving image.
 なお、このときに、記憶部23に記憶されている対象者70の情報をディスプレイに表示してもよい。この対象者70の情報には、例えば対象者70の歩行状態情報(自立、車椅子、歩行器)が含まれる。また、前後の画像を表示する際に、制御部21が画像を解析することで、居室内の家具等(ベッド、歩行器、杖)を認識し、前後の2つの画像で、家具等の配置の変化があれば、そのことを示すアノテーション画像(囲み枠、注意文、等)を表示画像に追加するようにしてもよい。 At this time, the information of the subject 70 stored in the storage unit 23 may be displayed on the display. The information on the target person 70 includes, for example, walking state information (independence, wheelchair, walker) of the target person 70. When displaying the images before and after, the control unit 21 analyzes the images to recognize furniture and the like (bed, walker, cane) in the living room, and arranges the furniture and the like in the two images before and after. If there is a change, an annotation image (encircling frame, notice, etc.) indicating the change may be added to the display image.
 (ステップS360)
 制御部21は、レポートに追加するコメントを受け付ける。例えば、固定端末30を通じて、ユーザーからのコメントを受け付ける。
(Step S360)
The control unit 21 receives a comment to be added to the report. For example, a comment from a user is received through the fixed terminal 30.
 ユーザーは、固定端末30のディスプレイに表示された増加時期の前後の2枚の画像を比較することで、得られる判断結果をコメントとして入力する。例えば、画像の比較により、以前の画像に比べて、転落、転倒のイベントの発生件数の増加が発生した時期以降の画像では、居室内の家具等の配置が変更されていれば、この家具等の配置の変更により、発生件数が増加したと推定できる。この家具等には、ベッド、テレビ、箪笥、等の固定家具のみならず、歩行器、車椅子、杖、等の歩行用の道具が含まれる。例えば、杖を置く位置を以前と異なる位置に変更したことにより、イベントの発生件数が増加する場合がある。 The user compares the two images before and after the increase time displayed on the display of the fixed terminal 30 and inputs the obtained judgment result as a comment. For example, in the image after the time when the number of occurrences of the falling and falling events has increased compared to the previous image by comparing the images, if the arrangement of the furniture or the like in the living room is changed, the furniture or the like is used. It can be estimated that the number of occurrences has increased due to the change in the arrangement of. The furniture and the like include not only fixed furniture such as a bed, a television, and a chest, but also a walking tool such as a walker, a wheelchair, and a cane. For example, the number of occurrences of the event may be increased by changing the position where the wand is placed to a position different from the previous position.
 また対象者70の情報、例えば対象者70の歩行状態情報(自立、車椅子、歩行器)と、実際に1人でいる時の歩行状態(撮影画像)を比較してもよい。具体的には、居室内で、1人の時に転落、転倒した際に撮影した画像から実際の歩行状態を判別する。そしてこれが歩行状態情報とは異なり、対象者70が本来行うべきではない危険な歩行をしていることが判明したら、これを転倒のイベントが増加した理由として挙げることができる。この危険な走行の例としては、歩行器を使う対象者70が、歩行器を使用せずに居室内を壁伝いで自立して歩行する場合がある。 Also, information on the subject 70, for example, walking state information (independence, wheelchair, walker) of the subject 70, and the walking state (photographed image) when actually alone may be compared. More specifically, the actual walking state is determined from an image taken when a person falls down and falls down alone in the living room. If this is different from the walking state information and it is found that the target person 70 is performing dangerous walking that should not be performed, this can be cited as the reason for an increase in the number of falling events. As an example of this dangerous traveling, there is a case where the target person 70 who uses a walker walks independently in a living room along a wall without using a walker.
 (ステップS370)
 制御部21は、分析結果に基づいてレポートを作成する。また、ステップS360でコメントを受け付けたなら、そのコメントをレポートに追加する。
(Step S370)
The control unit 21 creates a report based on the analysis result. If a comment is received in step S360, the comment is added to the report.
 (ステップS380)
 制御部21は、固定端末30のディスプレイに表示したり、プリンターから用紙に印刷したり、要求に応じた送信先アドレスにレポートのデータを送信したりすることで、レポートを出力する。
(Step S380)
The control unit 21 outputs a report by displaying it on the display of the fixed terminal 30, printing it on paper from a printer, or transmitting report data to a transmission destination address according to a request.
 図11は、可視化した分析結果を出力したレポートの例を示す図である。図11は、図10で示した分析結果を反映したものであり、ユーザーにより選択された対象者70(Bさん)に関する事故に繋がる可能性がある所定のイベントに関するレポートを示している。図11の上段の表は図10に示した図に対応している。中段の2枚の画像は、増加が発生した時期の前後の居室内の撮影画像である。下段のコメントは、ステップS360で受け付けたものである。 FIG. 11 is a diagram illustrating an example of a report that outputs a visualized analysis result. FIG. 11 reflects a result of the analysis shown in FIG. 10 and shows a report on a predetermined event that may lead to an accident regarding the target person 70 (Mr. B) selected by the user. The table in the upper part of FIG. 11 corresponds to the diagram shown in FIG. The two middle images are captured images of the living room before and after the time when the increase occurred. The comment at the bottom is the one received in step S360.
 このように第1の実施形態では、事故に繋がる可能性がある所定のイベントのイベント情報、および、このイベント情報に関連付けられた画像データを取得し、対象者について、イベントの発生件数、発生時刻、および画像データを用いて、イベントの発生状況を分析することで、分析結果に基づいてイベントの発生状況が可視化されたレポートを作成し、これを出力する。 As described above, in the first embodiment, the event information of a predetermined event that may lead to an accident and the image data associated with the event information are obtained, and the number of occurrences of the event, the occurrence time , And the image data is used to analyze the event occurrence status, thereby creating a report in which the event occurrence status is visualized based on the analysis result, and outputting this.
 これにより、入居者(対象者70)の日々の行動から検知した、転倒・転落件数、転倒・転落の画像を分析し、ヒヤリハットを取り出した結果をまとめたレポートを出力でき、このレポートは、事故に繋がる可能のあるイベントの発生の未然の防止に役立つ。 As a result, it is possible to analyze the images of the number of falls and falls and the falls and falls detected from the daily actions of the resident (target person 70) and output a report summarizing the results of taking out the near miss. To prevent the occurrence of an event that may lead to
 施設では対象者70が居室内でどのような行動をしているか把握できていない。すなわち、入口のドアが閉まっている状態では、スタッフ80は、室内の様子は認識できない。そのため、入居者の危険行動(転倒に繋がりそうな歩行など)があることが分からないので、事故が発生してからの対策検討になってしまう。 At the facility, it is not possible to grasp what the subject 70 is doing in the living room. That is, when the entrance door is closed, the staff 80 cannot recognize the state of the room. For this reason, it is not known that there is a dangerous behavior of the resident (such as walking that may lead to a fall), so that measures must be taken after an accident occurs.
 これに対して、本実施形態によれば、検出部10が検知した転倒・転落のイベントを分析することで、施設のスタッフ80が把握できていない入居者(対象者70)の居室内での危険行動を知ることができる。また、危険行動を減らすための対策(家具の配置変更等)を促すことで、転倒、転落を現象させ、これらによる事故を未然に防止できる。 On the other hand, according to the present embodiment, the fall / fall event detected by the detection unit 10 is analyzed, so that the resident (target person 70) of the resident (target person 70) whose facility staff 80 has not been able to grasp it. We can know dangerous behavior. Further, by encouraging countermeasures (replacement of furniture and the like) to reduce dangerous behavior, a fall or a fall can be caused, and an accident due to these can be prevented.
 (変形例に係る見守りシステム1でのレポートの出力処理)
 図10に示す実施形態においては、ユーザーからのコメントの入力を受け付けた。これに対して、以下に説明する変形例においては、制御部21側で、コメントを自動的に作成する。図12は、変形例におけるレポートの出力処理を示すフローチャートである。
(Output processing of a report in the watching system 1 according to the modified example)
In the embodiment shown in FIG. 10, an input of a comment from a user is received. On the other hand, in a modified example described below, a comment is automatically created on the control unit 21 side. FIG. 12 is a flowchart illustrating a report output process according to the modification.
 (ステップS410~S440)
 ここでは、制御部21は、図10に示したステップS310~S340と同様の処理を行う。イベントの発生件数の増加があれば、処理をステップS465に進める。
(Steps S410 to S440)
Here, the control unit 21 performs the same processing as steps S310 to S340 shown in FIG. If there is an increase in the number of events that have occurred, the process proceeds to step S465.
 (ステップS465)
 制御部21は、増加に関して注意を促すコメントを作成する。このコメントは、定型文を用いるものであり、増加があった前後の期間で、居室内の家具の配置等の転倒、転落の増加に結びつく変更がなされたかを確認するためのコメントを作成する。例えば図11のレポートのコメント欄は、このような場合に想定される、増加に関して注意を促すコメントである。このコメントの一部、または全部は、あらかじめ記憶部23に記憶されていてもよい。
(Step S465)
The control unit 21 creates a comment that calls attention to the increase. This comment uses a fixed phrase, and creates a comment for confirming whether or not a change has been made in the period before and after the increase in the arrangement of furniture in the living room, which has led to a fall or an increase in the fall. For example, the comment column of the report in FIG. 11 is a comment that calls attention to an increase assumed in such a case. Some or all of the comments may be stored in the storage unit 23 in advance.
 このように、制御部21側で、コメントを作成することによっても、第1の実施形態と同様の効果が得られるとともに、ユーザーがコメントを作成する手間を煩わせずに、レポートを作成できる。 As described above, by creating a comment on the control unit 21 side, the same effect as that of the first embodiment can be obtained, and a report can be created without the user having to perform the trouble of creating a comment.
 (他の変形例)
 上述した実施形態では、検出部10、サーバー20、固定端末30、およびスタッフ端末40をそれぞれ独立した別個の装置として説明した。しかしながら、これに限定されず、いくつかの構成を統合してもよい。例えばサーバー20の機能を固定端末30内に統合してもよい。
(Other modifications)
In the above-described embodiment, the detection unit 10, the server 20, the fixed terminal 30, and the staff terminal 40 have been described as independent devices. However, the present invention is not limited to this, and some configurations may be integrated. For example, the functions of the server 20 may be integrated in the fixed terminal 30.
 また、上述した実施形態に係る見守りシステム1における処理は、上述のシーケンチャート、またはフローチャートのステップ以外のステップを含んでもよく、あるいは、上述したステップのうちの一部を含まなくてもよい。また、ステップの順序は、上述した実施形態に限定されない。さらに、各ステップは、他のステップと組み合わされて一つのステップとして実行されてもよく、他のステップに含まれて実行されてもよく、複数のステップに分割されて実行されてもよい。 The processing in the watching system 1 according to the above-described embodiment may include steps other than the steps of the above-described sequence chart or flowchart, or may not include some of the above-described steps. Further, the order of the steps is not limited to the above-described embodiment. Further, each step may be executed as one step in combination with another step, may be executed by being included in another step, or may be executed by being divided into a plurality of steps.
 また、上述した実施形態に係る見守りシステム1における各種処理を行う手段および方法は、専用のハードウエア回路、またはプログラムされたコンピューターのいずれによっても実現することが可能である。上記制御プログラムは、例えば、USBメモリやDVD(Digital Versatile Disc)-ROM等のコンピューター読み取り可能な記録媒体によって提供されてもよいし、インターネット等のネットワークを介してオンラインで提供されてもよい。この場合、コンピューター読み取り可能な記録媒体に記録されたプログラムは、通常、ハードディスク等の記憶部に転送され記憶される。また、上記プログラムは、単独のアプリケーションソフトとして提供されてもよいし、一機能としてその検出部等の装置のソフトウエアに組み込まれてもよい。 The means and method for performing various processes in the watching system 1 according to the above-described embodiment can be realized by either a dedicated hardware circuit or a programmed computer. The control program may be provided by a computer-readable recording medium such as a USB memory or a DVD (Digital Versatile Disc) -ROM, or may be provided online via a network such as the Internet. In this case, the program recorded on the computer-readable recording medium is usually transferred and stored in a storage unit such as a hard disk. Further, the program may be provided as independent application software, or may be incorporated as one function into software of a device such as a detection unit.
 本出願は、2018年6月27日に出願された日本特許出願(特願2018-122274号)に基づいており、その開示内容は、参照され、全体として組み入れられている。 This application is based on a Japanese patent application filed on June 27, 2018 (Japanese Patent Application No. 2018-122274), the disclosure of which is incorporated herein by reference.
1 見守りシステム
10 検出部
 11 制御部
 12 通信部
 13 カメラ
 14 ナースコール部
 15 音声入出力部
20 サーバー
 21 制御部
 22 通信部
 23 記憶部
30 固定端末
40 スタッフ端末
 41 制御部
 42 無線通信部
 43 表示部
 44 入力部
 45 音声入出力部
50 ネットワーク
 51 アクセスポイント
70 対象者(被介護者)
80 スタッフ(介護者)
 
Reference Signs List 1 monitoring system 10 detection unit 11 control unit 12 communication unit 13 camera 14 nurse call unit 15 voice input / output unit 20 server 21 control unit 22 communication unit 23 storage unit 30 fixed terminal 40 staff terminal 41 control unit 42 wireless communication unit 43 display unit 44 Input unit 45 Voice input / output unit 50 Network 51 Access point 70 Target person (cared person)
80 Staff (caregiver)

Claims (9)

  1.  蓄積された、事故に繋がる可能性がある所定のイベントのイベント情報、および該イベントに関連付けられた画像データであって、観察領域における対象者の動きを検出する検出部の出力に基づいて、発生を判定した前記観察領域における前記対象者に関する所定の前記イベントを記録した前記イベント情報、および該イベントが発生した時点を含む所定期間の前記観察領域を撮影することで得られた前記画像データを取得する手順(a)と、
     前記対象者について、前記イベントの発生状況の情報を用いて、または、前記イベントの前記発生状況の情報、および前記画像データを用いて、前記イベントの発生状況を分析することで、分析結果が可視化されるレポートを作成する手順(b)と、
     前記手順(b)において作成されたレポートを出力する手順(c)と、
    を含む処理をコンピューターに実行させるための制御プログラム。
    The accumulated event information of a predetermined event that may lead to an accident and the image data associated with the event, which are generated based on the output of the detection unit that detects the movement of the subject in the observation area Acquiring the event information that records the predetermined event relating to the subject in the observation area determined as above, and acquiring the image data obtained by photographing the observation area for a predetermined period including the time when the event occurs (A)
    The analysis result is visualized by using the information on the occurrence status of the event or analyzing the occurrence status of the event using the information on the occurrence status of the event and the image data for the subject. (B) to create a report to be
    Outputting the report created in the step (b) (c);
    A control program that causes a computer to execute processing including
  2.  前記イベントの前記発生状況の情報には、前記イベントの発生件数、および発生時刻が含まれる、請求項1に記載の制御プログラム。 The control program according to claim 1, wherein the information on the occurrence status of the event includes the number of occurrences of the event and the occurrence time.
  3.  所定の前記イベントには、前記対象者の転倒を判定した転倒のイベント、または前記対象者の転落を判定した転落のイベントが含まれる、請求項1、または請求項2に記載の制御プログラム。 3. The control program according to claim 1, wherein the predetermined event includes a fall event that has determined that the subject has fallen or a fall event that has determined that the subject has fallen. 4.
  4.  前記対象者への対応を行うスタッフは、携帯端末を携帯し、
     前記イベントが発生することに応じて、前記スタッフが前記携帯端末を通じて入力した、前記イベントを確認したことを示す情報が、前記イベント情報に含まれる、請求項1から請求項3のいずれかに記載の制御プログラム。
    Staff responding to the target person carry a mobile terminal,
    The information according to any one of claims 1 to 3, wherein information indicating that the staff has confirmed the event, which is input by the staff through the mobile terminal in response to the occurrence of the event, is included in the event information. Control program.
  5.  前記手順(b)は、前記対象者について、前記イベントの単位期間当たりの発生件数の増加があった時期の前後の期間の前記画像データを、表示する手順(b1)と、
     ユーザーによるコメントの入力を受け付ける手順(b2)と、を含み、
     前記手順(c)では、前記手順(b2)で受け付けた、前記コメントを含む前記レポートを作成する、請求項1から請求項4のいずれかに記載の制御プログラム。
    A step (b1) of displaying the image data of the target person in a period before and after a time when the number of occurrences of the event per unit period is increased for the target person;
    Receiving a comment input by the user (b2).
    The control program according to any one of claims 1 to 4, wherein in the step (c), the report including the comment received in the step (b2) is created.
  6.  さらに、前記手順(a)で取得した前記イベント情報に基づいて、前記対象者について、前記イベントの単位期間当たりの発生件数の増加の発生有無を判定し、増加の発生を判定した場合には、前記対象者の前記イベントに関し、注意を促すコメントを作成する手順(d)を含み、
     前記手順(b)では、前記手順(d)において作成した前記コメントを含む前記レポートを作成する、請求項1から請求項4のいずれかに記載の制御プログラム。
    Further, based on the event information acquired in the step (a), the presence or absence of an increase in the number of occurrences of the event per unit period is determined for the target person, and when the increase is determined, A step (d) of creating a cautionary comment regarding the event of the target person;
    The control program according to claim 1, wherein in the step (b), the report including the comment created in the step (d) is created.
  7.  前記手順(b)で作成する前記レポートには、前記対象者について、前記イベントの単位期間当たりの発生件数の増加があった時期の前後の期間の前記画像データが含まれる、請求項1から請求項6のいずれかに記載の制御プログラム。 2. The report generated in the step (b), wherein the image data of the subject is included in a period before and after a time when the number of occurrences of the event per unit period is increased. 3. Item 7. The control program according to any one of Items 6.
  8.  蓄積された、事故に繋がる可能性がある所定のイベントのイベント情報、および該イベントに関連付けられた画像データであって、観察領域における対象者の動きを検出する検出部の出力に基づいて、発生を判定した前記観察領域における前記対象者に関する所定の前記イベントを記録した前記イベント情報、および該イベントが発生した時点を含む所定期間の前記観察領域を撮影することで得られた前記画像データを取得する手順(a)と、
     前記対象者について、前記イベントの発生状況の情報を用いて、または、前記イベントの前記発生状況の情報、および前記画像データを用いて、前記イベントの発生状況を分析することで、分析結果が可視化されるレポートを作成する手順(b)と、
     前記手順(b)において作成されたレポートを出力する手順(c)と、
    を含むレポート出力方法。
    The accumulated event information of a predetermined event that may lead to an accident and the image data associated with the event, which are generated based on the output of the detection unit that detects the movement of the subject in the observation area Acquiring the event information that records the predetermined event relating to the subject in the observation area determined as above, and acquiring the image data obtained by photographing the observation area for a predetermined period including the time when the event occurs (A)
    The analysis result is visualized by using the information on the occurrence status of the event or analyzing the occurrence status of the event using the information on the occurrence status of the event and the image data for the subject. (B) to create a report to be
    Outputting the report created in the step (b) (c);
    Report output method including.
  9.  蓄積された、事故に繋がる可能性がある所定のイベントのイベント情報、および該イベントに関連付けられた画像データであって、観察領域における対象者の動きを検出する検出部の出力に基づいて、発生を判定した前記観察領域における前記対象者に関する所定の前記イベントを記録した前記イベント情報、および該イベントが発生した時点を含む所定期間の前記観察領域を撮影することで得られた前記画像データを取得する取得部と、
     前記対象者について、前記イベントの発生状況の情報を用いて、または、前記イベントの前記発生状況の情報、および前記画像データを用いて、前記イベントの発生状況を分析することで、分析結果が可視化されるレポートを作成する作成部と、
     前記作成部において作成されたレポートを出力する出力部と、
    を備える、レポート出力装置。
     
    The accumulated event information of a predetermined event that may lead to an accident and the image data associated with the event, which are generated based on the output of the detection unit that detects the movement of the subject in the observation area Acquiring the event information that records the predetermined event relating to the subject in the observation area determined as above, and acquiring the image data obtained by photographing the observation area for a predetermined period including the time when the event occurs An acquisition unit that performs
    The analysis result is visualized by using the information on the occurrence status of the event or analyzing the occurrence status of the event using the information on the occurrence status of the event and the image data for the subject. A creating section for creating a report to be generated,
    An output unit that outputs the report created by the creation unit;
    A report output device comprising:
PCT/JP2019/016684 2018-06-27 2019-04-18 Control program, report output method, and report output device WO2020003704A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2020527229A JP7327396B2 (en) 2018-06-27 2019-04-18 Control program, report output method, and report output device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-122274 2018-06-27
JP2018122274 2018-06-27

Publications (1)

Publication Number Publication Date
WO2020003704A1 true WO2020003704A1 (en) 2020-01-02

Family

ID=68987024

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/016684 WO2020003704A1 (en) 2018-06-27 2019-04-18 Control program, report output method, and report output device

Country Status (2)

Country Link
JP (1) JP7327396B2 (en)
WO (1) WO2020003704A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018005413A (en) * 2016-06-29 2018-01-11 富士通株式会社 State specification device, state specification program, and state specification method
JP2018088238A (en) * 2016-11-18 2018-06-07 株式会社ベネッセスタイルケア Service support device, service support method, and program

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5379923B1 (en) 2013-04-18 2013-12-25 都子 秋葉 Care receiver information analysis support device and program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018005413A (en) * 2016-06-29 2018-01-11 富士通株式会社 State specification device, state specification program, and state specification method
JP2018088238A (en) * 2016-11-18 2018-06-07 株式会社ベネッセスタイルケア Service support device, service support method, and program

Also Published As

Publication number Publication date
JPWO2020003704A1 (en) 2021-07-15
JP7327396B2 (en) 2023-08-16

Similar Documents

Publication Publication Date Title
JPWO2017209094A1 (en) Watch system
JP2024038108A (en) Information processing equipment, monitoring system, and control program
JPWO2019216044A1 (en) System and system control method
JP7234948B2 (en) Monitoring system and display method of event list
JP6711474B2 (en) System, system control method, control program, and device including program
WO2020003715A1 (en) Report output program, report output method, and report output device
WO2020003704A1 (en) Control program, report output method, and report output device
US20210327244A1 (en) Assistance control method and assistance system
JP2019197262A (en) System and system control method
WO2019216045A1 (en) System and system control method
JP7268679B2 (en) Control program, report output method, and report output device
WO2020003705A1 (en) Control program, report output method, and report output device
JP2021176036A (en) Information processing device and information processing program
JP2020052808A (en) Supervision device, supervision system, supervision program, and supervision method
WO2019216066A1 (en) System and system control method
JP7268387B2 (en) Monitoring device and program for monitoring device
JP7259540B2 (en) Determination device, control program for determination device, and determination method
WO2020003714A1 (en) Report output program, report output method, and report output device
WO2019216058A1 (en) System and system control method
WO2019239716A1 (en) Report output program, report output method, and report output device
JP2022189269A (en) Information processing apparatus, information processing system, information processing program, and control method
JP2022190750A (en) Information processing apparatus, information processing system, information processing program, and control method
JP2021176035A (en) Information processing device and information processing program
JP2021176038A (en) Information processing device and information processing program
JPWO2019216057A1 (en) System and system control method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19826562

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020527229

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19826562

Country of ref document: EP

Kind code of ref document: A1