WO2020003704A1 - Programme de commande, procédé et dispositif de délivrance de rapport - Google Patents

Programme de commande, procédé et dispositif de délivrance de rapport Download PDF

Info

Publication number
WO2020003704A1
WO2020003704A1 PCT/JP2019/016684 JP2019016684W WO2020003704A1 WO 2020003704 A1 WO2020003704 A1 WO 2020003704A1 JP 2019016684 W JP2019016684 W JP 2019016684W WO 2020003704 A1 WO2020003704 A1 WO 2020003704A1
Authority
WO
WIPO (PCT)
Prior art keywords
event
information
subject
report
image data
Prior art date
Application number
PCT/JP2019/016684
Other languages
English (en)
Japanese (ja)
Inventor
寛 古川
武士 阪口
海里 姫野
恵美子 寄▲崎▼
遠山 修
藤原 浩一
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Priority to JP2020527229A priority Critical patent/JP7327396B2/ja
Publication of WO2020003704A1 publication Critical patent/WO2020003704A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/20ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management or administration of healthcare resources or facilities, e.g. managing hospital staff or surgery rooms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G12/00Accommodation for nursing, e.g. in hospitals, not covered by groups A61G1/00 - A61G11/00, e.g. trolleys for transport of medicaments or food; Prescription lists
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/22Social work or social welfare, e.g. community support activities or counselling services
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H15/00ICT specially adapted for medical reports, e.g. generation or transmission thereof
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M9/00Arrangements for interconnection not involving centralised switching

Definitions

  • the present invention relates to a control program, a report output method, and a report output device, and more particularly, to a control program, a report output method, and a report output device for outputting a report on a target person to be handled by staff.
  • Japan's life expectancy has been remarkably prolonged due to the improvement of living standards, improvement of sanitary conditions, and improvement of medical care standards following the postwar economic growth. For this reason, coupled with a decrease in the birth rate, the aging society has a high aging rate. In such an aging society, an increase in the number of cared people who need nursing care due to illness, injury, aging, and the like is expected.
  • a nurse call system disclosed in Patent Document 1 as such a system, it is determined from a captured image of a camera installed on a bed of a patient whether the patient has woken up or left the bed. I do.
  • the image captured by the camera is displayed on the nurse call master provided in the nurse station.
  • Patent Literature 2 discloses a medical nursing care system that determines whether or not a near-miss incident requiring a countermeasure has occurred.
  • a support device is disclosed. This medical care support device calculates the similarity (correction notification hit rate) between the case record input by the caregiver and the past near miss incident recorded in the database, and the correction notification hit rate is equal to or more than a predetermined value. Is identified as countermeasure information that needs improvement.
  • the nurse call system disclosed in Patent Literature 1 can detect the rising and leaving of a patient in a bed, but cannot prevent an accident or a near-miss. Further, the medical care support device disclosed in Patent Document 2 identifies an error in measures taken in the past by comparing the occurred near-miss with past cases, and identifies accidents and accidents. It does not prevent the occurrence of connected near misses.
  • the present invention has been made in view of the above circumstances, by analyzing an event that may lead to an accident, a control program capable of outputting a report useful for preventing the occurrence of the event, a report output method, And a report output device.
  • the staff responding to the target person carries a mobile terminal, Any of the above (1) to (3), wherein information indicating that the staff has confirmed the event, which is input by the staff through the mobile terminal in response to the occurrence of the event, is included in the event information.
  • the control program according to 1.
  • the procedure (b) includes displaying the image data of the target person before and after the time when the number of occurrences of the event per unit period has increased, and (b1). Receiving a comment input by the user (b2).
  • the control program according to any one of (1) to (4), wherein in the step (c), the report including the comment received in the step (b2) is created.
  • step (d) of creating a comment that calls attention to the event of the target person.
  • the report created in the step (b) includes the image data of the target person before and after the period when the number of occurrences of the event per unit period increases.
  • the control program according to any one of (1) to (6).
  • event information of a predetermined event that may lead to an accident and image data associated with the event information are acquired, and information on the occurrence status of the event or the
  • a report in which the analysis result is visualized is created and output. This makes it possible to output a report that is useful for preventing the occurrence of an event that may lead to an accident.
  • FIG. 3 is a block diagram illustrating a schematic configuration of a detection unit.
  • FIG. 2 is a block diagram illustrating a schematic configuration of a server. It is a block diagram which shows the schematic structure of a staff terminal. It is a sequence chart which shows the procedure of a process of a watching system.
  • 5 is an example of an event list stored in a storage unit. It is an example of an operation screen for confirming an event displayed on a staff terminal. 5 is a flowchart illustrating a procedure of a report output process according to the first embodiment.
  • FIG. 13 is a flowchart illustrating a procedure of a report output process in a modified example.
  • FIG. 1 is a diagram illustrating an entire configuration of a watching system according to the present embodiment
  • FIG. 2 is a diagram illustrating an example of a detection unit installed around a bed in a room of a subject.
  • the watching system 1 includes a plurality of detection units 10, a server 20, a fixed terminal 30, and one or more staff terminals 40.
  • the server 20 functions as a report output device. These are communicably connected to each other by wire or wireless via a network 50 such as a LAN (Local Area Network), a telephone network or a data communication network.
  • the network 50 may include a repeater that relays a communication signal, such as a repeater, a bridge, a router, or a cross-connect.
  • the staff terminal 40, the detection unit 10, the server 20, and the fixed terminal 30 are mutually connected by a network 50 such as a wireless LAN (for example, a LAN according to the IEEE 802.11 standard) including an access point 51. Connected for communication.
  • the watching system 1 is provided at an appropriate place according to the target person 70.
  • the target person 70 (also referred to as a watching target person or a care target person) may be, for example, a patient who needs nursing due to illness or injury, a cared person who needs nursing care due to a decline in physical ability due to aging, or a single person living alone. A single person.
  • the target person 70 may be a person who needs to be detected when a predetermined inconvenient event such as an abnormal state occurs in the person.
  • the watching system 1 is suitably installed in buildings such as hospitals, welfare facilities for the elderly, and dwelling units, depending on the type of the subject 70.
  • the watching system 1 is arranged in a facility including a plurality of rooms (rooms) where a plurality of subjects 70 enter and a plurality of rooms including a nurse station.
  • the detection unit 10 is arranged in each living room, which is the observation area of the subject 70.
  • the four detection units 10 are arranged in the rooms of the subjects 70, A, B, C, and D, respectively.
  • the bed 60 is included in the observation area of the detection unit 10.
  • Staff 80 also referred to as care staff or care staff
  • who provide nursing or care for the subject 70 carries a staff terminal 40 which is a portable terminal.
  • the server 20 may not be located at the nurse station, and may be an external server unit connected to the network 50.
  • the fixed terminal 30 may be omitted, and the server 20 or the staff terminal 40 may perform the function.
  • FIG. 3 is a block diagram illustrating a schematic configuration of the detection unit.
  • the detection unit 10 includes a control unit 11, a communication unit 12, a camera 13, a nurse call unit 14, and a voice input / output unit 15, which are interconnected by a bus.
  • the control unit 11 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), and a memory such as a ROM (Read Only Memory), and performs control and arithmetic processing of each unit of the detection unit 10 according to a program.
  • the control unit 11 may further include an HDD (Hard ⁇ Disk ⁇ Drive) as a memory.
  • the communication unit 12 is an interface circuit (for example, a LAN card) for communicating with another device such as the server 20, the fixed terminal 30, or the staff terminal 40 via the network 50.
  • an interface circuit for example, a LAN card
  • the camera 13 is arranged, for example, on the ceiling of a living room or on an upper part of a wall, and captures an area including the bed 60 of the subject 70 directly below as an observation area, and outputs a captured image (image data).
  • This photographed image includes a still image and a moving image.
  • the camera 13 is a near-infrared camera, a visible light camera may be used instead, or these may be used in combination.
  • the control unit 11 determines (recognizes) the occurrence of a predetermined action of the target person 70 from the image captured by the camera 13.
  • the predetermined actions to be determined include “wake up” rising from the bed 60, “leaving off” from the bed 60, “falling” falling down from the bed 60, and “falling down” falling on the floor or the like.
  • the control unit 11 detects an image silhouette (hereinafter, referred to as a “human silhouette”) from a plurality of captured images (moving images).
  • the human silhouette can be detected by, for example, extracting a range of pixels having a relatively large difference by a time difference method of extracting a difference between images whose shooting time is before and after.
  • the human silhouette may be detected by a background difference method that extracts a difference between the captured image and the background image. Whether the user is awake, awake, falls, or falls is recognized based on the posture of the subject 70 (for example, standing, sitting, lying down, etc.) based on the detected human silhouette, and the relative position with respect to an installed object such as a bed 60 in a living room. Is done.
  • Such recognition may be performed by a program processed by the CPU of the control unit 11, or may be performed by a built-in processing circuit.
  • the control unit 11 determines the type of event and whether or not the event has occurred.
  • the server 20 may perform all or most of these recognitions, and the control unit 11 may transmit only the captured image to the server 20.
  • the control unit 11 transmits a notification to the effect that the action (an event to be described later) has occurred to the server 20 or the like.
  • the staff 80 is a person who performs various responses to the target person 70 according to the business. Services can include medical services and nursing care services.
  • the work of the staff 80 is a care work for the target person 70
  • the corresponding contents regarding each event will be described. "Waking up” is determined as an event, and if the determination is within a predetermined time (wake-up time set at the facility (for example, 7 to 8 am), morning care is performed. This morning care includes face washing, tooth brushing assistance, denture wearing, changing clothes assistance, and the like.
  • a wheelchair transfer or walking assistance may be required.
  • an alert may be generated at a fixed time by the nurse call unit 14 or the like.
  • the nurse call unit 14 includes a push-button switch, and detects a nurse call (also called a care call) when the switch is pressed by the subject 70.
  • the nurse call may be detected by a voice microphone instead of the push button type switch.
  • the control unit 11 sends a notification (nurse call notification) indicating that there is a nurse call via the communication unit 12 and the network 50. It is transmitted to the server 20 or the like.
  • the voice input / output unit 15 is, for example, a speaker and a microphone, and enables voice communication by transmitting and receiving a voice signal to / from the staff terminal 40 or the like via the communication unit 12.
  • the voice input / output unit 15 may be connected to the detection unit 10 via the communication unit 12 as an external device of the detection unit 10.
  • the detection unit 10 also includes a Doppler shift type body motion sensor that transmits and receives microwaves toward the bed 60 and detects Doppler shifts of microwaves caused by body movements (for example, respiratory movements) of the subject 70. May be further provided.
  • the body movement sensor detects body movement of the chest (up and down movement of the chest) accompanying the breathing movement of the subject 70, and disorder of the cycle in the body movement of the chest and movement of the chest that is equal to or less than a preset threshold value. When the amplitude at is detected, it is recognized that there is a slight movement abnormality.
  • a nurse call by the nurse call unit 14 and a change in the state recognized by the detection unit 10 regarding the subject 70 are included in the staff 80 such as getting up, leaving the bed, falling, falling down, and abnormal body movement.
  • An event for which a notification (notification) is to be performed is called an event.
  • the detection unit 10 transmits (outputs) information on the event that has occurred and the captured image to the server 20.
  • FIG. 4 is a block diagram showing a schematic configuration of the server.
  • the server 20 includes a control unit 21, a communication unit 22, and a storage unit 23.
  • the server 20 may be provided in the same building as the living room for the subject 70, or may be provided in a remote location and connectable via a network.
  • the server 20 may be a cloud server virtually constructed by a plurality of servers arranged on a network such as the Internet.
  • the components are communicably connected to each other by a bus.
  • the storage unit 23 functions as a database, and stores various information regarding the event list, the target person 70, and the staff 80.
  • the control unit 21 obtains event information of a predetermined event that may lead to an accident by cooperating with the communication unit 22 and image data of a predetermined period including a time point at which an event associated with the event has occurred. Functions as a unit. Further, the control unit 21 functions as a creating unit that creates a report by analyzing the event occurrence status from the acquired data. The control unit 21 also functions as an output unit that outputs the created report in cooperation with the communication unit 22.
  • the other control unit 21 and communication unit 22 have the same functions as those of the configuration of the detection unit 10, and thus a detailed description is omitted.
  • the event list includes information on various events including a predetermined event (for example, a falling event or a falling event) that may lead to an accident.
  • the predetermined event that may lead to the accident is also called a near-miss event, and if this situation continues, an accident may occur.
  • the server 20 determines (identifies) which subject 70 the event, such as a nurse call, getting up, getting out of bed, falling, falling down, etc., detected by the detecting unit 10 is related to.
  • the target person 70 that is, the resident of the room
  • the determined event type is associated with the target person 70 and added to the event list in the storage unit 23.
  • the subject 70 may be determined by reading the IC tag with an RFID reader provided in each room.
  • the subject 70 may be determined by arranging the detection unit 10 for each bed 60.
  • the entry of each staff 80 into the room may be detected by bringing the staff terminal 40 carried by the staff 80 close to the RFID reader.
  • the fixed terminal 30 is a so-called PC (Personal Computer), and includes a control unit including a CPU, a RAM, and the like, a communication unit, a display unit, an input unit, and a voice input / output unit.
  • the fixed terminal 30 is arranged, for example, in a nurse station.
  • a user staff 80, an administrator who manages the staff 80, etc.
  • a printer (not shown) outputs the data to paper.
  • the report may include a comment input from the input unit while referring to the captured image displayed on the display unit (display) of the fixed terminal 30 as described later.
  • the staff 80 or technical staff associates the room number with the detection unit 10 when attaching the detection unit 10 to each room (living room) through the fixed terminal 30, or associates the detection unit 10 with the room 60 such as the bed 60.
  • the position information of the installation object that is, the outline information of the ceiling camera 13 viewed upward is calibrated and designated.
  • the identification information such as the name and ID number of the subject 70 who is hospitalized or resident is associated with each room number.
  • FIG. 5 is a block diagram illustrating a schematic configuration of the staff terminal 40.
  • the staff terminal 40 includes a control unit 41, a wireless communication unit 42, a display unit 43, an input unit 44, and an audio input / output unit 45, which are interconnected by a bus.
  • the control unit 41 includes a CPU, a RAM, a ROM, and the like as the same configuration as the control unit 11 of the detection unit 10.
  • the wireless communication unit 42 enables wireless communication using a standard such as Wi-Fi or Bluetooth (registered trademark), and wirelessly communicates with each device via the access point 51 or directly.
  • the display unit 43 and the input unit 44 are touch panels, in which a touch sensor as the input unit 44 is superimposed on the display surface of the display unit 43 formed of liquid crystal or the like.
  • the display unit 43 and the input unit 44 display various operation screens displaying a list of a plurality of events included in the event list to the staff 80, and receive various operations through the operation screen.
  • the voice input / output unit 45 is, for example, a speaker and a microphone, and enables voice communication by the staff 80 with another staff terminal 40 via the wireless communication unit 42.
  • the staff terminal 40 is a device that functions as a user interface of the watching system 1, and can be configured by a portable communication terminal device such as a tablet computer, a smartphone, or a mobile phone.
  • the staff 80 performs a login authentication process through the assigned staff terminal 40 at the start of the work.
  • the staff 80 inputs a staff ID and a password through the touch panel (display unit 43, input unit 44) of the staff terminal 40.
  • the staff terminal 40 transmits this to the server 20.
  • the server 20 matches the authentication information stored in the storage unit 23 and transmits an authentication result according to the authority of the staff 80 to the staff terminal 40, thereby completing the login authentication.
  • FIG. 6 is a sequence chart showing the procedure of the processing of the watching system.
  • Step S110 As illustrated in FIG. 6, the detection unit 10 detects a movement of the subject 70 in the observation area or a nurse call event by the subject 70. Then, the detection unit 10 determines whether or not another event such as a fall or a fall has occurred based on the movement of the target person 70.
  • Steps S120, S130 When it is determined that an event has occurred, the event information is transmitted to the server 20.
  • the detecting unit 10 transmits image data obtained by photographing the observation region when an event occurs.
  • a predetermined time for example, several seconds to several tens of seconds
  • a moving image may be transmitted, and for other types of events, a still image at the time of occurrence may be transmitted.
  • the detection unit 10 continuously transmits the captured image to the server 20, and the server 20 stores the captured image in the temporary storage unit for a predetermined time (for example, the past several hours) while overwriting. . Then, when an event occurs and the event information is received from the detection unit 10, the server 20 may read out a still image or a moving image before and after the occurrence from the temporary storage unit and store the image in association with the event. .
  • Step S140 The server 20 updates the event list.
  • FIG. 7 is an example of an event list stored in the storage unit 23.
  • the events included in the event list include wake-up, wake-up, fall, fall, and nurse call as described above.
  • the latest event transmitted in step S120 is added to the end of the event list.
  • each event includes an event ID, a room number, a subject, an event type, an occurrence date and time (time), image data, and information on a corresponding state, which are automatically given as primary keys.
  • the image data is data of a still image or a moving image transmitted by step S130 and captured by the detection unit 10 when an event occurs.
  • the response status includes a status, a response staff, and a response date and time (time). The correspondence status will be described later.
  • the event with the event ID 010 is a fall, and the events with the event IDs 011 and 012 are nurse calls.
  • the event IDs 013 to 015 are wake-up, wake-up, and fall that have occurred consecutively for one subject 70 (Mr. B).
  • the status of the corresponding status regarding the events of the event IDs 010 to 012 has been handled, and the event IDs 013 to 015 are not supported.
  • Step S150 When the event list is updated due to a new event being detected and added, or the status of the event being changed, the server 20 updates the updated event list with all the logged-in staff terminals. Deliver to 40. Note that the distribution destination at this time may be distributed only to the staff terminal 40 used by the staff 80 in charge of the target person 70 who has caused the event. Each staff terminal 40 that has received the event list returns an acknowledgment command in response to receiving the event list from the server 20 (not shown). The server 20 may transmit the event list by transmitting only the difference data. For example, only the part that has changed from the event list transmitted immediately before, that is, only the event that has been added or updated, is transmitted to all staff terminals 40 as the event list.
  • Step S160 Each staff terminal 40 updates the display content of the display unit 43 in response to receiving the event list from the server 20.
  • FIG. 8 is an example of the operation screens 431 and 432 for inputting a response policy to an event displayed on the display unit 43 of the staff terminal 40.
  • the operation screen 431 shown in the figure is a screen on which an event list is displayed in a list. Events to be displayed can be scrolled up and down by flicking.
  • the areas a11 and a12 of the operation screen 431 correspond to the event IDs 011 and 012 of the event list in FIG. 7, respectively.
  • an icon indicating that the call is a nurse call and that the corresponding call has been completed are displayed.
  • the area a13 corresponds to the event IDs 013 to 015 surrounded by a broken line frame in FIG. Since these are related to the same subject 70 (Mr. B), they are displayed together in the same area.
  • thumbnail images are displayed in the area a13 so as to be easily distinguished from corresponding areas since they are not supported. This thumbnail image is created from a still image cut out from the moving image i015 (see FIG. 7) relating to the latest event (ID015).
  • a text message can be exchanged with another staff member 80 who is logging in.
  • the control unit 41 makes a transition to the operation screen 432 for content confirmation by receiving a click operation on the area a13 of the staff member 80 on the operation screen 431.
  • Step S170 The staff 80 checks the contents of the event on the operation screen 432 displayed on the staff terminal 40.
  • the ID (name: B) of the target person 70 that has caused (determined) the event and the ID of the staff 80 using the staff terminal 40 (name: staff A) are displayed.
  • the elapsed time from the event occurrence time is displayed.
  • the target person 70 Mr. B
  • the occurrence of an event of getting up, getting out of bed, or falling over is continuously determined, and both of them are unsupported.
  • icons representing events “wake up” and “wake up” other than the latest event are shown.
  • the elapsed time (“elapsed 0 minutes”) shown in the area a22 is the elapsed time (decimal fractions are rounded down) after the occurrence of the latest event is determined.
  • a character indicating “fall” and an icon are shown in the area a24.
  • a thumbnail image of the captured image at the time of the fall determination is displayed.
  • the staff 80 operates the "talk” button and the "view” button below when the thumbnail image is insufficient as information and the user wants to confirm the situation of the subject 70 who has fallen.
  • a call can be made with the target person 70 (Mr. B) through the voice input / output unit 15.
  • the “view” button the live video captured by the camera 13 can be viewed by streaming playback.
  • the staff 80 decides to take charge of the fall (ID015) displayed by operating the “corresponding” button in the area a25. If not, the user returns to the event list screen by operating the return button (triangle icon) in the area a26. While one staff member 80 is displaying the operation screen 432 shown in FIG. 8 and checking the state of an event such as a fall, the prohibition process for another staff member 80 is performed under the control of the server 20. . For example, even if another staff 80 selects the same event (ID013-015) while the staff 80 is checking the status of the event of the subject 70 (Mr. B), the operation screen 431 of the other staff 80 is selected. Is displayed with the characters "Checking status", and the "Corresponding" button is not displayed or cannot be selected.
  • Steps S180, S190 The staff terminal 40 transmits a response request for the selected event to the server 20 in response to the operation of the “correspondence” button, and in response, the server 20 returns an approval notification.
  • Step S200 The server 20 updates the event list according to the processing in step S180. Specifically, the status of the event list (ID013-015: fall etc.) is changed to "corresponding". In the present embodiment, the status is changed to “corresponding” in response to the operation of the above-mentioned “corresponding” button for the event selected by the staff member 80 and displayed on the operation screen 432. In other words, by operating the "correspondence” button, a "corresponding" response status, which is information indicating that confirmation has been performed, is input, and the response date and time is recorded assuming that a response has been made. More specifically, in the event list of FIG. 7, the time at which the processes of steps S180 and S190 are performed is recorded in the corresponding date and time column.
  • the correspondence date and time may be recorded in the response date and time column when the staff 80 enters the room of the subject 70 who has caused an event such as a fall, a fall, or a nurse call. May be completed by inputting the result of the operation on the operation screen of, and the input time may be recorded in the corresponding date and time column.
  • the server 20 transmits the updated event list to all the staff members 80 who have logged in.
  • the determination of a predetermined event that may lead to an accident for which the detection unit 10 has determined occurrence, that is, a fall or fall event includes an erroneous determination. If the criterion for determination is strict, there is a possibility that the determination may be missed when a fall or a fall actually occurs. On the other hand, if the criterion for determination is too loose, erroneous determination such as determining the occurrence of a fall or fall event is likely to occur when no fall or fall has actually occurred.
  • the criterion is not strict so as to minimize omission in the determination, and therefore, erroneous determination is included to some extent.
  • the staff 80 determines whether the erroneous determination has actually occurred or not by using the staff terminal 40 to check the situation. Specifically, the staff 80 can confirm that the situation of the target person 70 is not required to actually go to the room in the case of an erroneous determination by confirming the situation of the target person 70 by using a call or a captured image. Also, even if a fall or a fall actually occurs, no injury occurs, and there is a case where no special action is required. In this case, by confirming that the call is safe with the subject 70, etc., It is not necessary to go to the living room.
  • the detection unit 10 determines that an event such as a fall, a fall, or the like has occurred, the event is counted as the number of occurrences even if a fall, a fall, or the like has not actually occurred.
  • the number of occurrences can be said to be the number of suspected occurrences of falls, falls, and the like.
  • event information including the corresponding status of the event is accumulated in the event list.
  • FIG. 9 is a flowchart illustrating a procedure of a report output process performed by the watching system 1 according to the first embodiment. 9 is mainly performed by the control unit 21 of the server 20 executing a program stored in the storage unit 23.
  • Step S310 The control unit 21 of the server 20 accumulates (adds) event information of the event whose occurrence has been determined to the event list of the storage unit 23. Further, image data (still image or moving image) at the time when the event occurs is stored in association with the event. This accumulation process is performed by the above-described process (FIG. 6).
  • Step S320 The control unit 21 acquires, from the storage unit 23, event information of a fall or a fall that may lead to an accident. This is started, for example, by a request from a user (staff 80, an administrator who manages the staff 80, etc.) through the fixed terminal 30. The user selects a specific target person 70 through the fixed terminal 30. The control unit 21 acquires the fall / fall event information on the designated target person 70.
  • An event of leaving the bed may be included as an event leading to another accident. For example, a change in the arrangement of furniture in the living room such as the bed 60 and the television may change the posture of the target person 70 in the bed 60 at bedtime. As a result, a part of the body such as the leg often protrudes from the bed 60, and a large number of leaving events may be determined. In such an event, leaving the bed causes a fall from the bed, which indirectly leads to an accident.
  • FIG. 10 is a diagram illustrating an example in which the analysis result of a fall / fall in a certain subject 70 (for example, Mr. B) is visualized.
  • This figure is a table showing the transition of the total number of falls and the number of occurrences of the falls (the number of occurrences) for each hour from one day (6:00 to 5:00 (29:00) on the following day).
  • the time zone in which one or more falls or a fall event has occurred is shown in gray.
  • Step S340 The control unit 21 determines whether the number of occurrences of a fall or a fall event per unit period has increased. This determination may be made on an hourly basis or on a daily basis. When the determination is made on a daily basis, when the number of occurrences on a certain day is equal to or more than a predetermined multiple of the average number of occurrences on the day before or on the days before the day, it can be determined that there is an increase. In the example shown in FIG. 10, the number of occurrences on October 21 is 17 times more than the average of the number of occurrences on the previous two days (20th and 19th) by 6.5 or more. It can be determined that the number of occurrences has increased on March 21. If there is an increase, the process proceeds to step S350. On the other hand, if there is no increase, the process proceeds to step S360.
  • Step S350 The control unit 21 acquires image data associated with events before and after the increase time, and displays an image based on the image data on a display of the fixed terminal 30 used by a user (staff 80, manager, etc.). indicate.
  • the displayed image is a moving image (for example, several seconds to several tens of seconds), but may be a still image cut out from the moving image.
  • the information of the subject 70 stored in the storage unit 23 may be displayed on the display.
  • the information on the target person 70 includes, for example, walking state information (independence, wheelchair, walker) of the target person 70.
  • the control unit 21 analyzes the images to recognize furniture and the like (bed, walker, cane) in the living room, and arranges the furniture and the like in the two images before and after. If there is a change, an annotation image (encircling frame, notice, etc.) indicating the change may be added to the display image.
  • Step S360 The control unit 21 receives a comment to be added to the report. For example, a comment from a user is received through the fixed terminal 30.
  • the user compares the two images before and after the increase time displayed on the display of the fixed terminal 30 and inputs the obtained judgment result as a comment. For example, in the image after the time when the number of occurrences of the falling and falling events has increased compared to the previous image by comparing the images, if the arrangement of the furniture or the like in the living room is changed, the furniture or the like is used. It can be estimated that the number of occurrences has increased due to the change in the arrangement of.
  • the furniture and the like include not only fixed furniture such as a bed, a television, and a chest, but also a walking tool such as a walker, a wheelchair, and a cane.
  • the number of occurrences of the event may be increased by changing the position where the wand is placed to a position different from the previous position.
  • walking state information independence, wheelchair, walker
  • the walking state photographed image
  • the actual walking state is determined from an image taken when a person falls down and falls down alone in the living room. If this is different from the walking state information and it is found that the target person 70 is performing dangerous walking that should not be performed, this can be cited as the reason for an increase in the number of falling events.
  • this dangerous traveling there is a case where the target person 70 who uses a walker walks independently in a living room along a wall without using a walker.
  • Step S370 The control unit 21 creates a report based on the analysis result. If a comment is received in step S360, the comment is added to the report.
  • Step S380 The control unit 21 outputs a report by displaying it on the display of the fixed terminal 30, printing it on paper from a printer, or transmitting report data to a transmission destination address according to a request.
  • FIG. 11 is a diagram illustrating an example of a report that outputs a visualized analysis result.
  • FIG. 11 reflects a result of the analysis shown in FIG. 10 and shows a report on a predetermined event that may lead to an accident regarding the target person 70 (Mr. B) selected by the user.
  • the table in the upper part of FIG. 11 corresponds to the diagram shown in FIG.
  • the two middle images are captured images of the living room before and after the time when the increase occurred.
  • the comment at the bottom is the one received in step S360.
  • the event information of a predetermined event that may lead to an accident and the image data associated with the event information are obtained, and the number of occurrences of the event, the occurrence time , And the image data is used to analyze the event occurrence status, thereby creating a report in which the event occurrence status is visualized based on the analysis result, and outputting this.
  • the fall / fall event detected by the detection unit 10 is analyzed, so that the resident (target person 70) of the resident (target person 70) whose facility staff 80 has not been able to grasp it.
  • FIG. 12 is a flowchart illustrating a report output process according to the modification.
  • Steps S410 to S440 the control unit 21 performs the same processing as steps S310 to S340 shown in FIG. If there is an increase in the number of events that have occurred, the process proceeds to step S465.
  • Step S465) The control unit 21 creates a comment that calls attention to the increase.
  • This comment uses a fixed phrase, and creates a comment for confirming whether or not a change has been made in the period before and after the increase in the arrangement of furniture in the living room, which has led to a fall or an increase in the fall.
  • the comment column of the report in FIG. 11 is a comment that calls attention to an increase assumed in such a case.
  • Some or all of the comments may be stored in the storage unit 23 in advance.
  • the detection unit 10, the server 20, the fixed terminal 30, and the staff terminal 40 have been described as independent devices.
  • the present invention is not limited to this, and some configurations may be integrated.
  • the functions of the server 20 may be integrated in the fixed terminal 30.
  • the processing in the watching system 1 may include steps other than the steps of the above-described sequence chart or flowchart, or may not include some of the above-described steps. Further, the order of the steps is not limited to the above-described embodiment. Further, each step may be executed as one step in combination with another step, may be executed by being included in another step, or may be executed by being divided into a plurality of steps.
  • the means and method for performing various processes in the watching system 1 can be realized by either a dedicated hardware circuit or a programmed computer.
  • the control program may be provided by a computer-readable recording medium such as a USB memory or a DVD (Digital Versatile Disc) -ROM, or may be provided online via a network such as the Internet.
  • the program recorded on the computer-readable recording medium is usually transferred and stored in a storage unit such as a hard disk.
  • the program may be provided as independent application software, or may be incorporated as one function into software of a device such as a detection unit.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Public Health (AREA)
  • Business, Economics & Management (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Medical Informatics (AREA)
  • General Business, Economics & Management (AREA)
  • Epidemiology (AREA)
  • Biomedical Technology (AREA)
  • Tourism & Hospitality (AREA)
  • Child & Adolescent Psychology (AREA)
  • Human Resources & Organizations (AREA)
  • Databases & Information Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Signal Processing (AREA)
  • Veterinary Medicine (AREA)
  • Nursing (AREA)
  • Data Mining & Analysis (AREA)
  • Economics (AREA)
  • Pathology (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Medical Treatment And Welfare Office Work (AREA)
  • Alarm Systems (AREA)

Abstract

La présente invention a pour objectif de délivrer un rapport utile afin d'empêcher l'apparition d'événements, en permettant de visualiser un résultat d'analyse d'état d'occurrence d'événement. Des informations d'événement se rapportant à un événement prédéfini qui peut conduire à un accident et des données d'image associées aux informations d'événement sont acquises, lesdites informations se rapportant à un état d'occurrence d'événement ou à un état d'occurrence d'événement et à des données d'image, par rapport à un sujet (70) et sont utilisées pour analyser l'état d'occurrence d'événement, après quoi un rapport permettant de visualiser les résultats d'analyse est créé et délivré en sortie.
PCT/JP2019/016684 2018-06-27 2019-04-18 Programme de commande, procédé et dispositif de délivrance de rapport WO2020003704A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2020527229A JP7327396B2 (ja) 2018-06-27 2019-04-18 制御プログラム、レポート出力方法、およびレポート出力装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018122274 2018-06-27
JP2018-122274 2018-06-27

Publications (1)

Publication Number Publication Date
WO2020003704A1 true WO2020003704A1 (fr) 2020-01-02

Family

ID=68987024

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/016684 WO2020003704A1 (fr) 2018-06-27 2019-04-18 Programme de commande, procédé et dispositif de délivrance de rapport

Country Status (2)

Country Link
JP (1) JP7327396B2 (fr)
WO (1) WO2020003704A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7551536B2 (ja) 2021-02-25 2024-09-17 株式会社フジタ ヒヤリ・ハットデータベース生成方法及びヒヤリ・ハットデータベース生成システム

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018005413A (ja) * 2016-06-29 2018-01-11 富士通株式会社 状況特定装置、状況特定プログラム、及び状況特定方法
JP2018088238A (ja) * 2016-11-18 2018-06-07 株式会社ベネッセスタイルケア サービス支援装置、サービス支援方法及びプログラム

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5379923B1 (ja) 2013-04-18 2013-12-25 都子 秋葉 被介護者情報分析支援装置およびプログラム

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018005413A (ja) * 2016-06-29 2018-01-11 富士通株式会社 状況特定装置、状況特定プログラム、及び状況特定方法
JP2018088238A (ja) * 2016-11-18 2018-06-07 株式会社ベネッセスタイルケア サービス支援装置、サービス支援方法及びプログラム

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7551536B2 (ja) 2021-02-25 2024-09-17 株式会社フジタ ヒヤリ・ハットデータベース生成方法及びヒヤリ・ハットデータベース生成システム

Also Published As

Publication number Publication date
JP7327396B2 (ja) 2023-08-16
JPWO2020003704A1 (ja) 2021-07-15

Similar Documents

Publication Publication Date Title
JPWO2017209094A1 (ja) 見守りシステム
JPWO2019216044A1 (ja) システム、およびシステムの制御方法
JP2024038108A (ja) 情報処理装置、見守りシステム、および制御プログラム
JP2020166873A (ja) システム、システムの制御方法、制御プログラム、制御プログラムを記録したコンピューター読み取り可能な記録媒体、および情報管理者端末
JP7234948B2 (ja) 見守りシステム、およびイベントリストの表示方法
US20210327244A1 (en) Assistance control method and assistance system
WO2020003704A1 (fr) Programme de commande, procédé et dispositif de délivrance de rapport
JP7259540B2 (ja) 判定装置、判定装置の制御プログラム、および判定方法
WO2020003715A1 (fr) Programme de délivrance de rapport, procédé de délivrance de rapport et dispositif de délivrance de rapport
WO2019216045A1 (fr) Système et procédé de commande de système
JP7268679B2 (ja) 制御プログラム、レポート出力方法、およびレポート出力装置
JP2020052808A (ja) 見守り装置、見守りシステム、見守りプログラム、および見守り方法
JP2019197262A (ja) システムおよびシステムの制御方法
WO2020003705A1 (fr) Programme de commande, procédé de sortie de rapport et dispositif de sortie de rapport
JP2021176036A (ja) 情報処理装置および情報処理プログラム
JP2020126553A (ja) 見守りシステム、および見守りシステムの制御プログラム
WO2019216066A1 (fr) Système et procédé de commande de système
JP7268387B2 (ja) 見守り装置および見守り装置用プログラム
WO2020003714A1 (fr) Programme, procédé et dispositif de délivrance de rapport
WO2019216058A1 (fr) Système et procédé de commande de système
WO2019239716A1 (fr) Programme, procédé et dispositif de délivrance de rapport
JP2022189269A (ja) 情報処理装置、情報処理システム、情報処理プログラムおよび制御方法
JP2021176035A (ja) 情報処理装置および情報処理プログラム
JP2021176038A (ja) 情報処理装置および情報処理プログラム
JPWO2019216057A1 (ja) システムおよびシステムの制御方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19826562

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020527229

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19826562

Country of ref document: EP

Kind code of ref document: A1