WO2020003758A1 - Programme de délivrance de rapport, procédé de délivrance de rapport et dispositif de délivrance de rapport - Google Patents

Programme de délivrance de rapport, procédé de délivrance de rapport et dispositif de délivrance de rapport Download PDF

Info

Publication number
WO2020003758A1
WO2020003758A1 PCT/JP2019/018630 JP2019018630W WO2020003758A1 WO 2020003758 A1 WO2020003758 A1 WO 2020003758A1 JP 2019018630 W JP2019018630 W JP 2019018630W WO 2020003758 A1 WO2020003758 A1 WO 2020003758A1
Authority
WO
WIPO (PCT)
Prior art keywords
event
staff
report
response
detected
Prior art date
Application number
PCT/JP2019/018630
Other languages
English (en)
Japanese (ja)
Inventor
寛 古川
武士 阪口
海里 姫野
恵美子 寄▲崎▼
遠山 修
藤原 浩一
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Priority to JP2020527253A priority Critical patent/JP7363780B2/ja
Publication of WO2020003758A1 publication Critical patent/WO2020003758A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/20ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management or administration of healthcare resources or facilities, e.g. managing hospital staff or surgery rooms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/22Social work or social welfare, e.g. community support activities or counselling services

Definitions

  • the present invention relates to a report output program, a report output method, and a report output device.
  • the present invention relates to a report output program, a report output method, and a report output device that output a report on staff corresponding to an event of a target person such as nursing care.
  • Japan's life expectancy has been remarkably prolonged due to the improvement of living standards, improvement of sanitary conditions, and improvement of medical care standards following the postwar economic growth. For this reason, coupled with a decrease in the birth rate, the aging society has a high aging rate. In such an aging society, an increase in the number of care-requirers and the like who need to take care of care and the like due to illness, injury, and aging is expected.
  • a schedule is set for responding to the target person, such as a care recipient, who needs to be dealt with by the staff, and the staff responds to the target person based on the schedule.
  • the staff member's mobile terminal or the like is notified, and a staff member capable of responding to the nurse call responds from the mobile terminal or the like to accept the response, and responds to the nurse call.
  • Patent Document 1 As a prior art for displaying the results of the staff's response to the target person, there is one described in Patent Document 1 below. That is, a terminal carried by a staff member visiting a care-receiver's home uses a terminal carried by a near-field communication (NFC) tag, which is a recording medium arranged at the visited site, and time information when the terminal is read, and the terminal. Receives the current position detected by. When the difference between the received visited position and the current position is within a predetermined range, the time information at the time of entering and leaving the care-receiver's house is recorded as the staff's action result. Then, a graph of the activity results of the staff, which indicates the time of visit to the care receiver's home, is displayed.
  • NFC near-field communication
  • Patent Literature 1 can display the time period during which the staff is visiting the care-receiver's house, but takes care of factors other than the time required to respond to the care-receiver. There is a problem that it is not possible to properly grasp the load on staff due to the response to the elderly.
  • the present invention has been made to solve such a problem. That is, a report output program, a report output method, and a report output capable of accurately and easily grasping the work situation of each staff by outputting a report in which the load of each staff by responding to the target person is appropriately visualized. It is intended to provide a device.
  • (C) outputting a report, and a report output program for causing a computer to execute a process including:
  • the computer has a communication unit, and the detection parameter is a response time from when the communication unit transmits the notification of the event to when the communication unit receives a response indicating acceptance of the event.
  • the number of steps of the staff who moved to respond to the event, the number of floors the staff moved up and down in the movement to respond to the event, the content of the event, and the staff required to respond to the event The report output program according to (1), which is at least one of business hours.
  • the step (a) selects and obtains a plurality of the detection parameters, and the step (b) performs a process for selecting the event for each staff member based on the obtained plurality of the detection parameters.
  • the step (b) calculates the coefficients corresponding to the detection parameters from the acquired detection parameters other than the business hours, and multiplies the obtained business hours by the coefficient.
  • the step (b) includes a result of visualizing, for each staff member, a sum of the loads according to all the events that have occurred during a predetermined period based on the acquired detection parameters.
  • the report output program according to any one of the above (1) to (4), which creates the report.
  • An acquisition unit that selects and acquires at least one of the detection parameters from a plurality of detection parameters related to the event of the subject corresponding to the staff detected and accumulated, and acquired by the acquisition unit. Based on the detection parameters, an operation unit that creates a report including a result of visualization of the load due to the response to the event for each staff, and an output unit that outputs the report created by the operation unit. Report output device to have.
  • the staff selects and acquires at least one detection parameter from among the detection parameters related to the event corresponding to the staff, and creates a report including a result of visualizing a load for each staff based on the acquired detection parameter. Output. This makes it possible to accurately and easily grasp the work status of each staff by appropriately visualizing the load of each staff caused by responding to the target person.
  • FIG. 3 is a block diagram illustrating a hardware configuration of a detection unit.
  • FIG. 3 is a block diagram illustrating a hardware configuration of a server. It is a figure showing the example of an event list. It is a figure showing an example of an additional information list. It is a figure showing an example of a load coefficient list. It is a figure showing an example of a load coefficient list. It is a figure showing an example of a load coefficient list. It is a figure showing an example of a load coefficient list. It is a figure showing an example of a load coefficient list. It is a figure showing an example of a load coefficient list.
  • FIG. 3 is a block diagram illustrating a hardware configuration of a detection unit.
  • FIG. 3 is a block diagram illustrating a hardware configuration of a server. It is a figure showing the example of an event list. It is a figure showing an example of an additional information list. It is a figure showing an example of a load coefficient list. It is a figure showing an example of a
  • FIG. 13 is a diagram illustrating an example of a report including a graph in which the total of the workload of each staff is visualized.
  • FIG. 3 is a block diagram illustrating a hardware configuration of an administrator terminal.
  • FIG. 2 is a block diagram illustrating a hardware configuration of the mobile terminal.
  • 6 is a flowchart illustrating an operation of the server.
  • FIG. 1 is a diagram illustrating an entire configuration of a watching system
  • FIG. 2 is a diagram illustrating an example of a detection unit installed around a bed in a room of a subject.
  • the watching system 1 includes a plurality of detection units 10, a server 20, an administrator terminal 30, and one or more staff terminals 40. These are communicably connected to each other by wire or wireless via a network 50 such as a LAN (Local Area Network), a telephone network or a data communication network.
  • the network 50 may include a repeater that relays a communication signal, such as a repeater, a bridge, a router, or a cross-connect.
  • the staff terminal 40 is connected to the detection unit 10, the server 20, the administrator terminal 30, and a network 50 such as a wireless LAN (for example, a LAN according to the IEEE 802.11 standard) including an access point 51. They are communicably connected to each other.
  • the server 20 functions as a report output device.
  • the watching system 1 is provided at an appropriate place according to the target person 70.
  • the target person 70 (watching target person) is, for example, a patient who needs nursing due to illness or injury, a cared person who needs nursing care due to a decline in physical ability due to old age, a single person living alone, or a hospital facility. Patients who are hospitalized.
  • the target person 70 may be a person who needs to be detected when a predetermined inconvenient event such as an abnormal state occurs in the person.
  • the watching system 1 is suitably installed in buildings such as welfare facilities for the elderly, hospitals, and dwelling units, depending on the type of the subject 70.
  • the watching system 1 is disposed in a facility building including a plurality of rooms (rooms) in which a plurality of subjects 70 enter and a plurality of rooms including a nurse station.
  • the detection unit 10 is arranged in each living room, which is the observation area of the subject 70.
  • the four detection units 10 are arranged in the rooms of the subjects 70, A, B, C, and D, respectively.
  • the bed 60 is included in the observation area of the detection unit 10.
  • the staff 80 who performs care (eg, care) for the target person 70 such as nursing or nursing carries the staff terminal 40 which is a portable terminal.
  • the server 20 may not be located at the nurse station, and may be an external server unit connected to the network 50.
  • FIG. 3 is a block diagram illustrating a hardware configuration of the detection unit.
  • the detection unit 10 includes a control unit 11, a communication unit 12, a camera 13, a nurse call unit 14, and a voice input / output unit 15, which are interconnected by a bus.
  • the control unit 11 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), and a memory such as a ROM (Read Only Memory), and performs control and arithmetic processing of each unit of the detection unit 10 according to a program.
  • the control unit 11 may further include an HDD (Hard ⁇ Disk ⁇ Drive) as a memory.
  • the communication unit 12 is an interface circuit (for example, a LAN card) for communicating with another device such as the server 20, the administrator terminal 30, or the staff terminal 40 via the network 50.
  • an interface circuit for example, a LAN card
  • the camera 13 is arranged, for example, on the ceiling of a living room or on an upper part of a wall, and captures an area including the bed 60 of the subject 70 as an observation area, and outputs a captured image (image data).
  • image data an image captured by the camera 13 is also simply referred to as a “photographed image”.
  • the captured image includes an image including the subject 70.
  • the captured image includes a still image and a moving image.
  • the camera 13 is a near-infrared camera, a visible light camera may be used instead, or these may be used in combination.
  • the control unit 11 recognizes the behavior of the target person 70 from the image captured by the camera 13.
  • the actions to be recognized include “wake up” from the bed 60, “leaving” away from the bed 60, “fall” falling off the bed 60, and “falling” falling on the floor or the like.
  • the control unit 11 detects a silhouette of an image (hereinafter, referred to as a “human silhouette”) from a plurality of captured images (moving images).
  • the human silhouette can be detected by, for example, extracting a range of pixels having a relatively large difference by a time difference method of extracting a difference between images whose shooting time is before and after.
  • the human silhouette may be detected by a background difference method that extracts a difference between the captured image and the background image. Whether the user is awake, awake, falls, or falls is recognized from the detected person silhouette based on the posture of the subject 70 (for example, standing, sitting, lying down, etc.) and the relative position with respect to the installed object such as the bed 60 in the living room. Is done.
  • Such recognition may be performed by a program processed by the CPU of the control unit 11, or may be performed by a built-in processing circuit.
  • the present invention is not limited to this, and the server 20 may perform all or most of these recognitions, and the control unit 11 may transmit only the captured image to the server 20.
  • the control unit 11 transmits a notification to the effect that an event has occurred to the server 20 or the like.
  • the staff 80 is a person who performs various responses to the target person 70 according to the business. Services can include medical services and nursing care services.
  • the work of the staff 80 is a care work for the target person 70
  • the corresponding contents regarding each event will be described. "Waking up” is determined as an event, and if the determination is within a predetermined time (wake-up time set at the facility (for example, 7 to 8 am), morning care is performed. This morning care includes face washing, tooth brushing assistance, denture wearing, changing clothes assistance, and the like.
  • a wheelchair transfer or walking assistance may be required.
  • an alert may be generated at a fixed time by the nurse call unit 14 or the like.
  • the nurse call unit 14 includes a push-button switch, and detects a nurse call when the switch is pressed by the subject 70. Nurse calls include care calls and the like. The nurse call may be detected by a voice microphone instead of the push button type switch.
  • the control unit 11 sends a notification (nurse call notification) indicating that there is a nurse call via the communication unit 12 and the network 50. It is transmitted to the server 20 or the like.
  • the voice input / output unit 15 is, for example, a speaker and a microphone, and enables voice communication by transmitting and receiving a voice signal to / from the staff terminal 40 or the like via the communication unit 12.
  • the voice input / output unit 15 may be connected to the detection unit 10 via the communication unit 12 as an external device of the detection unit 10.
  • the detection unit 10 also includes a Doppler shift type body motion sensor that transmits and receives microwaves toward the bed 60 and detects Doppler shifts of microwaves caused by body movements (for example, respiratory movements) of the subject 70. May be further provided.
  • the body movement sensor detects body movement of the chest (up and down movement of the chest) accompanying the breathing movement of the subject 70, and disorder of the cycle in the body movement of the chest and movement of the chest that is equal to or less than a preset threshold value. When the amplitude at is detected, it is recognized that there is a slight movement abnormality.
  • the event broadly includes an event in which the staff 80 should deal with the target person.
  • the event includes (1) an event (hereinafter, referred to as a “first event”) to be handled by the staff 80 by issuing (notifying) to the staff 80, and (2) being determined.
  • An event to be handled by the staff 80 at the time (hereinafter, referred to as “second event”) and (3) an event to be handled when the staff 80 notices (hereinafter, referred to as “third event”) are included.
  • the first event includes, for example, a nurse call, and getting up, getting out of bed, falling, falling, and abnormal body movement.
  • the second event includes, for example, a transfer to a wheelchair at the time of eating, bathing, changing clothes, and taking a walk.
  • the third event includes, for example, excretion care at the request of the subject 70 orally, measurement of body temperature when fever is suspected, and response to vomiting.
  • the detection unit 10 transmits (outputs) information on the detected event (hereinafter, also simply referred to as “event information”) and the captured image to the server 20.
  • FIG. 4 is a block diagram illustrating a hardware configuration of the server.
  • the server 20 includes a control unit 21, a communication unit 22, and a storage unit 23.
  • the server 20 may be provided in the same building as the living room for the subject 70, or may be provided in a remote location and connectable via a network.
  • the server 20 may be a cloud server virtually constructed by a plurality of servers arranged on a network such as the Internet.
  • the components are communicably connected to each other by a bus.
  • the control unit 21 functions as an acquisition unit in cooperation with the communication unit 22.
  • the control unit 21 functions as a calculation unit.
  • the control unit 21 also functions as an output unit in cooperation with the communication unit 22.
  • the storage unit 23 stores the program according to the present embodiment.
  • the above components of the server 20 are controlled by the control unit 21 according to a program.
  • the storage unit 23 stores and accumulates various kinds of information on the target person 70 and the staff 80.
  • the information on the subject 70 includes an event list including event information and a care record.
  • the information on the staff 80 includes an additional information list associated with the event information.
  • the additional information list the contents (type) of the event corresponding to the staff 80, the staff corresponding to the event, and a notification of the event (hereinafter, referred to as “event notification”) are transmitted to the staff terminal 40, and then the corresponding event is transmitted to the staff terminal 40.
  • the time until the response of acceptance of the response is received (hereinafter, also referred to as “response time”), the number of steps of the staff 80 who moved to respond to the event, and the movement of the staff 80 moving up and down in response to the event.
  • the number of floors and business hours, which are business hours required to respond to each event are registered.
  • the content of the event, the response time, the number of steps, the number of floors, the business hours, and the like constitute the detection parameters.
  • the additional information list further registers the event ID of the event for which the detection parameter has been detected. By registering the event ID in the additional information list, the additional information list is associated with the event list. Note that the information registered in the additional information list may be registered in the event list. In this case, the additional information list becomes unnecessary.
  • the other control unit 21 and communication unit 22 have the same functions as those of the configuration of the detection unit 10, and thus a detailed description is omitted.
  • FIG. 5 is a diagram showing an example of an event list.
  • FIG. 6 is a diagram illustrating an example of the additional information list.
  • event information of each event that has occurred is registered.
  • an image in which the action corresponding to the event is recognized by the detection unit 10 may be further registered as a tagged image.
  • the tagged image is a captured image associated with the event, for example, a captured image in which the event has been detected.
  • the server 20 may retain the tagged image without erasing it until instructed to do so.
  • an event ID, a room number, a subject, details of an event, and an occurrence date and time are registered as event information in the event list.
  • the name is described as identification information for specifying the target person, but the ID or the room number of the target person may be used.
  • Event IDs 110, 111, and 115 are events of the first event
  • event IDs 112 to 114 are events of the second event
  • event IDs 116 and 117 are events of the third event.
  • the event IDs 115 to 117 are a series of events. As a result of visiting the room of Mr. A in response to the nurse call of the event ID 115, it is necessary to respond to each excretion care event of the event IDs 116 to 117. This is an example.
  • the date and time of occurrence is detected as follows. That is, for the event of the first event, the event is detected by the detection unit 10 or the like, and the date and time when the server 20 receives the notification that the event has occurred is detected as the occurrence date and time. Regarding the event of the second event, acceptance of the response to the event is input by the staff 80 in the staff terminal 40, and the date and time when the server 20 receives the acceptance is detected as the occurrence date and time. As for the event of the third event, information on the occurrence of the event is input to the staff terminal 40, and the date and time when the server 20 receives the information is detected as the occurrence date and time. Note that the event occurrence date and time may be detected by other methods.
  • the event of the second event is scheduled to occur in a schedule table or the like. Therefore, the walking direction and the walking speed of the staff 80 are detected by using the GPS function and the like provided in the staff terminal 40, and the staff 80 moves toward the room of the subject 70 in charge at the scheduled event occurrence date and time.
  • the tendency of traveling straight at a constant speed is analyzed, it is determined that an event has occurred and responded according to a schedule, and the date and time set in the schedule can be detected as the occurrence date and time.
  • the contents of the event, the staff corresponding to the event, the response time, the number of steps, the number of floors, and the business hours are registered in the additional information list.
  • the content of the event is, for example, when the content of the event input by the staff 80 at the input unit 44 (see FIG. 10) of the staff terminal 40 is transmitted from the staff terminal 40 to the server 20 when responding to the event. , Are detected by the server 20.
  • the content of the event may be the content of the event detected by the detection unit 10 as described above. That is, for example, the content of an event such as wake-up, leaving the bed, falling, falling, and abnormal body movement can be detected by the detection unit 10.
  • an event notification is transmitted from the server 20 to the staff terminal 40.
  • the response time is, for example, from the time when the event notification is transmitted to the staff terminal 40 carried by the staff 80 to the time when the staff 80 receives a response of acceptance of the response to the event input at the input unit 44 of the staff terminal 40. This time is detected by measuring the time until a server 20 has a timer or the like.
  • the number of moving steps is measured by, for example, a step number sensor 46 (see FIG. 10) provided in the staff terminal 40, and is detected by the server 20 when the server 20 receives it from the staff terminal 40.
  • the number of steps traveled is detected by the detection unit 10 provided in the room of the subject 70 after the server 20 receives a response indicating acceptance of the event. Can be detected as the number of steps up to.
  • the staff 80 may be detected when the number of persons in the living room detected by the detection unit 10 changes from one person to two persons.
  • the number of steps from when the response to the acceptance of the response to the event is received by the server 20 to when the completion of the response to the event is input by the staff 80 at the staff terminal 40 and received by the server 20 May be detected as the number of steps.
  • the moving floor number is measured by the barometric pressure sensor 47 of the staff terminal 40, and is detected by the server 20 when the server 20 receives it from the staff terminal 40.
  • the number of moving floors can be measured by the barometer because when the staff 80 goes up, the barometric pressure decreases according to the number of floors, and when the staff 80 goes down, the barometric pressure increases according to the number of floors. From the time when the response for accepting the response to the event is input by the staff 80 at the staff terminal 40 and received by the server 20, the completion of the response to the event is input by the staff 80 at the staff terminal 40 and the server 20 It is detected as the time until reception.
  • the response time is not described for the meal event.
  • the meal event is the second event that the staff 80 should respond to at a fixed time, and the event notification is not transmitted when the event occurs. Because.
  • the event IDs 115 to 117 correspond to a series of events, and the staff 80 corresponds to the nurse call event of the event ID 115. This is because they have arrived at his room.
  • the information registered in the additional information list is used to calculate a load (hereinafter, also referred to as a “business load”) for each staff member 80 in response to each event, as described later.
  • a load hereinafter, also referred to as a “business load”
  • the server 20 uses the event information and the captured image received from the detection unit 10 alone or in cooperation with the detection unit 10 to generate an event such as wake-up, wake-up, fall, or nurse call that the detection unit 10 recognizes. It is determined (identified) which target person 70 is related. This determination is made by determining the target person 70 (that is, the resident of the room) associated with the room number where the detection unit 10 that has recognized the event is installed. When the target person 70 carries an IC tag, the determination may be made by reading the IC tag with an RFID reader provided in each living room. When a plurality of subjects 70 are present in one living room in a shared room or the like, the subject 70 may be determined by arranging the detection unit 10 for each bed 60.
  • the server 20 When the server 20 receives the event information from the detection unit 10, the server 20 transmits an event notification including the name of the subject 70 who caused the event, the room number, and the content of the event to the staff terminal 40. To notify the occurrence of the event, and instruct to respond to the event.
  • the server 20 calculates the work load for each staff member 80, and creates and outputs a report in which the work load is visualized for each staff member 80.
  • Reports can be generated and output periodically (eg, daily, or weekly or monthly). The report may be generated and output irregularly each time there is an instruction.
  • the report is transmitted as data to the administrator terminal 30 or the like to display the report on the display unit, or the data is transmitted to an image forming apparatus (not shown) connected to the server 200. And output the report as a printout.
  • the report data can be uploaded to a place (for example, a cloud server) where the report can be viewed from the administrator terminal 30 or the like via a browser. Including.
  • FIGS. 7A to 7D are diagrams showing examples of the load coefficient list.
  • the load coefficient list is a list in which coefficients for calculating the business load are registered for each detection parameter.
  • the load coefficient list is stored in the storage unit 23 of the server 20.
  • FIG. 7A to 7D show coefficients corresponding to the detection parameters of response time, number of steps, number of floors, and contents of the event.
  • the coefficient (business time coefficient) corresponding to the detection parameter is the business hours value (for example, 1 for one hour).
  • FIG. 7A shows a response time coefficient set for each response time.
  • FIG. 7B shows a moving step number coefficient set for each moving step number.
  • FIG. 7C shows the moving floor coefficient set for each moving floor.
  • FIG. 7D shows an event content coefficient set for each event content.
  • the business load for each event is calculated as the product of the business time required for the event and a coefficient corresponding to a detection parameter related to the event.
  • the work load for each event is calculated by the following equation.
  • the coefficient (business time coefficient) according to the detection parameter is the business hours value. Therefore, the task load for each event is calculated as the product of a coefficient corresponding to the task time required for the event (task time coefficient) and a coefficient related to the event and corresponding to a detection parameter other than the task time. You can also.
  • Work load work time (work time coefficient) x response time coefficient x moving step coefficient x moving floor coefficient x event content coefficient
  • the response time detection parameter is not registered for the event IDs 112 to 114, 116, and 117.
  • the work load is calculated by setting the coefficient to 1.
  • the coefficient of the unregistered detection parameter is included in the above equation for calculating the business load as a coefficient by which the business hours are multiplied. You may not.
  • the shorter the response time the larger the response time coefficient. That is, the shorter the response time, the greater the work load. This is because there is a correlation between the response time and the work load because the staff 80 needs to maintain a certain level of tension during working hours in order to respond to the event notification in a short time. Because it is possible.
  • the load for each staff 80 visualized in the report may be the sum of the business loads calculated for all the events corresponding to the staff 80 during a predetermined period (for example, one day).
  • FIG. 8 is a diagram showing an example of a report including a graph in which the total of the workload of each staff is visualized.
  • a report including a graph that is a result of visualizing the sum of the workloads calculated for all the events corresponding to the staff A, the staff B, and the staff C on one day for each staff 80 is shown. It is shown. According to FIG. 8, it is visualized that the load on the staff C is relatively large and the load on the staff B is relatively small so that it can be grasped at a glance.
  • the report may include a table in which the sum of the business loads calculated for all the events corresponding to each day for each staff member is indicated as a number.
  • the sum of the work loads calculated for all the events corresponding to each day by each staff is visualized so as to be recognizable for each staff by the size of the figure and the like. May be included in the report.
  • FIG. 9 is a block diagram illustrating a hardware configuration of the administrator terminal.
  • the administrator terminal 30 is a so-called PC (Personal Computer), and includes a control unit 31, a communication unit 32, a display unit 33, and an input unit 34, which are mutually connected by a bus.
  • the control unit 31 includes a CPU, a RAM, a ROM, and the like as the same configuration as the control unit 11 of the detection unit 10.
  • the communication unit 32 is an interface for various local connections such as a network interface for wired communication based on standards such as Ethernet (registered trademark) and an interface for wireless communication based on standards such as Bluetooth (registered trademark) and IEEE 802.11. The communication with each terminal connected to 50 is performed.
  • the display unit 33 is, for example, a liquid crystal display, and displays various information.
  • the input unit 34 includes a keyboard, a numeric keypad, a mouse, and the like, and inputs various information.
  • the administrator terminal 30 is used as a terminal for the administrator 90.
  • the manager 90 is, for example, a manager who supervises the staff 80.
  • the administrator terminal 30 accepts the specification of the period of occurrence of the event to be output in the report, and transmits it to the server 200.
  • the administrator terminal 30 can receive the specification of the event occurrence period as the specification of the event to be output in the report.
  • the occurrence period of the specified event may be one day (for example, one day yesterday).
  • the specified event occurrence period may be one week, one month, one year, or the like. For example, by specifying one week, an event that occurred in the past week becomes a target of the report.
  • a report is periodically (for example, every day, every week, or every year) output from the server 200, the administrator terminal 30 does not need to accept the specification of the event occurrence period to be reported. Good.
  • the administrator terminal 30 accepts the selection of the detection parameter used for calculating the workload and transmits the selection to the server 200.
  • the administrator terminal 30 displays the detection parameter selection candidates together with buttons for selecting each detection parameter, and can accept the detection parameter selected by the button as the detection parameter used for calculating the workload.
  • FIG. 10 is a block diagram illustrating a hardware configuration of the staff terminal.
  • the staff terminal 40 includes a control unit 41, a wireless communication unit 42, a display unit 43, an input unit 44, a voice input / output unit 45, a step count sensor 46, and a barometric pressure sensor 47, which are interconnected by a bus.
  • the control unit 41 includes a CPU, a RAM, a ROM, and the like as the same configuration as the control unit 11 of the detection unit 10.
  • the wireless communication unit 42 enables wireless communication using a standard such as Wi-Fi or Bluetooth (registered trademark), and wirelessly communicates with each device via the access point 51 or directly.
  • the display unit 43 and the input unit 44 are touch panels, in which a touch sensor as the input unit 44 is superimposed on the display surface of the display unit 43 formed of liquid crystal or the like.
  • Various instructions are displayed to the staff 80 by the display unit 43 and the input unit 44.
  • the display unit 43 and the input unit 44 display an operation screen on which an event notification is displayed, and accept various kinds of operations such as input of a response to acceptance of an event and input of a nursing record through the operation screen. I do.
  • the voice input / output unit 45 is, for example, a speaker and a microphone, and enables voice communication by the staff 80 with another staff terminal 40 via the wireless communication unit 42.
  • the staff terminal 40 can be configured by a portable communication terminal device such as a tablet computer, a smartphone, or a mobile phone.
  • the step number sensor 46 is configured by an acceleration sensor or the like, and detects the number of steps of the staff 80 carrying the staff terminal 40.
  • the barometric pressure sensor 47 is configured by a MEMS (Micro Electro Mechanical Systems) or the like, and detects a barometric pressure around a staff 80 carrying the staff terminal 40.
  • the detection unit 10, the server 20, the administrator terminal 30, and the staff terminal 40 may include components other than the above components, or may not include some of the above components. .
  • FIG. 11 is a flowchart showing the operation of the server. This flowchart can be executed by the control unit 21 according to a program stored in the storage unit 23 of the server 20.
  • the control unit 21 acquires business hours for each event of the target person 70 that has occurred during the predetermined period and that the staff 80 has dealt with (S101).
  • the control unit 21 can acquire the business hours of each event corresponding to each staff 80 by reading the business hours from the additional information list stored in the storage unit 23.
  • a report in which the load of staff C (see FIG. 6) in response to the event is visualized is output.
  • a report in which the load due to the response to the event for each staff 80 is visualized is output.
  • the control unit 21 selects and acquires, from the accumulated detection parameters, a detection parameter to be used for calculating the work load, for each event corresponding to the staff C (S102).
  • the selection of the detection parameter is executed, for example, according to the selection of the detection parameter received at the administrator terminal 30.
  • the control unit 21 can acquire the selected detection parameter by reading it from the additional information list stored in the storage unit 23. For example, the number of moving steps, the number of moving floors, and the content of the event can be selected and acquired from the detection parameters.
  • the description will be made on the assumption that all of the detection parameters registered in the additional information list have been selected.
  • the control unit 21 calculates a coefficient for each detection parameter acquired for each event corresponding to the staff C.
  • the control unit 21 calculates a business load for each event by multiplying the business time by a coefficient calculated from each detection parameter for each event (S103).
  • S103 The event ID 113 and the event ID 115 shown in FIG. 6 will be described as an example.
  • the detection parameters for the event ID 113 are that the content of the event is “meal”, the number of moving steps is “150 steps”, and the number of moving floors is “down one floor”.
  • the working time is 0.8 hours.
  • the coefficients calculated from the respective detection parameters are 1.4, 1.2, and 1.2 for the content of the event, the number of steps, and the number of floors, respectively. is there. Therefore, the business load for the event ID 113 is calculated to be 1.6128 (0.8 (hour) ⁇ 1.4 ⁇ 1.2 ⁇ 1.2) according to the above-described equation (1).
  • the business load is calculated by setting the coefficient to 1. Therefore, for the event ID 113, the business load is calculated by setting the response time coefficient to 1.
  • the detection parameters for the event ID 115 are that the content of the event is “nurse call”, the response time is “5 seconds”, the number of steps is “100”, and the number of floors is “down one floor”.
  • the working time is 0.1 hour. Therefore, the coefficients calculated from the respective detection parameters can be referred to as 1.0, 2.0, 1.2,. 2. For this reason, the work load for the event ID 115 is calculated as 0.288 (0.1 (hour) ⁇ 1.0 ⁇ 2.0 ⁇ 1.2 ⁇ 1.2) by equation (1).
  • the control unit 21 calculates the sum of the business loads of all the events corresponding to the predetermined period for each staff 80 (S104).
  • the staff C corresponds to the events of the event IDs 110, 113, and 115 to 117 during at least one day. For this reason, if the predetermined period is one day and the events corresponding to the predetermined period are only the events with event IDs 110, 113, and 115 to 117, the business load calculated for each event with these event IDs Is calculated as the sum of the business loads.
  • the control unit 21 creates a report including a diagram or a table in which the total (load) of the business loads is visualized for each staff 80 (S105).
  • the report may include a comment by the manager 90 on the work load of each staff member.
  • the comment can be input by the administrator 90 at the administrator terminal 30 and transmitted to the server 200, for example.
  • the control unit 21 outputs the created report (S106).
  • This embodiment has the following effects.
  • the detection parameter is at least one of the response time, the number of steps, the number of floors, the content of the event, and the business hours. This makes it possible to more appropriately and efficiently grasp the load of each staff.
  • calculate the coefficient corresponding to each detection parameter from the detection parameters other than the business hours calculate the load by multiplying the business time by the coefficient, and create a diagram or table in which the calculated load is visualized for each staff member. Create a report that includes This makes it possible to more easily grasp the load of each staff.
  • the configuration of the watching system described above is a main configuration in describing the features of the above embodiment, and is not limited to the above configuration, and can be variously modified within the scope of the claims.
  • the configuration provided in a general watching system is not excluded.
  • the function of the server 20 may be provided in the administrator terminal 30 or the detection unit 10.
  • the detection unit 10, the server 20, the administrator terminal 30, and the staff terminal 40 may each be configured by a plurality of devices, or any one of the plurality of devices may be configured as a single device.
  • steps may be omitted from the flowchart described above, and other steps may be added. Further, some of the steps may be executed simultaneously, or one step may be divided into a plurality of steps and executed.
  • the detection parameters may include parameters other than those exemplified in the embodiment.
  • the working hours may be set as the detection parameter, and the daytime coefficient may be set to 1 and the nighttime coefficient may be set to a value larger than 1.
  • the value of the business time is used for the business time.
  • the coefficient A value other than the business time value may be used as the (business time coefficient). For example, when the contribution of the business hours to the business load is increased, the coefficient may be set to 1.1 or the like for one business hour. When the contribution of the business hours to the business load is reduced, the coefficient can be set to 0.9 or the like for one business hour.
  • the means and method for performing various processes in the watching system 1 according to the above-described embodiment can be realized by either a dedicated hardware circuit or a programmed computer.
  • the above program may be provided by a computer-readable recording medium such as a USB memory or a DVD (Digital Versatile Disc) -ROM, or may be provided online via a network such as the Internet.
  • the program recorded on the computer-readable recording medium is usually transferred and stored in a storage unit such as a hard disk.
  • the program may be provided as independent application software, or may be incorporated as one function into software of a device such as a detection unit.

Landscapes

  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • General Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Primary Health Care (AREA)
  • Tourism & Hospitality (AREA)
  • General Health & Medical Sciences (AREA)
  • Child & Adolescent Psychology (AREA)
  • Marketing (AREA)
  • Public Health (AREA)
  • Epidemiology (AREA)
  • Biomedical Technology (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Medical Informatics (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Alarm Systems (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Abstract

Le problème décrit par la présente invention est de fournir un programme de délivrance de rapport avec lequel il est possible de déterminer avec précision et facilement les conditions de travail de chaque membre du personnel en émettant un rapport qui permet de visualiser de manière appropriée des charges qui ont été imposées sur chaque membre du personnel et qui ont été associées à des personnes sujets devant les réaliser. La solution selon l'invention porte sur un programme de délivrance de rapport permettant d'amener un ordinateur à effectuer un processus comprenant : une procédure (a) pour sélectionner et acquérir au moins un paramètre de détection parmi plusieurs paramètres de détection relatifs à des événements détectés et stockés qui ont été associés à des personnes sujets et qui ont été traités par un membre du personnel ; une procédure (b) pour créer, sur la base dudit paramètre de détection acquis, un rapport qui comprend les résultats de visualisation de charges qui ont été imposées à chaque membre du personnel et qui ont été associées à des événements de gestion ; et une procédure (c) pour délivrer le rapport créé.
PCT/JP2019/018630 2018-06-26 2019-05-09 Programme de délivrance de rapport, procédé de délivrance de rapport et dispositif de délivrance de rapport WO2020003758A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2020527253A JP7363780B2 (ja) 2018-06-26 2019-05-09 レポート出力プログラム、レポート出力方法、およびレポート出力装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-120747 2018-06-26
JP2018120747 2018-06-26

Publications (1)

Publication Number Publication Date
WO2020003758A1 true WO2020003758A1 (fr) 2020-01-02

Family

ID=68985630

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/018630 WO2020003758A1 (fr) 2018-06-26 2019-05-09 Programme de délivrance de rapport, procédé de délivrance de rapport et dispositif de délivrance de rapport

Country Status (2)

Country Link
JP (1) JP7363780B2 (fr)
WO (1) WO2020003758A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022087040A (ja) * 2020-11-30 2022-06-09 株式会社プロジェクト アイ 支援システムおよびコンピュータプログラム
JP7430927B2 (ja) 2022-03-30 2024-02-14 三栄通信工業株式会社 情報処理装置及びプログラム

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001325362A (ja) * 2000-05-15 2001-11-22 Hitachi Plant Eng & Constr Co Ltd 労働負荷管理システム
JP2009098979A (ja) * 2007-10-17 2009-05-07 Kagoshima Medical It Center Co Ltd 看護ケア量表示装置、看護ケア量表示プログラム及び看護ケア量の表示方法
JP2012168575A (ja) * 2011-02-09 2012-09-06 Nec Corp 業務負荷判定装置、業務負荷判定システム、業務負荷判定方法およびプログラム
JP2013168099A (ja) * 2012-02-17 2013-08-29 Osaka Gas Co Ltd 対象業務負荷評価システム
JP2015219551A (ja) * 2014-05-14 2015-12-07 株式会社シーイー・フォックス 医療情報処理システム、医療情報処理方法、および医療情報処理プログラム
JP2017191611A (ja) * 2012-11-30 2017-10-19 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America 情報処理方法
JP2018032226A (ja) * 2016-08-25 2018-03-01 株式会社ニコン 介護スケジュール管理装置

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004157614A (ja) 2002-11-01 2004-06-03 Advanced Telecommunication Research Institute International 行動分析装置
WO2017039018A1 (fr) 2015-09-03 2017-03-09 株式会社ニコン Dispositif de gestion de travail, système de gestion de travail et programme de gestion de travail

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001325362A (ja) * 2000-05-15 2001-11-22 Hitachi Plant Eng & Constr Co Ltd 労働負荷管理システム
JP2009098979A (ja) * 2007-10-17 2009-05-07 Kagoshima Medical It Center Co Ltd 看護ケア量表示装置、看護ケア量表示プログラム及び看護ケア量の表示方法
JP2012168575A (ja) * 2011-02-09 2012-09-06 Nec Corp 業務負荷判定装置、業務負荷判定システム、業務負荷判定方法およびプログラム
JP2013168099A (ja) * 2012-02-17 2013-08-29 Osaka Gas Co Ltd 対象業務負荷評価システム
JP2017191611A (ja) * 2012-11-30 2017-10-19 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America 情報処理方法
JP2015219551A (ja) * 2014-05-14 2015-12-07 株式会社シーイー・フォックス 医療情報処理システム、医療情報処理方法、および医療情報処理プログラム
JP2018032226A (ja) * 2016-08-25 2018-03-01 株式会社ニコン 介護スケジュール管理装置

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022087040A (ja) * 2020-11-30 2022-06-09 株式会社プロジェクト アイ 支援システムおよびコンピュータプログラム
JP7144883B2 (ja) 2020-11-30 2022-09-30 株式会社プロジェクト アイ 支援システムおよびコンピュータプログラム
JP2022168333A (ja) * 2020-11-30 2022-11-07 株式会社プロジェクト アイ 支援システムおよびコンピュータプログラム
JP7430927B2 (ja) 2022-03-30 2024-02-14 三栄通信工業株式会社 情報処理装置及びプログラム

Also Published As

Publication number Publication date
JPWO2020003758A1 (ja) 2021-07-08
JP7363780B2 (ja) 2023-10-18

Similar Documents

Publication Publication Date Title
WO2020003758A1 (fr) Programme de délivrance de rapport, procédé de délivrance de rapport et dispositif de délivrance de rapport
JPWO2017183603A1 (ja) 被監視者監視システムおよび被監視者監視方法
JP2018073376A (ja) 生活見守り装置
WO2020003715A1 (fr) Programme de délivrance de rapport, procédé de délivrance de rapport et dispositif de délivrance de rapport
JP2019197263A (ja) システム、およびシステムの制御方法
JP2019197262A (ja) システムおよびシステムの制御方法
JP2021176036A (ja) 情報処理装置および情報処理プログラム
WO2020003616A1 (fr) Programme, procédé, et dispositif de sortie de rapport
JP2021196889A (ja) ケア情報処理プログラム、ケア情報処理装置、およびケア情報処理システム
WO2020003714A1 (fr) Programme, procédé et dispositif de délivrance de rapport
WO2019138915A1 (fr) Dispositif de visualisation d'événement de soins, système de visualisation d'événement de soins et procédé de visualisation d'événement de soins
JP2021015497A (ja) 制御装置、制御プログラム、および制御方法
JP7467869B2 (ja) 制御プログラム、情報処理装置、および情報処理システム
JP7404682B2 (ja) 制御装置、制御プログラム、および制御方法
JP7251546B2 (ja) レポート出力プログラム、レポート出力方法、およびレポート出力装置
JP7484393B2 (ja) 情報処理装置および情報処理プログラム
JP2020052808A (ja) 見守り装置、見守りシステム、見守りプログラム、および見守り方法
JP7354634B2 (ja) 制御装置、制御プログラム、および制御方法
JP2023108725A (ja) 表示装置、システム、表示方法およびプログラム
JP7147787B2 (ja) 被監視者監視支援装置、被監視者監視支援方法、および、被監視者監視支援プログラム
JP7415434B2 (ja) 情報共有装置、情報共有システム、および情報共有プログラム
WO2020003706A1 (fr) Programme de commande, procédé de sortie de rapport et dispositif de sortie de rapport
WO2019216066A1 (fr) Système et procédé de commande de système
JP2021196937A (ja) 演算装置、制御プログラム、制御システム、および演算方法
JP2022189269A (ja) 情報処理装置、情報処理システム、情報処理プログラムおよび制御方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19826896

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020527253

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19826896

Country of ref document: EP

Kind code of ref document: A1