WO2019073735A1 - Monitored person monitoring assistance system and monitored person monitoring assistance method - Google Patents

Monitored person monitoring assistance system and monitored person monitoring assistance method Download PDF

Info

Publication number
WO2019073735A1
WO2019073735A1 PCT/JP2018/033594 JP2018033594W WO2019073735A1 WO 2019073735 A1 WO2019073735 A1 WO 2019073735A1 JP 2018033594 W JP2018033594 W JP 2018033594W WO 2019073735 A1 WO2019073735 A1 WO 2019073735A1
Authority
WO
WIPO (PCT)
Prior art keywords
detection result
seating
monitored
detection
monitored person
Prior art date
Application number
PCT/JP2018/033594
Other languages
French (fr)
Japanese (ja)
Inventor
保理江 大作
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Priority to JP2019547948A priority Critical patent/JP7137155B2/en
Publication of WO2019073735A1 publication Critical patent/WO2019073735A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G12/00Accommodation for nursing, e.g. in hospitals, not covered by groups A61G1/00 - A61G11/00, e.g. trolleys for transport of medicaments or food; Prescription lists
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G7/00Beds specially adapted for nursing; Devices for lifting patients or disabled persons
    • A61G7/05Parts, details or accessories of beds
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • G08B25/04Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using a single signalling line, e.g. in a closed loop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M11/00Telephonic communication systems specially adapted for combination with other electrical systems

Definitions

  • the present invention relates to a monitored person monitoring support system and a monitored person monitoring support method for supporting monitoring of a monitored person.
  • the fall detection device disclosed in Patent Document 1 includes a human body image including a partial image representing a human body to be detected, position coordinate information representing a position on the coordinate of the partial image, and the color image represented by the partial image.
  • the height of the human body is reduced by comparing the height information included in the size information with the height information included in the size information acquired in the past based on the size information indicating the size of the human body
  • a height detection unit that detects whether or not there is an appearance feature amount extraction unit that extracts an appearance feature amount based on the appearance of the human body image when the height decreases, and the appearance feature amount
  • the position coordinates The moving distance of the partial image included in the human body image is calculated based on the information and the position coordinate information acquired in the past, and the moving distance is compared with a threshold, and the human body is compared based on the comparison result
  • a wheelchair equipped with a seating sensor disclosed in Patent Document 2 is a wheelchair equipped with position detection means for detecting the position of the wheelchair and communication means for transmitting the detected position of the wheelchair to the control center. It has a seating sensor that detects that it is
  • the call system disclosed in Patent Document 3 includes a floor detection sensor for detecting an object present on a floor near a bed used by the monitored person, and the monitored person on the floor near the bed.
  • a human body detection sensor for detecting an object in the vicinity of the bed so as to detect the person to be monitored when standing up, and not to detect the person to be monitored when the person to be monitored falls from the bed
  • a notification device for notifying that the person has left the bed, and storage of notification reason information indicating the calling reason by distinguishing that the person to be monitored has moved away from the bed and that the person to be monitored has fallen from the bed
  • Storage device a timer unit for measuring time, and when the floor detection sensor detects an object, the notification device and the timer unit are operated, and the time measured by the timer unit is
  • the human body detection sensor detects an object before reaching the time
  • notification reason information indicating that the monitored person has moved away from the bed is stored in the storage device
  • the timer unit measures
  • a control unit for storing, in the storage device, notification
  • the floor detection sensor 1 is configured by an infrared sensor or the like, and when the monitored person using the bed tries to leave the bed, the monitored person's foot falls while the monitored person falls from the bed. It is installed in the position which can detect a to-be-monitored person's body etc. (the [0015] paragraph etc. of the said patent document 3).
  • the human body detection sensor 2 is configured by an infrared sensor or the like, and the body of the person to be monitored when the person using the bed stands on the floor near the bed where the floor detection sensor 1 is installed. A position where it can detect (for example, the head of the person being monitored) and can not detect the person's body when the person using the bed falls from the bed (Eg, paragraph [0018] of Patent Document 3).
  • the person to be monitored can be separated from the bed by making the arrangement height of the infrared sensor as the floor detection sensor different from the arrangement height of the infrared sensor as the human body detection sensor. It distinguishes having moved away and having said to-be-monitored person falling from a bed.
  • a person to be monitored such as a person to be cared or a person to be cared over falls
  • a person to be monitored such as a nurse or carer receives notification from the system such as toppling etc. Take measures such as visiting the person to be monitored who has been notified.
  • the notification is a false alarm that is determined by the system as a fall or the like that is not a fall of the monitored person, the supervisor's action is wasted and the labor is lost. turn into.
  • the system fails to detect the state of the fall or the like and a failure is not generated, the fall or the like is left unchecked, which may lead to a more serious situation. There is sex. Therefore, it is desirable that errors such as false alarms and false alarms are further suppressed and reported more appropriately.
  • the positional relationship in which the monitored person, the monitored person, and the bed overlap when viewed from the sensors when the observer works on the monitored person on the bed
  • the wheelchair with a seating sensor disclosed in the patent document 2 only reports the current position of the wheelchair and the presence or absence of seating (the paragraph [0012] of the patent document 2 and the like). Therefore, when the wheelchair with a seating sensor disclosed in Patent Document 2 is not seated, the wheelchair is not seated because it has moved from the wheelchair to another place, or is seated because it has fallen from the wheelchair or the like. It is not possible to distinguish whether it is not or not, so it is not possible to detect a fall or the like. For this reason, if it is determined that the seat falls down and the like when the seat sensor is not seated and a notification is issued, false alarms frequently occur.
  • the call system disclosed in the patent document 3 determines the fall of the person to be monitored by infrared sensors with different arrangement positions, for example, when the user is not in a standing position such as squatting to pick up a falling object, etc. In this case, it is determined that everything has fallen and there is a possibility that false alarms may occur frequently.
  • the present invention is an invention made in view of the above-described circumstances, and an object thereof is to provide a monitored person monitoring support system and a monitored person monitoring support method that can be issued more appropriately.
  • a monitored person monitoring support system and a monitored person monitoring support method reflecting one aspect of the present invention are provided corresponding to the monitored person, and the predetermined related to the monitored person is provided.
  • a sensor device for detecting an action a central processing unit communicably connected to the sensor device and managing a detection result received from the sensor device, and communicably connected to the central processing unit via the central processing unit
  • a terminal device for receiving and displaying the detection result is provided, the seating of the person to be monitored is detected, and the detection result is processed by a different processing method based on the seating detection result.
  • FIG. 8 is a flowchart showing an operation in processing of a detection result shown in FIG. 7 in the first embodiment.
  • 1st Embodiment it is a sequence diagram which shows operation
  • 1st Embodiment it is a sequence diagram which shows operation
  • 2nd Embodiment it is a flowchart which shows the operation
  • 2nd Embodiment it is a figure which shows another example of the monitoring information display screen displayed on the portable terminal device in the to-be-monitored person monitoring assistance system. It is a figure for demonstrating the modification in 2nd Embodiment.
  • 3rd Embodiment it is a sequence diagram which shows operation
  • the monitored person monitoring support system in the present embodiment is a system that supports monitoring of a monitored person (watching target person) Ob who is a monitoring target (watching target) to be monitored (should watch). .
  • the monitored person monitoring support system is provided corresponding to the monitored person Ob, and is connected communicably to the sensor device for detecting a predetermined action related to the monitored person Ob.
  • a central processing unit that manages detection results received from a sensor device, and a terminal unit communicably connected to the central processing unit and receiving and displaying the detection results via the central processing unit.
  • the monitored person monitoring support system further includes a seating detection unit that detects seating of the monitored person, and a processing method in which the detection result is different based on the seating detection result of the seating detection unit.
  • a detection result processing unit for processing in The terminal device may be one type of device, but in the present embodiment, the terminal device is two types of devices: a fixed terminal device and a portable terminal device.
  • the main difference between the fixed terminal device and the portable terminal device is that while the fixed terminal device is operated in a fixed manner, the portable terminal device is carried to a supervisor (service provider, user) such as a nurse or a caregiver, for example.
  • a supervisor service provider, user
  • FIG. 1 is a diagram showing a configuration of a monitored person monitoring support system in the embodiment.
  • FIG. 2 is a diagram showing a configuration of a sensor device in the monitored person monitoring support system.
  • FIG. 3 is a diagram showing an example of a setting screen for setting a bedding location area stored in the sensor device.
  • FIG. 4 is a diagram for explaining a method of detecting bed departure.
  • FIG. 4A shows a state of the monitored person Ob sitting at one end of the bedding BD
  • FIG. 4B shows a target image obtained by imaging the situation of FIG. 4A vertically downward from the ceiling.
  • FIG. 5 is a diagram for explaining the detection method of the fall.
  • FIG. 5A shows the subject Ob in the standing posture
  • FIG. 5B shows a target image obtained by imaging the situation of FIG. 5A vertically downward from the ceiling.
  • FIG. 5C shows the state of the watched person Ob in the sitting posture due to falling, falling, and
  • FIG. 5D shows a target image obtained by capturing the situation of FIG. 5C vertically downward from the ceiling.
  • FIG. 6 is a diagram showing a configuration of a management server device in the monitored person monitoring support system.
  • the monitored person monitoring support system MSa in the embodiment is a system for supporting monitoring of the monitored person Ob, and for example, as shown in FIG. 1, one or more sensor devices SUa (SUa-1 to SUa-4) , A management server apparatus SVa, a fixed terminal apparatus SPa, one or more portable terminal apparatuses TAa (TAa-1 and TAa-2), and a private branch exchange (PBX) CX, which include: It is communicably connected via a network (network, communication line) NW such as a LAN (Local Area Network) by wire or wirelessly.
  • NW network, communication line
  • the network NW may be provided with relays such as repeaters, bridges and routers for relaying communication signals.
  • the network NW may be provided with relays such as repeaters, bridges and routers for relaying communication signals.
  • the plurality of sensor devices SUa-1 to SUa-4, the management server device SVa, the fixed terminal device SPa, the plurality of portable terminal devices TAa-1 and TAa-2, and the private branch exchange CX are L2 switches.
  • a wired / wireless mixed LAN for example, a LAN according to the IEEE 802.11 standard
  • NW including the line concentrators (hub, HUB) LS and the access point AP.
  • the plurality of sensor devices SUa-1 to SUa-4, the management server SVa, the fixed terminal SPa, and the private branch exchange CX are connected to the concentrator LS, and the plurality of mobile terminals TAa-1 and TAa-2 are connected.
  • the network NW constructs a so-called intranet by using an internet protocol group such as a transmission control protocol (TCP) and an internet protocol (IP).
  • TCP transmission control protocol
  • IP internet protocol
  • the private branch exchange (line switching unit) CX is connected to the network NW, performs extension telephone control such as call origination, call reception, and telephone call between the portable terminal devices TAa, and implements the extension telephone of the portable terminal devices TAa, For example, it is connected to an outside telephone TL such as a fixed telephone or a mobile telephone through a public telephone network PN such as a fixed telephone network or a mobile telephone network, for example, to make, receive, and make calls between the outside telephone TL and the mobile terminal device TAa. , Etc. to control the outside line telephone and carry out the outside line telephone between the outside line telephone TL and the portable terminal device TAa.
  • the private branch exchange CX is, for example, a digital exchange, an IP-PBX (Internet Protocol Private Branch eXchange), or the like.
  • the monitored person monitoring support system MSa is disposed at an appropriate place according to the monitored person Ob.
  • the monitored person (watching target person) Ob is, for example, a person who needs nursing due to illness or injury, a person who needs care due to a decrease in physical ability or the like, or a single person living alone.
  • the person to be monitored Ob is a person who needs the detection when a predetermined adverse event such as an abnormal condition occurs in the person. preferable.
  • the person-to-be-monitored monitoring support system MSa is suitably disposed in buildings such as hospitals, welfare facilities for the elderly and dwelling units according to the type of the person to be monitored Ob.
  • the monitored person monitoring support system MSa is disposed in a building of a care facility provided with a plurality of living rooms RM in which a plurality of monitored persons Ob reside, and a plurality of rooms such as a nurse station. .
  • the sensor device SUa has a communication function and the like for communicating with other devices SVa, SPa, TAa via the network NW, detects a predetermined action in the monitored person Ob, and sends the detection result to the management server device SVa together with an image. It is an apparatus to transmit. Then, in the present embodiment, when the sensor device SUa detects the seating of the person to be monitored Ob and transmits the detection result to the management server device SVa, the detection result is a seating detection result of the seating detection unit. Process with different processing methods based on For example, as illustrated in FIG.
  • such a sensor device SUa includes an imaging unit 11, a seating detection unit 12, a sensor control processing unit (SU control processing unit) 13a, and a sensor communication interface unit (SU communication An IF unit 14 and a sensor storage unit (SU storage unit) 15 are provided.
  • the imaging unit 11 is connected to the SU control processing unit 13a, and according to the control of the SU control processing unit 13a, an image (image) showing the shape of the person to be measured Ob in order to detect a predetermined action on the person to be monitored Ob. Data) is generated.
  • the imaging unit 11 is arranged to be able to monitor the space (location space; room (room) RM of the arrangement location in the example shown in FIG. 1) where the monitored person Ob who is the monitoring target to be monitored is scheduled to be located
  • the location space is imaged from above as an imaging target, an image (image data) in which the imaging target is viewed is generated, and an image (target image) of the imaging target is output to the SU control processing unit 13a.
  • the imaging unit 11 may position the head of the monitored person Ob in the bedding (for example, a bed or the like) in which the monitored person Ob lays, since the probability of capturing the entire monitored object Ob of the monitored object is high. It arrange
  • the sensor unit SU uses the imaging unit 11 to obtain an image of the person to be monitored Ob, which is taken from above the person to be monitored Ob, preferably an image of the person to be monitored Ob, which is taken from directly above the planned head position.
  • Such an imaging unit 11 may be a device that generates an image of visible light, but in the present embodiment, it is a device that generates an infrared image so that the monitored person Ob can be monitored even in a relatively dark state.
  • such an imaging unit 11 forms an imaging optical system that forms an optical image of infrared in an imaging target on a predetermined imaging surface, and makes the light receiving surface coincide with the imaging surface.
  • An area image sensor arranged to convert an infrared optical image of the imaging target into an electrical signal, and data representing an infrared image of the imaging target by performing image processing on an output of the area image sensor
  • This digital infrared camera includes an image processing unit that generates image data.
  • the imaging optical system of the imaging unit 11 is preferably a wide-angle optical system (a so-called wide-angle lens (including a fisheye lens)) having an angle of view capable of imaging the entire room RM provided.
  • the imaging unit 11 may be a thermographic camera that generates a thermal distribution image of an imaging target.
  • the seating detection unit 12 is connected to the SU control processing unit 13a, and is arranged (for example, a chair, a wheelchair, and a toilet facility) where seating of the monitored person Ob is scheduled (estimated), and seating of the monitored person Ob Is a device that detects
  • the seating detection unit 12 outputs the seating detection result to the SU control processing unit 13a.
  • the seating detection unit 12 determines, for example, the presence or absence of seating in the monitored person Ob at predetermined time intervals (sampling intervals), and for each determination, "seating" or "no seating” is determined according to the determination. It may be output to the management server device SVa as a seating detection result.
  • the seating detection unit 12 determines the presence or absence of seating in the monitored person Ob at a predetermined time interval (sampling interval), and detects “seating” only when the seating of the monitored person Ob is detected. "Yes” may be output to the management server device SVa as the seating detection result. In this case, the SU control processing unit 13a can recognize the absence of seating by not inputting the presence of seating.
  • the seating detection unit 12 may be connected to the SU control processing unit 13a by wire, and may be connected to the SU control processing unit 13a by short distance wireless communication such as Bluetooth (registered trademark) standard, for example.
  • Such a seating detection unit 12 includes, for example, a contact-type seating sensor such as a pressure sensor or a vibration sensor.
  • the pressure sensor is disposed, for example, on a seating surface of a chair, a wheelchair, a toilet facility, or the like so as to detect pressurization (pressure increase) due to the seating of the monitored person Ob.
  • the vibration sensor is disposed, for example, on a seating surface of a chair, a wheelchair, a toilet facility or the like so as to detect a vibration caused by the seating of the person to be monitored Ob.
  • the seating detection unit 12 includes a non-contact seating sensor such as a heat sensor.
  • the heat sensor is disposed so as to detect heat (infrared radiation) emitted from the monitored person Ob in, for example, an arrangement area of a chair and a toilet facility.
  • the SU communication IF unit 14 is a communication circuit which is connected to the SU control processing unit 13a and performs communication according to the control of the SU control processing unit 13a.
  • the SU communication IF unit 14 generates a communication signal containing the data to be transferred, which is input from the SU control processing unit 13a, according to the communication protocol used in the network NW of the person-to-be-monitored monitoring support system MSa
  • the communication signal is transmitted to the other devices SVa, SPa, TAa via the network NW.
  • the SU communication IF unit 14 receives communication signals from the other devices SVa, SPa, TAa via the network NW, extracts data from the received communication signals, and the SU control processing unit 13a can process the extracted data. It converts into data of the following format and outputs it to the SU control processing unit 13a.
  • the SU communication IF unit 14 is configured to include, for example, a communication interface circuit according to the IEEE 802.11 standard or the like.
  • the SU storage unit 15 is a circuit which is connected to the SU control processing unit 13a and stores various predetermined programs and various predetermined data according to the control of the SU control processing unit 13a.
  • the various predetermined programs include, for example, an SU control program that controls each part of the sensor device SUa according to the function of each part, and an SU monitoring processing program that executes predetermined information processing related to monitoring of the monitored person Ob. Etc. control processing programs are included.
  • an action detection process program for detecting a predetermined action in the monitored person Ob and a detection result of the action detection process program are different processing methods based on the seating detection result of the seating detection unit 12 It includes a detection result processing program to be processed.
  • the various predetermined data include a sensor identifier (sensor ID) that is an identifier for identifying the sensor device SUa of the own device and identifying the sensor device SUa, and a communication address of the management server device SVa, etc. It contains data etc. necessary to execute the program.
  • the SU storage unit 15 includes, for example, a ROM (Read Only Memory) which is a non-volatile memory element, an Electrically Erasable Programmable Read Only Memory (EEPROM) which is a rewritable non-volatile memory element, or the like.
  • the SU storage unit 15 includes, for example, a RAM (Random Access Memory) serving as a working memory of a so-called SU control processing unit 13a which stores data and the like generated during execution of the predetermined program.
  • the SU control processing unit 13a controls each of the units 11, 12, 14, 15 of the sensor device SUa according to the function of each of the units 11, 12, 14, 15, and sets a predetermined set of observers Ob. It is a circuit for detecting an action and transmitting the detection result together with an image to the management server device SVa. Then, in the present embodiment, when the SU control processing unit 13a detects the seating of the person to be monitored Ob and transmits the detection result to the management server device SVa, the seating of the seating detection unit 12 is the detection result. Process with different processing methods based on detection results.
  • the SU control processing unit 13a includes, for example, a central processing unit (CPU) and peripheral circuits thereof.
  • the SU control processing unit 13a functionally includes a sensor-side control unit (SU control unit) 131, an action detection processing unit 132, and a detection result processing unit 133a by executing the control processing program.
  • the SU control unit 131 controls the respective units 11, 12, 14, 15 of the sensor unit SUa according to the functions of the respective units 11, 12, 14, 15, and controls the entire control of the sensor unit SUa.
  • the action detection processing unit 132 detects a predetermined action set in advance in the monitored person Ob. More specifically, in the present embodiment, the predetermined action is, for example, leaving the bed when the monitored person Ob leaves the bedding, falling down when the monitored person Ob falls from the bedding, and falling down the monitored person Ob. There are three actions of falling.
  • the action detection processing unit 132 detects, for example, bed departure, falling, and falling in the monitored person Ob based on the target image captured by the imaging unit 11.
  • the location area of the bedding BD a first threshold Th1 for determining the presence or absence of bed leaving, and a second threshold Th2 for determining the presence or absence of falling or falling are one of the various predetermined data.
  • the setting screen shown in FIG. 3 is displayed on the terminal devices SPa and TAa, for example, on the fixed terminal apparatus SPa, and four vertex positions in the location area of the bedding BD are displayed on the displayed setting screen. Coordinates on the target image are input and stored in the sensor unit SUa from the fixed terminal unit SPa via the management server unit SVa.
  • the target image generated by the imaging unit 11 is displayed on the fixed terminal apparatus SPa via the management server apparatus SVa.
  • the seating detection unit 12 is a non-contact seating sensor
  • the presence or absence of leaving of the bed is a region (overlap region out of the location region of the bedding BD in the person region HA extracted from the target image, as shown in FIG. ) It is determined based on the size of Hout.
  • the first threshold Th1 is a value for identifying the size of the protruding area before leaving the bed and the size of the protruding area after leaving the bed, and is appropriately set in advance from a plurality of samples. As shown in FIGS. 5B and 5D, the presence or absence of a fall or a fall is determined based on the time change of the position and the size of the monitored person Ob in the head region HD. For this reason, the second threshold value Th2 is used to identify whether the size of the head area HD in the standing position or the size of the head area HD in the sitting position (or lying position). The value is appropriately set in advance from a plurality of samples.
  • the action detection processing unit 132 extracts a moving subject region as a region (person region) HA of a person of the person to be monitored Ob from the target image, for example, by the background subtraction method or the frame subtraction method.
  • the action detection processing unit 132 performs, for example, a circular or elliptical Hough transform from the extracted person area (moving body area) HA, for example, a pattern matching using a prepared head model, for example, a head
  • the head region HD of the person to be monitored Ob is extracted by a neural network learned for detection.
  • the behavior detection processing unit 132 detects that the current state of the monitored person Ob is before bed leaving and the ratio of the overhang area HAout to the person area HA exceeds the first threshold Th1 ((HAout / HA)> Th1), it is determined that the bed has left, and the bed is detected.
  • Th1 the first threshold
  • the behavior detection processing unit 132 changes the position of the extracted head area HD (for example, the center position of the head area, etc.) from within the area where the bedding BD is located to outside the area where the bedding BD is located If the size of the extracted head region HD is less than or equal to the second threshold value Th2, it is determined that a fall has occurred, and the fall is detected.
  • the action detection processing unit 132 determines that the position of the extracted head area HD is in the living room RM excluding the area where the bedding BD is located, and the size of the extracted head area HD is the second If the threshold value Th2 or less, it is determined that a fall, to detect the fall.
  • the detection result processing unit 133 a processes the detection result of the action detection processing unit 132 by a different processing method based on the seating detection result of the seating detection unit 12. More specifically, in the present embodiment, in the first processing method, the detection result processing unit 133a causes the terminal devices SPa and TAa to display the detection result when the seating detection result is not the seating of the monitored person Ob. It processes, and when the said seating detection result is seating of the to-be-monitored person Ob, the said sensing result is processed by the 2nd processing method which is not displayed on terminal device SPa and TAa. More specifically, the second processing method is a processing method which does not transmit the detection result to the management server SVa.
  • the SU communication IF unit 14 manages a communication signal (first action detection notification communication signal) for notifying (notifying) the detection, which contains information indicating the content of a predetermined action in the detected monitored person Ob. It transmits to the server apparatus SVa.
  • the first action detection notification communication signal includes, for example, the sensor ID of the own machine, information indicating the content of a predetermined action in the detected monitored person Ob (in the embodiment, any one of bed leaving, falling and falling) And the target image used for the detection of the predetermined action.
  • the detection result processing unit 133a detects that the seating detection result indicates that the monitored person Ob is seated.
  • the first action detection notification communication signal is not transmitted.
  • the transmission process of the first action detection notification communication signal may be executed by the action detection processing unit 132, and in this case, the action detection processing unit when the seating detection result is not the seating of the monitored person Ob. If the process of transmitting the first action detection notification communication signal by 132 is permitted, and the seating detection result indicates that the monitored person Ob is seated, the first action detection notification communication signal by the action detection processing unit 132 Prohibits transmission processing.
  • first to fourth sensor devices SUa-1 to SUa-4 are shown, and the first sensor device SUa-1 is one of the monitored persons Ob, Mr. Ob-1
  • the second sensor device SUa-2 is disposed in a living room RM-2 (not shown) of Mr. B Ob-2 who is one of the monitored persons Ob, and is disposed in the living room RM-1 (not shown) of
  • the third sensor device SUa-3 is disposed in a room RM-3 (not shown) of Mr. C's Ob-3 who is one of the monitored persons Ob
  • the fourth sensor device SUa-4 is a monitored person. It is arranged in the room RM-4 (not shown) of Mr. D's Ob-4 who is one of the Obs.
  • the management server device SVa has a communication function of communicating with the other devices SUa, TAa, and SPa via the network NW, and receives notification of the predetermined action from the sensor device SUa, information related to monitoring of the monitored person Ob ( Manages monitoring information (in this embodiment, for example, the type of predetermined action detected by the sensor device SUa, an image of the monitored person Ob, and the time when the notification is received, etc.), and the predetermined action is a predetermined terminal Informs (re-informs, re-notifies, transmits) to the devices SPa, TAa, and provides the client with data according to the request of the client (terminal devices SPa, TAa, etc.
  • Manages monitoring information in this embodiment, for example, the type of predetermined action detected by the sensor device SUa, an image of the monitored person Ob, and the time when the notification is received, etc.
  • the predetermined action is a predetermined terminal Informs (re-informs, re-notifies, transmits) to the devices SP
  • such a management server device SVa includes a server side communication interface unit (SV communication IF unit) 21, a server side control processing unit (SV control processing unit) 22, and a server side storage unit. (SV storage unit) 23 is provided.
  • SV communication IF unit server side communication interface unit
  • SV control processing unit server side control processing unit
  • server side storage unit server side storage unit
  • the SV communication IF unit 21 is a communication circuit connected to the SV control processing unit 22 and performing communication in accordance with the control of the SV control processing unit 22.
  • the SV communication IF unit 21 includes, for example, a communication interface circuit conforming to the IEEE 802.11 standard or the like.
  • the SV storage unit 23 is a circuit which is connected to the SV control processing unit 22 and stores various predetermined programs and various predetermined data according to the control of the SV control processing unit 22.
  • the various predetermined programs include, for example, an SV control program for controlling each part of the management server SVa according to the function of each part, and SV monitoring processing for executing predetermined information processing related to monitoring of the monitored person Ob.
  • a control processing program such as a program is included.
  • the various predetermined data includes a server identifier (server ID) which is an identifier for identifying the management server SVa of the own machine and identifying the management server SVa, the monitoring information of the monitored person Ob, etc. It contains data etc. necessary to execute each program.
  • the SV storage unit 23 includes, for example, a ROM, an EEPROM, a RAM, and peripheral circuits thereof.
  • the SV control processing unit 22 controls the respective units of the management server device SVa according to the functions of the respective units, and when receiving the notification of the predetermined action from the sensor device SUa, manages the monitoring information related to the monitoring of the monitored person Ob Broadcast (re-notify, re-notify, send) the predetermined action to a predetermined terminal apparatus SPa, TAa, provide data to the client according to the client's request, and manage the entire monitored person monitoring system MSa It is a circuit to The SV control processing unit 22 is configured to include, for example, a CPU and its peripheral circuits.
  • the SV control processing unit 22 functionally includes a server-side control unit (SV control unit) 221 and a server-side monitoring processing unit (SV monitoring processing unit) 222 by executing the control processing program.
  • the SV control unit 221 controls the respective units 21 and 23 of the management server device SVa according to the functions of the respective units 21 and 23, and controls the overall control of the management server device SVa.
  • the SV monitoring processor 222 When the SV monitoring processor 222 receives a notification of the predetermined action from the sensor device SUa, the SV monitoring processor 222 manages monitoring information related to monitoring of the monitored person Ob and notifies the predetermined action to the predetermined terminal devices SPa and TAa. is there.
  • the SV monitoring processor 222 monitors the monitored person Ob accommodated in the received first activity detection notification communication signal. Are stored (recorded) in the SV storage unit 23. Then, the SV monitoring processing unit 222 is accommodated in the received first action detection notification communication signal to notify (re-notify, re-notify, transmit) the detection result notified by the first action detection notification communication signal.
  • the SV communication IF unit 21 transmits a communication signal (second action detection notification communication signal) containing information indicating the content of the predetermined action in the monitored person Ob to the predetermined terminal devices SPa and TAa.
  • the second activity detection notification communication signal includes, for example, a sensor ID, information representing the content of a predetermined activity in the monitored person Ob, which is accommodated in the received first activity detection notification communication signal (in the present embodiment, Any of bed leaving, falling and falling) and target images are stored.
  • the second action detection notification communication signal may be transmitted to the terminal devices SPa and TAa by broadcast communication, for example.
  • the second action detection notification communication signal may be transmitted only to the terminal devices SPa and TAa associated with the sensor device SUa of the transmission source that has transmitted the first action detection notification communication signal.
  • the SV storage unit 23 stores, in advance, the correspondence between the sensor device SUa (sensor ID) and the terminal devices SPa and TAa (terminal ID) associated with one to one or one to many.
  • the terminal ID (terminal identifier) is an identifier for identifying the terminal devices SPa and TAa and identifying the terminal devices SPa and TAa.
  • the management server SVa is further connected to the SV control processor 22 as required, for example, a server-side input unit (SV input unit for inputting various commands, various data, etc.) 24), the server side output unit (SV output unit) 25 that outputs various commands and various data input from the SV input unit 24, monitoring information related to monitoring of the monitored person Ob, etc., and data with external devices
  • a server side interface unit (SVIF unit) 26 or the like that performs input and output of
  • Such a management server device SVa can be configured, for example, by a computer with a communication function.
  • the management server device SVa corresponds to an example of a central processing unit.
  • the fixed terminal device SPa has a communication function of communicating with other devices SUa, SVa, and TA via the network NW, a display function of displaying predetermined information, and an input function of inputting predetermined instructions and data.
  • the target person monitoring support system MSa is configured to input predetermined instructions and data to be given to the sensor device SUa, the management server device SVa, and the portable terminal device TAa, or to display monitoring information obtained by the sensor device SUa.
  • the device functions as a user interface (UI) of Such fixed terminal device SPa can be configured, for example, by a computer with a communication function.
  • the mobile terminal device TAa has a communication function of communicating with the other devices SVa, SP and SUa via the network NW, a display function of displaying predetermined information, an input function of inputting predetermined instructions and data, and a voice call. It has a call function to be performed, etc., inputs predetermined instructions and data given to the management server SVa and the sensor SUa, displays the monitoring information obtained by the sensor SUa by notification from the management server SVa, etc.
  • Equipment for Such a portable terminal device TAa can be configured by, for example, a portable communication terminal device such as a so-called tablet computer, a smartphone, or a mobile phone.
  • FIG. 7 is a flowchart showing the operation of the sensor device.
  • FIG. 8 is a flowchart showing an operation in the action detection process shown in FIG.
  • FIG. 9 is a flow chart showing the operation in the fall detection processing shown in FIG.
  • FIG. 10 is a flowchart showing an operation in the process of detecting bed departure shown in FIG.
  • FIG. 11 is a flowchart showing an operation in the process of the detection result shown in FIG. 7 in the first embodiment.
  • FIG. 12 is a sequence diagram showing the operation of the monitored person monitoring system in the first embodiment.
  • FIG. 13 is a diagram showing an example of a monitoring information display screen displayed on the mobile terminal device in the monitored person monitoring support system in the first embodiment.
  • each of the devices SUa, SVa, SPa, and TAa performs initialization of necessary parts when the power is turned on, and starts its operation.
  • an SU control unit 131, an action detection processing unit 132, and a detection result processing unit 133a are functionally configured in the SU control processing unit 13a by execution of the control processing program.
  • an SV control unit 221 and an SV monitoring processing unit 222 are functionally configured in the SV control processing unit 22 by execution of the control processing program. Then, the sensor device SUa detects a predetermined operation in the monitored person Ob by operating as follows for each frame or every several frames.
  • the sensor unit SUa acquires an image (image data) for one frame from the imaging unit 11 as the target image by the SU control unit 131 of the SU control processing unit 13a (S1).
  • the sensor device SUa causes the action detection processing unit 132 of the SU control processing unit 13a to extract a person area HA in order to detect a predetermined action in the monitored person Ob based on the target image.
  • the sensor apparatus SUa further causes the behavior detection processing unit 132 to A head region HD is extracted from the extracted person region HA (S2).
  • the sensor device SUa causes the behavior detection processing unit 132 to execute a process of detecting a behavior to detect a predetermined behavior in the monitored person Ob based on the person area HA and the head area HD extracted in the process S2. (S3).
  • the action detection processing unit 132 executes a fall fall detection process S31 for detecting a fall or a fall in FIG.
  • the process S32 is executed.
  • the action detection processing unit 132 first determines whether or not the conditions of the fall and the fall are satisfied in FIG. 9 (S311). More specifically, the action detection processing unit 132 first performs the process S2 when the position of the head area HD extracted in the process S2 changes from the area where the bedding BD is located to the time when the area of the bedding BD is outside the area If the size of the head region HD extracted in the above is equal to or less than the second threshold value Th2, it is determined that a fall has occurred (Yes), the fall is detected (S312), and the process S31 is ended.
  • the action detection processing unit 132 determines that the position of the head area HD extracted in the process S2 is in the living room RM excluding the area where the bedding BD is located, and the size of the head area HD extracted in the process S2 is If the second threshold Th2 or less, it is determined that a fall (Yes), detects the fall (S312), and ends the process S31. On the other hand, when excluding these, the action detection processing unit 132 determines that neither falling nor falling has occurred (No), and ends the present process S31.
  • the action detection processing unit 132 first determines whether or not the state of the monitored person Ob is before bed leaving in FIG. 10 (S321).
  • the state of the monitored person Ob is stored in the SV storage unit 23 by a state variable or the like.
  • the action detection processing unit 132 sequentially executes each process of process S322, process S323 and process S324, and this process S32 Finish.
  • the behavior detection processing unit 132 ends the present process S32.
  • the behavior detection processing unit 132 determines whether or not the condition for bed departure is satisfied. More specifically, the behavior detection processing unit 132 first determines that the user has left the bed when the ratio of the protruding area HAout to the human area HA exceeds the first threshold Th1 ((HAout / HA)> Th1)) (Yes ), The bed leaving is detected (S323), the state of the person to be monitored Ob is updated (changed) after bed leaving (S324, the state variable ⁇ “ after bed leaving ”), and the process S32 is ended. On the other hand, in the case of excluding this, the behavior detection processing unit 132 determines that it is not bed departure (No), and ends the present process S32.
  • the state of the person to be monitored Ob is updated (changed) before leaving the bed (the state Variable ⁇ " Before getting out of bed ").
  • the sensor device SUa causes the detection result processing unit S3 of the SU control processing unit 13a to detect the detection result by the action detection process S3, and the seating detection result of the seating detection unit 12
  • the processing of detection results to be processed by different processing methods is executed based on (S4), and the present processing is terminated.
  • the detection result processing unit 133a first obtains the seating detection result of the seating detection unit 12 (S41).
  • the detection result processing unit 133a determines whether or not the monitored person Ob is seated (S42). As a result of the determination, if the seating detection result acquired in the processing S41 indicates that the monitored person Ob is seated (Yes), the processing S43 is executed, and the processing S4 is ended. On the other hand, as a result of the determination, when the seating detection result acquired in the processing S41 indicates no seating of the monitored person Ob (No), the processing S44 and the processing S45 are sequentially executed, and the processing S4 is performed. finish.
  • the detection result processing unit 133a cancels and deletes the detection result detected in the process S3 in order to process the detection result by the second processing method in which the terminal devices SPa and TAa are not displayed. Therefore, the detection result processing unit 133a does not transmit the detection result detected in the process S3 to the management server device SVa.
  • the seating detection unit 12 detects the seating of the monitored person Ob when the sensor device SUa detects the predetermined action, the monitored person Ob is seated, and so the bed leaving, falling, falling, etc.
  • the detection result of the predetermined action has a high possibility of false detection, and the detection result can be corrected by the seating detection result of the seating detection unit 12.
  • the predetermined action is detected based on the image, it is considered that the seating detection result is higher in detection reliability than the detection result by the image, so the detection result is more determined by the seating detection result. Correctable.
  • the detection result processing unit 133a determines the detection result detected in the process S3 (determines (confirms) the final detection result). Then, in the process S45, the detection result processing unit 133a sends the detection result determined in the process S44 to the management server apparatus SVa in order to process the detection result by the first processing method for displaying the terminal apparatus SPa and TAa. Send. That is, in the present embodiment, the detection result processing unit 133a transmits the first action detection notification communication signal by the SU communication IF unit 14 to the management server device SVa.
  • the behavior detection processing unit 132 detects a predetermined behavior and the seating detection unit 12
  • the sensor device SUa transmits a first action detection notification communication signal to the management server device SVa (C2).
  • the management server device SVa receives the first activity detection notification communication signal, it stores the monitoring information contained in the received first activity detection notification communication signal (C3), and the received first activity detection notification communication
  • a second action detection notification communication signal based on the signal is transmitted to the predetermined terminal devices SPa and TAa (C4).
  • the terminal devices SPa and TAa display the monitoring information contained in the received second activity detection notification communication signal on the monitoring information display screen (C5).
  • This monitoring information display screen is a screen for displaying monitoring information, and an example thereof is shown in FIG. In the example shown in FIG. 13, the monitoring information display screen 51 is displayed on the portable terminal device TAa.
  • the monitoring information display screen 51 displays the room name of the room RM in which a predetermined action is detected, the name of the detected predetermined action, and a target image when the predetermined action is detected.
  • a sensor ID and a sensor apparatus SUa having the sensor ID are provided in the portable terminal apparatus TAa.
  • the correspondence with the name is stored in advance. From this correspondence relationship, the mobile terminal device TAa searches the room name of the room RM corresponding to the sensor ID contained in the second activity detection notification communication signal, and displays the room name according to the search result on the monitoring information display screen 51. Do. Then, the mobile terminal device TAa displays the name of the predetermined action contained in the second action detection notification communication signal and the target image on the monitoring information display screen 51. Thus, the monitor (user) who handles the mobile terminal device TAa notifies the monitored person Ob who is present in the room RM of the room name of the predetermined action reported by referring to the monitoring information display screen 51. It can be recognized and its appearance can be recognized from the target image.
  • the action detection processing unit 132 detects a predetermined action and the seating detection unit 12 detects seating.
  • the sensor unit SUa cancels and deletes the detection of the predetermined action, thereby not transmitting the first action detection notification communication signal to the management server device SVa. Therefore, the management server device SVa does not transmit the second activity detection notification communication signal based on the first activity detection notification communication signal to the predetermined terminal devices SPa and TAa, and the terminal devices SPa and TAa perform the second activity detection notification communication.
  • the monitoring information display screen 51 based on the signal is not displayed. As a result, the person-to-be-monitored support system MSa can issue a notification more appropriately.
  • the target person monitoring support system MSa in the first embodiment and the target person monitoring support method implemented in the processing process the detection result based on the seating detection result. It is possible to increase the credibility of the result and thus to be notified more appropriately.
  • the monitored person monitoring support system MSa and the monitored person monitoring support method cause the terminal devices SPa and TAa to display the detection results, and the seating detection result is Since the detection results are not displayed on the terminal devices SPa and TAa when the monitored person Ob is seated, the reliability of the detection results can be enhanced, and therefore, notification can be issued more appropriately.
  • the monitoring person receives the notification, for the purpose of assisting with, for example, excretion, washing and getting ready, for example, for the purpose of preventing the fall. It goes to the place of person Ob.
  • the monitoring subject visits the monitored subject Ob upon receiving the notification for the purpose of rescue of the monitored subject Ob.
  • the predetermined action includes at least one of “bedding”, “falling” and “falling”, so This reduces unnecessary work due to false alarms and can effectively reduce the burden on the observer.
  • the detection result processing unit 133a is provided in the sensor device SUa, but may be provided in the management server device SVa instead of the sensor device SUa, or provided in the terminal devices SPa and TAa. It may be done.
  • FIG. 14 is a sequence diagram showing the operation of the monitored person monitoring system in the first modification in the first embodiment.
  • FIG. 15 is a sequence diagram showing an operation of the monitored person monitoring system in the second modified embodiment in the first embodiment.
  • the management server device SVa includes the detection result processing unit
  • the detection result processing unit 133a is omitted from the sensor device SUa
  • the action detection processing unit 132 replaces the first detection result notification communication signal instead of the detection result processing unit 133a. It transmits to the management server apparatus SVa.
  • the sensor ID of the own device information indicating the content of the predetermined action in the detected monitored person Ob (in this embodiment, among the bed leaving, falling and falling) Not only is the target image used to detect the predetermined action and the target image stored, but also the seating detection result (in the present embodiment, either seating presence or no seating) is also included Ru.
  • the management server SVa not only functionally includes the SV control unit 221 and the SV monitoring processing unit 222 in the SV control processing unit 22 but also functionally includes a detection result processing unit.
  • the detection result processing unit in the management server device SVa receives the detection result contained in the first detection result notification communication signal received from the sensor device SUa as information representing the content of a predetermined action in the monitored person Ob. Based on the seating detection result of the seating detection unit 12 contained in the first detection result notification communication signal, processing is performed using different processing methods. More specifically, the detection result processing unit in the management server device SVa displays the detection result on the terminal devices SPa and TAa when the seating detection result is not the seating of the monitored person Ob (in the case of no seating).
  • the second processing method does not display the detection result on the terminal devices SPa and TAa when processing is performed by the first processing method and the seating detection result indicates that the monitored person Ob is seated (seating is present) Do.
  • the second processing method is a processing method which does not transmit the detection result to the terminal devices SPa and TAa. That is, in the above, since the SV monitoring processing unit 222 transmits the second activity detection notification communication signal based on the received first activity detection notification communication signal to the predetermined terminal devices SPa and TAa, the detection in the management server SVa The result processing unit permits the SV monitoring processing unit 222 to transmit the second activity detection notification communication signal when the seating detection result is not the seating of the monitored person Ob (in the case of no seating), and the seating is performed. When the detection result indicates that the person to be monitored Ob is seated (when there is seating), the SV monitoring processing unit 222 is prohibited from transmitting the second activity detection notification communication signal.
  • the sensor device SUa when a predetermined action is detected by the action detection processing unit 132 and no seating is detected by the seating detection unit 12 (C11), the sensor device SUa further determines that seating is detected as the seating detection result.
  • the received first action detection notification communication signal is transmitted to the management server device SVa (C12).
  • the management server device SVa receives the first action detection notification communication signal
  • the management server device SVa stores the monitoring information contained in the received first action detection notification communication signal (C13), and the seating detection result indicates that seating is present. Therefore, transmission processing of the second behavior detection notification communication signal to the predetermined terminal devices SPa and TAa based on the received first behavior detection notification communication signal is prohibited.
  • the management server device SVa does not transmit the second activity detection notification communication signal based on the first activity detection notification communication signal to the predetermined terminal devices SPa and TAa, and the terminal devices SPa and TAa transmit the second activity detection notification communication signal.
  • the monitor information display screen 51 is not displayed based on the above.
  • the person-to-be-monitored support system MSa can issue a notification more appropriately.
  • the supervisor since the management server device SVa stores the monitoring information contained in the first action detection notification communication signal received in the process C13, the supervisor (user) checks the storage content of the management server device SVa. The fact that the false notification has been stopped by the management server SVa can be verified.
  • the detection result processing unit in the management server device SVa transmits the second behavior detection notification communication signal to the SV monitoring processing unit 222 according to the absence or the presence of the seating detection result.
  • the detection result processing unit in the management server device SVa executes transmission processing of the second action detection notification communication signal instead of the SV monitoring processing unit 222 as in the detection result processing unit 133a described above. You may.
  • the detection result processing unit in the management server device SVa transmits the second activity detection notification communication signal according to the absence or presence of the seating detection result. Run or not run.
  • the detection result processing unit 133a is omitted from the sensor unit SUa as in the case where the management server SVa described above includes the detection result processing unit.
  • the action detection processing unit 132 further transmits a first detection result notification communication signal containing the seating detection result to the management server device SVa.
  • the SV monitoring processing unit 222 is information contained in the received first action detection notification communication signal, the sensor ID, and information indicating the content of a predetermined action in the monitored person Ob (in this embodiment, Not only accommodating the bed leaving, falling or falling), and accommodating the target image in the second detection result notification communication signal, and further, the seating detection contained in the received first action detection notification communication signal
  • the results are also accommodated.
  • the terminal devices SPa and TAa include a detection result processing unit.
  • the detection result processing unit in each of the terminal devices SPa and TAa includes the detection result contained in the second detection result notification communication signal received from the management server device SVa as information representing the content of the predetermined action in the monitored person Ob.
  • Processing is performed with different processing methods based on the seating detection result of the seating detection unit 12 contained in the received second detection result notification communication signal. More specifically, the detection result processing unit in the terminal devices SPa and TAa sends the detection results to the terminal devices SPa and TAa when the seating detection result is not the seating of the monitored person Ob (when there is no seating). It processes by the 1st processing method to display, and when the said seating detection result is seating of the said to-be-monitored person Ob (when there is seating), it does not display the said sensing result on the said terminal device SPa and TAa by the 2nd processing method To process. More specifically, the second processing method is a processing method which does not display the detection result.
  • the detection result processing unit in the terminal devices SPa and TAa displays the detection result on the monitoring information display screen 51, for example, when the seating detection result is not seating of the monitored person Ob (without seating). If the seating detection result indicates that the monitored person Ob is seated (seating is present), the detection result is not displayed.
  • the action detection processing unit 132 detects a predetermined action and the seating detection unit 12 detects no seating
  • the first and second monitored Each of the processes C1 to C5 shown in FIG. 12 is executed except that the seating detection result is accommodated in each of the detection result notification communication signals.
  • the sensor device SUa when a predetermined action is detected by the action detection processing unit 132 and no seating is detected by the seating detection unit 12 (C21), the sensor device SUa further determines seating presence as the seating detection result.
  • the received first action detection notification communication signal is transmitted to the management server device SVa (C22).
  • the management server device SVa receives the first action detection notification communication signal
  • the management server device SVa stores the monitoring information contained in the received first action detection notification communication signal (C23), and the seating detection result is further added as the seating detection result.
  • the second activity detection notification communication signal received is transmitted to the predetermined terminal devices SPa and TAa (C24).
  • the terminal devices SPa and TAa When the terminal devices SPa and TAa receive the second activity detection notification communication signal, the seating detection result indicates that there is a seat, and thus the monitoring information display screen 51 is displayed based on the received second activity detection notification communication signal. do not do. As a result, the person-to-be-monitored support system MSa can issue a notification more appropriately. Further, since the management server device SVa stores the monitoring information contained in the first action detection notification communication signal received in the process C23, the supervisor (user) checks the storage content of the management server device SVa. The terminal devices SPa and TAa can verify the fact that false alarm has been stopped.
  • FIG. 16 is a flowchart showing an operation in the processing of the detection result shown in FIG. 7 in the second embodiment.
  • FIG. 17 is a flow chart showing the operation of the mobile terminal device regarding the display of the monitoring information display screen in the second embodiment.
  • FIG. 18 is a view showing another example of the monitoring information display screen displayed on the mobile terminal device in the monitored person monitoring support system in the second embodiment.
  • the monitored person monitoring support system MSa in the first embodiment processes the detection result by a different processing method whether to display the detection result on the terminal device based on the seating detection result.
  • the monitored result in the second embodiment The person monitoring support system processes the detection result by a different processing method for displaying on the terminal device in different display modes based on the seating detection result.
  • the monitored person monitoring support system MSb includes one or more sensor devices SUb (SUb-1 to SUb-4), and the management server device SVa.
  • the fixed terminal apparatus SPb, one or more mobile terminal apparatuses TAb (TAb-1, TAb-2), and a private branch exchange CX are connected communicably via a network NW.
  • the management server device SVa in the monitored person monitoring support system MSb of the second embodiment is the same as the management server device SVa in the monitored person monitoring support system MSa of the first embodiment, and thus the description thereof will be omitted.
  • the sensor device SUb includes an imaging unit 11, a seating detection unit 12, an SU control processing unit 13b, an SU communication IF unit 14, and an SU storage unit 15.
  • the imaging unit 11, the seating detection unit 12, the SU communication IF unit 14, and the SU storage unit 15 in the sensor device SUb according to the second embodiment respectively correspond to the imaging unit 11 and the seating detection unit 12 in the sensor device SUa according to the first embodiment. Since the SU communication IF unit 14 and the SU storage unit 15 are the same, the description thereof is omitted.
  • the SU control processing unit 13b controls each unit of the sensor device SUb according to the function of each unit, detects a predetermined action related to the monitored person Ob set in advance, and detects the detection result together with an image, with the management server device. It is a circuit for transmitting to SVa. Then, in the present embodiment, when the SU control processing unit 13b detects the seating of the person to be monitored Ob and transmits the detection result to the management server device SVa, the seating of the seating detection unit 12 is the detection result. Process with different processing methods based on detection results.
  • the SU control processing unit 13b functionally includes an SU control unit 131, an action detection processing unit 132, and a detection result processing unit 133b by executing a control processing program.
  • the SU control unit 131 and the action detection processing unit 132 in the SU control processing unit 13b of the second embodiment are similar to the SU control unit 131 and the action detection processing unit 132 in the SU control processing unit 13a of the first embodiment, respectively. Because there is, I omit the explanation.
  • the detection result processing unit 133 b processes the detection result of the action detection processing unit 132 by a different processing method based on the seating detection result of the seating detection unit 12. More specifically, in the present embodiment, the detection result processing unit 133b sends the detection result to the terminal devices SPb and TAb in a predetermined first display mode when the seating detection result is not the seating of the monitored person Ob.
  • the terminal apparatus SPb, TAb is processed by the third processing method to be displayed, and when the seating detection result is the seating of the monitored person Ob, the detection result is displayed in a predetermined second display mode different from the first display mode. It processes by the 4th processing method displayed on.
  • the first and second display modes are display modes indicating the degree of reliability of the detection result, and the second display mode is a display mode indicating that the degree of reliability is lower than the first display mode. is there. More specifically, first, in the present embodiment, the predetermined behavior not only includes bedriding, falling and falling, and further, bedriding suspected of the monitored person Ob being separated from the bedding (monitored person Ob Is suspected of getting away from the bedding), suspicion that the monitored person Ob has fallen from the bedding suspected of falling (believed that the monitored person Ob may have fallen out of the bedding), the monitored person Ob falls There are suspected falls (suspected falls that the monitored person Ob may have fallen). The suspected bed break is a detection result that is less reliable than the bed break.
  • the fall possibility is a detection result that is less reliable than the fall.
  • the suspected fall is a detection result that is less reliable than the fall.
  • the detection result processing unit 133b determines the detection result as it is and displays the detection result as it is when the seating detection result is not the seating of the monitored person Ob. Display on terminal devices SPb and TAb. That is, in the third processing method, when the action detection processing unit 132 detects the predetermined action, the detection result processing unit 133 b detects the first action when the seating detection result is not the seating of the monitored person Ob.
  • the notification communication signal is transmitted by the SU communication IF unit 14 to the management server device SVa.
  • the detection result processing unit 133b changes the detection result to a doubt detection result and displays a doubt detection result when the seating detection result indicates that the person to be monitored Ob is seated.
  • the detection result processing unit 133b changes the bed leaving to be a bed leaving suspect when the seating detection result is the seating of the person to be monitored Ob.
  • the detection result processing unit 133b changes the fall into a fall possibility if the seating detection result indicates that the person to be monitored Ob is seated. In the case where the detection result is falling, the detection result processing unit 133b changes the falling into a fall possibility if the seating detection result indicates that the person to be monitored Ob is seated. That is, in the fourth processing method, when the action detection processing unit 132 detects the predetermined action, the detection result processing unit 133 b detects the first action when the seating detection result indicates that the monitored person Ob is seated.
  • the notification communication signal is transmitted by the SU communication IF unit 14 to the management server device SVa.
  • the second display mode a predetermined action in the detected monitored person Ob is displayed on, for example, monitoring information display screens 52 and 53 shown in FIG. 18 described later, and the monitoring information display screens 52 and 53 are The display mode different from the first display mode indicates that the detection result is relatively low in reliability.
  • the first action detection notification communication signal includes, for example, the sensor ID of the own device, information indicating the content of a predetermined action in the detected monitored person Ob (in this embodiment, bed departure, fall, fall, suspicion, fall A target image used for detection of any of doubt and suspicion of falling) and the predetermined action is stored.
  • any one of bed leaving, falling and falling is accommodated in the first activity detection notification communication signal
  • the fourth processing method the first activity detection notification communication signal is suspected of leaving the floor , Or a fall or suspicion of fall is accommodated.
  • the fixed terminal apparatus SPb and the portable terminal apparatus TAb respectively detect the second activity detection notification communication signal received from the management server apparatus SVa as the doubt detection result (in the present embodiment, any of the bed departure, the fall suspect and the fall suspect).
  • the second activity detection notification communication signal received from the management server apparatus SVa as the doubt detection result (in the present embodiment, any of the bed departure, the fall suspect and the fall suspect).
  • the terminal device SPa and the mobile terminal device TAa are the same as the terminal device SPa and the mobile terminal device TAa.
  • the sensor unit SUb is shown in FIG. 7 every frame or every several frames as in the sensor unit SUa according to the first embodiment.
  • detection result treating part 133b acquires a seating detection result of seating detection part 12 first (S51).
  • the detection result processing unit 133b determines whether or not the monitored person Ob is seated (S52). As a result of this determination, when the seating detection result acquired in the processing S51 indicates that the monitored person Ob is seated (Yes), the processing S53 is performed, and then the processing S55 is performed, and the processing S5 is ended. Do. On the other hand, if it is determined that the seating detection result acquired in the processing S51 indicates that the monitored person Ob is not seated (No), the processing S55 is performed after the processing S54, and the main processing S5 Finish.
  • the detection result processing unit 133b changes the detection result detected in the processing S3 to a detection result of doubt. That is, when the detection result of process S3 is bed leaving, falling, and fall, respectively, the detection result processing unit 133b changes the bed leaving doubt, the falling suspect, and the falling suspect. If the seating detection unit 12 detects the seating of the person to be monitored Ob when the sensor device SUb detects the predetermined action, the person to be monitored Ob sits on the floor, falling down, falling, etc.
  • the detection result of the predetermined action is lower in reliability than the case where the seating detection unit 12 does not detect the seating of the monitored person Ob, and the detection result can be corrected by the seating detection result of the seating detection unit 12 .
  • the predetermined action is detected based on the image, it is considered that the seating detection result is higher in detection reliability than the detection result by the image, so the detection result is more determined by the seating detection result. Correctable.
  • the detection result processing unit 133b causes the terminal device SPb or TAb to display the detection result in the first display mode that indicates that the detection result is a relatively high detection result of the reliability.
  • the detection result detected by process S3 is decided as it is (a final detection result and determination (decision)).
  • the detection result processing unit 133b transmits a first detection result notification communication signal to the management server device SVa by the SU communication IF unit 14. That is, in the processing S55 after the execution of the processing S53, the first detection result notification communication signal contains the target image of any of the sensor ID of the own machine, the possibility of bed departure, the possibility of falling, and the possibility of falling. On the other hand, in the processing S55 after the execution of the processing S54, the first detection result notification communication signal contains the target image, which is any one of the sensor ID of the own device, bed leaving, falling and falling.
  • the management server device SVa When receiving the first action detection notification communication signal from the sensor device SUb, the management server device SVa stores the monitoring information contained in the received first action detection notification communication signal, substantially as in the first embodiment, A second activity detection notification communication signal based on the received first activity detection notification communication signal is transmitted to predetermined terminal devices SPb and TAb.
  • the second action detection notification communication signal includes the sensor device SUb because the predetermined behavior includes not only bed break, fall and fall but also bed break, doubt and fall.
  • any of leaving, falling, falling, falling, doubting to leave, suspicion of falling and suspicion of falling is accommodated.
  • the terminal devices SPb and TAb When the terminal devices SPb and TAb receive the second activity detection notification communication signal, the terminal devices SPb and TAb display the monitoring information contained in the received second activity detection notification communication signal on the monitoring information display screen. More specifically, in FIG. 17, the terminal devices SPb and TAb determine what the detection result detected by the sensor device SUb is (S61). More specifically, the terminal devices SPb and TAb determine what information representing the content of the predetermined action contained in the received second action detection notification communication signal. As a result of this determination, when the information indicating the content of the predetermined action is any one of bed break, fall and fall (bed break, fall and fall), the terminal devices SPb and TAb select the detection result After the processing S63 of displaying in one display mode is executed, the present processing is ended.
  • terminal apparatus SPb The TAb ends the present processing after executing the processing S62 of displaying the detection result of the suspicion in the second display mode.
  • the terminal devices SPb and TAb display the monitoring information contained in the received second activity detection notification communication signal, for example, on the monitoring information display screen 51 shown in FIG.
  • This monitoring information display screen 51 displays leaving and falling and falling over as a text as it is, so that the detection result is a detection result having a relatively high reliability as compared with the monitoring information display screens 52 and 53 shown in FIG. It represents that there is.
  • the terminal devices SPb and TAb monitor the monitoring information contained in the received second activity detection notification communication signal, for example, the monitoring information display screen 52 shown in FIG. 18A or the monitoring shown in FIG. It is displayed on the information display screen 53.
  • the monitoring information display screens 52 and 53 are displayed on the mobile terminal device TAb.
  • the room name of the room RM in which the predetermined action is detected, the name of the detected predetermined action, and the exclamation mark (!) Are displayed.
  • the character attributes (type of font, font size, character color and display density (display luminance)) of each character in the room name of the room RM and each character in the name of the action are displayed.
  • An observer who expresses relatively low reliability such as an exclamation mark (!)
  • the detection result is a detection result having a relatively low reliability as compared with the monitoring information display screen 51 shown in FIG.
  • Each character displayed on the monitoring information display screen 52 shown in FIG. 18A is displayed with a lighter display density than the monitoring information display screen 51 shown in FIG. Similar to the monitoring information display screen 51 shown in FIG.
  • the monitoring information display screen 53 shown in FIG. 18B includes the room name of the room RM where the predetermined action is detected, the name of the detected predetermined action, and the predetermined The target image at the time of detecting the action of is displayed. However, on this monitoring information display screen 53, the display size (resolution) of the target image is further changed by changing the character attribute of each character to the character attribute of each character represented on the monitoring information display screen 51 of the first display mode. As compared with the monitoring information display screen 51 shown in FIG. 13, the detection result indicates that the detection result is a detection result with relatively low reliability.
  • Each character displayed on the monitoring information display screen 53 shown in FIG. 18B is displayed as a character having a smaller number of points than the monitoring information display screen 51 shown in FIG.
  • the target image is the monitoring information shown in FIG.
  • the display screen is displayed in a smaller display size (low resolution).
  • the detection result is relatively less reliable than the monitoring information display screen 51 shown in FIG. 13 even when the text “suspect” is displayed. It represents that it is a result.
  • the monitored person monitoring support system MSb in the second embodiment and the monitored person monitoring support method implemented in this change the display mode of the detection result based on the seating detection result, so from the display mode, The reliability of the detection result based on the seating detection result can be recognized, and accordingly, the notification can be issued more appropriately with the reliability of the detection result.
  • the terminal devices SPb and TAb are displayed in the first display mode or the second display mode according to the absence or the presence of the seating detection result, but instead of or in addition to this, A relatively loud notification sound or a relatively small notification sound may be output. Furthermore, instead of or in addition to these, the terminal devices SPb and TAb may output a notification sound with a relatively long time length or a notification sound with a relatively short time length.
  • FIG. 19 is a view for explaining a modification of the second embodiment.
  • FIG. 19A shows the case where the seating detection result shows no seating
  • FIG. 19B shows the case where the seating detection result shows seating.
  • the terminal devices SPb and TAb monitor information stored in the received second activity detection notification communication signal, for example, monitoring information illustrated in FIG.
  • the terminal devices SPb and TAb display the monitoring information display screen 51 and a notification sound (for example, ringing tone etc.) having a relatively large and long time length.
  • the terminal devices SPb and TAb monitor the monitoring information contained in the received second behavior detection notification communication signal, for example, monitoring information shown in FIG. 18B.
  • the terminal devices SPb and TAb display the monitoring information display screen 53, and a notification sound with a relatively small and short time length (for example, "pipi" etc. Output).
  • the monitored person monitoring support system MSa in the first embodiment processes the detection result according to a different processing method as to whether or not to display the detection result on the terminal device based on the seating detection result.
  • the person monitoring support system processes the detection result according to different processing methods as to whether or not the management server device manages the detection result based on the seating detection result.
  • the monitored person monitoring support system MSc in the third embodiment is, for example, as shown in FIG. 1, one or more sensor devices SUc (SUc-1 to SUc-4), a management server device SVc, The fixed terminal apparatus SPa, one or more portable terminal apparatuses TAa (TAa-1, TAa-2), and a private branch exchange CX are connected communicably via a network NW.
  • the fixed terminal apparatus SPa and the portable terminal apparatus TAa in the monitored person monitoring support system MSc of the third embodiment are respectively connected with the fixed terminal apparatus SPa and the portable terminal apparatus TAa in the monitored person monitoring support system MSa of the first embodiment. The description is omitted because it is similar.
  • the sensor device SUc in the monitored person monitoring support system MSc of the third embodiment is, for example, as shown in FIG. 2, an imaging unit 11, a seating detection unit 12, an SU control processing unit 13c, and an SU communication IF unit 14 and an SU storage unit 15.
  • the imaging unit 11, the seating detection unit 12, the SU communication IF unit 14 and the SU storage unit 15 in the sensor device SUc according to the third embodiment respectively correspond to the imaging unit 11 and the seating detection unit 12 in the sensor device SUa according to the first embodiment. Since the SU communication IF unit 14 and the SU storage unit 15 are the same, the description thereof is omitted.
  • the SU control processing unit 13c controls each unit of the sensor device SUc according to the function of each unit, detects a predetermined action related to the monitored person Ob set in advance, and detects the detection result together with an image with the management server device. It is a circuit for transmitting to SVc. Then, in the present embodiment, when the SU control processing unit 13c detects the seating of the monitored person Ob and transmits the detection result to the management server device SVc, the seating of the seating detection unit 12 is the detection result. Process with different processing methods based on detection results.
  • the SU control processing unit 13c functionally includes an SU control unit 131, an action detection processing unit 132, and a detection result processing unit 133c by executing a control processing program.
  • the SU control unit 131 and the action detection processing unit 132 in the SU control processing unit 13c of the third embodiment are the same as the SU control unit 131 and the action detection processing unit 132 in the SU control processing unit 13a of the first embodiment, respectively. Because there is, I omit the explanation.
  • the detection result processing unit 133 c processes the detection result of the action detection processing unit 132 by a different processing method based on the seating detection result of the seating detection unit 12. More specifically, in the present embodiment, the fifth processing method causes the detection result processing unit 133c to store and manage the detection result in the management server device SVc when the seating detection result is not the seating of the monitored person Ob. If the seating detection result indicates that the person to be monitored Ob is seated, the processing is performed according to a sixth processing method that does not store the detection result in the management server device SVc. More specifically, the sixth processing method is a processing method which does not transmit the detection result to the management server device SVc.
  • the detection result processing unit 133c detects the first action when the seating detection result is not the seating of the monitored person Ob.
  • the notification communication signal is transmitted by the SU communication IF unit 14 to the management server SVc.
  • the detection result processing unit 133 c performs the first action when the seating detection result indicates that the monitored person Ob is seated. Does not send detection notification communication signal.
  • the monitored person monitoring support system MSb and the monitored person monitoring support method implemented in the third embodiment control execution or non-execution of the memory based on the result of the seating detection, as the detection result. Only more reliable detection results can be stored and managed.
  • the detection result processing unit 133c is provided in the sensor device SUc, but may be provided in the management server device SVc instead of the sensor device SUc.
  • FIG. 20 is a sequence diagram showing the operation of the monitored person monitoring system of the modification in the third embodiment.
  • the management server device SVc includes the detection result processing unit
  • the detection result processing unit 133c is omitted from the sensor device SUc
  • the action detection processing unit 132 replaces the first detection result notification communication signal instead of the detection result processing unit 133c. Transmit to the management server SVc.
  • the sensor ID of the own device information indicating the content of the predetermined action in the detected monitored person Ob (in this embodiment, among the bed leaving, falling and falling) Not only is the target image used to detect the predetermined action and the target image stored, but also the seating detection result (in the present embodiment, either seating presence or no seating) is also included Ru.
  • the management server SVc not only functionally includes the SV control unit 221 and the SV monitoring processing unit 222 in the SV control processing unit 22 but also functionally includes a detection result processing unit.
  • the detection result processing unit in the management server device SVc receives the detection result contained in the first detection result notification communication signal received from the sensor device SUc as information representing the content of a predetermined action in the monitored person Ob. Based on the seating detection result of the seating detection unit 12 contained in the first detection result notification communication signal, processing is performed using different processing methods. More specifically, the detection result processing unit in the management server device SVc stores the detection result in the management server device SVc when the seating detection result is not the seating of the monitored person Ob (in the case of no seating).
  • the processing is performed by the fifth processing method for management, and when the seating detection result indicates that the person to be monitored Ob is seated (when seating is present), the management result is not stored in the management server SVc without being managed.
  • Process by processing method That is, in the above description, the SV monitoring processing unit 222 stores the monitoring information contained in the received first action detection notification communication signal in the SV storage unit 23 and manages it, so that the detection result processing unit in the management server SVc Allows the storage and management of the monitoring information by the SV monitoring processing unit 222 when the seating detection result is not the seating of the monitored person Ob (when there is no seating), and the seating detection result is for the monitored person Ob In the case of being seated (in the presence of seating), storage and management of monitoring information by the SV monitoring processing unit 222 are prohibited, and the monitoring information is discarded.
  • the action detection processing unit 132 detects a predetermined action and the seating detection unit 12 detects no seating
  • the first detection result notification communication signal Furthermore, except for the fact that the seating detection result is accommodated, the processing of the above-described processing C1 to processing C5 shown in FIG. 12 is executed.
  • the sensor device SUc when a predetermined action is detected by the action detection processing unit 132 and no seating is detected by the seating detection unit 12 (C31), the sensor device SUc further detects seating presence as the seating detection result.
  • the received first action detection notification communication signal is transmitted to the management server device SVc (C32).
  • the management server device SVc receives the first action detection notification communication signal, the management server device SVc discards the monitoring information according to the seating detection result of the seating presence accommodated in the received first action detection notification communication signal, and does not store it. Do not manage (C33).
  • the monitored person monitoring support system is provided corresponding to the monitored person to be monitored, and is connected communicably to the sensor device for detecting a predetermined action related to the monitored person, and the sensor device.
  • a central processing unit that manages detection results received from the sensor device; and a terminal device communicably connected to the central processing unit and receiving and displaying the detection results via the central processing unit;
  • a monitored person monitoring system for supporting monitoring of a monitored person, the seating detection unit detecting seating of the monitored person, and the processing of processing the detection result different based on the seating detection result of the seating detection unit
  • a detection result processing unit that performs processing by a method.
  • the sensor device Preferably, in the above-mentioned person-to-be-monitored monitoring support system, the sensor device generates an image in which the shape of the person to be monitored is copied, and detects the predetermined action based on the generated image.
  • the seating detection unit detects the presence or absence of seating on the monitored person. For this reason, when the seating detection unit detects seating of the monitored person when the sensor device detects the predetermined behavior, the monitored person Ob is seated, the detection result of the predetermined behavior There is a high possibility of false detection, and the detection result can be corrected by the seating detection result of the seating detection unit. Since the above-mentioned person-to-be-monitored monitoring support system processes the detection result based on the seating detection result of the seating detection unit, the reliability of the detection result can be enhanced, and therefore, notification can be issued more appropriately.
  • the detection result processing unit causes the terminal device to display the detection result when the seating detection result is not the seating of the monitored person. It processes by a processing method, and when the said seating detection result is seating of the said to-be-monitored person, it processes by the 2nd processing method which does not display the said detection result on the said terminal device.
  • Such a monitored person monitoring support system displays the detection result on the terminal device when the seating detection result is not the seating of the monitored person, and the seating detection result is the seating of the monitored person Since the detection result is not displayed on the terminal device in some cases, the reliability of the detection result can be enhanced, and therefore, notification can be issued more appropriately.
  • the detection result processing unit is provided in the sensor device, and the second processing method does not transmit the detection result to the management server device. It is.
  • the detection result processing unit is provided in the management server device, and the second processing method does not transmit the detection result to the terminal device. It is.
  • the detection result processing unit is provided in the terminal device, and the second processing method is a processing method which does not display the detection result.
  • the detection result processing unit when the seating detection result is not a seating of the person to be monitored, performs the detection result in a predetermined first display mode
  • the terminal device is processed according to a third processing method to be displayed on the terminal device, and the detection result is different from the first display mode when the seating detection result is the seating of the monitored person. It processes by the 4th processing method displayed on.
  • the first and second display modes are display modes indicating the degree of reliability of the detection result, and the second display mode is the first display mode. It is a display mode showing that the degree of reliability is lower.
  • the seating detection unit detects seating of the monitored person when the sensor device detects the predetermined behavior, the monitored person is seated, so the detection result of the predetermined behavior is seating detection.
  • the reliability is lower than when the unit does not detect the seating of the monitored person, and the detection result can be corrected by the seating detection result of the seating detection unit.
  • the monitored person monitoring support system changes the display mode of the detection result based on the seating detection result of the seating detection unit. Therefore, the detection result based on the seating detection result of the seating detection unit from the display mode The reliability of the detection result can therefore be recognized more appropriately with the reliability of the detection result.
  • the detection result processing unit stores the detection result in the management server device when the seating detection result is not a seating of the monitored person. It processes by the 5th processing method to manage, and when the said seating detection result is seating of the said to-be-monitored person, it processes by the 6th processing method which is not made to manage without storing the said detection result in the said management server apparatus.
  • the detection result processing unit is included in the sensor device, and the sixth processing method is a processing method which does not transmit the detection result to the management server device.
  • the detection result processing unit is provided in the management server device, and the sixth processing method is a processing method which does not store the detection result but does not manage it.
  • Such a monitored person monitoring support system controls the execution or non-execution of the memory based on the seating detection result of the seating detection unit, so that only the detection result with higher reliability is stored. Management.
  • the predetermined action is preset, and the person to be monitored leaves the bed, the person to be monitored falls from the bed, and And at least one of the falls that the monitored person has fallen.
  • the supervisory person receives the notification for the purpose of assisting with, for example, excretion, washing and getting ready, for example, for the purpose of preventing the fall thereof. I went to my place.
  • the monitoring person visits the monitored person when receiving the notification for the purpose of rescue of the monitored person. Therefore, if the predetermined action includes at least one of “bedding off”, “falling” and “falling”, by being notified more appropriately, wasteful effort due to false alarm is reduced, and the watcher's The burden can be reduced effectively.
  • the seating detection unit includes a contact-type seating sensor.
  • the seating detection unit includes a contact type seating sensor.
  • the seating detection unit includes a non-contact seating sensor.
  • the seating detection unit includes a noncontact seating sensor.
  • the sensor device further includes an imaging unit configured to generate an image
  • the terminal device is received from the sensor device via the management server device. Display the image.
  • Such a monitored person monitoring support system can recognize the situation of the monitored person from the terminal device by the image.
  • the terminal device is a portable terminal device.
  • the supervisor can carry the terminal device.
  • a monitored person monitoring support method corresponding to a monitored person to be monitored, and is capable of communicating with the sensor device that detects a predetermined behavior related to the monitored person
  • a central processing unit connected and managing a detection result received from the sensor device; and a terminal device communicably connected to the central processing unit and receiving and displaying the detection result via the central processing unit;
  • a monitored person monitoring support method of a monitored person monitoring system for supporting monitoring of the monitored person, a seating detection process for detecting seating of the monitored person, and the detection result, the seating detection process And a detection result processing step of processing by a different processing method based on the seating detection result of
  • the above-mentioned person-to-be-monitored monitoring support method processes the detection result based on the seating detection result of the seating detection process, the reliability of the detection result can be enhanced, and therefore, notification can be issued more appropriately.
  • the present invention it is possible to provide a monitored person monitoring support system and a monitored person monitoring support method for supporting monitoring of a monitored person.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Nursing (AREA)
  • Emergency Management (AREA)
  • Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Business, Economics & Management (AREA)
  • Public Health (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Alarm Systems (AREA)
  • Emergency Alarm Devices (AREA)
  • Accommodation For Nursing Or Treatment Tables (AREA)
  • Telephonic Communication Services (AREA)

Abstract

This monitored person monitoring assistance system and monitored person monitoring assistance method comprise: a sensor device that is provided in correspondence with a monitored person and that detects a prescribed behavior associated with the monitored person; a central processing device that is communicably connected to the sensor device, and that manages a detection result received from the sensor device; and a terminal device that is communicably connected to the central processing device, and that receives the detection result via the central processing device and displays the detection result. Seating of the monitored person is detected, and the detection result is processed with a different processing method on the basis of this seating detection result.

Description

被監視者監視支援システムおよび被監視者監視支援方法Monitored person monitoring support system and monitored person monitoring support method
 本発明は、被監視者の監視を支援するための被監視者監視支援システムおよび被監視者監視支援方法に関する。 The present invention relates to a monitored person monitoring support system and a monitored person monitoring support method for supporting monitoring of a monitored person.
 近年、超高齢社会の突入に伴って、看護業界や介護業界等の業務を補完する技術が、研究、開発されており、例えば、特許文献1ないし特許文献3に開示されている。 In recent years, with the rush of the super-aging society, techniques for complementing operations such as the nursing industry and the nursing care industry have been researched and developed, and are disclosed, for example, in Patent Literatures 1 to 3.
 特許文献1に開示された転倒検知装置は、検知対象である人体を表す部分画像を含む人体画像と、該部分画像の座標上の位置を表す位置座標情報と、前記部分画像に表された前記人体の大きさを表すサイズ情報とに基づいて、前記サイズ情報に含まれる高さ情報と、過去に取得したサイズ情報に含まれる高さ情報とを比較することによって前記人体の高さが減少しているか否かを検知する高さ検知部と、高さが減少している場合には、前記人体画像のアピアランスに基づいて、アピアランス特徴量を抽出するアピアランス特徴量抽出部と、前記アピアランス特徴量を基に第1のアピアランス辞書を参照した結果に基づいて、前記人体が倒れた状態か否かを識別するアピアランス検知部と、前記倒れた状態と識別した場合には、前記位置座標情報と過去に取得した位置座標情報とに基づいて、前記人体画像に含まれる前記部分画像の移動距離を算出し、前記移動距離と閾値とを比較すると共に、比較した結果に基づいて、前記人体を表す前記部分画像の前記移動距離が前記閾値より少ないか否かを検知し、前記移動距離が少ない場合に前記人体画像に含まれる前記人体が転倒したことを検知する移動距離検知部とを備える。 The fall detection device disclosed in Patent Document 1 includes a human body image including a partial image representing a human body to be detected, position coordinate information representing a position on the coordinate of the partial image, and the color image represented by the partial image. The height of the human body is reduced by comparing the height information included in the size information with the height information included in the size information acquired in the past based on the size information indicating the size of the human body A height detection unit that detects whether or not there is an appearance feature amount extraction unit that extracts an appearance feature amount based on the appearance of the human body image when the height decreases, and the appearance feature amount When it is determined that the human body is in the fallen state or not based on the result of referring to the first appearance dictionary based on the above, and when it is discriminated that the human body is in the fallen state, the position coordinates The moving distance of the partial image included in the human body image is calculated based on the information and the position coordinate information acquired in the past, and the moving distance is compared with a threshold, and the human body is compared based on the comparison result. And a movement distance detection unit that detects whether the movement distance of the partial image representing 少 な い is smaller than the threshold and detects that the human body included in the human body image falls when the movement distance is small. .
 特許文献2に開示された着座センサー付き車椅子は、車椅子の位置を検出する位置検出手段と、検出された車椅子の位置を管理センターに送信する通信手段とを搭載した車椅子において、車椅子に人が座っていることを検出する着座センサーを備える。 A wheelchair equipped with a seating sensor disclosed in Patent Document 2 is a wheelchair equipped with position detection means for detecting the position of the wheelchair and communication means for transmitting the detected position of the wheelchair to the control center. It has a seating sensor that detects that it is
 特許文献3に開示された呼び出しシステムは、被監視者が使用しているベッド近傍の床の上に存在する物体を検出する床検出センサと、前記被監視者が前記ベッド近傍の床の上に立ち上がった場合に前記被監視者を検出し、かつ、前記被監視者がベッドから転落した場合に前記被監視者を検出しないように前記ベッド近傍の物体を検出する人体検出センサと、前記被監視者がベッドを離れたことを報知する報知装置と、前記被監視者がベッドから離れて移動したことと前記被監視者がベッドから転落したこととを区別して呼び出し理由を示す報知理由情報を記憶する記憶装置と、時間を計測するタイマー部と、前記床検出センサが物体を検出した場合に、前記報知装置および前記タイマー部を動作させ、前記タイマー部が計測した時間が所定時間に達するまでに、前記人体検出センサが物体を検出した場合に、前記被監視者がベッドから離れて移動したことを示す報知理由情報を前記記憶装置に記憶させて、前記タイマー部が計測した時間が所定時間に達しても、前記人体検出センサが物体を検出しなかった場合に、被監視者がベッドから転落したことを示す報知理由情報を前記記憶装置に記憶させる制御部とを有する制御装置と、を備える。床検出センサ1は、赤外線センサなどにより構成されており、ベッドを使用している被監視者がベッドから離れようとした場合に被監視者の足を、被監視者がベッドから転落した場合に被監視者の体などを検出することができるような位置に設置されている(前記特許文献3の[0015]段落等)。人体検出センサ2は、赤外線センサなどにより構成されており、ベッドを使用している被監視者が床検出センサ1の設置されているベッド近傍の床の上に立ち上がった場合に被監視者の体(例えば、被監視者の頭部など)を検出することができ、かつ、ベッドを使用している被監視者がベッドから転落した場合に被監視者の体を検出することができないような位置に設置されている(前記特許文献3の[0018]段落等)。したがって、特許文献3に開示された呼び出しシステムは、床検出センサとしての赤外線センサの配設高さと、人体検出センサとしての赤外線センサの配設高さとを異ならせることで、被監視者がベッドから離れて移動したことと前記被監視者がベッドから転落したこととを区別している。 The call system disclosed in Patent Document 3 includes a floor detection sensor for detecting an object present on a floor near a bed used by the monitored person, and the monitored person on the floor near the bed. A human body detection sensor for detecting an object in the vicinity of the bed so as to detect the person to be monitored when standing up, and not to detect the person to be monitored when the person to be monitored falls from the bed A notification device for notifying that the person has left the bed, and storage of notification reason information indicating the calling reason by distinguishing that the person to be monitored has moved away from the bed and that the person to be monitored has fallen from the bed Storage device, a timer unit for measuring time, and when the floor detection sensor detects an object, the notification device and the timer unit are operated, and the time measured by the timer unit is In the case where the human body detection sensor detects an object before reaching the time, notification reason information indicating that the monitored person has moved away from the bed is stored in the storage device, and the timer unit measures A control unit for storing, in the storage device, notification reason information indicating that the monitored person has fallen from the bed when the human body detection sensor does not detect an object even if the time reaches a predetermined time A device. The floor detection sensor 1 is configured by an infrared sensor or the like, and when the monitored person using the bed tries to leave the bed, the monitored person's foot falls while the monitored person falls from the bed. It is installed in the position which can detect a to-be-monitored person's body etc. (the [0015] paragraph etc. of the said patent document 3). The human body detection sensor 2 is configured by an infrared sensor or the like, and the body of the person to be monitored when the person using the bed stands on the floor near the bed where the floor detection sensor 1 is installed. A position where it can detect (for example, the head of the person being monitored) and can not detect the person's body when the person using the bed falls from the bed (Eg, paragraph [0018] of Patent Document 3). Therefore, in the call system disclosed in Patent Document 3, the person to be monitored can be separated from the bed by making the arrangement height of the infrared sensor as the floor detection sensor different from the arrangement height of the infrared sensor as the human body detection sensor. It distinguishes having moved away and having said to-be-monitored person falling from a bed.
 ところで、例えば被看護者や被介護者等の被監視者が転倒等し、例えば看護師や介護士等の監視者がシステムから前記転倒等の発報を受けた場合、通常、監視者は、前記発報された被監視者の処に赴く等の対処を実施する。しかしながら、前記発報が前記システムによって被監視者の転倒等ではない状態を前記転倒等と判定されて発報されてしまった誤報である場合、監視者の対処は、無駄となり、労力の損失となってしまう。一方、前記システムが前記転倒等である状態を検知できずに発報しない失報が生じてしまうと、前記転倒等の事態が放置されることになり、より重大な事態に発展してしまう可能性がある。したがって、このような誤報や失報等のエラーがより抑制され、より適切に発報されることが望まれる。 By the way, for example, when a person to be monitored such as a person to be cared or a person to be cared over falls, for example, a person to be monitored such as a nurse or carer receives notification from the system such as toppling etc. Take measures such as visiting the person to be monitored who has been notified. However, if the notification is a false alarm that is determined by the system as a fall or the like that is not a fall of the monitored person, the supervisor's action is wasted and the labor is lost. turn into. On the other hand, if the system fails to detect the state of the fall or the like and a failure is not generated, the fall or the like is left unchecked, which may lead to a more serious situation. There is sex. Therefore, it is desirable that errors such as false alarms and false alarms are further suppressed and reported more appropriately.
 前記特許文献1に開示された転倒検知装置のように画像に基づいて転倒等を検知する場合、例えば、転倒して尻餅をついた状態と椅子に座って後ろにもたれたり前かがみになった状態とは、画像上では似たように見えるため、判別し難い。また例えば椅子や車椅子等に座っていた状態と前記椅子や車椅子等からずり落ちた状態とは、画像上では似たように見えるため、判別し難い。このため、誤報が生じてしまう虞がある。特に、失報を防止するために、このような判別し難い場合を全て発報することが必要となってしまい、誤報が増大してしまう。 When a fall is detected based on an image as in the fall detection device disclosed in Patent Document 1, for example, a state in which a fall falls on a buttock and a state in which a chair sits down and leans or leans forward Are difficult to distinguish because they look similar on the image. Also, for example, the state of sitting on a chair or a wheelchair and the state of falling off from the chair or wheelchair or the like look similar on an image, so it is difficult to distinguish. For this reason, there is a possibility that a false alarm may occur. In particular, in order to prevent a failure from being reported, it is necessary to report all such difficult-to-understand cases, which increases the number of false alarms.
 また、ベッドの所在位置とセンサの配設位置との位置関係によって、ベッド上の被監視者に対する監視者の作業の際に、センサから見て被監視者、監視者およびベッドが重なった位置関係になるため、被監視者がベッドから車椅子に移されたのか、そのまま寝ているのか、画像から判別し難い場合もあり、監視者の退去後に被監視者がベッド上に寝ているか否かの誤判定の虞もある。 Also, due to the positional relationship between the bed position and the sensor placement position, the positional relationship in which the monitored person, the monitored person, and the bed overlap when viewed from the sensors when the observer works on the monitored person on the bed In some cases, it may be difficult to determine from the image whether the monitored person has been transferred from bed to a wheelchair, sleeping as it is, or it may be difficult to determine whether the monitored person sleeps on the bed after leaving the observer. There is also the possibility of an erroneous decision.
 前記特許文献2に開示された着座センサー付き車椅子は、車椅子の現在位置と着座の有無を通報するだけである(前記特許文献2の[0012]段落等)。このため、前記特許文献2に開示された着座センサー付き車椅子は、着座していない場合、前記車椅子から他の場所に移動したために着座していないのか、それとも、車椅子から転落等したために着座していないのかを区別することができず、したがって、転落等を検知できない。このため、着座センサーで着座していない場合に転落等と判定して発報してしまうと、誤報が多発してしまう。 The wheelchair with a seating sensor disclosed in the patent document 2 only reports the current position of the wheelchair and the presence or absence of seating (the paragraph [0012] of the patent document 2 and the like). Therefore, when the wheelchair with a seating sensor disclosed in Patent Document 2 is not seated, the wheelchair is not seated because it has moved from the wheelchair to another place, or is seated because it has fallen from the wheelchair or the like. It is not possible to distinguish whether it is not or not, so it is not possible to detect a fall or the like. For this reason, if it is determined that the seat falls down and the like when the seat sensor is not seated and a notification is issued, false alarms frequently occur.
 前記特許文献3に開示された呼び出しシステムは、配設位置の異なる赤外線センサで被監視者の転落を判定しているので、例えば落下物を拾うためにしゃがんでいる等の立位姿勢ではない場合は、全て転落と判定されてしまい、誤報が多発してしまう虞がある。 Since the call system disclosed in the patent document 3 determines the fall of the person to be monitored by infrared sensors with different arrangement positions, for example, when the user is not in a standing position such as squatting to pick up a falling object, etc. In this case, it is determined that everything has fallen and there is a possibility that false alarms may occur frequently.
国際公開第WO2014/010203号パンフレットInternational Publication No. WO 2014/010203 Pamphlet 特開2000-279450号公報JP 2000-279450 A 特開2012-071065号公報JP, 2012-071065, A
 本発明は、上述の事情に鑑みて為された発明であり、その目的は、より適切に発報できる被監視者監視支援システムおよび被監視者監視支援方法を提供することである。 The present invention is an invention made in view of the above-described circumstances, and an object thereof is to provide a monitored person monitoring support system and a monitored person monitoring support method that can be issued more appropriately.
 上述した目的を実現するために、本発明の一側面を反映した被監視者監視支援システムおよび被監視者監視支援方法は、被監視者に対応して設けられ、前記被監視者に関わる所定の行動を検知するセンサ装置、前記センサ装置と通信可能に接続され前記センサ装置から受信した検知結果を管理する中央処理装置、および、前記中央処理装置と通信可能に接続され前記中央処理装置を介して前記検知結果を受信して表示する端末装置を備え、前記被監視者の着座を検知し、前記検知結果を、この着座検知結果に基づいて異なる処理方法で処理する。 In order to realize the above-mentioned object, a monitored person monitoring support system and a monitored person monitoring support method reflecting one aspect of the present invention are provided corresponding to the monitored person, and the predetermined related to the monitored person is provided. A sensor device for detecting an action, a central processing unit communicably connected to the sensor device and managing a detection result received from the sensor device, and communicably connected to the central processing unit via the central processing unit A terminal device for receiving and displaying the detection result is provided, the seating of the person to be monitored is detected, and the detection result is processed by a different processing method based on the seating detection result.
 発明の1または複数の実施形態により与えられる利点および特徴は、以下に与えられる詳細な説明および添付図面から十分に理解される。これら詳細な説明及び添付図面は、例としてのみ与えられるものであり本発明の限定の定義として意図されるものではない。 The advantages and features provided by one or more embodiments of the invention will be better understood from the detailed description given below and the accompanying drawings. These detailed descriptions and the accompanying drawings are given by way of example only and are not intended as a definition of the limits of the present invention.
実施形態における被監視者監視支援システムの構成を示す図である。BRIEF DESCRIPTION OF THE DRAWINGS It is a figure which shows the structure of the to-be-monitored person monitoring assistance system in embodiment. 前記被監視者監視支援システムにおけるセンサ装置の構成を示す図である。It is a figure which shows the structure of the sensor apparatus in the said to-be-monitored person monitoring assistance system. 前記センサ装置に記憶される寝具所在領域を設定するための設定画面の一例を示す図である。It is a figure which shows an example of the setting screen for setting the bedding location area memorize | stored in the said sensor apparatus. 離床の検知手法を説明するための図である。It is a figure for demonstrating the detection method of bed departure. 転倒転落の検知手法を説明するための図である。It is a figure for demonstrating the detection method of fall fall. 前記被監視者監視支援システムにおける管理サーバ装置の構成を示す図である。It is a figure which shows the structure of the management server apparatus in the said to-be-monitored person monitoring assistance system. 前記センサ装置の動作を示すフローチャートである。It is a flowchart which shows operation | movement of the said sensor apparatus. 図7に示す行動の検知処理における動作を示すフローチャートである。It is a flowchart which shows the operation | movement in the detection process of the action shown in FIG. 図8に示す転倒転落の検知処理における動作を示すフローチャートである。It is a flowchart which shows the operation | movement in the detection process of the fall fall shown in FIG. 図8に示す離床の検知処理における動作を示すフローチャートである。It is a flowchart which shows the operation | movement in the detection process of bed leaving shown in FIG. 第1実施形態において、図7に示す検知結果の処理における動作を示すフローチャートである。FIG. 8 is a flowchart showing an operation in processing of a detection result shown in FIG. 7 in the first embodiment. 第1実施形態における被監視者監視システムの動作を示すシーケンス図である。It is a sequence diagram which shows operation | movement of the to-be-monitored person monitoring system in 1st Embodiment. 第1実施形態において、前記被監視者監視支援システムにおける携帯端末装置に表示される監視情報表示画面の一例を示す図である。In 1st Embodiment, it is a figure which shows an example of the monitoring information display screen displayed on the portable terminal device in the to-be-monitored person monitoring assistance system. 第1実施形態において、第1変形形態における被監視者監視システムの動作を示すシーケンス図である。In 1st Embodiment, it is a sequence diagram which shows operation | movement of the to-be-monitored person monitoring system in a 1st modification. 第1実施形態において、第2変形形態における被監視者監視システムの動作を示すシーケンス図である。In 1st Embodiment, it is a sequence diagram which shows operation | movement of the to-be-monitored person monitoring system in 2nd modification. 第2実施形態において、図7に示す検知結果の処理における動作を示すフローチャートである。In 2nd Embodiment, it is a flowchart which shows the operation | movement in a process of a detection result shown in FIG. 第2実施形態における、監視情報表示画面の表示に関する前記携帯端末装置の動作を示すフローチャートである。It is a flowchart which shows operation | movement of the said portable terminal device regarding the display of the monitoring information display screen in 2nd Embodiment. 第2実施形態において、前記被監視者監視支援システムにおける携帯端末装置に表示される監視情報表示画面の他の一例を示す図である。In 2nd Embodiment, it is a figure which shows another example of the monitoring information display screen displayed on the portable terminal device in the to-be-monitored person monitoring assistance system. 第2実施形態における変形形態を説明するための図である。It is a figure for demonstrating the modification in 2nd Embodiment. 第3実施形態において、変形形態の被監視者監視システムの動作を示すシーケンス図である。In 3rd Embodiment, it is a sequence diagram which shows operation | movement of the to-be-monitored person monitoring system of a modification.
 以下、図面を参照して、本発明の1または複数の実施形態が説明される。しかしながら、発明の範囲は、開示された実施形態に限定されない。なお、各図において同一の符号を付した構成は、同一の構成であることを示し、適宜、その説明を省略する。本明細書において、総称する場合には添え字を省略した参照符号で示し、個別の構成を指す場合には添え字を付した参照符号で示す。 Hereinafter, one or more embodiments of the present invention will be described with reference to the drawings. However, the scope of the invention is not limited to the disclosed embodiments. In addition, the structure which attached | subjected the same code | symbol in each figure shows that it is the same structure, and abbreviate | omits the description suitably. In the present specification, suffixes are generally indicated by reference numerals without suffixes, and individual configurations are indicated by subscripts.
 本実施形態における被監視者監視支援システムは、監視すべき(見守るべき)監視対象(見守り対象)である被監視者(見守り対象者)Obを監視する際に、その監視を支援するシステムである。本実施形態では、被監視者監視支援システムは、被監視者Obに対応して設けられ、前記被監視者Obに関わる所定の行動を検知するセンサ装置、前記センサ装置と通信可能に接続され前記センサ装置から受信した検知結果を管理する中央処理装置、および、前記中央処理装置と通信可能に接続され前記中央処理装置を介して前記検知結果を受信して表示する端末装置を備える。そして、本実施形態では、被監視者監視支援システムは、さらに、前記被監視者の着座を検知する着座検知部と、前記検知結果を、前記着座検知部の着座検知結果に基づいて異なる処理方法で処理する検知結果処理部とを備える。前記端末装置は、1種類の装置であって良いが、本実施形態態では、前記端末装置は、固定端末装置と携帯端末装置との2種類の装置である。これら固定端末装置と携帯端末装置との主な相違は、固定端末装置が固定的に運用される一方、携帯端末装置が例えば看護師や介護士等の監視者(サービス提供者、ユーザ)に携行されて運用される点であり、これら固定端末装置と携帯端末装置とは、略同様である。このような被監視者監視支援システムについて、以下、より具体的に説明する。 The monitored person monitoring support system in the present embodiment is a system that supports monitoring of a monitored person (watching target person) Ob who is a monitoring target (watching target) to be monitored (should watch). . In the present embodiment, the monitored person monitoring support system is provided corresponding to the monitored person Ob, and is connected communicably to the sensor device for detecting a predetermined action related to the monitored person Ob. A central processing unit that manages detection results received from a sensor device, and a terminal unit communicably connected to the central processing unit and receiving and displaying the detection results via the central processing unit. Further, in the present embodiment, the monitored person monitoring support system further includes a seating detection unit that detects seating of the monitored person, and a processing method in which the detection result is different based on the seating detection result of the seating detection unit. And a detection result processing unit for processing in The terminal device may be one type of device, but in the present embodiment, the terminal device is two types of devices: a fixed terminal device and a portable terminal device. The main difference between the fixed terminal device and the portable terminal device is that while the fixed terminal device is operated in a fixed manner, the portable terminal device is carried to a supervisor (service provider, user) such as a nurse or a caregiver, for example. These fixed terminal devices and mobile terminal devices are substantially the same. Such a person-to-be-monitored monitoring support system will be described more specifically below.
 (第1実施形態)
 図1は、実施形態における被監視者監視支援システムの構成を示す図である。図2は、前記被監視者監視支援システムにおけるセンサ装置の構成を示す図である。図3は、前記センサ装置に記憶される寝具所在領域を設定するための設定画面の一例を示す図である。図4は、離床の検知手法を説明するための図である。図4Aは、寝具BDの一方側端に座る被監視者Obの様子を示し、図4Bは、図4Aの状況を天井から垂直下方に撮像した対象画像を示す。図5は、前記転倒転落の検知手法を説明するための図である。図5Aは、立位姿勢の被監視者Obの様子を示し、図5Bは、図5Aの状況を天井から垂直下方に撮像した対象画像を示す。図5Cは、転落、転落による座位姿勢の被監視者Obの様子を示し、図5Dは、図5Cの状況を天井から垂直下方に撮像した対象画像を示す。図6は、前記被監視者監視支援システムにおける管理サーバ装置の構成を示す図である。
First Embodiment
FIG. 1 is a diagram showing a configuration of a monitored person monitoring support system in the embodiment. FIG. 2 is a diagram showing a configuration of a sensor device in the monitored person monitoring support system. FIG. 3 is a diagram showing an example of a setting screen for setting a bedding location area stored in the sensor device. FIG. 4 is a diagram for explaining a method of detecting bed departure. FIG. 4A shows a state of the monitored person Ob sitting at one end of the bedding BD, and FIG. 4B shows a target image obtained by imaging the situation of FIG. 4A vertically downward from the ceiling. FIG. 5 is a diagram for explaining the detection method of the fall. FIG. 5A shows the subject Ob in the standing posture, and FIG. 5B shows a target image obtained by imaging the situation of FIG. 5A vertically downward from the ceiling. FIG. 5C shows the state of the watched person Ob in the sitting posture due to falling, falling, and FIG. 5D shows a target image obtained by capturing the situation of FIG. 5C vertically downward from the ceiling. FIG. 6 is a diagram showing a configuration of a management server device in the monitored person monitoring support system.
 実施形態における被監視者監視支援システムMSaは、被監視者Obの監視を支援するシステムであり、例えば、図1に示すように、1または複数のセンサ装置SUa(SUa-1~SUa-4)と、管理サーバ装置SVaと、固定端末装置SPaと、1または複数の携帯端末装置TAa(TAa-1、TAa-2)と、構内交換機(PBX、Private branch exchange)CXとを備え、これらは、有線や無線で、LAN(Local Area Network)等の網(ネットワーク、通信回線)NWを介して通信可能に接続される。ネットワークNWは、通信信号を中継する例えばリピーター、ブリッジおよびルーター等の中継機が備えられても良い。図1に示す例では、これら複数のセンサ装置SUa-1~SUa-4、管理サーバ装置SVa、固定端末装置SPa、複数の携帯端末装置TAa-1、TAa-2および構内交換機CXは、L2スイッチの集線装置(ハブ、HUB)LSおよびアクセスポイントAPを含む有線および無線の混在したLAN(例えばIEEE802.11規格に従ったLAN等)NWによって互いに通信可能に接続されている。より詳しくは、複数のセンサ装置SUa-1~SUa-4、管理サーバ装置SVa、固定端末装置SPaおよび構内交換機CXは、集線装置LSに接続され、複数の携帯端末装置TAa-1、TAa-2は、アクセスポイントAPを介して集線装置LSに接続されている。そして、ネットワークNWは、TCP(Transmission control protocol)およびIP(Internet protocol)等のインターネットプロトコル群が用いられることによっていわゆるイントラネットを構成する。 The monitored person monitoring support system MSa in the embodiment is a system for supporting monitoring of the monitored person Ob, and for example, as shown in FIG. 1, one or more sensor devices SUa (SUa-1 to SUa-4) , A management server apparatus SVa, a fixed terminal apparatus SPa, one or more portable terminal apparatuses TAa (TAa-1 and TAa-2), and a private branch exchange (PBX) CX, which include: It is communicably connected via a network (network, communication line) NW such as a LAN (Local Area Network) by wire or wirelessly. The network NW may be provided with relays such as repeaters, bridges and routers for relaying communication signals. In the example shown in FIG. 1, the plurality of sensor devices SUa-1 to SUa-4, the management server device SVa, the fixed terminal device SPa, the plurality of portable terminal devices TAa-1 and TAa-2, and the private branch exchange CX are L2 switches. Are connected communicably to each other by a wired / wireless mixed LAN (for example, a LAN according to the IEEE 802.11 standard) NW including the line concentrators (hub, HUB) LS and the access point AP. More specifically, the plurality of sensor devices SUa-1 to SUa-4, the management server SVa, the fixed terminal SPa, and the private branch exchange CX are connected to the concentrator LS, and the plurality of mobile terminals TAa-1 and TAa-2 are connected. Are connected to the concentrator LS via the access point AP. Then, the network NW constructs a so-called intranet by using an internet protocol group such as a transmission control protocol (TCP) and an internet protocol (IP).
 構内交換機(回線切換機)CXは、ネットワークNWに接続され、携帯端末装置TAa同士における発信、着信および通話等の内線電話の制御を行って前記携帯端末装置TAa同士の内線電話を実施し、そして、例えば固定電話網や携帯電話網等の公衆電話網PNを介して例えば固定電話機や携帯電話機等の外線電話機TLに接続され、外線電話機TLと携帯端末装置TAaとの間における発信、着信および通話等の外線電話の制御を行って外線電話機TLと携帯端末装置TAaとの間における外線電話を実施するものである。構内交換機CXは、例えば、デジタル交換機や、IP-PBX(Internet Protocol Private Branch eXchange)等である。 The private branch exchange (line switching unit) CX is connected to the network NW, performs extension telephone control such as call origination, call reception, and telephone call between the portable terminal devices TAa, and implements the extension telephone of the portable terminal devices TAa, For example, it is connected to an outside telephone TL such as a fixed telephone or a mobile telephone through a public telephone network PN such as a fixed telephone network or a mobile telephone network, for example, to make, receive, and make calls between the outside telephone TL and the mobile terminal device TAa. , Etc. to control the outside line telephone and carry out the outside line telephone between the outside line telephone TL and the portable terminal device TAa. The private branch exchange CX is, for example, a digital exchange, an IP-PBX (Internet Protocol Private Branch eXchange), or the like.
 被監視者監視支援システムMSaは、被監視者Obに応じて適宜な場所に配設される。被監視者(見守り対象者)Obは、例えば、病気や怪我等によって看護を必要とする者や、身体能力の低下等によって介護を必要とする者や、一人暮らしの独居者等である。特に、早期発見と早期対処とを可能にする観点から、被監視者Obは、例えば異常状態等の所定の不都合な事象がその者に生じた場合にその発見を必要としている者であることが好ましい。このため、被監視者監視支援システムMSaは、被監視者Obの種類に応じて、病院、老人福祉施設および住戸等の建物に好適に配設される。図1に示す例では、被監視者監視支援システムMSaは、複数の被監視者Obが入居する複数の居室RMや、ナースステーション等の複数の部屋を備える介護施設の建物に配設されている。 The monitored person monitoring support system MSa is disposed at an appropriate place according to the monitored person Ob. The monitored person (watching target person) Ob is, for example, a person who needs nursing due to illness or injury, a person who needs care due to a decrease in physical ability or the like, or a single person living alone. In particular, from the viewpoint of enabling early detection and coping, the person to be monitored Ob is a person who needs the detection when a predetermined adverse event such as an abnormal condition occurs in the person. preferable. For this reason, the person-to-be-monitored monitoring support system MSa is suitably disposed in buildings such as hospitals, welfare facilities for the elderly and dwelling units according to the type of the person to be monitored Ob. In the example shown in FIG. 1, the monitored person monitoring support system MSa is disposed in a building of a care facility provided with a plurality of living rooms RM in which a plurality of monitored persons Ob reside, and a plurality of rooms such as a nurse station. .
 センサ装置SUaは、ネットワークNWを介して他の装置SVa、SPa、TAaと通信する通信機能等を備え、被監視者Obにおける所定の行動を検知してその検知結果を画像とともに管理サーバ装置SVaへ送信する装置である。そして、本実施形態では、センサ装置SUaは、前記被監視者Obの着座を検知し、前記検知結果を管理サーバ装置SVaへ送信する際に、前記検知結果を、前記着座検知部の着座検知結果に基づいて異なる処理方法で処理する。このようなセンサ装置SUaは、例えば、図2に示すように、撮像部11と、着座検知部12と、センサ側制御処理部(SU制御処理部)13aと、センサ側通信インターフェース部(SU通信IF部)14と、センサ側記憶部(SU記憶部)15とを備える。 The sensor device SUa has a communication function and the like for communicating with other devices SVa, SPa, TAa via the network NW, detects a predetermined action in the monitored person Ob, and sends the detection result to the management server device SVa together with an image. It is an apparatus to transmit. Then, in the present embodiment, when the sensor device SUa detects the seating of the person to be monitored Ob and transmits the detection result to the management server device SVa, the detection result is a seating detection result of the seating detection unit. Process with different processing methods based on For example, as illustrated in FIG. 2, such a sensor device SUa includes an imaging unit 11, a seating detection unit 12, a sensor control processing unit (SU control processing unit) 13a, and a sensor communication interface unit (SU communication An IF unit 14 and a sensor storage unit (SU storage unit) 15 are provided.
 撮像部11は、SU制御処理部13aに接続され、SU制御処理部13aの制御に従って、被監視者Obにおける所定の行動を検知するために、前記被測定者Obの形状を写した画像(画像データ)を生成する装置である。撮像部11は、監視すべき監視対象である被監視者Obが所在を予定している空間(所在空間、図1に示す例では配設場所の居室(部屋)RM)を監視可能に配置され、前記所在空間を撮像対象としてその上方から撮像し、前記撮像対象を俯瞰した画像(画像データ)を生成し、前記撮像対象の画像(対象画像)をSU制御処理部13aへ出力する。好ましくは、監視対象の被監視者Ob全体を撮像できる蓋然性が高いことから、撮像部11は、被監視者Obが横臥する寝具(例えばベッド等)における、被監視者Obの頭部が位置すると予定されている予め設定された頭部予定位置(通常、枕の配設位置)の直上から撮像対象を撮像できるように配設される。センサ装置SUは、この撮像部11によって、被監視者Obを、被監視者Obの上方から撮像した画像、好ましくは前記頭部予定位置の直上から撮像した画像を取得する。 The imaging unit 11 is connected to the SU control processing unit 13a, and according to the control of the SU control processing unit 13a, an image (image) showing the shape of the person to be measured Ob in order to detect a predetermined action on the person to be monitored Ob. Data) is generated. The imaging unit 11 is arranged to be able to monitor the space (location space; room (room) RM of the arrangement location in the example shown in FIG. 1) where the monitored person Ob who is the monitoring target to be monitored is scheduled to be located The location space is imaged from above as an imaging target, an image (image data) in which the imaging target is viewed is generated, and an image (target image) of the imaging target is output to the SU control processing unit 13a. Preferably, the imaging unit 11 may position the head of the monitored person Ob in the bedding (for example, a bed or the like) in which the monitored person Ob lays, since the probability of capturing the entire monitored object Ob of the monitored object is high. It arrange | positions so that an imaging target can be imaged from immediately on the head preset position (usually the arrangement | positioning position of a pillow) of the preset preset head plan. The sensor unit SU uses the imaging unit 11 to obtain an image of the person to be monitored Ob, which is taken from above the person to be monitored Ob, preferably an image of the person to be monitored Ob, which is taken from directly above the planned head position.
 このような撮像部11は、可視光の画像を生成する装置であって良いが、比較的暗がりでも被監視者Obを監視できるように、本実施形態では、赤外線の画像を生成する装置である。このような撮像部11は、例えば、本実施形態では、撮像対象における赤外の光学像を所定の結像面上に結像する結像光学系、前記結像面に受光面を一致させて配置され、前記撮像対象における赤外の光学像を電気的な信号に変換するエリアイメージセンサ、および、エリアイメージセンサの出力を画像処理することで前記撮像対象における赤外の画像を表すデータである画像データを生成する画像処理部等を備えるデジタル赤外線カメラである。撮像部11の前記結像光学系は、本実施形態では、その配設された居室RM全体を撮像できる画角を持つ広角な光学系(いわゆる広角レンズ(魚眼レンズを含む))であることが好ましい。あるいは、前記撮像部11は、撮像対象の熱分布画像を生成するサーモグラフィーカメラであっても良い。 Such an imaging unit 11 may be a device that generates an image of visible light, but in the present embodiment, it is a device that generates an infrared image so that the monitored person Ob can be monitored even in a relatively dark state. . In the present embodiment, for example, in the present embodiment, such an imaging unit 11 forms an imaging optical system that forms an optical image of infrared in an imaging target on a predetermined imaging surface, and makes the light receiving surface coincide with the imaging surface. An area image sensor arranged to convert an infrared optical image of the imaging target into an electrical signal, and data representing an infrared image of the imaging target by performing image processing on an output of the area image sensor This digital infrared camera includes an image processing unit that generates image data. In the present embodiment, the imaging optical system of the imaging unit 11 is preferably a wide-angle optical system (a so-called wide-angle lens (including a fisheye lens)) having an angle of view capable of imaging the entire room RM provided. . Alternatively, the imaging unit 11 may be a thermographic camera that generates a thermal distribution image of an imaging target.
 着座検知部12は、SU制御処理部13aに接続され、被監視者Obの着座が予定(想定)される、例えば椅子、車椅子およびトイレ設備等の装置に配設され、被監視者Obの着座を検知する装置である。着座検知部12は、その着座検知結果をSU制御処理部13aへ出力する。着座検知部12は、例えば、所定の時間間隔(サンプリング間隔)で被監視者Obにおける着座の有無を判定し、その判定ごとに、前記判定に応じて「着座有り」または「着座なし」を前記着座検知結果として管理サーバ装置SVaへ出力して良い。また例えば、着座検知部12は、所定の時間間隔(サンプリング間隔)で被監視者Obにおける着座の有無を判定し、被監視者Obの着座有りを検知した場合にのみ、その検知ごとに「着座有り」を前記着座検知結果として管理サーバ装置SVaへ出力して良い。この場合では、着座有りが入力されないことで、SU制御処理部13aは、着座無しを認識できる。なお、着座検知部12は、有線でSU制御処理部13aに接続されて良く、また、例えばBluetooth(登録商標)規格等の近距離無線通信でSU制御処理部13aに接続されて良い。 The seating detection unit 12 is connected to the SU control processing unit 13a, and is arranged (for example, a chair, a wheelchair, and a toilet facility) where seating of the monitored person Ob is scheduled (estimated), and seating of the monitored person Ob Is a device that detects The seating detection unit 12 outputs the seating detection result to the SU control processing unit 13a. The seating detection unit 12 determines, for example, the presence or absence of seating in the monitored person Ob at predetermined time intervals (sampling intervals), and for each determination, "seating" or "no seating" is determined according to the determination. It may be output to the management server device SVa as a seating detection result. Further, for example, the seating detection unit 12 determines the presence or absence of seating in the monitored person Ob at a predetermined time interval (sampling interval), and detects “seating” only when the seating of the monitored person Ob is detected. "Yes" may be output to the management server device SVa as the seating detection result. In this case, the SU control processing unit 13a can recognize the absence of seating by not inputting the presence of seating. The seating detection unit 12 may be connected to the SU control processing unit 13a by wire, and may be connected to the SU control processing unit 13a by short distance wireless communication such as Bluetooth (registered trademark) standard, for example.
 このような着座検知部12は、例えば圧力センサや振動センサ等の、接触式の着座センサを備えて構成される。前記圧力センサは、被監視者Obの着座による加圧(圧力増加)を検知するように、例えば椅子、車椅子およびトイレ設備等の座面等に配設される。前記振動センサは、被監視者Obの着座による振動を検知するように、例えば椅子、車椅子およびトイレ設備等の座面等に配設される。 Such a seating detection unit 12 includes, for example, a contact-type seating sensor such as a pressure sensor or a vibration sensor. The pressure sensor is disposed, for example, on a seating surface of a chair, a wheelchair, a toilet facility, or the like so as to detect pressurization (pressure increase) due to the seating of the monitored person Ob. The vibration sensor is disposed, for example, on a seating surface of a chair, a wheelchair, a toilet facility or the like so as to detect a vibration caused by the seating of the person to be monitored Ob.
 また例えば、着座検知部12は、例えば熱センサ等の、非接触式の着座センサを備えて構成される。前記熱センサは、被監視者Obから放射される熱(赤外線)を、例えば椅子およびトイレ設備等の配置領域内で検知するように、配設される。 Further, for example, the seating detection unit 12 includes a non-contact seating sensor such as a heat sensor. The heat sensor is disposed so as to detect heat (infrared radiation) emitted from the monitored person Ob in, for example, an arrangement area of a chair and a toilet facility.
 SU通信IF部14は、SU制御処理部13aに接続され、SU制御処理部13aの制御に従って通信を行うための通信回路である。SU通信IF部14は、SU制御処理部13aから入力された転送すべきデータを収容した通信信号を、この被監視者監視支援システムMSaのネットワークNWで用いられる通信プロトコルに従って生成し、この生成した通信信号をネットワークNWを介して他の装置SVa、SPa、TAaへ送信する。SU通信IF部14は、ネットワークNWを介して他の装置SVa、SPa、TAaから通信信号を受信し、この受信した通信信号からデータを取り出し、この取り出したデータをSU制御処理部13aが処理可能な形式のデータに変換してSU制御処理部13aへ出力する。SU通信IF部14は、例えば、IEEE802.11規格等に従った通信インターフェース回路を備えて構成される。 The SU communication IF unit 14 is a communication circuit which is connected to the SU control processing unit 13a and performs communication according to the control of the SU control processing unit 13a. The SU communication IF unit 14 generates a communication signal containing the data to be transferred, which is input from the SU control processing unit 13a, according to the communication protocol used in the network NW of the person-to-be-monitored monitoring support system MSa The communication signal is transmitted to the other devices SVa, SPa, TAa via the network NW. The SU communication IF unit 14 receives communication signals from the other devices SVa, SPa, TAa via the network NW, extracts data from the received communication signals, and the SU control processing unit 13a can process the extracted data. It converts into data of the following format and outputs it to the SU control processing unit 13a. The SU communication IF unit 14 is configured to include, for example, a communication interface circuit according to the IEEE 802.11 standard or the like.
 SU記憶部15は、SU制御処理部13aに接続され、SU制御処理部13aの制御に従って、各種の所定のプログラムおよび各種の所定のデータを記憶する回路である。前記各種の所定のプログラムには、例えば、センサ装置SUaの各部を当該各部の機能に応じてそれぞれ制御するSU制御プログラムや、被監視者Obに対する監視に関する所定の情報処理を実行するSU監視処理プログラム等の制御処理プログラムが含まれる。前記SU監視処理プログラムには、被監視者Obにおける所定の行動を検知する行動検知処理プログラムや、前記行動検知処理プログラムの検知結果を、着座検知部12の着座検知結果に基づいて異なる処理方法で処理する検知結果処理プログラム等が含まれる。前記各種の所定のデータには、自機の、センサ装置SUaを特定しセンサ装置SUaを識別するための識別子であるセンサ識別子(センサID)、および、管理サーバ装置SVaの通信アドレス等の、各プログラムを実行する上で必要なデータ等が含まれる。SU記憶部15は、例えば不揮発性の記憶素子であるROM(Read Only Memory)や書き換え可能な不揮発性の記憶素子であるEEPROM(Electrically Erasable Programmable Read Only Memory)等を備える。そして、SU記憶部15は、前記所定のプログラムの実行中に生じるデータ等を記憶するいわゆるSU制御処理部13aのワーキングメモリとなるRAM(Random Access Memory)等を含む。 The SU storage unit 15 is a circuit which is connected to the SU control processing unit 13a and stores various predetermined programs and various predetermined data according to the control of the SU control processing unit 13a. The various predetermined programs include, for example, an SU control program that controls each part of the sensor device SUa according to the function of each part, and an SU monitoring processing program that executes predetermined information processing related to monitoring of the monitored person Ob. Etc. control processing programs are included. In the SU monitoring process program, an action detection process program for detecting a predetermined action in the monitored person Ob and a detection result of the action detection process program are different processing methods based on the seating detection result of the seating detection unit 12 It includes a detection result processing program to be processed. The various predetermined data include a sensor identifier (sensor ID) that is an identifier for identifying the sensor device SUa of the own device and identifying the sensor device SUa, and a communication address of the management server device SVa, etc. It contains data etc. necessary to execute the program. The SU storage unit 15 includes, for example, a ROM (Read Only Memory) which is a non-volatile memory element, an Electrically Erasable Programmable Read Only Memory (EEPROM) which is a rewritable non-volatile memory element, or the like. The SU storage unit 15 includes, for example, a RAM (Random Access Memory) serving as a working memory of a so-called SU control processing unit 13a which stores data and the like generated during execution of the predetermined program.
 SU制御処理部13aは、センサ装置SUaの各部11、12、14、15を当該各部11、12、14、15の機能に応じてそれぞれ制御し、予め設定された被監視者Obに関わる所定の行動を検知してその検知結果を画像とともに管理サーバ装置SVaへ送信するための回路である。そして、本実施形態では、SU制御処理部13aは、前記被監視者Obの着座を検知し、前記検知結果を管理サーバ装置SVaへ送信する際に、前記検知結果を、着座検知部12の着座検知結果に基づいて異なる処理方法で処理する。SU制御処理部13aは、例えば、CPU(Central Processing Unit)およびその周辺回路を備えて構成される。SU制御処理部13aは、前記制御処理プログラムが実行されることによって、センサ側制御部(SU制御部)131、行動検知処理部132および検知結果処理部133aを機能的に備える。 The SU control processing unit 13a controls each of the units 11, 12, 14, 15 of the sensor device SUa according to the function of each of the units 11, 12, 14, 15, and sets a predetermined set of observers Ob. It is a circuit for detecting an action and transmitting the detection result together with an image to the management server device SVa. Then, in the present embodiment, when the SU control processing unit 13a detects the seating of the person to be monitored Ob and transmits the detection result to the management server device SVa, the seating of the seating detection unit 12 is the detection result. Process with different processing methods based on detection results. The SU control processing unit 13a includes, for example, a central processing unit (CPU) and peripheral circuits thereof. The SU control processing unit 13a functionally includes a sensor-side control unit (SU control unit) 131, an action detection processing unit 132, and a detection result processing unit 133a by executing the control processing program.
 SU制御部131は、センサ装置SUaの各部11、12、14、15を当該各部11、12、14、15の機能に応じてそれぞれ制御し、センサ装置SUaの全体制御を司るものである。 The SU control unit 131 controls the respective units 11, 12, 14, 15 of the sensor unit SUa according to the functions of the respective units 11, 12, 14, 15, and controls the entire control of the sensor unit SUa.
 行動検知処理部132は、被監視者Obにおける、予め設定された所定の行動を検知するものである。より具体的には、本実施形態では、前記所定の行動は、例えば、被監視者Obが寝具から離れた離床、被監視者Obが寝具から落ちた転落、および、被監視者Obが倒れた転倒の3つの行動である。行動検知処理部132は、例えば、撮像部11で撮像した対象画像に基づいて被監視者Obにおける離床、転倒および転落を検知する。 The action detection processing unit 132 detects a predetermined action set in advance in the monitored person Ob. More specifically, in the present embodiment, the predetermined action is, for example, leaving the bed when the monitored person Ob leaves the bedding, falling down when the monitored person Ob falls from the bedding, and falling down the monitored person Ob. There are three actions of falling. The action detection processing unit 132 detects, for example, bed departure, falling, and falling in the monitored person Ob based on the target image captured by the imaging unit 11.
 より詳しくは、まず、寝具BDの所在領域、離床の有無を判定するための第1閾値Th1、および、転落や転倒の有無を判定するための第2閾値Th2が前記各種の所定のデータの1つとして予めSU記憶部15に記憶される。寝具BDの所在領域は、端末装置SPa、TAaに、例えば固定端末装置SPaに、図3に示す設定画面が表示され、この表示された設定画面で寝具BDの所在領域における4個の頂点位置(対象画像上での座標)が入力され、固定端末装置SPaから管理サーバ装置SVaを介してセンサ装置SUaに記憶される。前記設定画面には、撮像部11で生成された対象画像が管理サーバ装置SVaを介して固定端末装置SPaに表示される。なお、着座検知部12が非接触式の着座センサである場合には、さらに、前記設定画面から、例えば椅子およびトイレ設備等の前記配置領域における4個の頂点位置(対象画像上での座標)が入力され、固定端末装置SPaから管理サーバ装置SVaを介してセンサ装置SUaに記憶される。離床の有無は、本実施形態では、被監視者Obにおける現在の状態と、図4Bに示すように、対象画像から抽出された人物領域HAにおける、寝具BDの所在領域からはみ出した領域(はみ出し領域)Houtの大きさに基づいて判定される。このため、前記第1閾値Th1は、離床前のはみ出し領域の大きさと離床後のはみ出し領域の大きさとを識別するための値で、予め複数のサンプルから適宜に設定される。転落や転落の有無は、図5Bおよび図5Dに示すように、被監視者Obの頭部領域HDにおける位置および大きさの時間変化に基づいて判定される。このため、前記第2閾値Th2は、立位姿勢の頭部領域HDの大きさであるか、座位姿勢(または横臥姿勢)の頭部領域HDの大きさであるかの別を識別するための値で、予め複数のサンプルから適宜に設定される。 More specifically, first, the location area of the bedding BD, a first threshold Th1 for determining the presence or absence of bed leaving, and a second threshold Th2 for determining the presence or absence of falling or falling are one of the various predetermined data. Are stored in advance in the SU storage unit 15 as one. For the location area of the bedding BD, the setting screen shown in FIG. 3 is displayed on the terminal devices SPa and TAa, for example, on the fixed terminal apparatus SPa, and four vertex positions in the location area of the bedding BD are displayed on the displayed setting screen. Coordinates on the target image are input and stored in the sensor unit SUa from the fixed terminal unit SPa via the management server unit SVa. On the setting screen, the target image generated by the imaging unit 11 is displayed on the fixed terminal apparatus SPa via the management server apparatus SVa. In the case where the seating detection unit 12 is a non-contact seating sensor, four vertex positions (coordinates on the target image) in the arrangement area, such as a chair and a toilet facility, are further determined from the setting screen. Is input from the fixed terminal SPa and stored in the sensor unit SUa via the management server SVa. In the present embodiment, as shown in FIG. 4B, the presence or absence of leaving of the bed is a region (overlap region out of the location region of the bedding BD in the person region HA extracted from the target image, as shown in FIG. ) It is determined based on the size of Hout. Therefore, the first threshold Th1 is a value for identifying the size of the protruding area before leaving the bed and the size of the protruding area after leaving the bed, and is appropriately set in advance from a plurality of samples. As shown in FIGS. 5B and 5D, the presence or absence of a fall or a fall is determined based on the time change of the position and the size of the monitored person Ob in the head region HD. For this reason, the second threshold value Th2 is used to identify whether the size of the head area HD in the standing position or the size of the head area HD in the sitting position (or lying position). The value is appropriately set in advance from a plurality of samples.
 そして、行動検知処理部132は、対象画像から例えば背景差分法やフレーム差分法によって被監視者Obの人物の領域(人物領域)HAとして動体領域を抽出する。行動検知処理部132は、この抽出した人物領域(動体領域)HAから、例えば円形や楕円形のハフ変換によって、また例えば予め用意された頭部のモデルを用いたパターンマッチングによって、また例えば頭部検出用に学習したニューラルネットワークによって、被監視者Obの頭部領域HDを抽出する。離床の検知では、行動検知処理部132は、被監視者Obにおける現在の状態が離床前であって、人物領域HAに対するはみ出し領域HAoutの割合が前記第1閾値Th1を超えた場合((HAout/HA)>Th1))に、離床と判定し、前記離床を検知する。転落の検知では、行動検知処理部132は、この抽出した頭部領域HDの位置(例えば頭部領域の中心位置等)が寝具BDの所在領域内から寝具BDの所在領域外へ時間変化した場合であって、前記抽出した頭部領域HDの大きさが前記第2閾値Th2以下である場合には、転落と判定し、前記転落を検知する。転倒の検知では、行動検知処理部132は、この抽出した頭部領域HDの位置が寝具BDの所在領域を除く居室RM内であって、前記抽出した頭部領域HDの大きさが前記第2閾値Th2以下である場合には、転倒と判定し、前記転倒を検知する。 Then, the action detection processing unit 132 extracts a moving subject region as a region (person region) HA of a person of the person to be monitored Ob from the target image, for example, by the background subtraction method or the frame subtraction method. The action detection processing unit 132 performs, for example, a circular or elliptical Hough transform from the extracted person area (moving body area) HA, for example, a pattern matching using a prepared head model, for example, a head The head region HD of the person to be monitored Ob is extracted by a neural network learned for detection. In the detection of bed departure, the behavior detection processing unit 132 detects that the current state of the monitored person Ob is before bed leaving and the ratio of the overhang area HAout to the person area HA exceeds the first threshold Th1 ((HAout / HA)> Th1), it is determined that the bed has left, and the bed is detected. In the fall detection, when the behavior detection processing unit 132 changes the position of the extracted head area HD (for example, the center position of the head area, etc.) from within the area where the bedding BD is located to outside the area where the bedding BD is located If the size of the extracted head region HD is less than or equal to the second threshold value Th2, it is determined that a fall has occurred, and the fall is detected. In the fall detection, the action detection processing unit 132 determines that the position of the extracted head area HD is in the living room RM excluding the area where the bedding BD is located, and the size of the extracted head area HD is the second If the threshold value Th2 or less, it is determined that a fall, to detect the fall.
 検知結果処理部133aは、行動検知処理部132の検知結果を、着座検知部12の着座検知結果に基づいて異なる処理方法で処理するものである。より具体的には、本実施形態では、検知結果処理部133aは、前記着座検知結果が被監視者Obの着座ではない場合に前記検知結果を端末装置SPa、TAaに表示させる第1処理方法で処理し、前記着座検知結果が被監視者Obの着座である場合に前記検知結果を端末装置SPa、TAaに表示させない第2処理方法で処理する。より詳しくは、前記第2処理方法は、前記検知結果を管理サーバ装置SVaに送信しない処理方法である。すなわち、前記第1処理方法では、上述のように、行動検知処理部132が前記所定の行動を検知すると、前記着座検知結果が被監視者Obの着座ではない場合に、検知結果処理部133aは、前記検知した被監視者Obにおける所定の行動の内容を表す情報を収容した、前記検知を報知(通知)するための通信信号(第1行動検知通知通信信号)をSU通信IF部14で管理サーバ装置SVaへ送信する。前記第1行動検知通知通信信号には、例えば、自機のセンサID、検知した被監視者Obにおける所定の行動の内容を表す情報(本実施形態では、離床、転落および転倒のうちのいずれか)、および、前記所定の行動の検知に用いられた対象画像が収容される。前記第2処理方法では、上述のように、行動検知処理部132が前記所定の行動を検知しても、前記着座検知結果が被監視者Obの着座である場合に、検知結果処理部133aは、前記第1行動検知通知通信信号を送信しない。なお、前記第1行動検知通知通信信号の送信処理は、行動検知処理部132によって実行されて良く、この場合では、前記着座検知結果が被監視者Obの着座ではない場合に、行動検知処理部132による前記第1行動検知通知通信信号の送信処理を許可し、一方、前記着座検知結果が被監視者Obの着座である場合に、行動検知処理部132による前記第1行動検知通知通信信号の送信処理を禁止する。 The detection result processing unit 133 a processes the detection result of the action detection processing unit 132 by a different processing method based on the seating detection result of the seating detection unit 12. More specifically, in the present embodiment, in the first processing method, the detection result processing unit 133a causes the terminal devices SPa and TAa to display the detection result when the seating detection result is not the seating of the monitored person Ob. It processes, and when the said seating detection result is seating of the to-be-monitored person Ob, the said sensing result is processed by the 2nd processing method which is not displayed on terminal device SPa and TAa. More specifically, the second processing method is a processing method which does not transmit the detection result to the management server SVa. That is, in the first processing method, as described above, when the action detection processing unit 132 detects the predetermined action, the detection result processing unit 133a detects that the seating detection result is not the seating of the monitored person Ob. The SU communication IF unit 14 manages a communication signal (first action detection notification communication signal) for notifying (notifying) the detection, which contains information indicating the content of a predetermined action in the detected monitored person Ob. It transmits to the server apparatus SVa. The first action detection notification communication signal includes, for example, the sensor ID of the own machine, information indicating the content of a predetermined action in the detected monitored person Ob (in the embodiment, any one of bed leaving, falling and falling) And the target image used for the detection of the predetermined action. In the second processing method, as described above, even if the action detection processing unit 132 detects the predetermined action, the detection result processing unit 133a detects that the seating detection result indicates that the monitored person Ob is seated. The first action detection notification communication signal is not transmitted. In addition, the transmission process of the first action detection notification communication signal may be executed by the action detection processing unit 132, and in this case, the action detection processing unit when the seating detection result is not the seating of the monitored person Ob. If the process of transmitting the first action detection notification communication signal by 132 is permitted, and the seating detection result indicates that the monitored person Ob is seated, the first action detection notification communication signal by the action detection processing unit 132 Prohibits transmission processing.
 図1には、一例として、第1ないし第4センサ装置SUa-1~SUa-4が示されており、第1センサ装置SUa-1は、被監視者Obの一人であるAさんOb-1の居室RM-1(不図示)に配設され、第2センサ装置SUa-2は、被監視者Obの一人であるBさんOb-2の居室RM-2(不図示)に配設され、第3センサ装置SUa-3は、被監視者Obの一人であるCさんOb-3の居室RM-3(不図示)に配設され、そして、第4センサ装置SUa-4は、被監視者Obの一人であるDさんOb-4の居室RM-4(不図示)に配設されている。 In FIG. 1, as an example, first to fourth sensor devices SUa-1 to SUa-4 are shown, and the first sensor device SUa-1 is one of the monitored persons Ob, Mr. Ob-1 The second sensor device SUa-2 is disposed in a living room RM-2 (not shown) of Mr. B Ob-2 who is one of the monitored persons Ob, and is disposed in the living room RM-1 (not shown) of The third sensor device SUa-3 is disposed in a room RM-3 (not shown) of Mr. C's Ob-3 who is one of the monitored persons Ob, and the fourth sensor device SUa-4 is a monitored person. It is arranged in the room RM-4 (not shown) of Mr. D's Ob-4 who is one of the Obs.
 管理サーバ装置SVaは、ネットワークNWを介して他の装置SUa、TAa、SPaと通信する通信機能を備え、センサ装置SUaから前記所定の行動の通知を受けると、被監視者Obに対する監視に関する情報(監視情報(本実施形態では例えばセンサ装置SUaで検知した所定の行動の種類、被監視者Obの画像、および、前記通知を受けた時刻等))を管理し、前記所定の行動を所定の端末装置SPa、TAaへ報知(再報知、再通知、送信)し、クライアント(本実施形態では端末装置SPa、TAa等)の要求に応じたデータを前記クライアントに提供し、被監視者監視システムMSa全体を管理する装置である。このような管理サーバ装置SVaは、例えば、図6に示すように、サーバ側通信インターフェース部(SV通信IF部)21と、サーバ側制御処理部(SV制御処理部)22と、サーバ側記憶部(SV記憶部)23とを備える。 The management server device SVa has a communication function of communicating with the other devices SUa, TAa, and SPa via the network NW, and receives notification of the predetermined action from the sensor device SUa, information related to monitoring of the monitored person Ob ( Manages monitoring information (in this embodiment, for example, the type of predetermined action detected by the sensor device SUa, an image of the monitored person Ob, and the time when the notification is received, etc.), and the predetermined action is a predetermined terminal Informs (re-informs, re-notifies, transmits) to the devices SPa, TAa, and provides the client with data according to the request of the client (terminal devices SPa, TAa, etc. in the present embodiment) Is a device that manages For example, as shown in FIG. 6, such a management server device SVa includes a server side communication interface unit (SV communication IF unit) 21, a server side control processing unit (SV control processing unit) 22, and a server side storage unit. (SV storage unit) 23 is provided.
 SV通信IF部21は、SU通信IF部14と同様に、SV制御処理部22に接続され、SV制御処理部22の制御に従って通信を行うための通信回路である。SV通信IF部21は、例えば、IEEE802.11規格等に従った通信インターフェース回路を備えて構成される。 Similar to the SU communication IF unit 14, the SV communication IF unit 21 is a communication circuit connected to the SV control processing unit 22 and performing communication in accordance with the control of the SV control processing unit 22. The SV communication IF unit 21 includes, for example, a communication interface circuit conforming to the IEEE 802.11 standard or the like.
 SV記憶部23は、SV制御処理部22に接続され、SV制御処理部22の制御に従って、各種の所定のプログラムおよび各種の所定のデータを記憶する回路である。前記各種の所定のプログラムには、例えば、管理サーバ装置SVaの各部を当該各部の機能に応じてそれぞれ制御するSV制御プログラムや、被監視者Obに対する監視に関する所定の情報処理を実行するSV監視処理プログラム等の制御処理プログラムが含まれる。前記各種の所定のデータには、自機の、管理サーバ装置SVaを特定し管理サーバ装置SVaを識別するための識別子であるサーバ識別子(サーバID)や、被監視者Obの前記監視情報等の各プログラムを実行する上で必要なデータ等が含まれる。SV記憶部23は、例えば、ROM、EEPROM、RAMおよびその周辺回路を備えて構成される。 The SV storage unit 23 is a circuit which is connected to the SV control processing unit 22 and stores various predetermined programs and various predetermined data according to the control of the SV control processing unit 22. The various predetermined programs include, for example, an SV control program for controlling each part of the management server SVa according to the function of each part, and SV monitoring processing for executing predetermined information processing related to monitoring of the monitored person Ob. A control processing program such as a program is included. The various predetermined data includes a server identifier (server ID) which is an identifier for identifying the management server SVa of the own machine and identifying the management server SVa, the monitoring information of the monitored person Ob, etc. It contains data etc. necessary to execute each program. The SV storage unit 23 includes, for example, a ROM, an EEPROM, a RAM, and peripheral circuits thereof.
 SV制御処理部22は、管理サーバ装置SVaの各部を当該各部の機能に応じてそれぞれ制御し、センサ装置SUaから前記所定の行動の通知を受けると、被監視者Obに対する監視に関する監視情報を管理し、前記所定の行動を所定の端末装置SPa、TAaへ報知(再報知、再通知、送信)し、クライアントの要求に応じたデータを前記クライアントに提供し、被監視者監視システムMSa全体を管理するための回路である。SV制御処理部22は、例えば、CPUおよびその周辺回路を備えて構成される。SV制御処理部22は、前記制御処理プログラムが実行されることによって、サーバ側制御部(SV制御部)221およびサーバ側監視処理部(SV監視処理部)222を機能的に備える。 The SV control processing unit 22 controls the respective units of the management server device SVa according to the functions of the respective units, and when receiving the notification of the predetermined action from the sensor device SUa, manages the monitoring information related to the monitoring of the monitored person Ob Broadcast (re-notify, re-notify, send) the predetermined action to a predetermined terminal apparatus SPa, TAa, provide data to the client according to the client's request, and manage the entire monitored person monitoring system MSa It is a circuit to The SV control processing unit 22 is configured to include, for example, a CPU and its peripheral circuits. The SV control processing unit 22 functionally includes a server-side control unit (SV control unit) 221 and a server-side monitoring processing unit (SV monitoring processing unit) 222 by executing the control processing program.
 SV制御部221は、管理サーバ装置SVaの各部21、23を当該各部21、23の機能に応じてそれぞれ制御し、管理サーバ装置SVaの全体制御を司るものである。 The SV control unit 221 controls the respective units 21 and 23 of the management server device SVa according to the functions of the respective units 21 and 23, and controls the overall control of the management server device SVa.
 SV監視処理部222は、センサ装置SUaから前記所定の行動の通知を受信すると被監視者Obに対する監視に関する監視情報を管理し、前記所定の行動を所定の端末装置SPa、TAaへ通知するものである。 When the SV monitoring processor 222 receives a notification of the predetermined action from the sensor device SUa, the SV monitoring processor 222 manages monitoring information related to monitoring of the monitored person Ob and notifies the predetermined action to the predetermined terminal devices SPa and TAa. is there.
 より具体的には、SV監視処理部222は、センサ装置SUaから前記第1行動検知通知通信信号を受信すると、この受信した第1行動検知通知通信信号に収容された、被監視者Obに対する監視に関する監視情報をSV記憶部23に記憶(記録)する。そして、SV監視処理部222は、第1行動検知通知通信信号で通知された検知結果を報知(再報知、再通知、送信)するために、前記受信した第1行動検知通知通信信号に収容された、被監視者Obにおける所定の行動の内容を表す情報を収容した通信信号(第2行動検知通知通信信号)をSV通信IF部21で所定の端末装置SPa、TAaへ送信する。前記第2行動検知通知通信信号には、例えば、前記受信した第1行動検知通知通信信号に収容された、センサID、被監視者Obにおける所定の行動の内容を表す情報(本実施形態では、離床、転落および転倒のうちのいずれか)、および、対象画像が収容される。前記第2行動検知通知通信信号は、例えば、同報通信で端末装置SPa、TAaへ送信されて良い。また例えば、前記第2行動検知通知通信信号は、前記第1行動検知通知通信信号を送信した送信元のセンサ装置SUaに対応付けられた端末装置SPa、TAaのみへ送信されて良い。この場合では、SV記憶部23には、1対1または1対多で対応付けられた、センサ装置SUa(センサID)と端末装置SPa、TAa(端末ID)との対応関係が予め記憶される。前記端末ID(端末識別子)は、端末装置SPa、TAaを特定し端末装置SPa、TAaを識別するための識別子である。 More specifically, when the SV monitoring processor 222 receives the first activity detection notification communication signal from the sensor device SUa, the SV monitoring processor 222 monitors the monitored person Ob accommodated in the received first activity detection notification communication signal. Are stored (recorded) in the SV storage unit 23. Then, the SV monitoring processing unit 222 is accommodated in the received first action detection notification communication signal to notify (re-notify, re-notify, transmit) the detection result notified by the first action detection notification communication signal. The SV communication IF unit 21 transmits a communication signal (second action detection notification communication signal) containing information indicating the content of the predetermined action in the monitored person Ob to the predetermined terminal devices SPa and TAa. The second activity detection notification communication signal includes, for example, a sensor ID, information representing the content of a predetermined activity in the monitored person Ob, which is accommodated in the received first activity detection notification communication signal (in the present embodiment, Any of bed leaving, falling and falling) and target images are stored. The second action detection notification communication signal may be transmitted to the terminal devices SPa and TAa by broadcast communication, for example. For example, the second action detection notification communication signal may be transmitted only to the terminal devices SPa and TAa associated with the sensor device SUa of the transmission source that has transmitted the first action detection notification communication signal. In this case, the SV storage unit 23 stores, in advance, the correspondence between the sensor device SUa (sensor ID) and the terminal devices SPa and TAa (terminal ID) associated with one to one or one to many. . The terminal ID (terminal identifier) is an identifier for identifying the terminal devices SPa and TAa and identifying the terminal devices SPa and TAa.
 なお、管理サーバ装置SVaは、図6に破線で示すように、必要に応じて、さらに、SV制御処理部22に接続され例えば各種コマンドや各種データ等を入力するサーバ側入力部(SV入力部)24、SV入力部24で入力された各種コマンドや各種データおよび被監視者Obに対する監視に関する監視情報等を出力するサーバ側出力部(SV出力部)25、および、外部機器との間でデータの入出力を行うサーバ側インターフェース部(SVIF部)26等を備えても良い。 As shown by the broken line in FIG. 6, the management server SVa is further connected to the SV control processor 22 as required, for example, a server-side input unit (SV input unit for inputting various commands, various data, etc.) 24), the server side output unit (SV output unit) 25 that outputs various commands and various data input from the SV input unit 24, monitoring information related to monitoring of the monitored person Ob, etc., and data with external devices A server side interface unit (SVIF unit) 26 or the like that performs input and output of
 このような管理サーバ装置SVaは、例えば、通信機能付きのコンピュータによって構成可能である。 Such a management server device SVa can be configured, for example, by a computer with a communication function.
 この管理サーバ装置SVaは、中央処理装置の一例に相当する。 The management server device SVa corresponds to an example of a central processing unit.
 固定端末装置SPaは、ネットワークNWを介して他の装置SUa、SVa、TAと通信する通信機能、所定の情報を表示する表示機能、および、所定の指示やデータを入力する入力機能等を備え、センサ装置SUaや管理サーバ装置SVaや携帯端末装置TAaに与える所定の指示やデータを入力したり、センサ装置SUaで得られた監視情報を表示したり等することによって、被監視者監視支援システムMSaのユーザインターフェース(UI)として機能する機器である。このような固定端末装置SPaは、例えば、通信機能付きのコンピュータによって構成可能である。 The fixed terminal device SPa has a communication function of communicating with other devices SUa, SVa, and TA via the network NW, a display function of displaying predetermined information, and an input function of inputting predetermined instructions and data. The target person monitoring support system MSa is configured to input predetermined instructions and data to be given to the sensor device SUa, the management server device SVa, and the portable terminal device TAa, or to display monitoring information obtained by the sensor device SUa. The device functions as a user interface (UI) of Such fixed terminal device SPa can be configured, for example, by a computer with a communication function.
 携帯端末装置TAaは、ネットワークNWを介して他の装置SVa、SP、SUaと通信する通信機能、所定の情報を表示する表示機能、所定の指示やデータを入力する入力機能、および、音声通話を行う通話機能等を備え、管理サーバ装置SVaやセンサ装置SUaに与える所定の指示やデータを入力したり、管理サーバ装置SVaからの通知によってセンサ装置SUaで得られた前記監視情報を表示したり等するための機器である。このような携帯端末装置TAaは、例えば、いわゆるタブレット型コンピュータやスマートフォンや携帯電話機等の、持ち運び可能な通信端末装置によって構成可能である。 The mobile terminal device TAa has a communication function of communicating with the other devices SVa, SP and SUa via the network NW, a display function of displaying predetermined information, an input function of inputting predetermined instructions and data, and a voice call. It has a call function to be performed, etc., inputs predetermined instructions and data given to the management server SVa and the sensor SUa, displays the monitoring information obtained by the sensor SUa by notification from the management server SVa, etc. Equipment for Such a portable terminal device TAa can be configured by, for example, a portable communication terminal device such as a so-called tablet computer, a smartphone, or a mobile phone.
 次に、本実施形態の動作について説明する。図7は、前記センサ装置の動作を示すフローチャートである。図8は、図7に示す行動の検知処理における動作を示すフローチャートである。図9は、図8に示す転倒転落の検知処理における動作を示すフローチャートである。図10は、図8に示す離床の検知処理における動作を示すフローチャートである。図11は、第1実施形態において、図7に示す検知結果の処理における動作を示すフローチャートである。図12は、第1実施形態における被監視者監視システムの動作を示すシーケンス図である。図13は、第1実施形態において、前記被監視者監視支援システムにおける携帯端末装置に表示される監視情報表示画面の一例を示す図である。 Next, the operation of this embodiment will be described. FIG. 7 is a flowchart showing the operation of the sensor device. FIG. 8 is a flowchart showing an operation in the action detection process shown in FIG. FIG. 9 is a flow chart showing the operation in the fall detection processing shown in FIG. FIG. 10 is a flowchart showing an operation in the process of detecting bed departure shown in FIG. FIG. 11 is a flowchart showing an operation in the process of the detection result shown in FIG. 7 in the first embodiment. FIG. 12 is a sequence diagram showing the operation of the monitored person monitoring system in the first embodiment. FIG. 13 is a diagram showing an example of a monitoring information display screen displayed on the mobile terminal device in the monitored person monitoring support system in the first embodiment.
 上記構成の被監視者監視支援システムMSaでは、各装置SUa、SVa、SPa、TAaは、電源が投入されると、必要な各部の初期化を実行し、その稼働を始める。センサ装置SUaでは、その制御処理プログラムの実行によって、SU制御処理部13aには、SU制御部131、行動検知処理部132および検知結果処理部133aが機能的に構成される。管理サーバ装置SVaでは、その制御処理プログラムの実行によって、SV制御処理部22には、SV制御部221およびSV監視処理部222が機能的に構成される。そして、センサ装置SUaは、各フレームごとに、あるいは、数フレームおきに、次のように動作することで、被監視者Obにおける所定の動作を検知している。 In the person-to-be-monitored person monitoring support system MSa configured as described above, each of the devices SUa, SVa, SPa, and TAa performs initialization of necessary parts when the power is turned on, and starts its operation. In the sensor device SUa, an SU control unit 131, an action detection processing unit 132, and a detection result processing unit 133a are functionally configured in the SU control processing unit 13a by execution of the control processing program. In the management server SVa, an SV control unit 221 and an SV monitoring processing unit 222 are functionally configured in the SV control processing unit 22 by execution of the control processing program. Then, the sensor device SUa detects a predetermined operation in the monitored person Ob by operating as follows for each frame or every several frames.
 図7において、センサ装置SUaは、SU制御処理部13aのSU制御部131によって、撮像部11から1フレーム分の画像(画像データ)を前記対象画像として取得する(S1)。 In FIG. 7, the sensor unit SUa acquires an image (image data) for one frame from the imaging unit 11 as the target image by the SU control unit 131 of the SU control processing unit 13a (S1).
 次に、センサ装置SUaは、SU制御処理部13aの行動検知処理部132によって、前記対象画像に基づいて被監視者Obにおける所定の行動を検知するために、人物領域HAを抽出する。本実施形態では、離床の検知には、人物領域HAで足りるが、転落や転倒等の検知には、頭部領域HDが用いられるので、センサ装置SUaは、行動検知処理部132によって、さらに、前記抽出した人物領域HAから、頭部領域HDを抽出する(S2)。 Next, the sensor device SUa causes the action detection processing unit 132 of the SU control processing unit 13a to extract a person area HA in order to detect a predetermined action in the monitored person Ob based on the target image. In the present embodiment, although the person area HA suffices to detect bed departure, but the head area HD is used to detect falling or falling, etc., the sensor apparatus SUa further causes the behavior detection processing unit 132 to A head region HD is extracted from the extracted person region HA (S2).
 次に、センサ装置SUaは、行動検知処理部132によって、被監視者Obにおける所定の行動を検知する行動の検知処理を前記処理S2で抽出した人物領域HAおよび頭部領域HDに基づいて実行する(S3)。 Next, the sensor device SUa causes the behavior detection processing unit 132 to execute a process of detecting a behavior to detect a predetermined behavior in the monitored person Ob based on the person area HA and the head area HD extracted in the process S2. (S3).
 この行動の検知処理S3では、より具体的には、行動検知処理部132は、図8において、転落や転倒を検知する転倒転落の検知処理S31を実行し、そして、離床を検知する離床の検知処理S32を実行する。 More specifically, in the action detection process S3 of this action, the action detection processing unit 132 executes a fall fall detection process S31 for detecting a fall or a fall in FIG. The process S32 is executed.
 この転倒転落の検知処理S31では、より具体的には、行動検知処理部132は、図9において、まず、転落や転倒の条件を満たすか否かを判定する(S311)。より詳しくは、まず、行動検知処理部132は、処理S2で抽出した頭部領域HDの位置が寝具BDの所在領域内から寝具BDの所在領域外へ時間変化した場合であって、前記処理S2で抽出した頭部領域HDの大きさが前記第2閾値Th2以下である場合には、転落と判定し(Yes)、前記転落を検知し(S312)、本処理S31を終了する。そして、行動検知処理部132は、前記処理S2で抽出した頭部領域HDの位置が寝具BDの所在領域を除く居室RM内であって、前記処理S2で抽出した頭部領域HDの大きさが前記第2閾値Th2以下である場合には、転倒と判定し(Yes)、前記転倒を検知し(S312)、本処理S31を終了する。一方、これらを除く場合には、行動検知処理部132は、転落でも転倒でもないと判定し(No)、本処理S31を終了する。 More specifically, in the detection process S31 of this fall, the action detection processing unit 132 first determines whether or not the conditions of the fall and the fall are satisfied in FIG. 9 (S311). More specifically, the action detection processing unit 132 first performs the process S2 when the position of the head area HD extracted in the process S2 changes from the area where the bedding BD is located to the time when the area of the bedding BD is outside the area If the size of the head region HD extracted in the above is equal to or less than the second threshold value Th2, it is determined that a fall has occurred (Yes), the fall is detected (S312), and the process S31 is ended. Then, in the action detection processing unit 132, the position of the head area HD extracted in the process S2 is in the living room RM excluding the area where the bedding BD is located, and the size of the head area HD extracted in the process S2 is If the second threshold Th2 or less, it is determined that a fall (Yes), detects the fall (S312), and ends the process S31. On the other hand, when excluding these, the action detection processing unit 132 determines that neither falling nor falling has occurred (No), and ends the present process S31.
 前記離床の検知処理S32では、より具体的には、行動検知処理部132は、図10において、まず、被監視者Obの状態が離床前か否かを判定する(S321)。被監視者Obの状態は、状態変数等によってSV記憶部23に記憶されている。この判定の結果、被監視者Obの状態が離床前である場合(Yes)には、行動検知処理部132は、処理S322、処理S323および処理S324の各処理を順次に実行し、本処理S32を終了する。一方、前記判定の結果、被監視者Obの状態が離床前ではない場合(No)には、行動検知処理部132は、本処理S32を終了する。 More specifically, in the detection process S32 of bed leaving, the action detection processing unit 132 first determines whether or not the state of the monitored person Ob is before bed leaving in FIG. 10 (S321). The state of the monitored person Ob is stored in the SV storage unit 23 by a state variable or the like. As a result of this determination, if the state of the monitored person Ob is before bed departure (Yes), the action detection processing unit 132 sequentially executes each process of process S322, process S323 and process S324, and this process S32 Finish. On the other hand, if the result of the determination is that the state of the monitored person Ob is not before bed departure (No), the behavior detection processing unit 132 ends the present process S32.
 前記処理S322では、行動検知処理部132は、離床の条件を満たすか否かを判定する。より詳しくは、まず、行動検知処理部132は、人物領域HAに対するはみ出し領域HAoutの割合が前記第1閾値Th1を超えた場合((HAout/HA)>Th1))に、離床と判定し(Yes)、前記離床を検知し(S323)、被監視者Obの状態を離床後に更新(変更)し(S324、前記状態変数←「離床後」)、本処理S32を終了する。一方、これを除く場合には、行動検知処理部132は、離床ではないと判定し(No)、本処理S32を終了する。なお、被監視者Obの横臥状態から起床が検知された場合や被監視者Obの入床が検知された場合では、被監視者Obの状態が離床前に更新(変更)される(前記状態変数←「離床前」)。 In the process S322, the behavior detection processing unit 132 determines whether or not the condition for bed departure is satisfied. More specifically, the behavior detection processing unit 132 first determines that the user has left the bed when the ratio of the protruding area HAout to the human area HA exceeds the first threshold Th1 ((HAout / HA)> Th1)) (Yes ), The bed leaving is detected (S323), the state of the person to be monitored Ob is updated (changed) after bed leaving (S324, the state variable 「“ after bed leaving ”), and the process S32 is ended. On the other hand, in the case of excluding this, the behavior detection processing unit 132 determines that it is not bed departure (No), and ends the present process S32. In the case where wake-up is detected from the lying condition of the person to be monitored Ob or arrival of the person to be monitored Ob is detected, the state of the person to be monitored Ob is updated (changed) before leaving the bed (the state Variable 「" Before getting out of bed ").
 このように行動の検知処理S3が実行されると、センサ装置SUaは、SU制御処理部13aの検知結果処理部133aによって、行動の検知処理S3による検知結果を、着座検知部12の着座検知結果に基づいて異なる処理方法で処理する検知結果の処理を実行し(S4)、今回の本処理を終了する。 As described above, when the action detection process S3 is performed, the sensor device SUa causes the detection result processing unit S3 of the SU control processing unit 13a to detect the detection result by the action detection process S3, and the seating detection result of the seating detection unit 12 The processing of detection results to be processed by different processing methods is executed based on (S4), and the present processing is terminated.
 より具体的には、検知結果処理部133aは、図11において、まず、着座検知部12の着座検知結果を取得する(S41)。 More specifically, in FIG. 11, the detection result processing unit 133a first obtains the seating detection result of the seating detection unit 12 (S41).
 次に、検知結果処理部133aは、被監視者Obの着座か否かを判定する(S42)。この判定の結果、処理S41で取得した着座検知結果が被監視者Obの着座有りである場合(Yes)には、処理S43を実行し、本処理S4を終了する。一方、前記判定の結果、処理S41で取得した着座検知結果が被監視者Obの着座なしである場合(No)には、処理S44および処理S45の各処理を順次に実行し、本処理S4を終了する。 Next, the detection result processing unit 133a determines whether or not the monitored person Ob is seated (S42). As a result of the determination, if the seating detection result acquired in the processing S41 indicates that the monitored person Ob is seated (Yes), the processing S43 is executed, and the processing S4 is ended. On the other hand, as a result of the determination, when the seating detection result acquired in the processing S41 indicates no seating of the monitored person Ob (No), the processing S44 and the processing S45 are sequentially executed, and the processing S4 is performed. finish.
 前記処理S43では、前記検知結果を端末装置SPa、TAaに表示させない第2処理方法で処理するために、検知結果処理部133aは、処理S3で検知した検知結果を取消して削除する。したがって、検知結果処理部133aは、処理S3で検知した検知結果を、管理サーバ装置SVaに送信しない。センサ装置SUaが前記所定の行動を検知した際に、着座検知部12が被監視者Obの着座を検知していれば、被監視者Obが着座しているので、離床、転落および転倒等の、前記所定の行動の検知結果は、誤検知の可能性が高く、前記検知結果を着座検知部12の着座検知結果で修正できる。特に、前記所定の行動が画像に基づいて検知されている場合、画像による検知結果よりも着座検知結果の方が検知の信頼性が高いと考えられるから、前記検知結果を前記着座検知結果でより適切に修正できる。 In the process S43, the detection result processing unit 133a cancels and deletes the detection result detected in the process S3 in order to process the detection result by the second processing method in which the terminal devices SPa and TAa are not displayed. Therefore, the detection result processing unit 133a does not transmit the detection result detected in the process S3 to the management server device SVa. If the seating detection unit 12 detects the seating of the monitored person Ob when the sensor device SUa detects the predetermined action, the monitored person Ob is seated, and so the bed leaving, falling, falling, etc. The detection result of the predetermined action has a high possibility of false detection, and the detection result can be corrected by the seating detection result of the seating detection unit 12. In particular, when the predetermined action is detected based on the image, it is considered that the seating detection result is higher in detection reliability than the detection result by the image, so the detection result is more determined by the seating detection result. Correctable.
 前記処理S44では、検知結果処理部133aは、処理S3で検知した検知結果を確定する(最終的な検知結果と決定(認定)する)。そして、前記処理S45では、前記検知結果を端末装置SPa、TAaに表示させる第1処理方法で処理するために、検知結果処理部133aは、前記処理S44で確定した検知結果を管理サーバ装置SVaへ送信する。すなわち、本実施形態では、検知結果処理部133aは、前記第1行動検知通知通信信号をSU通信IF部14で管理サーバ装置SVaへ送信する。 In the process S44, the detection result processing unit 133a determines the detection result detected in the process S3 (determines (confirms) the final detection result). Then, in the process S45, the detection result processing unit 133a sends the detection result determined in the process S44 to the management server apparatus SVa in order to process the detection result by the first processing method for displaying the terminal apparatus SPa and TAa. Send. That is, in the present embodiment, the detection result processing unit 133a transmits the first action detection notification communication signal by the SU communication IF unit 14 to the management server device SVa.
 センサ装置SUaが上述のように動作することによって、被監視者監視支援システムMSaでは、例えば、図12に示すように、行動検知処理部132によって所定の行動が検知されるとともに着座検知部12によって着座なしが検知されると(C1)、センサ装置SUaは、第1行動検知通知通信信号を管理サーバ装置SVaへ送信する(C2)。管理サーバ装置SVaは、この第1行動検知通知通信信号を受信すると、この受信した第1行動検知通知通信信号に収容された監視情報を記憶し(C3)、この受信した第1行動検知通知通信信号に基づく第2行動検知通知通信信号を所定の端末装置SPa、TAaへ送信する(C4)。そして、端末装置SPa、TAaは、この第2行動検知通知通信信号を受信すると、この受信した第2行動検知通知通信信号に収容された監視情報を監視情報表示画面で表示する(C5)。この監視情報表示画面は、監視情報を表示するための画面であり、その一例が図13に示されている。この図13に示す例では、監視情報表示画面51は、携帯端末装置TAaに表示される。この監視情報表示画面51には、所定の行動を検知した居室RMの居室名、前記検知した所定の行動の名称、および、前記所定の行動を検知した際の対象画像が表示されている。監視情報表示画面51に表示される前記居室RMの居室名を表示するために、携帯端末装置TAaには、センサIDと、当該センサIDを持つセンサ装置SUaが配設されている居室RMの居室名との対応関係が予め記憶される。携帯端末装置TAaは、この対応関係から、第2行動検知通知通信信号に収容されたセンサIDに対応する居室RMの居室名を検索し、この検索結果による居室名を監視情報表示画面51に表示する。そして、携帯端末装置TAaは、前記第2行動検知通知通信信号に収容された前記所定の行動の名称、および、対象画像を監視情報表示画面51に表示する。これによって携帯端末装置TAaを扱う監視者(ユーザ)は、監視情報表示画面51を参照することによって、前記居室名の居室RMに居る被監視者Obに対し検知された前記所定の行動の報知を認識でき、対象画像からその様子を認識できる。 As the sensor apparatus SUa operates as described above, in the monitored person monitoring support system MSa, for example, as shown in FIG. 12, the behavior detection processing unit 132 detects a predetermined behavior and the seating detection unit 12 When the absence of seating is detected (C1), the sensor device SUa transmits a first action detection notification communication signal to the management server device SVa (C2). When the management server device SVa receives the first activity detection notification communication signal, it stores the monitoring information contained in the received first activity detection notification communication signal (C3), and the received first activity detection notification communication A second action detection notification communication signal based on the signal is transmitted to the predetermined terminal devices SPa and TAa (C4). Then, when receiving the second activity detection notification communication signal, the terminal devices SPa and TAa display the monitoring information contained in the received second activity detection notification communication signal on the monitoring information display screen (C5). This monitoring information display screen is a screen for displaying monitoring information, and an example thereof is shown in FIG. In the example shown in FIG. 13, the monitoring information display screen 51 is displayed on the portable terminal device TAa. The monitoring information display screen 51 displays the room name of the room RM in which a predetermined action is detected, the name of the detected predetermined action, and a target image when the predetermined action is detected. In order to display the room name of the room RM displayed on the monitoring information display screen 51, a sensor ID and a sensor apparatus SUa having the sensor ID are provided in the portable terminal apparatus TAa. The correspondence with the name is stored in advance. From this correspondence relationship, the mobile terminal device TAa searches the room name of the room RM corresponding to the sensor ID contained in the second activity detection notification communication signal, and displays the room name according to the search result on the monitoring information display screen 51. Do. Then, the mobile terminal device TAa displays the name of the predetermined action contained in the second action detection notification communication signal and the target image on the monitoring information display screen 51. Thus, the monitor (user) who handles the mobile terminal device TAa notifies the monitored person Ob who is present in the room RM of the room name of the predetermined action reported by referring to the monitoring information display screen 51. It can be recognized and its appearance can be recognized from the target image.
 一方、被監視者監視支援システムMSaでは、センサ装置SUaが上述のように動作することによって、行動検知処理部132によって所定の行動が検知されるとともに着座検知部12によって着座ありが検知されると(C6)、センサ装置SUaは、前記所定の行動の検知を取り消して削除し、これによって第1行動検知通知通信信号を管理サーバ装置SVaへ送信しない。したがって、管理サーバ装置SVaが第1行動検知通知通信信号に基づく第2行動検知通知通信信号を所定の端末装置SPa、TAaへ送信することはなく、端末装置SPa、TAaが第2行動検知通知通信信号に基づく前記監視情報表示画面51を表示することもない。これによって被監視者監視支援システムMSaは、より適切に発報できる。 On the other hand, in the monitored person monitoring support system MSa, when the sensor apparatus SUa operates as described above, the action detection processing unit 132 detects a predetermined action and the seating detection unit 12 detects seating. (C6) The sensor unit SUa cancels and deletes the detection of the predetermined action, thereby not transmitting the first action detection notification communication signal to the management server device SVa. Therefore, the management server device SVa does not transmit the second activity detection notification communication signal based on the first activity detection notification communication signal to the predetermined terminal devices SPa and TAa, and the terminal devices SPa and TAa perform the second activity detection notification communication. The monitoring information display screen 51 based on the signal is not displayed. As a result, the person-to-be-monitored support system MSa can issue a notification more appropriately.
 以上説明したように、第1実施形態における被監視者監視支援システムMSaおよびこれに実装された被監視者監視支援方法は、前記検知結果を、前記着座検知結果に基づいて処理するので、前記検知結果の信頼性を高めることができ、したがって、より適切に発報できる。 As described above, the target person monitoring support system MSa in the first embodiment and the target person monitoring support method implemented in the processing process the detection result based on the seating detection result. It is possible to increase the credibility of the result and thus to be notified more appropriately.
 上記被監視者監視支援システムMSaおよび被監視者監視支援方法は、前記着座検知結果が被監視者Obの着座ではない場合に前記検知結果を端末装置SPa、TAaに表示させ、前記着座検知結果が前記被監視者Obの着座である場合に前記検知結果を前記端末装置SPa、TAaに表示させないので、前記検知結果の信頼性を高めることができ、したがって、より適切に発報できる。 When the seating detection result is not the seating of the monitored person Ob, the monitored person monitoring support system MSa and the monitored person monitoring support method cause the terminal devices SPa and TAa to display the detection results, and the seating detection result is Since the detection results are not displayed on the terminal devices SPa and TAa when the monitored person Ob is seated, the reliability of the detection results can be enhanced, and therefore, notification can be issued more appropriately.
 看護レベルや介護レベルによって、被監視者Obが離床した場合、例えば排泄、洗面および身支度等の介助目的で、また例えばその転倒の防止目的で、監視者は、その発報を受けると、被監視者Obの処に赴く。被監視者Obが転落や転倒した場合、監視者は、被監視者Obの救護目的で、監視者は、その発報を受けると、被監視者Obの処に赴く。上記被監視者監視支援システムMSaおよび被監視者監視支援方法では、前記所定の行動が「離床」、「転落」および「転倒」のうちの少なくとも1つを含むので、より適切に発報されることによって、誤報による無駄な労力が低減され、監視者の負担を効果的に軽減できる。 Depending on the nursing level and the care level, when the monitored person Ob leaves the floor, the monitoring person receives the notification, for the purpose of assisting with, for example, excretion, washing and getting ready, for example, for the purpose of preventing the fall. It goes to the place of person Ob. When the monitored subject Ob falls or falls, the monitoring subject visits the monitored subject Ob upon receiving the notification for the purpose of rescue of the monitored subject Ob. In the above-mentioned person-to-be-monitored person monitoring support system MSa and the person-to-be-monitored person monitoring support method, the predetermined action includes at least one of “bedding”, “falling” and “falling”, so This reduces unnecessary work due to false alarms and can effectively reduce the burden on the observer.
 なお、上述の実施形態では、検知結果処理部133aは、センサ装置SUaに備えられたが、センサ装置SUaに代えて管理サーバ装置SVaに備えられても良く、あるいは、端末装置SPa、TAaに備えられても良い。 In the above embodiment, the detection result processing unit 133a is provided in the sensor device SUa, but may be provided in the management server device SVa instead of the sensor device SUa, or provided in the terminal devices SPa and TAa. It may be done.
 図14は、第1実施形態において、第1変形形態における被監視者監視システムの動作を示すシーケンス図である。図15は、第1実施形態において、第2変形形態における被監視者監視システムの動作を示すシーケンス図である。 FIG. 14 is a sequence diagram showing the operation of the monitored person monitoring system in the first modification in the first embodiment. FIG. 15 is a sequence diagram showing an operation of the monitored person monitoring system in the second modified embodiment in the first embodiment.
 管理サーバ装置SVaが検知結果処理部を備える場合では、まず、センサ装置SUaから検知結果処理部133aが省略され、検知結果処理部133aに代え行動検知処理部132が第1検知結果通知通信信号を管理サーバ装置SVaへ送信する。この場合では、前記第1検知結果通知通信信号には、自機のセンサID、検知した被監視者Obにおける所定の行動の内容を表す情報(本実施形態では、離床、転落および転倒のうちのいずれか)、および、前記所定の行動の検知に用いられた対象画像が収容されるだけでなく、さらに、着座検知結果(本実施形態では着座有りおよび着座無しのうちのいずれか)も収容される。そして、管理サーバ装置SVaは、SV制御処理部22に、SV制御部221およびSV監視処理部222を機能的に備えるだけでなく、さらに、検知結果処理部を機能的に備える。この管理サーバ装置SVaにおける前記検知結果処理部は、センサ装置SUaから受信した第1検知結果通知通信信号に被監視者Obにおける所定の行動の内容を表す情報として収容された検知結果を、前記受信した第1検知結果通知通信信号に収容された着座検知部12の着座検知結果に基づいて異なる処理方法で処理するものである。より具体的には、管理サーバ装置SVaにおける前記検知結果処理部は、前記着座検知結果が被監視者Obの着座ではない場合(着座無しの場合)に前記検知結果を端末装置SPa、TAaに表示させる第1処理方法で処理し、前記着座検知結果が前記被監視者Obの着座である場合(着座有りの場合)に前記検知結果を前記端末装置SPa、TAaに表示させない第2処理方法で処理する。より詳しくは、前記第2処理方法は、前記検知結果を前記端末装置SPa、TAaに送信しない処理方法である。すなわち、上述では、SV監視処理部222が前記受信した第1行動検知通知通信信号に基づく第2行動検知通知通信信号を所定の端末装置SPa、TAaへ送信するので、管理サーバ装置SVaにおける前記検知結果処理部は、前記着座検知結果が被監視者Obの着座ではない場合(着座なしの場合)に、SV監視処理部222による前記第2行動検知通知通信信号の送信処理を許可し、前記着座検知結果が被監視者Obの着座である場合(着座有りの場合)に、SV監視処理部222に前記第2行動検知通知通信信号の送信処理を禁止する。 When the management server device SVa includes the detection result processing unit, first, the detection result processing unit 133a is omitted from the sensor device SUa, and the action detection processing unit 132 replaces the first detection result notification communication signal instead of the detection result processing unit 133a. It transmits to the management server apparatus SVa. In this case, in the first detection result notification communication signal, the sensor ID of the own device, information indicating the content of the predetermined action in the detected monitored person Ob (in this embodiment, among the bed leaving, falling and falling) Not only is the target image used to detect the predetermined action and the target image stored, but also the seating detection result (in the present embodiment, either seating presence or no seating) is also included Ru. The management server SVa not only functionally includes the SV control unit 221 and the SV monitoring processing unit 222 in the SV control processing unit 22 but also functionally includes a detection result processing unit. The detection result processing unit in the management server device SVa receives the detection result contained in the first detection result notification communication signal received from the sensor device SUa as information representing the content of a predetermined action in the monitored person Ob. Based on the seating detection result of the seating detection unit 12 contained in the first detection result notification communication signal, processing is performed using different processing methods. More specifically, the detection result processing unit in the management server device SVa displays the detection result on the terminal devices SPa and TAa when the seating detection result is not the seating of the monitored person Ob (in the case of no seating). The second processing method does not display the detection result on the terminal devices SPa and TAa when processing is performed by the first processing method and the seating detection result indicates that the monitored person Ob is seated (seating is present) Do. More specifically, the second processing method is a processing method which does not transmit the detection result to the terminal devices SPa and TAa. That is, in the above, since the SV monitoring processing unit 222 transmits the second activity detection notification communication signal based on the received first activity detection notification communication signal to the predetermined terminal devices SPa and TAa, the detection in the management server SVa The result processing unit permits the SV monitoring processing unit 222 to transmit the second activity detection notification communication signal when the seating detection result is not the seating of the monitored person Ob (in the case of no seating), and the seating is performed. When the detection result indicates that the person to be monitored Ob is seated (when there is seating), the SV monitoring processing unit 222 is prohibited from transmitting the second activity detection notification communication signal.
 このような第1変形形態における被監視者監視支援システムMSaでは、行動検知処理部132によって所定の行動が検知されるとともに着座検知部12によって着座なしが検知された場合、前記第1検知結果通知通信信号に、さらに、前記着座検知結果が収容される点を除き、図12に示す、上述した処理C1ないし処理C5の各処理が実行される。 In the person-to-be-monitored monitoring support system MSa in the first modified embodiment, when the action detection processing unit 132 detects a predetermined action and the seating detection unit 12 detects no seating, the first detection result notification Each processing of the above-described processing C1 to processing C5 shown in FIG. 12 is executed except that the seating detection result is accommodated in the communication signal.
 一方、図14において、行動検知処理部132によって所定の行動が検知されるとともに着座検知部12によって着座なしが検知されると(C11)、センサ装置SUaは、前記着座検知結果として着座有りをさらに収容した第1行動検知通知通信信号を管理サーバ装置SVaへ送信する(C12)。管理サーバ装置SVaは、この第1行動検知通知通信信号を受信すると、この受信した第1行動検知通知通信信号に収容された監視情報を記憶し(C13)、前記着座検知結果が着座有りであるので、この受信した第1行動検知通知通信信号に基づく所定の端末装置SPa、TAaへの第2行動検知通知通信信号の送信処理を禁止する。したがって、管理サーバ装置SVaは、第1行動検知通知通信信号に基づく第2行動検知通知通信信号を所定の端末装置SPa、TAaへ送信せず、端末装置SPa、TAaが第2行動検知通知通信信号に基づく前記監視情報表示画面51を表示することはない。これによって被監視者監視支援システムMSaは、より適切に発報できる。また、管理サーバ装置SVaは、処理C13で前記受信した第1行動検知通知通信信号に収容された監視情報を記憶するので、監視者(ユーザ)は、管理サーバ装置SVaの記憶内容を調べることで、管理サーバ装置SVaで誤報を停止した事実を検証できる。 On the other hand, in FIG. 14, when a predetermined action is detected by the action detection processing unit 132 and no seating is detected by the seating detection unit 12 (C11), the sensor device SUa further determines that seating is detected as the seating detection result. The received first action detection notification communication signal is transmitted to the management server device SVa (C12). When the management server device SVa receives the first action detection notification communication signal, the management server device SVa stores the monitoring information contained in the received first action detection notification communication signal (C13), and the seating detection result indicates that seating is present. Therefore, transmission processing of the second behavior detection notification communication signal to the predetermined terminal devices SPa and TAa based on the received first behavior detection notification communication signal is prohibited. Therefore, the management server device SVa does not transmit the second activity detection notification communication signal based on the first activity detection notification communication signal to the predetermined terminal devices SPa and TAa, and the terminal devices SPa and TAa transmit the second activity detection notification communication signal. The monitor information display screen 51 is not displayed based on the above. As a result, the person-to-be-monitored support system MSa can issue a notification more appropriately. Further, since the management server device SVa stores the monitoring information contained in the first action detection notification communication signal received in the process C13, the supervisor (user) checks the storage content of the management server device SVa. The fact that the false notification has been stopped by the management server SVa can be verified.
 なお、上述では、管理サーバ装置SVaにおける前記検知結果処理部は、SV監視処理部222に対し、前記着座検知結果の着座無しまたは着座有りに応じて前記第2行動検知通知通信信号の送信処理を許可または禁止したが、上述の検知結果処理部133aのように、SV監視処理部222に代え、前記第2行動検知通知通信信号の送信処理を、管理サーバ装置SVaにおける前記検知結果処理部が実行しても良い。この場合では、上述の検知結果処理部133aのように、管理サーバ装置SVaにおける前記検知結果処理部が前記着座検知結果の着座無しまたは着座有りに応じて前記第2行動検知通知通信信号の送信処理を実行または不実行する。 In the above description, the detection result processing unit in the management server device SVa transmits the second behavior detection notification communication signal to the SV monitoring processing unit 222 according to the absence or the presence of the seating detection result. Although permitted or prohibited, the detection result processing unit in the management server device SVa executes transmission processing of the second action detection notification communication signal instead of the SV monitoring processing unit 222 as in the detection result processing unit 133a described above. You may. In this case, as in the detection result processing unit 133a described above, the detection result processing unit in the management server device SVa transmits the second activity detection notification communication signal according to the absence or presence of the seating detection result. Run or not run.
 端末装置SPa、TAaが検知結果処理部を備える場合では、上述の管理サーバ装置SVaが検知結果処理部を備える場合と同様に、まず、センサ装置SUaから検知結果処理部133aが省略され、検知結果処理部133aに代え行動検知処理部132がさらに着座検知結果を収容する第1検知結果通知通信信号を管理サーバ装置SVaへ送信する。管理サーバ装置SVaでは、SV監視処理部222は、前記受信した第1行動検知通知通信信号に収容された、センサID、被監視者Obにおける所定の行動の内容を表す情報(本実施形態では、離床、転落および転倒のうちのいずれか)、および、前記対象画像を第2検知結果通知通信信号に収容するだけでなく、さらに、前記受信した第1行動検知通知通信信号に収容された着座検知結果(本実施形態では着座有りおよび着座無しのうちのいずれか)も収容する。そして、端末装置SPa、TAaは、検知結果処理部を備える。この端末装置SPa、TAaにおける前記検知結果処理部は、管理サーバ装置SVaから受信した第2検知結果通知通信信号に被監視者Obにおける所定の行動の内容を表す情報として収容された検知結果を、前記受信した第2検知結果通知通信信号に収容された着座検知部12の着座検知結果に基づいて異なる処理方法で処理するものである。より具体的には、端末装置SPa、TAaにおける前記検知結果処理部は、前記着座検知結果が被監視者Obの着座ではない場合(着座無しの場合)に前記検知結果を端末装置SPa、TAaに表示させる第1処理方法で処理し、前記着座検知結果が前記被監視者Obの着座である場合(着座有りの場合)に前記検知結果を前記端末装置SPa、TAaに表示させない第2処理方法で処理する。より詳しくは、前記第2処理方法は、前記検知結果を表示しない処理方法である。すなわち、端末装置SPa、TAaにおける前記検知結果処理部は、前記着座検知結果が被監視者Obの着座ではない場合(着座なしの場合)に、例えば前記監視情報表示画面51で前記検知結果を表示し、前記着座検知結果が被監視者Obの着座である場合(着座有りの場合)に、前記検知結果を表示しない。 In the case where the terminal devices SPa and TAa include the detection result processing unit, the detection result processing unit 133a is omitted from the sensor unit SUa as in the case where the management server SVa described above includes the detection result processing unit. Instead of the processing unit 133a, the action detection processing unit 132 further transmits a first detection result notification communication signal containing the seating detection result to the management server device SVa. In the management server device SVa, the SV monitoring processing unit 222 is information contained in the received first action detection notification communication signal, the sensor ID, and information indicating the content of a predetermined action in the monitored person Ob (in this embodiment, Not only accommodating the bed leaving, falling or falling), and accommodating the target image in the second detection result notification communication signal, and further, the seating detection contained in the received first action detection notification communication signal The results (in the present embodiment, either with or without seating) are also accommodated. The terminal devices SPa and TAa include a detection result processing unit. The detection result processing unit in each of the terminal devices SPa and TAa includes the detection result contained in the second detection result notification communication signal received from the management server device SVa as information representing the content of the predetermined action in the monitored person Ob. Processing is performed with different processing methods based on the seating detection result of the seating detection unit 12 contained in the received second detection result notification communication signal. More specifically, the detection result processing unit in the terminal devices SPa and TAa sends the detection results to the terminal devices SPa and TAa when the seating detection result is not the seating of the monitored person Ob (when there is no seating). It processes by the 1st processing method to display, and when the said seating detection result is seating of the said to-be-monitored person Ob (when there is seating), it does not display the said sensing result on the said terminal device SPa and TAa by the 2nd processing method To process. More specifically, the second processing method is a processing method which does not display the detection result. That is, the detection result processing unit in the terminal devices SPa and TAa displays the detection result on the monitoring information display screen 51, for example, when the seating detection result is not seating of the monitored person Ob (without seating). If the seating detection result indicates that the monitored person Ob is seated (seating is present), the detection result is not displayed.
 このような第2変形形態における被監視者監視支援システムMSaでは、行動検知処理部132によって所定の行動が検知されるとともに着座検知部12によって着座なしが検知された場合、前記第1および第2検知結果通知通信信号それぞれに、さらに、前記着座検知結果が収容される点を除き、図12に示す、上述した処理C1ないし処理C5の各処理が実行される。 In the monitored person monitoring support system MSa in the second modified embodiment, when the action detection processing unit 132 detects a predetermined action and the seating detection unit 12 detects no seating, the first and second monitored Each of the processes C1 to C5 shown in FIG. 12 is executed except that the seating detection result is accommodated in each of the detection result notification communication signals.
 一方、図15において、行動検知処理部132によって所定の行動が検知されるとともに着座検知部12によって着座なしが検知されると(C21)、センサ装置SUaは、前記着座検知結果として着座有りをさらに収容した第1行動検知通知通信信号を管理サーバ装置SVaへ送信する(C22)。管理サーバ装置SVaは、この第1行動検知通知通信信号を受信すると、この受信した第1行動検知通知通信信号に収容された監視情報を記憶し(C23)、前記着座検知結果として着座有りをさらに収容した第2行動検知通知通信信号を所定の端末装置SPa、TAaへ送信する(C24)。端末装置SPa、TAaは、この第2行動検知通知通信信号を受信すると、前記着座検知結果が着座有りであるので、この受信した第2行動検知通知通信信号に基づく前記監視情報表示画面51を表示しない。これによって被監視者監視支援システムMSaは、より適切に発報できる。また、管理サーバ装置SVaは、処理C23で前記受信した第1行動検知通知通信信号に収容された監視情報を記憶するので、監視者(ユーザ)は、管理サーバ装置SVaの記憶内容を調べることで、端末装置SPa、TAaで誤報を停止した事実を検証できる。 On the other hand, in FIG. 15, when a predetermined action is detected by the action detection processing unit 132 and no seating is detected by the seating detection unit 12 (C21), the sensor device SUa further determines seating presence as the seating detection result. The received first action detection notification communication signal is transmitted to the management server device SVa (C22). When the management server device SVa receives the first action detection notification communication signal, the management server device SVa stores the monitoring information contained in the received first action detection notification communication signal (C23), and the seating detection result is further added as the seating detection result. The second activity detection notification communication signal received is transmitted to the predetermined terminal devices SPa and TAa (C24). When the terminal devices SPa and TAa receive the second activity detection notification communication signal, the seating detection result indicates that there is a seat, and thus the monitoring information display screen 51 is displayed based on the received second activity detection notification communication signal. do not do. As a result, the person-to-be-monitored support system MSa can issue a notification more appropriately. Further, since the management server device SVa stores the monitoring information contained in the first action detection notification communication signal received in the process C23, the supervisor (user) checks the storage content of the management server device SVa. The terminal devices SPa and TAa can verify the fact that false alarm has been stopped.
 次に、別の実施形態について説明する。 Next, another embodiment will be described.
(第2実施形態)
 図16は、第2実施形態において、図7に示す検知結果の処理における動作を示すフローチャートである。図17は、第2実施形態における、監視情報表示画面の表示に関する前記携帯端末装置の動作を示すフローチャートである。図18は、第2実施形態において、前記被監視者監視支援システムにおける携帯端末装置に表示される監視情報表示画面の他の一例を示す図である。
Second Embodiment
FIG. 16 is a flowchart showing an operation in the processing of the detection result shown in FIG. 7 in the second embodiment. FIG. 17 is a flow chart showing the operation of the mobile terminal device regarding the display of the monitoring information display screen in the second embodiment. FIG. 18 is a view showing another example of the monitoring information display screen displayed on the mobile terminal device in the monitored person monitoring support system in the second embodiment.
 第1実施形態における被監視者監視支援システムMSaは、前記検知結果を、前記着座検知結果に基づいて端末装置に表示させるか否かの異なる処理方法で処理したが、第2実施形態における被監視者監視支援システムは、前記検知結果を、前記着座検知結果に基づいて異なる表示態様で端末装置に表示させる異なる処理方法で処理するものである。 The monitored person monitoring support system MSa in the first embodiment processes the detection result by a different processing method whether to display the detection result on the terminal device based on the seating detection result. However, the monitored result in the second embodiment The person monitoring support system processes the detection result by a different processing method for displaying on the terminal device in different display modes based on the seating detection result.
 このような第2実施形態における被監視者監視支援システムMSbは、例えば、図1に示すように、1または複数のセンサ装置SUb(SUb-1~SUb-4)と、管理サーバ装置SVaと、固定端末装置SPbと、1または複数の携帯端末装置TAb(TAb-1、TAb-2)と、構内交換機CXとを備え、これらは、ネットワークNWを介して通信可能に接続される。この第2実施形態の被監視者監視支援システムMSbにおける管理サーバ装置SVaは、第1実施形態の被監視者監視支援システムMSaにおける管理サーバ装置SVaと同様であるので、その説明を省略する。 For example, as shown in FIG. 1, the monitored person monitoring support system MSb according to the second embodiment includes one or more sensor devices SUb (SUb-1 to SUb-4), and the management server device SVa. The fixed terminal apparatus SPb, one or more mobile terminal apparatuses TAb (TAb-1, TAb-2), and a private branch exchange CX are connected communicably via a network NW. The management server device SVa in the monitored person monitoring support system MSb of the second embodiment is the same as the management server device SVa in the monitored person monitoring support system MSa of the first embodiment, and thus the description thereof will be omitted.
 センサ装置SUbは、例えば、図2に示すように、撮像部11と、着座検知部12と、SU制御処理部13bと、SU通信IF部14と、SU記憶部15とを備える。これら第2実施形態のセンサ装置SUbにおける撮像部11、着座検知部12、SU通信IF部14およびSU記憶部15は、それぞれ、第1実施形態のセンサ装置SUaにおける撮像部11、着座検知部12、SU通信IF部14およびSU記憶部15と同様であるので、その説明を省略する。 For example, as shown in FIG. 2, the sensor device SUb includes an imaging unit 11, a seating detection unit 12, an SU control processing unit 13b, an SU communication IF unit 14, and an SU storage unit 15. The imaging unit 11, the seating detection unit 12, the SU communication IF unit 14, and the SU storage unit 15 in the sensor device SUb according to the second embodiment respectively correspond to the imaging unit 11 and the seating detection unit 12 in the sensor device SUa according to the first embodiment. Since the SU communication IF unit 14 and the SU storage unit 15 are the same, the description thereof is omitted.
 SU制御処理部13bは、センサ装置SUbの各部を当該各部の機能に応じてそれぞれ制御し、予め設定された被監視者Obに関わる所定の行動を検知してその検知結果を画像とともに管理サーバ装置SVaへ送信するための回路である。そして、本実施形態では、SU制御処理部13bは、前記被監視者Obの着座を検知し、前記検知結果を管理サーバ装置SVaへ送信する際に、前記検知結果を、着座検知部12の着座検知結果に基づいて異なる処理方法で処理する。SU制御処理部13bは、制御処理プログラムが実行されることによって、SU制御部131、行動検知処理部132および検知結果処理部133bを機能的に備える。これら第2実施形態のSU制御処理部13bにおけるSU制御部131および行動検知処理部132は、それぞれ、第1実施形態のSU制御処理部13aにおけるSU制御部131および行動検知処理部132と同様であるので、その説明を省略する。 The SU control processing unit 13b controls each unit of the sensor device SUb according to the function of each unit, detects a predetermined action related to the monitored person Ob set in advance, and detects the detection result together with an image, with the management server device. It is a circuit for transmitting to SVa. Then, in the present embodiment, when the SU control processing unit 13b detects the seating of the person to be monitored Ob and transmits the detection result to the management server device SVa, the seating of the seating detection unit 12 is the detection result. Process with different processing methods based on detection results. The SU control processing unit 13b functionally includes an SU control unit 131, an action detection processing unit 132, and a detection result processing unit 133b by executing a control processing program. The SU control unit 131 and the action detection processing unit 132 in the SU control processing unit 13b of the second embodiment are similar to the SU control unit 131 and the action detection processing unit 132 in the SU control processing unit 13a of the first embodiment, respectively. Because there is, I omit the explanation.
 検知結果処理部133bは、行動検知処理部132の検知結果を、着座検知部12の着座検知結果に基づいて異なる処理方法で処理するものである。より具体的には、本実施形態では、検知結果処理部133bは、前記着座検知結果が被監視者Obの着座ではない場合に前記検知結果を所定の第1表示態様で端末装置SPb、TAbに表示させる第3処理方法で処理し、前記着座検知結果が前記被監視者Obの着座である場合に前記検知結果を前記第1表示態様と異なる所定の第2表示態様で前記端末装置SPb、TAbに表示させる第4処理方法で処理する。前記第1および第2表示態様は、前記検知結果に対する信頼性の度合いを表す表示態様であり、前記第2表示態様は、前記第1表示態様より信頼性の度合いが低いことを表す表示態様である。より詳しくは、まず、本実施形態では、前記所定の行動には、離床、転落および転倒が含まれるだけでなく、さらに、被監視者Obが寝具から離れたと疑われる離床疑い(被監視者Obが寝具から離れた可能性のある離床疑い)、被監視者Obが寝具から落ちたと疑われる転落疑い(被監視者Obが寝具から落ちた可能性のある転落疑い)、被監視者Obが倒れたと疑われる転倒疑い(被監視者Obが倒れた可能性のある転倒疑い)が含まれる。前記離床疑いは、前記離床よりも、信頼性の低い検知結果である。前記転落疑いは、前記転落よりも、信頼性の低い検知結果である。前記転倒疑いは、前記転倒よりも、信頼性の低い検知結果である。前記第3処理方法では、検知結果処理部133bは、前記着座検知結果が被監視者Obの着座ではない場合に前記検知結果をそのまま確定して前記検知結果をそのまま表示する前記第1表示態様で端末装置SPb、TAbに表示させる。すなわち、前記第3処理方法では、行動検知処理部132が前記所定の行動を検知すると、前記着座検知結果が被監視者Obの着座ではない場合に、検知結果処理部133bは、第1行動検知通知通信信号をSU通信IF部14で管理サーバ装置SVaへ送信する。本実施形態では、前記第1表示態様では、前記検知された被監視者Obにおける所定の行動は、例えば上述の図13に示す監視情報表示画面51等で表示される。前記第4処理方法では、検知結果処理部133bは、前記着座検知結果が前記被監視者Obの着座である場合に前記検知結果を疑いの検知結果に変えて疑いの検知結果を表示する前記第2表示態様で前記端末装置SPb、TAbに表示させる。本実施形態では、前記検知結果が離床である場合では、検知結果処理部133bは、前記着座検知結果が前記被監視者Obの着座である場合に前記離床を離床疑いに変える。前記検知結果が転落である場合では、検知結果処理部133bは、前記着座検知結果が前記被監視者Obの着座である場合に前記転落を転落疑いに変える。前記検知結果が転倒である場合では、検知結果処理部133bは、前記着座検知結果が前記被監視者Obの着座である場合に前記転倒を転倒疑いに変える。すなわち、前記第4処理方法では、行動検知処理部132が前記所定の行動を検知すると、前記着座検知結果が被監視者Obの着座である場合に、検知結果処理部133bは、第1行動検知通知通信信号をSU通信IF部14で管理サーバ装置SVaへ送信する。前記第2表示態様では、前記検知された被監視者Obにおける所定の行動は、例えば後述の図18に示す監視情報表示画面52、53等で表示され、監視情報表示画面52、53は、前記第1表示態様と異なる表示態様とすることで、信頼性の相対的に低い検知結果であることを表している。前記第1行動検知通知通信信号には、例えば、自機のセンサID、検知した被監視者Obにおける所定の行動の内容を表す情報(本実施形態では、離床、転落、転倒、離床疑い、転落疑いおよび転倒疑いのうちのいずれか)、および、前記所定の行動の検知に用いられた対象画像が収容される。前記第3処理方法では、前記第1行動検知通知通信信号には、離床、転落および転倒のいずれかが収容され、前記第4処理方法では、前記第1行動検知通知通信信号には、離床疑い、転落疑いおよび転倒疑いのいずれかが収容される。 The detection result processing unit 133 b processes the detection result of the action detection processing unit 132 by a different processing method based on the seating detection result of the seating detection unit 12. More specifically, in the present embodiment, the detection result processing unit 133b sends the detection result to the terminal devices SPb and TAb in a predetermined first display mode when the seating detection result is not the seating of the monitored person Ob. The terminal apparatus SPb, TAb is processed by the third processing method to be displayed, and when the seating detection result is the seating of the monitored person Ob, the detection result is displayed in a predetermined second display mode different from the first display mode. It processes by the 4th processing method displayed on. The first and second display modes are display modes indicating the degree of reliability of the detection result, and the second display mode is a display mode indicating that the degree of reliability is lower than the first display mode. is there. More specifically, first, in the present embodiment, the predetermined behavior not only includes bedriding, falling and falling, and further, bedriding suspected of the monitored person Ob being separated from the bedding (monitored person Ob Is suspected of getting away from the bedding), suspicion that the monitored person Ob has fallen from the bedding suspected of falling (believed that the monitored person Ob may have fallen out of the bedding), the monitored person Ob falls There are suspected falls (suspected falls that the monitored person Ob may have fallen). The suspected bed break is a detection result that is less reliable than the bed break. The fall possibility is a detection result that is less reliable than the fall. The suspected fall is a detection result that is less reliable than the fall. In the third processing method, the detection result processing unit 133b determines the detection result as it is and displays the detection result as it is when the seating detection result is not the seating of the monitored person Ob. Display on terminal devices SPb and TAb. That is, in the third processing method, when the action detection processing unit 132 detects the predetermined action, the detection result processing unit 133 b detects the first action when the seating detection result is not the seating of the monitored person Ob. The notification communication signal is transmitted by the SU communication IF unit 14 to the management server device SVa. In the present embodiment, in the first display mode, a predetermined action in the detected monitored person Ob is displayed, for example, on the monitoring information display screen 51 shown in FIG. 13 described above. In the fourth processing method, the detection result processing unit 133b changes the detection result to a doubt detection result and displays a doubt detection result when the seating detection result indicates that the person to be monitored Ob is seated. (2) Display on the terminal devices SPb and TAb in a display mode. In the present embodiment, when the detection result is bed leaving, the detection result processing unit 133b changes the bed leaving to be a bed leaving suspect when the seating detection result is the seating of the person to be monitored Ob. When the detection result is a fall, the detection result processing unit 133b changes the fall into a fall possibility if the seating detection result indicates that the person to be monitored Ob is seated. In the case where the detection result is falling, the detection result processing unit 133b changes the falling into a fall possibility if the seating detection result indicates that the person to be monitored Ob is seated. That is, in the fourth processing method, when the action detection processing unit 132 detects the predetermined action, the detection result processing unit 133 b detects the first action when the seating detection result indicates that the monitored person Ob is seated. The notification communication signal is transmitted by the SU communication IF unit 14 to the management server device SVa. In the second display mode, a predetermined action in the detected monitored person Ob is displayed on, for example, monitoring information display screens 52 and 53 shown in FIG. 18 described later, and the monitoring information display screens 52 and 53 are The display mode different from the first display mode indicates that the detection result is relatively low in reliability. The first action detection notification communication signal includes, for example, the sensor ID of the own device, information indicating the content of a predetermined action in the detected monitored person Ob (in this embodiment, bed departure, fall, fall, suspicion, fall A target image used for detection of any of doubt and suspicion of falling) and the predetermined action is stored. In the third processing method, any one of bed leaving, falling and falling is accommodated in the first activity detection notification communication signal, and in the fourth processing method, the first activity detection notification communication signal is suspected of leaving the floor , Or a fall or suspicion of fall is accommodated.
 固定端末装置SPbおよび携帯端末装置TAbは、それぞれ、管理サーバ装置SVaから受信した第2行動検知通知通信信号に、疑いの検知結果(本実施形態では離床疑い、転落疑いおよび転倒疑いのうちのいずれか)が収容されている場合に、前記疑いの検知結果を、前記疑いの検知結果を表示する第2表示態様で表示する点を除き、第1実施形態の被監視者監視支援システムMSaにおける固定端末装置SPaおよび携帯端末装置TAaと同様であるので、その説明を省略する。 The fixed terminal apparatus SPb and the portable terminal apparatus TAb respectively detect the second activity detection notification communication signal received from the management server apparatus SVa as the doubt detection result (in the present embodiment, any of the bed departure, the fall suspect and the fall suspect). Fixed in the monitored person monitoring support system MSa of the first embodiment except that the detection result of the suspicion is displayed in a second display mode in which the suspicion detection result is displayed when? The terminal device SPa and the mobile terminal device TAa are the same as the terminal device SPa and the mobile terminal device TAa.
 このような第2実施形態における被監視者監視支援システムMSbにおいて、センサ装置SUbは、第1実施形態におけるセンサ装置SUaと同様に、各フレームごとに、あるいは、数フレームおきに、図7に示す、上述の画像の取得処理S1、上述の人物領域等の抽出処理S2、上述の所定の行動の検知処理S3、および、後述のように実行される検知結果の処理S5の各処理を順次に実行することで、被監視者Obにおける所定の動作を検知している。 In the monitored person monitoring support system MSb according to the second embodiment, the sensor unit SUb is shown in FIG. 7 every frame or every several frames as in the sensor unit SUa according to the first embodiment. The image acquisition process S1 described above, the extraction process S2 of the person area etc. described above, the detection process S3 of the predetermined action described above, and the process S5 of the detection result executed as described later. By doing this, a predetermined operation in the monitored person Ob is detected.
 この第2実施形態における検知結果の処理S5では、図16において、検知結果処理部133bは、まず、着座検知部12の着座検知結果を取得する(S51)。 In processing S5 of a detection result in this 2nd embodiment, in Drawing 16, detection result treating part 133b acquires a seating detection result of seating detection part 12 first (S51).
 次に、検知結果処理部133bは、被監視者Obの着座か否かを判定する(S52)。この判定の結果、処理S51で取得した着座検知結果が被監視者Obの着座有りである場合(Yes)には、処理S53を実行した後に、処理S55を実行し、そして、本処理S5を終了する。一方、前記判定の結果、処理S51で取得した着座検知結果が被監視者Obの着座なしである場合(No)には、処理S54を実行した後に、処理S55を実行し、そして、本処理S5を終了する。 Next, the detection result processing unit 133b determines whether or not the monitored person Ob is seated (S52). As a result of this determination, when the seating detection result acquired in the processing S51 indicates that the monitored person Ob is seated (Yes), the processing S53 is performed, and then the processing S55 is performed, and the processing S5 is ended. Do. On the other hand, if it is determined that the seating detection result acquired in the processing S51 indicates that the monitored person Ob is not seated (No), the processing S55 is performed after the processing S54, and the main processing S5 Finish.
 前記処理S53では、前記検知結果を、その信頼性の相対的に低い検知結果であることを表示する第2表示態様で前記端末装置SPb、TAbに表示させる第4処理方法で処理するために、検知結果処理部133bは、処理S3で検知した検知結果を疑いの検知結果に変更する。すなわち、処理S3の検知結果が離床、転落および転倒それぞれである場合には、検知結果処理部133bは、離床疑い、転落疑いおよび転倒疑いそれぞれに変更する。センサ装置SUbが前記所定の行動を検知した際に、着座検知部12が被監視者Obの着座を検知していれば、被監視者Obが着座しているので、離床、転落および転倒等の、前記所定の行動の検知結果は、着座検知部12が被監視者Obの着座を検知していない場合に較べて信頼性が低く、前記検知結果を着座検知部12の着座検知結果で修正できる。特に、前記所定の行動が画像に基づいて検知されている場合、画像による検知結果よりも着座検知結果の方が検知の信頼性が高いと考えられるから、前記検知結果を前記着座検知結果でより適切に修正できる。 In the processing S53, in order to process the detection result in the fourth processing method for displaying the terminal apparatus SPb and TAb in a second display mode that displays that the detection result is relatively low in reliability. The detection result processing unit 133b changes the detection result detected in the processing S3 to a detection result of doubt. That is, when the detection result of process S3 is bed leaving, falling, and fall, respectively, the detection result processing unit 133b changes the bed leaving doubt, the falling suspect, and the falling suspect. If the seating detection unit 12 detects the seating of the person to be monitored Ob when the sensor device SUb detects the predetermined action, the person to be monitored Ob sits on the floor, falling down, falling, etc. The detection result of the predetermined action is lower in reliability than the case where the seating detection unit 12 does not detect the seating of the monitored person Ob, and the detection result can be corrected by the seating detection result of the seating detection unit 12 . In particular, when the predetermined action is detected based on the image, it is considered that the seating detection result is higher in detection reliability than the detection result by the image, so the detection result is more determined by the seating detection result. Correctable.
 前記処理S54では、検知結果処理部133bは、前記検知結果を、その信頼性の相対的に高い検知結果であることを表示する第1表示態様で前記端末装置SPb、TAbに表示させる第3処理方法で処理するために、処理S3で検知した検知結果をそのまま確定する(最終的な検知結果と決定(認定)する)。 In the process S54, the detection result processing unit 133b causes the terminal device SPb or TAb to display the detection result in the first display mode that indicates that the detection result is a relatively high detection result of the reliability. In order to process by a method, the detection result detected by process S3 is decided as it is (a final detection result and determination (decision)).
 そして、前記処理S53や前記処理S54の次に、処理S55では、検知結果処理部133bは、第1検知結果通知通信信号をSU通信IF部14で管理サーバ装置SVaへ送信する。すなわち、前記処理S53の実行後の処理S55では、第1検知結果通知通信信号には、自機のセンサID、離床疑い、転落疑いおよび転倒疑いのうちのいずれか、対象画像が収容される。一方、前記処理S54の実行後の処理S55では、第1検知結果通知通信信号には、自機のセンサID、離床、転落および転倒のうちのいずれか、対象画像が収容される。 Then, after the process S53 and the process S54, in the process S55, the detection result processing unit 133b transmits a first detection result notification communication signal to the management server device SVa by the SU communication IF unit 14. That is, in the processing S55 after the execution of the processing S53, the first detection result notification communication signal contains the target image of any of the sensor ID of the own machine, the possibility of bed departure, the possibility of falling, and the possibility of falling. On the other hand, in the processing S55 after the execution of the processing S54, the first detection result notification communication signal contains the target image, which is any one of the sensor ID of the own device, bed leaving, falling and falling.
 管理サーバ装置SVaは、センサ装置SUbから第1行動検知通知通信信号を受信すると、第1実施形態と略同様に、この受信した第1行動検知通知通信信号に収容された監視情報を記憶し、この受信した第1行動検知通知通信信号に基づく第2行動検知通知通信信号を所定の端末装置SPb、TAbへ送信する。ここで、第2実施形態では、前記所定の行動が離床、転落および転倒を含むだけでなく、離床疑い、転落疑いおよび転倒疑いを含むので、第2行動検知通知通信信号には、センサ装置SUbで検知した被監視者Obにおける所定に行動の内容を表す情報として、離床、転落、転倒、離床疑い、転落疑いおよび転倒疑いのうちのいずれかが収容される。 When receiving the first action detection notification communication signal from the sensor device SUb, the management server device SVa stores the monitoring information contained in the received first action detection notification communication signal, substantially as in the first embodiment, A second activity detection notification communication signal based on the received first activity detection notification communication signal is transmitted to predetermined terminal devices SPb and TAb. Here, in the second embodiment, the second action detection notification communication signal includes the sensor device SUb because the predetermined behavior includes not only bed break, fall and fall but also bed break, doubt and fall. As information representing the contents of a predetermined action in the monitored person Ob detected in the above, any of leaving, falling, falling, falling, doubting to leave, suspicion of falling and suspicion of falling is accommodated.
 そして、端末装置SPb、TAbは、この第2行動検知通知通信信号を受信すると、この受信した第2行動検知通知通信信号に収容された監視情報を監視情報表示画面で表示する。より具体的には、図17において、端末装置SPb、TAbは、センサ装置SUbで検知した検知結果が何であるか判定する(S61)。より具体的には、端末装置SPb、TAbは、この受信した第2行動検知通知通信信号に収容された前記所定に行動の内容を表す情報が何であるか判定する。この判定の結果、前記所定に行動の内容を表す情報が離床、転落および転倒のうちのいずれかである場合(離床、転落、転倒)には、端末装置SPb、TAbは、前記検知結果を第1表示態様で表示する処理S63を実行した後に、本処理を終了する。一方、前記判定の結果、前記所定に行動の内容を表す情報が離床疑い、転落疑いおよび転倒疑いのうちのいずれかである場合(離床疑い、転落疑い、転倒疑い)には、端末装置SPb、TAbは、前記疑いの検知結果を第2表示態様で表示する処理S62を実行した後に、本処理を終了する。 When the terminal devices SPb and TAb receive the second activity detection notification communication signal, the terminal devices SPb and TAb display the monitoring information contained in the received second activity detection notification communication signal on the monitoring information display screen. More specifically, in FIG. 17, the terminal devices SPb and TAb determine what the detection result detected by the sensor device SUb is (S61). More specifically, the terminal devices SPb and TAb determine what information representing the content of the predetermined action contained in the received second action detection notification communication signal. As a result of this determination, when the information indicating the content of the predetermined action is any one of bed break, fall and fall (bed break, fall and fall), the terminal devices SPb and TAb select the detection result After the processing S63 of displaying in one display mode is executed, the present processing is ended. On the other hand, as a result of the determination, if the information indicating the content of the predetermined action is any of doubting of bed leaving, suspicion of falling or suspicion of falling (suspecting of bed leaving, suspicion of falling, suspicion of falling), terminal apparatus SPb The TAb ends the present processing after executing the processing S62 of displaying the detection result of the suspicion in the second display mode.
 前記処理S63では、端末装置SPb、TAbは、前記受信した第2行動検知通知通信信号に収容された監視情報を、例えば図13に示す監視情報表示画面51で表示する。この監視情報表示画面51は、離床、転落および転倒をテキストでそのまま表示することで、図18に示す監視情報表示画面52、53と較べて、検知結果が相対的に信頼性の高い検知結果であることを表している。 In the process S63, the terminal devices SPb and TAb display the monitoring information contained in the received second activity detection notification communication signal, for example, on the monitoring information display screen 51 shown in FIG. This monitoring information display screen 51 displays leaving and falling and falling over as a text as it is, so that the detection result is a detection result having a relatively high reliability as compared with the monitoring information display screens 52 and 53 shown in FIG. It represents that there is.
 一方、前記処理S63では、端末装置SPb、TAbは、前記受信した第2行動検知通知通信信号に収容された監視情報を、例えば図18Aに示す監視情報表示画面52、あるいは、図18Bに示す監視情報表示画面53で表示する。この図18に示す例では、監視情報表示画面52、53は、携帯端末装置TAbに表示される。図18Aに示す監視情報表示画面52には、所定の行動を検知した居室RMの居室名、前記検知した所定の行動の名称、および、エクスクラメーションマーク(!)が表示されている。この監視情報表示画面52では、前記居室RMの居室名における各文字および前記行動の名称における各文字の、文字属性(フォントの種類、フォントの大きさ、文字色および表示濃度(表示輝度))を第1表示態様の監視情報表示画面51で表される各文字の文字属性と変えることで、さらに、エクスクラメーションマーク(!)等の、相対的に低い信頼性を表す、監視者(ユーザ)に注意を促す特定のアイコンを表示することで、図13に示す監視情報表示画面51と較べて、検知結果が相対的に信頼性の低い検知結果であることを表している。図18Aに示す監視情報表示画面52に表示される各文字は、図13に示す監視情報表示画面51と較べて、薄い表示濃度で表示されている。図18Bに示す監視情報表示画面53には、図13に示す監視情報表示画面51と同様に、所定の行動を検知した居室RMの居室名、前記検知した所定の行動の名称、および、前記所定の行動を検知した際の対象画像が表示されている。しかしながら、この監視情報表示画面53では、各文字の文字属性を第1表示態様の監視情報表示画面51で表される各文字の文字属性と変えることで、さらに、対象画像の表示サイズ(解像度)を変えることで、図13に示す監視情報表示画面51と較べて、検知結果が相対的に信頼性の低い検知結果であることを表している。図18Bに示す監視情報表示画面53に表示される各文字は、図13に示す監視情報表示画面51と較べて、ポイント数のより小さい文字で表示され、対象画像は、図13に示す監視情報表示画面51と較べて、小さい表示サイズ(低解像度)で表示されている。さらに、図18に示す監視情報表示画面52、53では、「疑い」というテキストを表示することでも、図13に示す監視情報表示画面51と較べて、検知結果が相対的に信頼性の低い検知結果であることを表している。このように表示態様を変えることで、監視者(ユーザ)は、検知結果の信頼性を視覚で認識可能となる。 On the other hand, in the process S63, the terminal devices SPb and TAb monitor the monitoring information contained in the received second activity detection notification communication signal, for example, the monitoring information display screen 52 shown in FIG. 18A or the monitoring shown in FIG. It is displayed on the information display screen 53. In the example illustrated in FIG. 18, the monitoring information display screens 52 and 53 are displayed on the mobile terminal device TAb. On the monitoring information display screen 52 shown in FIG. 18A, the room name of the room RM in which the predetermined action is detected, the name of the detected predetermined action, and the exclamation mark (!) Are displayed. In this monitoring information display screen 52, the character attributes (type of font, font size, character color and display density (display luminance)) of each character in the room name of the room RM and each character in the name of the action are displayed. An observer (user) who expresses relatively low reliability such as an exclamation mark (!) By changing the character attribute of each character displayed on the monitoring information display screen 51 of the first display mode. By displaying a specific icon for prompting the user, it is shown that the detection result is a detection result having a relatively low reliability as compared with the monitoring information display screen 51 shown in FIG. Each character displayed on the monitoring information display screen 52 shown in FIG. 18A is displayed with a lighter display density than the monitoring information display screen 51 shown in FIG. Similar to the monitoring information display screen 51 shown in FIG. 13, the monitoring information display screen 53 shown in FIG. 18B includes the room name of the room RM where the predetermined action is detected, the name of the detected predetermined action, and the predetermined The target image at the time of detecting the action of is displayed. However, on this monitoring information display screen 53, the display size (resolution) of the target image is further changed by changing the character attribute of each character to the character attribute of each character represented on the monitoring information display screen 51 of the first display mode. As compared with the monitoring information display screen 51 shown in FIG. 13, the detection result indicates that the detection result is a detection result with relatively low reliability. Each character displayed on the monitoring information display screen 53 shown in FIG. 18B is displayed as a character having a smaller number of points than the monitoring information display screen 51 shown in FIG. 13, and the target image is the monitoring information shown in FIG. Compared to the display screen 51, the display screen is displayed in a smaller display size (low resolution). Furthermore, in the monitoring information display screens 52 and 53 shown in FIG. 18, the detection result is relatively less reliable than the monitoring information display screen 51 shown in FIG. 13 even when the text “suspect” is displayed. It represents that it is a result. By thus changing the display mode, the observer (user) can visually recognize the reliability of the detection result.
 第2実施形態における被監視者監視支援システムMSbおよびこれに実装された被監視者監視支援方法は、前記検知結果を、前記着座検知結果に基づいてその表示態様を変えるので、その表示態様から、前記着座検知結果に基づく前記検知結果の信頼性を認識でき、したがって、前記検知結果の信頼性を伴ってより適切に発報できる。 The monitored person monitoring support system MSb in the second embodiment and the monitored person monitoring support method implemented in this change the display mode of the detection result based on the seating detection result, so from the display mode, The reliability of the detection result based on the seating detection result can be recognized, and accordingly, the notification can be issued more appropriately with the reliability of the detection result.
 なお、上述では、端末装置SPb、TAbは、前記着座検知結果の着座無しまたは着座有りに応じて第1表示態様または第2表示態様で表示したが、これに代え、あるいは、これに追加して、相対的に大きい音の通知音または相対的に小さい音の通知音を出力しても良い。さらに、これらに代え、あるいは、これらに追加して、端末装置SPb、TAbは、相対的に長い時間長の通知音または相対的に短い時間長の通知音を出力しても良い。 In the above description, the terminal devices SPb and TAb are displayed in the first display mode or the second display mode according to the absence or the presence of the seating detection result, but instead of or in addition to this, A relatively loud notification sound or a relatively small notification sound may be output. Furthermore, instead of or in addition to these, the terminal devices SPb and TAb may output a notification sound with a relatively long time length or a notification sound with a relatively short time length.
 図19は、第2実施形態における変形形態を説明するための図である。図19Aは、着座検知結果が着座無しである場合を示し、図19Bは、着座検知結果が着座ありである場合を示す。 FIG. 19 is a view for explaining a modification of the second embodiment. FIG. 19A shows the case where the seating detection result shows no seating, and FIG. 19B shows the case where the seating detection result shows seating.
 例えば、センサ装置SUbが着座検知部12で着座無しを検知したために、端末装置SPb、TAbが、前記受信した第2行動検知通知通信信号に収容された監視情報を、例えば図13に示す監視情報表示画面51で表示する場合では、図19Aに示すように、端末装置SPb、TAbは、この監視情報表示画面51を表示するとともに、相対的に大きく長い時間長の通知音(例えば着信メロディ等)を出力する。一方、センサ装置SUbが着座検知部12で着座有りを検知したために、端末装置SPb、TAbが、前記受信した第2行動検知通知通信信号に収容された監視情報を、例えば図18Bに示す監視情報表示画面53で表示する場合では、図19Bに示すように、端末装置SPb、TAbは、この監視情報表示画面53を表示するとともに、相対的に小さく短い時間長の通知音(例えば「ピピッ」等)を出力する。 For example, since the sensor device SUb detects no seating in the seating detection unit 12, the terminal devices SPb and TAb monitor information stored in the received second activity detection notification communication signal, for example, monitoring information illustrated in FIG. In the case of displaying on the display screen 51, as shown in FIG. 19A, the terminal devices SPb and TAb display the monitoring information display screen 51 and a notification sound (for example, ringing tone etc.) having a relatively large and long time length. Output On the other hand, since the sensor device SUb has detected the presence of seating by the seating detection unit 12, the terminal devices SPb and TAb monitor the monitoring information contained in the received second behavior detection notification communication signal, for example, monitoring information shown in FIG. 18B. In the case of displaying on the display screen 53, as shown in FIG. 19B, the terminal devices SPb and TAb display the monitoring information display screen 53, and a notification sound with a relatively small and short time length (for example, "pipi" etc. Output).
 次に、別の実施形態について説明する。 Next, another embodiment will be described.
(第3実施形態)
 第1実施形態における被監視者監視支援システムMSaは、前記検知結果を、前記着座検知結果に基づいて端末装置に表示させるか否かの異なる処理方法で処理したが、第3実施形態における被監視者監視支援システムは、前記検知結果を、前記着座検知結果に基づいて管理サーバ装置に管理させるか否かの異なる処理方法で処理するものである。
Third Embodiment
The monitored person monitoring support system MSa in the first embodiment processes the detection result according to a different processing method as to whether or not to display the detection result on the terminal device based on the seating detection result. The person monitoring support system processes the detection result according to different processing methods as to whether or not the management server device manages the detection result based on the seating detection result.
 このような第3実施形態における被監視者監視支援システムMScは、例えば、図1に示すように、1または複数のセンサ装置SUc(SUc-1~SUc-4)と、管理サーバ装置SVcと、固定端末装置SPaと、1または複数の携帯端末装置TAa(TAa-1、TAa-2)と、構内交換機CXとを備え、これらは、ネットワークNWを介して通信可能に接続される。この第3実施形態の被監視者監視支援システムMScにおける固定端末装置SPaおよび携帯端末装置TAaは、それぞれ、第1実施形態の被監視者監視支援システムMSaにおける固定端末装置SPaおよび携帯端末装置TAaと同様であるので、その説明を省略する。 The monitored person monitoring support system MSc in the third embodiment is, for example, as shown in FIG. 1, one or more sensor devices SUc (SUc-1 to SUc-4), a management server device SVc, The fixed terminal apparatus SPa, one or more portable terminal apparatuses TAa (TAa-1, TAa-2), and a private branch exchange CX are connected communicably via a network NW. The fixed terminal apparatus SPa and the portable terminal apparatus TAa in the monitored person monitoring support system MSc of the third embodiment are respectively connected with the fixed terminal apparatus SPa and the portable terminal apparatus TAa in the monitored person monitoring support system MSa of the first embodiment. The description is omitted because it is similar.
 この第3実施形態の被監視者監視支援システムMScにおけるセンサ装置SUcは、例えば、図2に示すように、撮像部11と、着座検知部12と、SU制御処理部13cと、SU通信IF部14と、SU記憶部15とを備える。これら第3実施形態のセンサ装置SUcにおける撮像部11、着座検知部12、SU通信IF部14およびSU記憶部15は、それぞれ、第1実施形態のセンサ装置SUaにおける撮像部11、着座検知部12、SU通信IF部14およびSU記憶部15と同様であるので、その説明を省略する。 The sensor device SUc in the monitored person monitoring support system MSc of the third embodiment is, for example, as shown in FIG. 2, an imaging unit 11, a seating detection unit 12, an SU control processing unit 13c, and an SU communication IF unit 14 and an SU storage unit 15. The imaging unit 11, the seating detection unit 12, the SU communication IF unit 14 and the SU storage unit 15 in the sensor device SUc according to the third embodiment respectively correspond to the imaging unit 11 and the seating detection unit 12 in the sensor device SUa according to the first embodiment. Since the SU communication IF unit 14 and the SU storage unit 15 are the same, the description thereof is omitted.
 SU制御処理部13cは、センサ装置SUcの各部を当該各部の機能に応じてそれぞれ制御し、予め設定された被監視者Obに関わる所定の行動を検知してその検知結果を画像とともに管理サーバ装置SVcへ送信するための回路である。そして、本実施形態では、SU制御処理部13cは、前記被監視者Obの着座を検知し、前記検知結果を管理サーバ装置SVcへ送信する際に、前記検知結果を、着座検知部12の着座検知結果に基づいて異なる処理方法で処理する。SU制御処理部13cは、制御処理プログラムが実行されることによって、SU制御部131、行動検知処理部132および検知結果処理部133cを機能的に備える。これら第3実施形態のSU制御処理部13cにおけるSU制御部131および行動検知処理部132は、それぞれ、第1実施形態のSU制御処理部13aにおけるSU制御部131および行動検知処理部132と同様であるので、その説明を省略する。 The SU control processing unit 13c controls each unit of the sensor device SUc according to the function of each unit, detects a predetermined action related to the monitored person Ob set in advance, and detects the detection result together with an image with the management server device. It is a circuit for transmitting to SVc. Then, in the present embodiment, when the SU control processing unit 13c detects the seating of the monitored person Ob and transmits the detection result to the management server device SVc, the seating of the seating detection unit 12 is the detection result. Process with different processing methods based on detection results. The SU control processing unit 13c functionally includes an SU control unit 131, an action detection processing unit 132, and a detection result processing unit 133c by executing a control processing program. The SU control unit 131 and the action detection processing unit 132 in the SU control processing unit 13c of the third embodiment are the same as the SU control unit 131 and the action detection processing unit 132 in the SU control processing unit 13a of the first embodiment, respectively. Because there is, I omit the explanation.
 検知結果処理部133cは、行動検知処理部132の検知結果を、着座検知部12の着座検知結果に基づいて異なる処理方法で処理するものである。より具体的には、本実施形態では、検知結果処理部133cは、前記着座検知結果が被監視者Obの着座ではない場合に検知結果を管理サーバ装置SVcに記憶して管理させる第5処理方法で処理し、前記着座検知結果が前記被監視者Obの着座である場合に前記検知結果を前記管理サーバ装置SVcに記憶させずに管理させない第6処理方法で処理する。より詳しくは、前記第6処理方法は、前記検知結果を管理サーバ装置SVcに送信しない処理方法である。すなわち、前記第5処理方法では、行動検知処理部132が前記所定の行動を検知すると、前記着座検知結果が被監視者Obの着座ではない場合に、検知結果処理部133cは、第1行動検知通知通信信号をSU通信IF部14で管理サーバ装置SVcへ送信する。前記第5処理方法では、行動検知処理部132が前記所定の行動を検知しても、前記着座検知結果が被監視者Obの着座である場合に、検知結果処理部133cは、前記第1行動検知通知通信信号を送信しない。 The detection result processing unit 133 c processes the detection result of the action detection processing unit 132 by a different processing method based on the seating detection result of the seating detection unit 12. More specifically, in the present embodiment, the fifth processing method causes the detection result processing unit 133c to store and manage the detection result in the management server device SVc when the seating detection result is not the seating of the monitored person Ob. If the seating detection result indicates that the person to be monitored Ob is seated, the processing is performed according to a sixth processing method that does not store the detection result in the management server device SVc. More specifically, the sixth processing method is a processing method which does not transmit the detection result to the management server device SVc. That is, in the fifth processing method, when the action detection processing unit 132 detects the predetermined action, the detection result processing unit 133c detects the first action when the seating detection result is not the seating of the monitored person Ob. The notification communication signal is transmitted by the SU communication IF unit 14 to the management server SVc. In the fifth processing method, even when the action detection processing unit 132 detects the predetermined action, the detection result processing unit 133 c performs the first action when the seating detection result indicates that the monitored person Ob is seated. Does not send detection notification communication signal.
 したがって、結果的に、検知結果処理部133cは、検知結果処理部133aで実現できている((検知結果処理部133c)=(検知結果処理部133a))。このため、第3実施形態における被監視者監視支援システムMScは、第1実施形態におけるセンサ装置SUaおよび管理サーバ装置SVaそれぞれをセンサ装置SUcおよび管理サーバ装置SVcそれぞれとして用いることで実現できる。このため、管理サーバ装置SVcの説明を省略する。 Therefore, as a result, the detection result processing unit 133c can be realized by the detection result processing unit 133a ((detection result processing unit 133c) = (detection result processing unit 133a)). Therefore, the monitored person monitoring support system MSc in the third embodiment can be realized by using each of the sensor device SUa and the management server device SVa in the first embodiment as the sensor device SUc and the management server device SVc. Therefore, the description of the management server device SVc is omitted.
 第3実施形態における被監視者監視支援システムMSbおよびこれに実装された被監視者監視支援方法は、前記検知結果を、前記着座検知結果に基づいてその記憶の実行か不実行を制御するので、より信頼性の高い検知結果のみを記憶して管理できる。 The monitored person monitoring support system MSb and the monitored person monitoring support method implemented in the third embodiment control execution or non-execution of the memory based on the result of the seating detection, as the detection result. Only more reliable detection results can be stored and managed.
 なお、上述の実施形態では、検知結果処理部133cは、センサ装置SUcに備えられたが、センサ装置SUcに代えて管理サーバ装置SVcに備えられても良い。 In the above embodiment, the detection result processing unit 133c is provided in the sensor device SUc, but may be provided in the management server device SVc instead of the sensor device SUc.
 図20は、第3実施形態において、変形形態の被監視者監視システムの動作を示すシーケンス図である。 FIG. 20 is a sequence diagram showing the operation of the monitored person monitoring system of the modification in the third embodiment.
 管理サーバ装置SVcが検知結果処理部を備える場合では、まず、センサ装置SUcから検知結果処理部133cが省略され、検知結果処理部133cに代え行動検知処理部132が第1検知結果通知通信信号を管理サーバ装置SVcへ送信する。この場合では、前記第1検知結果通知通信信号には、自機のセンサID、検知した被監視者Obにおける所定の行動の内容を表す情報(本実施形態では、離床、転落および転倒のうちのいずれか)、および、前記所定の行動の検知に用いられた対象画像が収容されるだけでなく、さらに、着座検知結果(本実施形態では着座有りおよび着座無しのうちのいずれか)も収容される。そして、管理サーバ装置SVcは、SV制御処理部22に、SV制御部221およびSV監視処理部222を機能的に備えるだけでなく、さらに、検知結果処理部を機能的に備える。この管理サーバ装置SVcにおける前記検知結果処理部は、センサ装置SUcから受信した第1検知結果通知通信信号に被監視者Obにおける所定の行動の内容を表す情報として収容された検知結果を、前記受信した第1検知結果通知通信信号に収容された着座検知部12の着座検知結果に基づいて異なる処理方法で処理するものである。より具体的には、管理サーバ装置SVcにおける前記検知結果処理部は、前記着座検知結果が被監視者Obの着座ではない場合(着座無しの場合)に検知結果を管理サーバ装置SVcに記憶して管理させる第5処理方法で処理し、前記着座検知結果が前記被監視者Obの着座である場合(着座有りの場合)に前記検知結果を前記管理サーバ装置SVcに記憶させずに管理させない第6処理方法で処理する。すなわち、上述では、SV監視処理部222が前記受信した第1行動検知通知通信信号に収容された監視情報をSV記憶部23に記憶して管理するので、管理サーバ装置SVcにおける前記検知結果処理部は、前記着座検知結果が被監視者Obの着座ではない場合(着座なしの場合)に、SV監視処理部222による監視情報の記憶および管理を許可し、前記着座検知結果が被監視者Obの着座である場合(着座有りの場合)に、SV監視処理部222による監視情報の記憶および管理を禁止し、前記監視情報を破棄する。 When the management server device SVc includes the detection result processing unit, first, the detection result processing unit 133c is omitted from the sensor device SUc, and the action detection processing unit 132 replaces the first detection result notification communication signal instead of the detection result processing unit 133c. Transmit to the management server SVc. In this case, in the first detection result notification communication signal, the sensor ID of the own device, information indicating the content of the predetermined action in the detected monitored person Ob (in this embodiment, among the bed leaving, falling and falling) Not only is the target image used to detect the predetermined action and the target image stored, but also the seating detection result (in the present embodiment, either seating presence or no seating) is also included Ru. The management server SVc not only functionally includes the SV control unit 221 and the SV monitoring processing unit 222 in the SV control processing unit 22 but also functionally includes a detection result processing unit. The detection result processing unit in the management server device SVc receives the detection result contained in the first detection result notification communication signal received from the sensor device SUc as information representing the content of a predetermined action in the monitored person Ob. Based on the seating detection result of the seating detection unit 12 contained in the first detection result notification communication signal, processing is performed using different processing methods. More specifically, the detection result processing unit in the management server device SVc stores the detection result in the management server device SVc when the seating detection result is not the seating of the monitored person Ob (in the case of no seating). The processing is performed by the fifth processing method for management, and when the seating detection result indicates that the person to be monitored Ob is seated (when seating is present), the management result is not stored in the management server SVc without being managed. Process by processing method. That is, in the above description, the SV monitoring processing unit 222 stores the monitoring information contained in the received first action detection notification communication signal in the SV storage unit 23 and manages it, so that the detection result processing unit in the management server SVc Allows the storage and management of the monitoring information by the SV monitoring processing unit 222 when the seating detection result is not the seating of the monitored person Ob (when there is no seating), and the seating detection result is for the monitored person Ob In the case of being seated (in the presence of seating), storage and management of monitoring information by the SV monitoring processing unit 222 are prohibited, and the monitoring information is discarded.
 このような変形形態における被監視者監視支援システムMScでは、行動検知処理部132によって所定の行動が検知されるとともに着座検知部12によって着座なしが検知された場合、前記第1検知結果通知通信信号に、さらに、前記着座検知結果が収容される点を除き、図12に示す、上述した処理C1ないし処理C5の各処理が実行される。 In the monitored person monitoring support system MSc in such a modified embodiment, when the action detection processing unit 132 detects a predetermined action and the seating detection unit 12 detects no seating, the first detection result notification communication signal Furthermore, except for the fact that the seating detection result is accommodated, the processing of the above-described processing C1 to processing C5 shown in FIG. 12 is executed.
 一方、図20において、行動検知処理部132によって所定の行動が検知されるとともに着座検知部12によって着座なしが検知されると(C31)、センサ装置SUcは、前記着座検知結果として着座有りをさらに収容した第1行動検知通知通信信号を管理サーバ装置SVcへ送信する(C32)。管理サーバ装置SVcは、この第1行動検知通知通信信号を受信すると、この受信した第1行動検知通知通信信号に収容された着座有りの着座検知結果により、監視情報を破棄し、記憶せずに管理しない(C33)。 On the other hand, in FIG. 20, when a predetermined action is detected by the action detection processing unit 132 and no seating is detected by the seating detection unit 12 (C31), the sensor device SUc further detects seating presence as the seating detection result. The received first action detection notification communication signal is transmitted to the management server device SVc (C32). When the management server device SVc receives the first action detection notification communication signal, the management server device SVc discards the monitoring information according to the seating detection result of the seating presence accommodated in the received first action detection notification communication signal, and does not store it. Do not manage (C33).
 本明細書は、上記のように様々な態様の技術を開示しているが、そのうち主な技術を以下に纏める。 Although the present specification discloses various aspects of the technology as described above, the main technologies are summarized below.
 一態様にかかる被監視者監視支援システムは、監視対象である被監視者に対応して設けられ、前記被監視者に関わる所定の行動を検知するセンサ装置、前記センサ装置と通信可能に接続され前記センサ装置から受信した検知結果を管理する中央処理装置、および、前記中央処理装置と通信可能に接続され前記中央処理装置を介して前記検知結果を受信して表示する端末装置を備え、前記被監視者の監視を支援するための被監視者監視システムであって、前記被監視者の着座を検知する着座検知部と、前記検知結果を、前記着座検知部の着座検知結果に基づいて異なる処理方法で処理する検知結果処理部とを備える。好ましくは、上述の被監視者監視支援システムにおいて、前記センサ装置は、前記被監視者の形状を写した画像を生成し、前記生成した画像に基づいて前記所定の行動を検知する。 The monitored person monitoring support system according to one aspect is provided corresponding to the monitored person to be monitored, and is connected communicably to the sensor device for detecting a predetermined action related to the monitored person, and the sensor device. A central processing unit that manages detection results received from the sensor device; and a terminal device communicably connected to the central processing unit and receiving and displaying the detection results via the central processing unit; A monitored person monitoring system for supporting monitoring of a monitored person, the seating detection unit detecting seating of the monitored person, and the processing of processing the detection result different based on the seating detection result of the seating detection unit And a detection result processing unit that performs processing by a method. Preferably, in the above-mentioned person-to-be-monitored monitoring support system, the sensor device generates an image in which the shape of the person to be monitored is copied, and detects the predetermined action based on the generated image.
 着座検知部は、被監視者に対する着座の有無を検知する。このため、センサ装置が前記所定の行動を検知した際に、着座検知部が被監視者の着座を検知していれば、被監視者Obが着座しているので、前記所定の行動の検知結果は、誤検知の可能性が高く、前記検知結果を前記着座検知部の着座検知結果で修正できる。上記被監視者監視支援システムは、前記検知結果を、前記着座検知部の着座検知結果に基づいて処理するので、前記検知結果の信頼性を高めることができ、したがって、より適切に発報できる。 The seating detection unit detects the presence or absence of seating on the monitored person. For this reason, when the seating detection unit detects seating of the monitored person when the sensor device detects the predetermined behavior, the monitored person Ob is seated, the detection result of the predetermined behavior There is a high possibility of false detection, and the detection result can be corrected by the seating detection result of the seating detection unit. Since the above-mentioned person-to-be-monitored monitoring support system processes the detection result based on the seating detection result of the seating detection unit, the reliability of the detection result can be enhanced, and therefore, notification can be issued more appropriately.
 他の一態様では、上述の被監視者監視支援システムにおいて、前記検知結果処理部は、前記着座検知結果が前記被監視者の着座ではない場合に前記検知結果を前記端末装置に表示させる第1処理方法で処理し、前記着座検知結果が前記被監視者の着座である場合に前記検知結果を前記端末装置に表示させない第2処理方法で処理する。 In another aspect, in the monitored person monitoring support system described above, the detection result processing unit causes the terminal device to display the detection result when the seating detection result is not the seating of the monitored person. It processes by a processing method, and when the said seating detection result is seating of the said to-be-monitored person, it processes by the 2nd processing method which does not display the said detection result on the said terminal device.
 このような被監視者監視支援システムは、前記着座検知結果が前記被監視者の着座ではない場合に前記検知結果を前記端末装置に表示させ、前記着座検知結果が前記被監視者の着座である場合に前記検知結果を前記端末装置に表示させないので、前記検知結果の信頼性を高めることができ、したがって、より適切に発報できる。 Such a monitored person monitoring support system displays the detection result on the terminal device when the seating detection result is not the seating of the monitored person, and the seating detection result is the seating of the monitored person Since the detection result is not displayed on the terminal device in some cases, the reliability of the detection result can be enhanced, and therefore, notification can be issued more appropriately.
 他の一態様では、上述の被監視者監視支援システムにおいて、前記検知結果処理部は、前記センサ装置に備えられ、前記第2処理方法は、前記検知結果を前記管理サーバ装置に送信しない処理方法である。 In another aspect, in the above-described monitored person monitoring support system, the detection result processing unit is provided in the sensor device, and the second processing method does not transmit the detection result to the management server device. It is.
 これによれば、前記センサ装置が前記検知結果を前記管理サーバ装置に送信しない処理方法で前記第2処理方法を実現した被監視者監視支援システムが提供できる。 According to this, it is possible to provide a monitored person monitoring support system in which the second processing method is realized by the processing method in which the sensor device does not transmit the detection result to the management server device.
 他の一態様では、上述の被監視者監視支援システムにおいて、前記検知結果処理部は、前記管理サーバ装置に備えられ、前記第2処理方法は、前記検知結果を前記端末装置に送信しない処理方法である。 In another aspect, in the above-described monitored person monitoring support system, the detection result processing unit is provided in the management server device, and the second processing method does not transmit the detection result to the terminal device. It is.
 これによれば、前記管理サーバ装置が前記検知結果を前記端末装置に送信しない処理方法で前記第2処理方法を実現した被監視者監視支援システムが提供できる。 According to this, it is possible to provide a monitored person monitoring support system in which the second processing method is realized by the processing method in which the management server device does not transmit the detection result to the terminal device.
 他の一態様では、上述の被監視者監視支援システムにおいて、前記検知結果処理部は、前記端末装置に備えられ、前記第2処理方法は、前記検知結果を表示しない処理方法である。 In another aspect, in the above-described monitored person monitoring support system, the detection result processing unit is provided in the terminal device, and the second processing method is a processing method which does not display the detection result.
 これによれば、前記端末装置が前記検知結果を表示しない処理方法で前記第2処理方法を実現した被監視者監視支援システムが提供できる。 According to this, it is possible to provide a monitored person monitoring support system in which the second processing method is realized by the processing method in which the terminal device does not display the detection result.
 他の一態様では、これら上述の被監視者監視支援システムにおいて、前記検知結果処理部は、前記着座検知結果が前記被監視者の着座ではない場合に前記検知結果を所定の第1表示態様で前記端末装置に表示させる第3処理方法で処理し、前記着座検知結果が前記被監視者の着座である場合に前記検知結果を前記第1表示態様と異なる所定の第2表示態様で前記端末装置に表示させる第4処理方法で処理する。好ましくは、上述の被監視者監視支援システムにおいて、前記第1および第2表示態様は、前記検知結果に対する信頼性の度合いを表す表示態様であり、前記第2表示態様は、前記第1表示態様より信頼性の度合いが低いことを表す表示態様である。 In another aspect, in the above-mentioned person-to-be-monitored monitoring support system described above, when the seating detection result is not a seating of the person to be monitored, the detection result processing unit performs the detection result in a predetermined first display mode The terminal device is processed according to a third processing method to be displayed on the terminal device, and the detection result is different from the first display mode when the seating detection result is the seating of the monitored person. It processes by the 4th processing method displayed on. Preferably, in the above-mentioned person-to-be-monitored monitoring support system, the first and second display modes are display modes indicating the degree of reliability of the detection result, and the second display mode is the first display mode. It is a display mode showing that the degree of reliability is lower.
 センサ装置が前記所定の行動を検知した際に、着座検知部が被監視者の着座を検知していれば、被監視者が着座しているので、前記所定の行動の検知結果は、着座検知部が被監視者の着座を検知していない場合に較べて信頼性が低く、前記検知結果を着座検知部の着座検知結果で修正できる。上記被監視者監視支援システムは、前記検知結果を、前記着座検知部の着座検知結果に基づいてその表示態様を変えるので、その表示態様から、前記着座検知部の着座検知結果に基づく前記検知結果の信頼性を認識でき、したがって、前記検知結果の信頼性を伴ってより適切に発報できる。 If the seating detection unit detects seating of the monitored person when the sensor device detects the predetermined behavior, the monitored person is seated, so the detection result of the predetermined behavior is seating detection. The reliability is lower than when the unit does not detect the seating of the monitored person, and the detection result can be corrected by the seating detection result of the seating detection unit. The monitored person monitoring support system changes the display mode of the detection result based on the seating detection result of the seating detection unit. Therefore, the detection result based on the seating detection result of the seating detection unit from the display mode The reliability of the detection result can therefore be recognized more appropriately with the reliability of the detection result.
 他の一態様では、これら上述の被監視者監視支援システムにおいて、前記検知結果処理部は、前記着座検知結果が前記被監視者の着座ではない場合に前記検知結果を前記管理サーバ装置に記憶して管理させる第5処理方法で処理し、前記着座検知結果が前記被監視者の着座である場合に前記検知結果を前記管理サーバ装置に記憶させずに管理させない第6処理方法で処理する。好ましくは、上述の被監視者監視支援システムにおいて、前記検知結果処理部は、前記センサ装置に備えられ、前記第6処理方法は、前記検知結果を前記管理サーバ装置に送信しない処理方法である。好ましくは、上述の被監視者監視支援システムにおいて、前記検知結果処理部は、前記管理サーバ装置に備えられ、前記第6処理方法は、前記検知結果を記憶せずに管理しない処理方法である。 In another aspect, in the above-described monitored person monitoring support system, the detection result processing unit stores the detection result in the management server device when the seating detection result is not a seating of the monitored person. It processes by the 5th processing method to manage, and when the said seating detection result is seating of the said to-be-monitored person, it processes by the 6th processing method which is not made to manage without storing the said detection result in the said management server apparatus. Preferably, in the above-described monitored person monitoring support system, the detection result processing unit is included in the sensor device, and the sixth processing method is a processing method which does not transmit the detection result to the management server device. Preferably, in the above-mentioned person-to-be-monitored monitoring support system, the detection result processing unit is provided in the management server device, and the sixth processing method is a processing method which does not store the detection result but does not manage it.
 このような被監視者監視支援システムは、前記検知結果を、前記着座検知部の着座検知結果に基づいてその記憶の実行か不実行を制御するので、より信頼性の高い検知結果のみを記憶して管理できる。 Such a monitored person monitoring support system controls the execution or non-execution of the memory based on the seating detection result of the seating detection unit, so that only the detection result with higher reliability is stored. Management.
 他の一態様では、これら上述の被監視者監視支援システムにおいて、前記所定の行動は、予め設定され、前記被監視者が寝具から離れた離床、前記被監視者が寝具から落ちた転落、および、前記被監視者が転倒した転倒のうちの少なくとも1つを含む。 In another aspect, in the above-mentioned person-to-be-monitored person monitoring support system described above, the predetermined action is preset, and the person to be monitored leaves the bed, the person to be monitored falls from the bed, and And at least one of the falls that the monitored person has fallen.
 看護レベルや介護レベルによって、被監視者が離床した場合、例えば排泄、洗面および身支度等の介助目的で、また例えばその転倒の防止目的で、監視者は、その発報を受けると、被監視者の処に赴く。被監視者が転落や転倒した場合、監視者は、被監視者の救護目的で、監視者は、その発報を受けると、被監視者の処に赴く。したがって、前記所定の行動が「離床」、「転落」および「転倒」のうちの少なくとも1つを含む場合、より適切に発報されることによって、誤報による無駄な労力が低減され、監視者の負担を効果的に軽減できる。 Depending on the nursing level and the care level, when the monitored person leaves the bed, the supervisory person receives the notification for the purpose of assisting with, for example, excretion, washing and getting ready, for example, for the purpose of preventing the fall thereof. I went to my place. When the monitored person falls or falls, the monitoring person visits the monitored person when receiving the notification for the purpose of rescue of the monitored person. Therefore, if the predetermined action includes at least one of “bedding off”, “falling” and “falling”, by being notified more appropriately, wasteful effort due to false alarm is reduced, and the watcher's The burden can be reduced effectively.
 他の一態様では、これら上述の被監視者監視支援システムにおいて、前記着座検知部は、接触式の着座センサを備える。 In another aspect, in the above-described monitored person monitoring support system, the seating detection unit includes a contact-type seating sensor.
 これによれば、前記着座検知部が接触式の着座センサを備える被監視者監視支援システムが提供できる。 According to this, it is possible to provide a monitored person monitoring support system in which the seating detection unit includes a contact type seating sensor.
 他の一態様では、これら上述の被監視者監視支援システムにおいて、前記着座検知部は、非接触式の着座センサを備える。 In another aspect, in the above-described monitored person monitoring support system, the seating detection unit includes a non-contact seating sensor.
 これによれば、前記着座検知部が非接触式の着座センサを備える被監視者監視支援システムが提供できる。 According to this, it is possible to provide a monitored person monitoring support system in which the seating detection unit includes a noncontact seating sensor.
 他の一態様では、これら上述の被監視者監視支援システムにおいて、前記センサ装置は、画像を生成する撮像部をさらに備え、前記端末装置は、前記センサ装置から前記管理サーバ装置を介して受信した前記画像を表示する。 In another aspect, in the above-described monitored person monitoring support system, the sensor device further includes an imaging unit configured to generate an image, and the terminal device is received from the sensor device via the management server device. Display the image.
 このような被監視者監視支援システムは、画像によって端末装置から被監視者の様子を認識できる。 Such a monitored person monitoring support system can recognize the situation of the monitored person from the terminal device by the image.
 他の一態様では、これら上述の被監視者監視支援システムにおいて、前記端末装置は、携帯端末装置である。 In another aspect, in the above-mentioned person-to-be-monitored person monitoring support system described above, the terminal device is a portable terminal device.
 このような被監視者監視支援システムでは、監視者は、端末装置を携帯できる。 In such a monitored person monitoring support system, the supervisor can carry the terminal device.
 他の一態様にかかる被監視者監視支援方法は、監視対象である被監視者に対応して設けられ、前記被監視者に関わる所定の行動を検知するセンサ装置、前記センサ装置と通信可能に接続され前記センサ装置から受信した検知結果を管理する中央処理装置、および、前記中央処理装置と通信可能に接続され前記中央処理装置を介して前記検知結果を受信して表示する端末装置を備え、前記被監視者の監視を支援するための被監視者監視システムの被監視者監視支援方法であって、前記被監視者の着座を検知する着座検知工程と、前記検知結果を、前記着座検知工程の着座検知結果に基づいて異なる処理方法で処理する検知結果処理工程とを備える。 A monitored person monitoring support method according to another aspect is provided corresponding to a monitored person to be monitored, and is capable of communicating with the sensor device that detects a predetermined behavior related to the monitored person A central processing unit connected and managing a detection result received from the sensor device; and a terminal device communicably connected to the central processing unit and receiving and displaying the detection result via the central processing unit; A monitored person monitoring support method of a monitored person monitoring system for supporting monitoring of the monitored person, a seating detection process for detecting seating of the monitored person, and the detection result, the seating detection process And a detection result processing step of processing by a different processing method based on the seating detection result of
 上記被監視者監視支援方法は、前記検知結果を、前記着座検知工程の着座検知結果に基づいて処理するので、前記検知結果の信頼性を高めることができ、したがって、より適切に発報できる。 Since the above-mentioned person-to-be-monitored monitoring support method processes the detection result based on the seating detection result of the seating detection process, the reliability of the detection result can be enhanced, and therefore, notification can be issued more appropriately.
 この出願は、2017年10月11日に出願された日本国特許出願特願2017-197703を基礎とするものであり、その内容は、本願に含まれるものである。 This application is based on Japanese Patent Application No. 2017-197703 filed on Oct. 11, 2017, the contents of which are included in the present application.
 本発明の実施形態が詳細に図示され、かつ、説明されたが、それは単なる図例及び実例であって限定ではない。本発明の範囲は、添付されたクレームの文言によって解釈されるべきである。 Although embodiments of the present invention have been illustrated and described in detail, it is merely illustrative and not restrictive. The scope of the present invention should be interpreted by the terms of the appended claims.
 本発明を表現するために、上述において図面を参照しながら実施形態を通して本発明を適切且つ十分に説明したが、当業者であれば上述の実施形態を変更および/または改良することは容易に為し得ることであると認識すべきである。したがって、当業者が実施する変更形態または改良形態が、請求の範囲に記載された請求項の権利範囲を離脱するレベルのものでない限り、当該変更形態または当該改良形態は、当該請求項の権利範囲に包括されると解釈される。 While the present invention has been properly and sufficiently described above through the embodiments with reference to the drawings in order to express the present invention, those skilled in the art can easily change and / or improve the above embodiments. It should be recognized that it is possible. Therefore, unless a change or improvement implemented by a person skilled in the art is at a level that deviates from the scope of the claims set forth in the claims, the change or the improvement is the scope of the rights of the claim It is interpreted as being included in
 本発明によれば、被監視者の監視を支援するための被監視者監視支援システムおよび被監視者監視支援方法が提供できる。 According to the present invention, it is possible to provide a monitored person monitoring support system and a monitored person monitoring support method for supporting monitoring of a monitored person.

Claims (13)

  1.  監視対象である被監視者に対応して設けられ、前記被監視者に関わる所定の行動を検知するセンサ装置、前記センサ装置と通信可能に接続され前記センサ装置から受信した検知結果を管理する中央処理装置、および、前記中央処理装置と通信可能に接続され前記中央処理装置を介して前記検知結果を受信して表示する端末装置を備え、前記被監視者の監視を支援するための被監視者監視システムであって、
     前記被監視者の着座を検知する着座検知部と、
     前記検知結果を、前記着座検知部の着座検知結果に基づいて異なる処理方法で処理する検知結果処理部とを備える、
     被監視者監視支援システム。
    A sensor device is provided corresponding to a person to be monitored that is to be monitored, and detects a predetermined action related to the person to be monitored, and is centrally connected to the sensor device so as to be communicable with and to manage detection results received from the sensor device A person to be monitored for supporting the monitoring of the person to be monitored, comprising: a processing device; and a terminal device communicably connected to the central processing device and receiving and displaying the detection result via the central processing device; A surveillance system,
    A seating detection unit that detects seating of the person to be monitored;
    And a detection result processing unit configured to process the detection result by a different processing method based on the seating detection result of the seating detection unit.
    Monitored person monitoring support system.
  2.  前記検知結果処理部は、前記着座検知結果が前記被監視者の着座ではない場合に前記検知結果を前記端末装置に表示させる第1処理方法で処理し、前記着座検知結果が前記被監視者の着座である場合に前記検知結果を前記端末装置に表示させない第2処理方法で処理する、
     請求項1に記載の被監視者監視支援システム。
    The detection result processing unit processes the detection result according to a first processing method for displaying the detection result on the terminal device when the seating detection result is not the seating of the monitored person, and the seating detection result is the monitored person's Process the second detection method in which the detection result is not displayed on the terminal device when seated.
    The monitored person monitoring support system according to claim 1.
  3.  前記検知結果処理部は、前記センサ装置に備えられ、
     前記第2処理方法は、前記検知結果を前記管理サーバ装置に送信しない処理方法である、
     請求項2に記載の被監視者監視支援システム。
    The detection result processing unit is provided in the sensor device.
    The second processing method is a processing method which does not transmit the detection result to the management server device.
    The monitored person monitoring support system according to claim 2.
  4.  前記検知結果処理部は、前記管理サーバ装置に備えられ、
     前記第2処理方法は、前記検知結果を前記端末装置に送信しない処理方法である、
     請求項2に記載の被監視者監視支援システム。
    The detection result processing unit is included in the management server device.
    The second processing method is a processing method which does not transmit the detection result to the terminal device.
    The monitored person monitoring support system according to claim 2.
  5.  前記検知結果処理部は、前記端末装置に備えられ、
     前記第2処理方法は、前記検知結果を表示しない処理方法である、
     請求項2に記載の被監視者監視支援システム。
    The detection result processing unit is included in the terminal device.
    The second processing method is a processing method which does not display the detection result.
    The monitored person monitoring support system according to claim 2.
  6.  前記検知結果処理部は、前記着座検知結果が前記被監視者の着座ではない場合に前記検知結果を所定の第1表示態様で前記端末装置に表示させる第3処理方法で処理し、前記着座検知結果が前記被監視者の着座である場合に前記検知結果を前記第1表示態様と異なる所定の第2表示態様で前記端末装置に表示させる第4処理方法で処理する、
     請求項1に記載の被監視者監視支援システム。
    The detection result processing unit processes the detection result according to a third processing method for causing the terminal device to display the detection result in a predetermined first display mode when the seating detection result is not the seating of the monitored person, and the seating detection is performed. Processing is performed according to a fourth processing method in which the detection result is displayed on the terminal device in a predetermined second display mode different from the first display mode when the result is a seat of the monitored person.
    The monitored person monitoring support system according to claim 1.
  7.  前記検知結果処理部は、前記着座検知結果が前記被監視者の着座ではない場合に前記検知結果を前記管理サーバ装置に記憶して管理させる第5処理方法で処理し、前記着座検知結果が前記被監視者の着座である場合に前記検知結果を前記管理サーバ装置に記憶させずに管理させない第6処理方法で処理する、
     請求項1に記載の被監視者監視支援システム。
    The detection result processing unit processes the detection result according to a fifth processing method for storing and managing the detection result in the management server device when the seating detection result is not the seating of the monitored person, and the seating detection result is the processing Processing is performed according to a sixth processing method in which the detection result is not stored without being stored in the management server device when the monitored person is seated.
    The monitored person monitoring support system according to claim 1.
  8.  前記所定の行動は、予め設定され、前記被監視者が寝具から離れた離床、前記被監視者が寝具から落ちた転落、および、前記被監視者が転倒した転倒のうちの少なくとも1つを含む、
     請求項1ないし請求項7のいずれか1項に記載の被監視者監視支援システム。
    The predetermined action includes at least one of a preset, the bedridden leaving the bedclothes from the bedding, a falling down when the bedpanzees fell from the bedding, and a falling when the sportsmen overturned ,
    The monitored person monitoring support system according to any one of claims 1 to 7.
  9.  前記着座検知部は、接触式の着座センサを備える、
     請求項1ないし請求項8のいずれか1項に記載の被監視者監視支援システム。
    The seating detection unit includes a contact type seating sensor.
    The monitored person monitoring support system according to any one of claims 1 to 8.
  10.  前記着座検知部は、非接触式の着座センサを備える、
     請求項1ないし請求項8のいずれか1項に記載の被監視者監視支援システム。
    The seating detection unit includes a noncontact seating sensor.
    The monitored person monitoring support system according to any one of claims 1 to 8.
  11.  前記センサ装置は、画像を生成する撮像部をさらに備え、
     前記端末装置は、前記センサ装置から前記管理サーバ装置を介して受信した前記画像を表示する、
     請求項1ないし請求項10のいずれか1項に記載の被監視者監視支援システム。
    The sensor device further includes an imaging unit that generates an image;
    The terminal device displays the image received from the sensor device via the management server device.
    The monitored person monitoring support system according to any one of claims 1 to 10.
  12.  前記端末装置は、携帯端末装置である、
     請求項1ないし請求項11のいずれか1項に記載の被監視者監視支援システム。
    The terminal device is a portable terminal device.
    The monitored person monitoring support system according to any one of claims 1 to 11.
  13.  監視対象である被監視者に対応して設けられ、前記被監視者に関わる所定の行動を検知するセンサ装置、前記センサ装置と通信可能に接続され前記センサ装置から受信した検知結果を管理する中央処理装置、および、前記中央処理装置と通信可能に接続され前記中央処理装置を介して前記検知結果を受信して表示する端末装置を備え、前記被監視者の監視を支援するための被監視者監視システムの被監視者監視支援方法であって、
     前記被監視者の着座を検知する着座検知工程と、
     前記検知結果を、前記着座検知工程の着座検知結果に基づいて異なる処理方法で処理する検知結果処理工程とを備える、
     被監視者監視支援方法。
     
    A sensor device is provided corresponding to a person to be monitored that is to be monitored, and detects a predetermined action related to the person to be monitored, and is centrally connected to the sensor device so as to be communicable with and to manage detection results received from the sensor device A person to be monitored for supporting the monitoring of the person to be monitored, comprising: a processing device; and a terminal device communicably connected to the central processing device and receiving and displaying the detection result via the central processing device; A monitored person monitoring support method for a monitoring system, comprising:
    A seating detection process for sensing the seating of the person to be monitored;
    And a detection result processing step of processing the detection result by a different processing method based on the seating detection result of the seating detection step.
    Monitored person monitoring support method.
PCT/JP2018/033594 2017-10-11 2018-09-11 Monitored person monitoring assistance system and monitored person monitoring assistance method WO2019073735A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019547948A JP7137155B2 (en) 2017-10-11 2018-09-11 Monitored Person Monitoring Support System, Monitored Person Monitoring Support Method and Program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017197703 2017-10-11
JP2017-197703 2017-10-11

Publications (1)

Publication Number Publication Date
WO2019073735A1 true WO2019073735A1 (en) 2019-04-18

Family

ID=66101544

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/033594 WO2019073735A1 (en) 2017-10-11 2018-09-11 Monitored person monitoring assistance system and monitored person monitoring assistance method

Country Status (2)

Country Link
JP (1) JP7137155B2 (en)
WO (1) WO2019073735A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023105835A1 (en) * 2021-12-07 2023-06-15 パラマウントベッド株式会社 Information processing device and information processing method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000279450A (en) * 1999-03-31 2000-10-10 Matsushita Electric Works Ltd Wheelchair provided with sitting sensor
JP2001057996A (en) * 1999-05-06 2001-03-06 Kawasaki Heavy Ind Ltd Nursing aid device
JP2011086286A (en) * 2009-09-17 2011-04-28 Shimizu Corp Watching system on bed and inside room

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000279450A (en) * 1999-03-31 2000-10-10 Matsushita Electric Works Ltd Wheelchair provided with sitting sensor
JP2001057996A (en) * 1999-05-06 2001-03-06 Kawasaki Heavy Ind Ltd Nursing aid device
JP2011086286A (en) * 2009-09-17 2011-04-28 Shimizu Corp Watching system on bed and inside room

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023105835A1 (en) * 2021-12-07 2023-06-15 パラマウントベッド株式会社 Information processing device and information processing method

Also Published As

Publication number Publication date
JP7137155B2 (en) 2022-09-14
JPWO2019073735A1 (en) 2020-12-17

Similar Documents

Publication Publication Date Title
WO2017082037A1 (en) Central processing device and method for person monitoring system, and person monitoring system
WO2017209094A1 (en) Monitoring system
WO2017104521A1 (en) Monitored person monitoring device, method thereof, and system thereof
JP7137155B2 (en) Monitored Person Monitoring Support System, Monitored Person Monitoring Support Method and Program
JP7137154B2 (en) Behavior detection device and method, and monitored person monitoring support system
JP6504255B2 (en) Care support system and care support method
WO2017179605A1 (en) Watching system and management server
JP6740633B2 (en) Central processing unit and central processing method of monitored person monitoring system, and monitored person monitoring system
JP6895090B2 (en) Detection system and display processing method of detection system
JP6292363B2 (en) Terminal device, terminal device display method, and monitored person monitoring system
JP7180601B2 (en) SLEEP STATE DETECTION DEVICE AND METHOD, AND MONITORED PERSON MONITORING SUPPORT SYSTEM
JP2017151676A (en) Monitored person monitor device, method of monitoring monitored person and program thereof
JP7234931B2 (en) Sensor Device of Monitored Person Monitoring Support System, Processing Method of Sensor Device, and Monitored Person Monitoring Support System
JP7264066B2 (en) Monitored Person Monitoring Support System
JP7425413B2 (en) Monitored person monitoring support device, monitored person monitoring support method, monitored person monitoring support system, and monitored person monitoring support server device
JP7247898B2 (en) Monitored person monitoring system
JP7003921B2 (en) The setting change judgment device of the monitored person monitoring system and the setting change judgment method of the monitored person monitoring system.
JP6245415B1 (en) Terminal device, operation control method of terminal device, and monitored person monitoring system
JP6172416B1 (en) Nurse call system
JP2020188487A (en) Central processing device, monitored person monitoring method, and monitored person monitoring system
JP2019212172A (en) Monitored person monitoring support system and method, and central management device
JP2023107006A (en) nurse call system
WO2017130684A1 (en) Monitored-person monitoring device, method thereof, and system thereof
JPWO2018230103A1 (en) Monitored person monitoring apparatus and method, and monitored person monitoring support system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18867038

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019547948

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18867038

Country of ref document: EP

Kind code of ref document: A1