WO2017141629A1 - Terminal device, display method for terminal device, sensor device, and person-to-be-monitored monitoring system - Google Patents

Terminal device, display method for terminal device, sensor device, and person-to-be-monitored monitoring system Download PDF

Info

Publication number
WO2017141629A1
WO2017141629A1 PCT/JP2017/002079 JP2017002079W WO2017141629A1 WO 2017141629 A1 WO2017141629 A1 WO 2017141629A1 JP 2017002079 W JP2017002079 W JP 2017002079W WO 2017141629 A1 WO2017141629 A1 WO 2017141629A1
Authority
WO
WIPO (PCT)
Prior art keywords
person
image
display
unit
data
Prior art date
Application number
PCT/JP2017/002079
Other languages
French (fr)
Japanese (ja)
Inventor
山下 雅宣
篤広 野田
琢哉 村田
雅史 西角
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Priority to JP2018500001A priority Critical patent/JP6804510B2/en
Publication of WO2017141629A1 publication Critical patent/WO2017141629A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G7/00Beds specially adapted for nursing; Devices for lifting patients or disabled persons
    • A61G7/05Parts, details or accessories of beds
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • G08B25/04Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using a single signalling line, e.g. in a closed loop

Definitions

  • the present invention relates to a terminal device suitably used for a monitored person monitoring system for monitoring a monitored person to be monitored, a display control method for the terminal apparatus, and a sensor suitably used for the monitored person monitoring system.
  • the present invention relates to an apparatus and the monitored person monitoring system.
  • Japan is an aging society, more specifically the ratio of population over 65 years old to the total population due to the improvement of living standards accompanying the post-war high economic growth, improvement of sanitary environment and improvement of medical standards, etc. It is a super-aging society with an aging rate exceeding 21%.
  • the elderly population aged 65 and over was about 25.56 million compared to the total population of 127.65 million, whereas in 2020, the elderly population was about 124.11 million.
  • the population is predicted to be about 34.56 million.
  • nurses who need nursing or nursing care due to illness, injury, elderly age, etc., or those who need nursing care are those who need nursing in a normal society that is not an aging society.
  • monitored person monitoring techniques for monitoring a monitored person to be monitored, such as a care recipient, have been researched and developed.
  • the nurse call system disclosed in Patent Document 1 is a nurse call slave set that is installed in a bed and a patient calls a nurse, and a nurse call set that is installed in a nurse station and responds to a call by the nurse call slave set.
  • a nurse call system having a nurse call parent device, a camera for imaging a patient on a bed from above the bed, and a state in which the patient wakes up from a captured image of the camera and a state in which the patient is separated from the bed State judging means for judging the occurrence of at least one of them and outputting a caution state occurrence signal, and the nurse call master unit has a notifying means for performing a notification operation upon receiving the caution state occurrence signal.
  • the nurse call system transmits a captured image of the camera to the portable terminal upon receiving the attention state generation signal and a portable terminal carried by a nurse to respond to a call from the nurse call slave. Communication control means.
  • the monitor such as a nurse visually indicates the status of the monitored person such as a nurse. It is convenient because it can be grasped. However, if the monitored person appears small or the display has a low resolution, it is difficult for the monitoring person to find the monitored person in the image.
  • the terminal device is a mobile terminal device such as a tablet or a smart phone
  • the monitor screen is relatively small, and thus it is difficult for the monitor to find the monitored person in the image.
  • the present invention has been made in view of the above circumstances, and an object thereof is to provide a terminal device, a display method for the terminal device, a sensor device, and a monitored person monitoring system that can easily find a monitored person in an image. It is to be.
  • the terminal device and the display method thereof receive the monitoring information communication signal containing the image data of the image of the monitored person and the person position data relating to the position of the person appearing in the image on the image, An image of the image data and a person position indicator representing the position of the person based on the person position data are displayed on the display unit.
  • the sensor device generates person position data relating to the position of the person in the image captured by the image capturing unit, and transmits a monitoring information communication signal that accommodates the person position data.
  • the monitored person monitoring system includes such a sensor device and a terminal device.
  • FIG. 1 It is a figure which shows an example of the monitoring information screen which expanded the area
  • the monitored person monitoring system in the embodiment is a system that monitors a monitored person (watched person) Ob that is a monitored object (watched object) to be monitored (watched).
  • the monitored person monitoring system includes a sensor device and a terminal device connected to the sensor device so as to be communicable, and performs predetermined behavior in a monitored person based on an image captured by the imaging unit. And the detection result is notified to the terminal device.
  • the sensor device in such a monitored person monitoring system includes: a first communication unit that performs communication; an imaging unit that performs imaging; and person position data related to a position in the image of a person captured in an image captured by the imaging unit
  • generated by the said person position processing part by the said 1st communication part is provided.
  • the terminal device in the monitored person monitoring system includes a second communication unit that receives the monitoring information communication signal, a display unit that performs display, and the monitoring information communication signal received by the second communication unit.
  • a detection result and an image of the image data are displayed on the display unit, and a person position indicator indicating the position of the person based on the person position data stored in the monitoring information communication signal is displayed on the display unit.
  • a second monitoring processing unit having a display processing unit for displaying.
  • the monitoring information communication signal may be transmitted directly from the sensor device to the terminal device, or indirectly from the sensor device to the terminal device via another device (for example, the following management server device). May be sent.
  • the said terminal device may be one type of apparatus, in the following description, the said terminal device is two types of apparatuses, a fixed terminal device and a portable terminal device.
  • the main difference between these fixed terminal devices and portable terminal devices is that the fixed terminal device is fixedly operated, while the portable terminal device is operated by being carried by a supervisor (user) such as a nurse or a caregiver. Since the fixed terminal device and the mobile terminal device are substantially the same, the mobile terminal device will be mainly described below.
  • FIG. 1 is a diagram illustrating a configuration of a monitored person monitoring system according to the embodiment.
  • FIG. 2 is a diagram illustrating a configuration of a sensor device in the monitored person monitoring system according to the embodiment.
  • FIG. 3 is a diagram for explaining person position data and person position signs in the first mode.
  • FIG. 4 is a diagram for explaining person position data (person area data) and person position signs in the second mode.
  • FIG. 5 is a diagram for explaining person position data (person area data) and person position signs in the third mode.
  • FIG. 6 is a diagram for explaining person position data (person area data) and person position signs in the fourth mode.
  • FIG. 7 is a diagram for explaining person position data (person area data) and person position signs of the fifth aspect.
  • FIG. 8 is a diagram illustrating a configuration of the mobile terminal device in the monitored person monitoring system according to the embodiment.
  • the monitored person monitoring system MS includes, for example, as shown in FIG. 1, one or a plurality of sensor devices SU (SU-1 to SU-4), a management server device SV, and a fixed terminal device.
  • SP one or more portable terminal devices TA (TA-1, TA-2), and private branch exchange (PBX) CX, which are wired or wireless, LAN (Local Area Network) Or the like via a network (network, communication line) NW.
  • the network NW may be provided with repeaters such as repeaters, bridges, and routers that relay communication signals.
  • the plurality of sensor devices SU-1 to SU-4, the management server device SV, the fixed terminal device SP, the plurality of portable terminal devices TA-1, TA-2, and the private branch exchange CX include an L2 switch.
  • a wired / wireless LAN for example, a LAN in accordance with the IEEE 802.11 standard
  • NW including the LS and the access point AP.
  • the plurality of sensor devices SU-1 to SU-4, the management server device SV, the fixed terminal device SP, and the private branch exchange CX are connected to the line concentrator LS, and the plurality of portable terminal devices TA-1, TA-2. Is connected to the line concentrator LS via the access point AP.
  • the network NW constitutes a so-called intranet by using Internet protocol groups such as TCP (Transmission control protocol) and IP (Internet protocol).
  • the monitored person monitoring system MS is arranged at an appropriate place according to the monitored person Ob.
  • the monitored person (person to be watched) Ob is, for example, a person who needs nursing due to illness or injury, a person who needs care due to a decrease in physical ability, a single person living alone, or the like.
  • the monitored person Ob may be a person who needs the detection when a predetermined inconvenient event such as an abnormal state occurs in the person. preferable.
  • the monitored person monitoring system MS is suitably arranged in a building such as a hospital, a welfare facility for the elderly, and a dwelling unit according to the type of the monitored person Ob.
  • the monitored person monitoring system MS is disposed in a building of a care facility that includes a plurality of rooms RM in which a plurality of monitored persons Ob live and a plurality of rooms such as a nurse station.
  • a private branch exchange (line switching unit) CX is connected to the network NW, controls extension calls such as outgoing calls, incoming calls, and calls between the portable terminal devices TA, and performs extension calls between the portable terminal devices TA, and For example, outgoing calls, incoming calls, and calls between the external telephone TL and the mobile terminal device TA are connected to an external telephone TL such as a fixed telephone or a mobile telephone via a public telephone network PN such as a fixed telephone network or a mobile telephone network.
  • This is a device that performs an outside line telephone call between the outside line telephone TL and the portable terminal device TA by controlling the outside line telephone.
  • the private branch exchange CX is, for example, a digital exchange or an IP-PBX (Internet Protocol Private Branch eXchange).
  • the sensor device SU has a communication function that communicates with other devices SV, SP, TA via the network NW, detects a predetermined action in the monitored person Ob, and notifies the management server device SV of the detection result.
  • the sensor device SU when transmitting an image to the terminal devices SP and TA, transmits the person to the terminal devices SP and TA, and the person related to the position of the person in the image on the image. Position data is generated and transmitted to the management server device SV together with the image.
  • such a sensor device SU includes an imaging unit 11, a sensor side sound input / output unit (SU sound input / output unit) 12, a nurse call reception operation unit 13, and a sensor side control process.
  • Unit (SU control processing unit) 14 sensor-side communication interface unit (SU communication IF unit) 15, and sensor-side storage unit (SU storage unit) 16.
  • the imaging unit 11 is an apparatus that is connected to the SU control processing unit 14 and performs imaging in accordance with the control of the SU control processing unit 14 to generate an image (image data).
  • the image includes a still image (still image data) and a moving image (moving image data).
  • the imaging unit 11 is disposed so as to be able to monitor a space (location space, in the example shown in FIG. 1, where the monitored person Ob that is a monitoring target to be monitored) is located.
  • the space is imaged as an imaging target from above, an image (image data) overlooking the imaging target is generated, and the imaging target image (target image) is output to the SU control processing unit 14.
  • the imaging unit 11 is expected to be located at the head of the monitored person Ob in the bedding on which the monitored person Ob is lying (for example, a bed). It is arranged so that the imaging target can be imaged from directly above the preset planned head position (usually the position where the pillow is disposed).
  • the sensor device SU uses the imaging unit 11 to acquire an image of the monitored person Ob taken from above the monitored person Ob, preferably an image taken from directly above the planned head position.
  • Such an imaging unit 11 may be a device that generates an image of visible light, but in the present embodiment, it is a device that generates an infrared image so that the monitored person Ob can be monitored even in a relatively dark place.
  • the imaging unit 11 has an imaging optical system that forms an infrared optical image of an imaging target on a predetermined imaging surface, and a light receiving surface that matches the imaging surface.
  • An image sensor that is arranged and converts an infrared optical image in the imaging target into an electrical signal, and image data that represents an infrared image in the imaging target by performing image processing on the output of the image sensor It is a digital infrared camera provided with the image processing part etc. which produce
  • the imaging optical system of the imaging unit 11 is preferably a wide-angle optical system (so-called wide-angle lens (including a fisheye lens)) having an angle of view that can image the entire living room RM in which the imaging unit 11 is disposed. .
  • the SU sound input / output unit 12 is a circuit that inputs and outputs sound. That is, the SU sound input / output unit 12 is a circuit that is connected to the SU control processing unit 14 and generates and outputs a sound corresponding to an electrical signal representing sound according to the control of the SU control processing unit 14. It is a circuit for acquiring the sound of and inputting it into the sensor device SU.
  • the SU sound input / output unit 12 includes, for example, a speaker that converts a sound electrical signal (sound data) into a sound mechanical vibration signal (acoustic signal), and a microphone that converts a sound mechanical vibration signal in the audible region into an electrical signal. And so on.
  • the SU sound input / output unit 12 outputs an electric signal representing an external sound to the SU control processing unit 14, and converts the electric signal input from the SU control processing unit 14 into a sound mechanical vibration signal and outputs the sound. .
  • the nurse call reception operation unit 13 is connected to the SU control processing unit 14 and is a switch circuit such as a push button switch for inputting the nurse call to the sensor device SU.
  • the nurse call reception operation unit 13 may be connected to the SU control processing unit 14 by wire, or may be connected to the SU control processing unit 14 by short-range wireless communication such as Bluetooth (registered trademark) standard.
  • the SU communication IF unit 15 is a communication circuit that is connected to the SU control processing unit 14 and performs communication according to the control of the SU control processing unit 14.
  • the SU communication IF unit 15 generates a communication signal containing data to be transferred input from the SU control processing unit 14 in accordance with a communication protocol used in the network NW of the monitored person monitoring system MS, and the generated communication The signal is transmitted to other devices SV, SP, TA via the network NW.
  • the SU communication IF unit 15 receives communication signals from other devices SV, SP, and TA via the network NW, extracts data from the received communication signals, and the SU control processing unit 14 can process the extracted data.
  • the data is converted into data in a proper format and output to the SU control processing unit 14.
  • the SU communication IF unit 15 includes, for example, a communication interface circuit that complies with the IEEE 802.11 standard or the like.
  • the SU storage unit 16 is a circuit that is connected to the SU control processing unit 14 and stores various predetermined programs and various predetermined data under the control of the SU control processing unit 14.
  • Examples of the various predetermined programs include a SU control program that controls each unit of the sensor device SU according to the function of each unit, and a person related to the position of the person in the image captured by the imaging unit 11 in the image.
  • a person position processing program for generating position data, an image (image data) obtained by detecting a predetermined action in the monitored person Ob and picking up the detection result by the image pickup unit 11, and the person position data generated by the person position processing program
  • the management server device SV is notified of this
  • a nurse call processing program for making a voice call with the terminal devices SP and TA by using the SU sound input / output unit 12 or the like
  • the videos generated by the imaging unit 11, the terminal apparatus SP requested the video, which contains control program of SU streaming processing program for broadcast streaming to TA.
  • the SU storage unit 16 includes, for example, a ROM (Read Only Memory) that is a nonvolatile storage element, an EEPROM (Electrically Erasable Programmable Read Only Memory) that is a rewritable nonvolatile storage element, and the like.
  • the SU storage unit 16 includes a RAM (Random Access Memory) that serves as a working memory of the so-called SU control processing unit 14 that stores data generated during the execution of the predetermined program.
  • the SU control processing unit 14 controls each part of the sensor device SU according to the function of each part, detects a predetermined action in the monitored person Ob, and detects the detection result as the image (image data) and the person position data.
  • the management server device SV is notified, the nurse call is accepted, the management server device SV is notified of that, a voice call is performed with the terminal devices SP and TA, and an image including a moving image is generated to generate a terminal.
  • This is a circuit for distributing a moving image to the devices SP and TA.
  • the SU control processing unit 14 includes, for example, a CPU (Central Processing Unit) and its peripheral circuits.
  • the SU control processing unit 14 executes a sensor processing unit (SU control unit) 141, a sensor side monitoring processing unit (SU monitoring processing unit) 142, a person position processing unit 143, and a nurse call by executing the control processing program.
  • a processing unit (nurse call processing unit) 144 and a sensor-side streaming processing unit (SU streaming processing unit) 145 are functionally provided.
  • the SU control unit 141 controls each part of the sensor device SU according to the function of each part, and governs overall control of the sensor device SU.
  • the SU monitoring processing unit 142 detects a predetermined action (state, situation) set in advance in the monitored person Ob based on the image, and the person generated by the person position processing unit 143 as described above and the person described later. Along with the position data, the management server device SV is notified (notified and transmitted).
  • the predetermined action includes, for example, wake-up when the monitored person Ob occurred, leaving the monitored person Ob away from the bedding, falling down when the monitored person Ob fell from the bedding, And four actions of the fall where the monitored person Ob fell.
  • the SU monitoring processing unit 142 detects the head of the monitored person Ob based on the target image captured by the imaging unit 11, and based on the time change in the size of the detected head of the monitored person Ob. Detects the monitored person's getting up, getting out of bed, falling and falling. More specifically, first, the bedding location area and the first to third threshold values Th1 to Th3 are stored in advance in the SU storage unit 16 as one of the various predetermined data.
  • the first threshold Th1 is a value for discriminating between the size of the head in the lying posture and the size of the head in the sitting posture in the bedding location region.
  • the second threshold Th2 is a value for identifying whether or not the head is in the standing posture in the living room RM excluding the bedding location area.
  • the third threshold Th3 is a value for identifying whether the head is in the lying position in the living room RM excluding the bedding location area.
  • the SU monitoring processing unit 142 uses, for example, a circular or elliptical Hough transform from the extracted moving object region, for example, by pattern matching using a head model prepared in advance, or for example for head detection.
  • the head region of the monitored person Ob is extracted by the neural network learned in step (b).
  • the SU monitoring processing unit 142 detects wakeup, getting out of bed, falling, and falling from the position and size of the extracted head. For example, the SU monitoring processing unit 142 determines that the position of the extracted head is within the bedding location area, and the size of the extracted head uses the first threshold Th1 to determine the size of the lying posture. When the time changes to the size of the sitting posture, it is determined that the user is getting up and the rising is detected.
  • the SU monitoring processing unit 142 is a case where the position of the extracted head changes with time from within the bedding location area to outside the bedding location area, and the size of the extracted head is the second threshold value.
  • the time changes from a certain size to the size of the standing posture by using Th2 it is determined that the user has left the bed and the bed is detected.
  • the SU monitoring processing unit 142 is a case where the position of the extracted head changes over time from the bedding location area to the outside of the bedding location area, and the size of the extracted head is the third threshold value.
  • the time changes from a certain size to the size of the recumbent posture by using Th3 it is determined as a fall and the fall is detected.
  • the SU monitoring processing unit 142 has the extracted head position in the living room RM excluding the bedding location area, and the extracted head size is determined by using the third threshold Th3.
  • the time changes from the size to the size of the recumbent posture it is determined that the vehicle falls and the fall is detected.
  • the moving object area to be extracted as the person area is also within the predetermined size (width, size). It becomes. For this reason, a range to be extracted as a moving body region is preset, and when a moving body region is extracted from the target image, for example, by a background difference method or a frame difference method, The moving object region extracted from the target image by, for example, the background difference method or the frame difference method may be excluded from the moving object region extracted as the person region.
  • the SU monitoring processing unit 142 uses the SU communication IF to detect the detection result, the target image used when obtaining the detection result, and the person position data with respect to the target image.
  • the unit 15 notifies the management server device SV. More specifically, the SU monitoring processing unit 142 is used for detecting the sensor ID of the own device, the detection result (one or more of getting up, getting out of bed, falling, and falling in this embodiment) and the predetermined action.
  • a communication signal (first monitoring information communication signal) containing the target image and person position data for the target image is transmitted to the management server device SV via the SU communication IF unit 15.
  • the image may be at least one of a still image and a moving image.
  • a still image is notified, and the moving image is distributed in response to a user request.
  • a moving image may be distributed, a still image and a moving image may be transmitted, and the still image and the moving image may be displayed on the terminal devices SP and TA by screen division.
  • the person position data is generated by the person position processing unit 143 as described later, and is stored in the first monitoring information communication signal as metadata in the image data of the target image.
  • the image data and the person position data are accommodated in an Exif (Exchangeable image file format) file.
  • the Exif file is a file conforming to a JPEG (Joint Photographic Experts Group) file format, and so-called Exif information such as shooting conditions is stored in an application marker segment.
  • the person position data is stored in the application marker segment.
  • the image data and the person position data are accommodated as form data in a multipart file in HTTP (Hypertext Transfer Protocol). More specifically, the image data is accommodated as one form data in a multi-part format file, and the person position data is accommodated as another form data in the multi-part format file.
  • the person position processing unit 143 generates person position data regarding the position of the person in the image captured by the image capturing unit 11.
  • the image is the target image used for detecting a predetermined action in the monitored person.
  • the person position data for example, data of the following first to fifth modes can be used.
  • the person position data of the first aspect is the coordinate data of the position of the person in the image itself on the image.
  • the person position processing unit 143 first, like the SU monitoring processing unit 142, for example, a background difference from the target image captured by the imaging unit 11.
  • the moving object region MB is extracted as a person region by the method or the frame difference method.
  • the coordinate data (x, y) is given in the same way as the human position data of the following second to fifth modes, for example, at a pixel position with the upper left vertex of the target image as the coordinate origin.
  • the person position indicator displayed on the mobile terminal device TA is an image of a point represented by the coordinate data (x, y) of the gravity center position P1. HP1.
  • the person position data of the second aspect is person area data relating to the person area including a position of the person in the image (for example, the center of gravity position P1), and the person area data is the person area data.
  • This is coordinate data representing a rectangle including all or part of the area.
  • the person position processing unit 143 extracts the moving object area MB as the person area as described above, and as shown in FIG. A rectangle including all or part of the moving object region MB (for example, a circumscribed rectangle for the moving object region MB) is obtained, and coordinate data representing the obtained rectangle is obtained as person position data (person region data) of the second mode.
  • the coordinate data representing the rectangle includes, for example, coordinate data (x0, y0) and (x1, y1) of both end points P20 and P21 of one diagonal line in the rectangle. Further, for example, the coordinate data representing the rectangle is composed of coordinate data of four vertices in the rectangle (not shown).
  • the person position indicator displayed on the mobile terminal device TA is represented by the person area data (x0, y0) and (x1, y1) as described later.
  • Image that is, a rectangular image (rectangular contour image) HP2.
  • the person position data of the third aspect is person area data relating to the person area including a position of the person in the image (for example, the center of gravity position P1), and the person area data is the person area data. This is coordinate data representing the outline of the area.
  • the person position processing unit 143 extracts the moving object area MB as the person area as described above, and as shown in FIG. A contour line of the moving body region MB is obtained, and coordinate data representing the obtained contour line is obtained as person position data (person region data) of the third mode.
  • the coordinate data representing the contour line of the moving object region MB is coordinate data (x0, y0), (x1, y1), (x1, y1), (P3C) of the vertexes P30, P31, P32,. x2, y2), ..., (xpc, ypc). More specifically, when the moving object region MB has a relatively complicated shape, the number of vertices on the contour line of the moving object region MB is large, and the data given to the person position data in the first monitoring information communication signal There is a possibility that the capacity (for example, the data capacity of several to dozens of coordinate data) is exceeded.
  • the person position processing unit 143 forms a mosaic by combining the pixels constituting the moving object region MB as the person region with a plurality of adjacent pixels, thereby forming the moving object region MB with a plurality of mosaics (
  • the moving object area MB is made into a mosaic).
  • the person position processing unit 143 sequentially increases the size of the mosaic and adjusts the size of the mosaic until the data capacity is equal to or less than the data capacity given to the person position data in the first monitoring information communication signal.
  • the person position indicator displayed on the mobile terminal device TA is the person area data (x0, y0), (x1, y1), (x2) as will be described later. , Y2),..., (Xpc, ypc), that is, an image HP3 of a contour line in a person region.
  • the person position data of the fourth aspect is person area data relating to the person area including the position of the person in the image (for example, the center of gravity position P1), and the person area data is the person area data. Is coordinate data representing a circle including all or part of the area.
  • the person position processing unit 143 extracts the moving object area MB as the person area as described above, and as shown in FIG. A circle including all or part of the moving object region MB (for example, a circumscribed circle for the moving object region MB) is obtained, and coordinate data representing the obtained circle is obtained as person position data (person region data) of the fourth mode.
  • the coordinate data representing the circle includes, for example, coordinate data (x, y) and a radius r of the center point P4 in the circle. Further, for example, the coordinate data representing the circle includes coordinate data (x, y) of the gravity center position P4 in the moving object region MB and a preset radius r.
  • the person position indicator displayed on the mobile terminal device TA is an image represented by the person area data (x, y), r, that is, A circular image (circular contour image) HP4 is obtained.
  • the person position data of the fifth aspect is person area data relating to the person area including a position (for example, the barycentric position P1) of the person shown in the image, and the person area data is the person area data. Coordinate data representing an ellipse including all or part of the region.
  • the person position processing unit 143 extracts the moving object area MB as the person area as described above, and as shown in FIG. An ellipse including all or part of the moving object region MB is obtained, and coordinate data representing the obtained ellipse is obtained as person position data (person region data) of the fifth mode.
  • the person position processing unit 143 provides an appropriate one first set point P51 on the contour line in the moving body region MB as the person region, and the person position processing unit 143 is on the contour line farthest from the first set point P51.
  • a point is obtained as a second set point P53, and a line segment P51P53 connecting the first set point P51 and the second set point P53 is defined as an elliptical long axis.
  • the person position processing unit 143 passes through the center point P50 of the line segment P51P53 connecting the first set point P51 and the second set point P53 as the long axis, and the straight line orthogonal to the line segment P51P53 and the contour
  • the two intersections with the line are obtained, the lengths of each of the obtained two intersections and the center point P50 are obtained, and an intersection giving a longer length is selected from the two intersections as an intersection P52 between the minor axis and the ellipse.
  • the major axis, the minor axis, and the intersection of the major axis and the minor axis are obtained, and an ellipse is obtained.
  • the coordinate data representing the ellipse is, for example, an intersection P50 between the major axis and the minor axis, one intersection (first set point) P51 with the major axis on the ellipse, and one intersection P52 with the minor axis on the ellipse.
  • Coordinate data (x0, y0), (x1, y1), (x2, y2).
  • the coordinate data representing the ellipse includes coordinate data (x0, y0) of an intersection P50 between the major axis and the minor axis, an angle formed by the major axis (or minor axis) and the x axis (or y axis), It consists of the length of the major axis and the length of the minor axis.
  • the person position indicator displayed on the mobile terminal device TA is the person area data (x0, y0), (x1, y1), (x2, y2).
  • the person position processing unit 143 notifies the SU monitoring processing unit 142 of the person position data generated in this way.
  • the person position data of the second mode is used, and the person position processing unit 143 notifies the SU monitoring processing unit 142 of the person position data of the second mode.
  • the nurse call processing unit 144 reports the fact to the management server device SV and uses the SU sound input / output unit 12 or the like to communicate with the terminal devices SP and TA. Voice calls are made between the two. More specifically, when the nurse call reception operation unit 13 is input, the nurse call processing unit 144 stores the sensor ID of the own device and nurse call reception information indicating that the nurse call is received. A 1 nurse call notification communication signal is transmitted to the management server device SV via the SU communication IF unit 15. The nurse call processing unit 144 uses the SU sound input / output unit 12 and the like to perform a voice call with the terminal devices SP and TA, for example, by VoIP (Voice over Internet Protocol).
  • VoIP Voice over Internet Protocol
  • the SU streaming processing unit 145 receives the request from the fixed terminal device SP or the portable terminal device TA.
  • the moving image generated by the imaging unit 11 (for example, a live moving image) is distributed via the SU communication IF unit 15 by streaming reproduction.
  • FIG. 1 shows four first to fourth sensor devices SU-1 to SU-4 as an example, and the first sensor device SU-1 is one of the monitored persons Ob.
  • the second sensor device SU-2 is arranged in a room RM-2 (not shown) of Mr. B Ob-2 who is one of the monitored persons Ob.
  • the third sensor device SU-3 is disposed in the room RM-3 (not shown) of Mr. C Ob-3, one of the monitored subjects Ob, and the fourth sensor device SU-4 It is arranged in the room RM-4 (not shown) of Mr. D Ob-4, one of the monitored persons Ob.
  • the management server device SV has a communication function for communicating with other devices SU, TA, SP via the network NW, and receives the detection result, the target image, and the person position data regarding the monitored person Ob from the sensor device SU.
  • information monitoring information
  • the management server device SV is a correspondence relationship between the notification source sensor device SU (sensor ID) and the notification destination (re-notification destination) terminal device SP, TA (terminal ID) (notification destination correspondence relationship).
  • each device SU, SP, TA each ID
  • the terminal ID is an identifier for identifying and identifying the terminal devices SP and TA.
  • the management server device SV identifies the notification destination terminal devices SP and TA corresponding to the notification source sensor device SU in the received first monitoring information communication signal from the notification destination correspondence relationship, and The second monitoring information communication signal is transmitted to the terminal devices SP and TA.
  • the sensor ID contained in the received first monitoring information communication signal, the detection result, the target image and the person position data, and the received as the download destination of the moving image is accommodated.
  • the communication address is obtained from the communication address correspondence relationship.
  • the management server device SV when the management server device SV receives the first nurse call notification communication signal, the management server device SV is accommodated in the notification source sensor device in the received first nurse call notification communication signal and the received first nurse call notification communication signal. The data is associated with each other and stored (recorded) as monitoring information of the monitored person Ob. Then, the management server device SV identifies the notification destination terminal devices SP and TA corresponding to the notification source sensor device in the received first nurse call notification communication signal from the notification destination correspondence relationship, and A second nurse call notification communication signal is transmitted to the terminal devices SP and TA.
  • the second nurse call notification communication signal includes the sensor ID and nurse call reception information stored in the received first nurse call notification communication signal.
  • the second nurse call notification communication signal may include a communication address corresponding to the sensor device SU having the sensor ID stored in the received first nurse call notification communication signal as a download destination of the moving image.
  • the management server device SV provides the client with data corresponding to the request of the client (terminal device SP, TA, etc. in this embodiment).
  • Such a management server device SV can be configured by, for example, a computer with a communication function.
  • the fixed terminal device SP includes a communication function for communicating with other devices SU, SV, TA via the network NW, a display function for displaying predetermined information, an input function for inputting predetermined instructions and data, and the like.
  • the user interface (UI) of the monitored person monitoring system MS is input by inputting predetermined instructions and data to be given to the management server device SV and the portable terminal device TA, or displaying the monitoring information obtained by the sensor device SU. ).
  • Such a fixed terminal device SP can be configured by, for example, a computer with a communication function.
  • the fixed terminal device SP as an example of the terminal device operates in the same manner as the mobile terminal device TA. However, in this specification, a mobile terminal device TA that is another example of the terminal device will be described.
  • the mobile terminal device TA has a communication function for communicating with other devices SV, SP, SU via the network NW, a display function for displaying predetermined information, an input function for inputting predetermined instructions and data, and a voice call.
  • a monitoring function (including moving images) obtained from the sensor device SU by inputting a predetermined instruction or data to be provided to the management server device SV or the sensor device SU, or by a report from the management server device SV. This is a device for displaying or answering or calling a nurse call by voice call with the sensor device SU.
  • such a portable terminal device TA includes a terminal-side communication interface unit (TA communication IF unit) 31, a terminal-side control processing unit (TA control processing unit) 32, and A terminal side storage unit (TA storage unit) 33, a terminal side sound input / output unit (TA sound input / output unit) 34, a terminal side input unit (TA input unit) 35, and a terminal side display unit (TA display unit). 36 and a terminal-side interface unit (TAIF unit) 37.
  • the TA communication IF unit 31 is a communication circuit that is connected to the TA control processing unit 32 and performs communication according to the control of the TA control processing unit 32, similarly to the SU communication IF unit 15.
  • the TA communication IF unit 31 includes, for example, a communication interface circuit that conforms to the IEEE 802.11 standard or the like.
  • the TA sound input / output unit 34 is connected to the TA control processing unit 32, and generates and outputs a sound corresponding to an electrical signal representing sound according to the control of the TA control processing unit 32.
  • This is a circuit for acquiring external sound and inputting it to the mobile terminal device TA.
  • the TA input unit 35 is connected to the TA control processing unit 32 and is, for example, a circuit that accepts a predetermined operation and inputs it to the mobile terminal device TA, for example, a plurality of input switches assigned with a predetermined function.
  • the predetermined operation include an ID input operation for logging in, a voice call request operation and its end operation, a live video request operation and its end operation, and the informed monitored person.
  • an input operation indicating that there is an intention to perform a response (response, response) such as lifesaving, nursing, care, and assistance to “Ob” (“correspond”)
  • an instruction command
  • Various operations necessary for monitoring such as an input operation of an enlarged display instruction that is a command
  • the TA display unit 36 is connected to the TA control processing unit 32, and under the control of the TA control processing unit 32, the predetermined operation content input from the TA input unit 35 and the monitored target MS monitored by the monitored person monitoring system MS.
  • the monitoring information related to the monitoring of the monitoring person Ob for example, the type of the predetermined action detected by the sensor device SU, the image (still image and moving image) of the monitored person Ob, and the person position representing the position of the person based on the person position data
  • a display device such as an LCD (Liquid Crystal Display) and an organic EL display.
  • the TA input unit 35 and the TA display unit 36 constitute a touch panel.
  • the TA input unit 35 is a position input device that detects and inputs an operation position such as a resistance film method or a capacitance method.
  • a position input device is provided on the display surface of the TA display unit 36, and one or more input content candidates that can be input are displayed on the TA display unit 36.
  • a user such as a nurse or a caregiver ( When the monitor) touches the display position where the input content to be input is displayed, the position is detected by the position input device, and the display content displayed at the detected position is input to the portable terminal device TA as the operation input content of the user. Entered.
  • the TAIF unit 37 is a circuit that is connected to the TA control processing unit 32 and inputs / outputs data to / from an external device according to the control of the TA control processing unit 32.
  • the TAIF unit 37 uses the Bluetooth (registered trademark) standard.
  • An interface circuit, an interface circuit that performs infrared communication such as the IrDA standard, and an interface circuit that uses the USB standard are examples of the Bluetooth (registered trademark) standard.
  • the TA storage unit 33 is a circuit that is connected to the TA control processing unit 32 and stores various predetermined programs and various predetermined data under the control of the TA control processing unit 32.
  • the various predetermined programs include, for example, a TA control program for controlling each unit of the mobile terminal device TA according to the function of each unit, the detection result received from the sensor device SU via the management server device SV, Sensor information is stored by recording (recording) monitoring information relating to monitoring of the monitored person Ob, such as the nurse call, and using the TA monitoring processing program for displaying the detection result and the nurse call, the TA sound input / output unit 34, and the like.
  • Control processing program such as a TA call processing program for making a voice call with the SU and a streaming processing program for receiving the distribution of the moving image from the sensor device SU and displaying the distributed moving image on the TA display unit 36 by streaming playback Is included.
  • the TA monitoring processing program the detection result and the image data stored in the second monitoring information communication signal are displayed on the TA display unit 36, and the person based on the person position data stored in the second monitoring information communication signal is displayed.
  • a display processing program for displaying the position indicator in the image displayed on the TA display unit 36 is included.
  • the TA storage unit 33 includes, for example, a ROM and an EEPROM.
  • the TA storage unit 33 includes a RAM serving as a working memory for the so-called TA control processing unit 32 that stores data generated during execution of the predetermined program.
  • the TA storage unit 33 functionally includes a terminal-side monitoring information storage unit (TA monitoring information storage unit) 331 for storing the monitoring information.
  • the TA monitoring information storage unit 331 stores monitoring information of the monitored person Ob transmitted / received to / from each of the devices SV, SP, and SU. More specifically, in the present embodiment, the TA monitoring information storage unit 331 includes, as the monitoring information, the sensor ID, the detection result, the target image, and the sensor ID that are accommodated in the second monitoring information communication signal received from the management server device SV.
  • the second nurse call notification communication received from the management server device SV is stored in association with the communication address of the sensor device SU to which the person position data and video are downloaded, the reception time of the second monitoring information communication signal, and the like.
  • the sensor ID and nurse call reception information accommodated in the signal, the reception time of the second nurse call notification communication signal, and the like are stored in association with each other.
  • the TA control processing unit 32 is a circuit for controlling each unit of the mobile terminal device TA according to the function of each unit, receiving and displaying the monitoring information for the monitored person Ob, and answering or calling a nurse call. It is.
  • the TA control processing unit 32 includes, for example, a CPU and its peripheral circuits.
  • the TA control processing unit 32 executes a terminal processing unit (TA control unit) 321, a terminal side monitoring processing unit (TA monitoring processing unit) 322, a call processing unit 323, and a terminal side streaming process by executing a control processing program.
  • Unit (TA streaming processing unit) 324 functionally.
  • the TA monitoring processing unit 322 functionally includes a display processing unit 3221.
  • the TA control unit 321 controls each part of the mobile terminal apparatus TA according to the function of each part, and controls the entire mobile terminal apparatus TA.
  • the TA monitoring processing unit 322 stores (records) monitoring information related to monitoring of the monitored person Ob such as the detection result or the nurse call received from the sensor device SU via the management server device SV. A nurse call is displayed. More specifically, when the TA monitoring processing unit 322 receives the second monitoring information communication signal from the management server device SV, the TA monitoring processing unit 322 displays the monitoring information of the monitored person Ob contained in the received second monitoring information communication signal. Store (record) in the TA monitoring information storage unit 331. The TA monitor processor 322 causes the display processor 3221 to display a screen corresponding to each piece of information contained in the received second monitor information communication signal on the TA display 36.
  • the display processing unit 3221 displays the detection result and the target image accommodated in the received second monitoring information communication signal on the TA display unit 36, and accommodates in the received second monitoring information communication signal.
  • a person position indicator based on the person position data thus displayed is displayed in the target image displayed on the TA display unit 36.
  • the display processing unit 3221 receives the enlarged display instruction from the TA input unit 35, the display processing unit 3221 enlarges the size of the person's area being displayed on the TA display unit 36, and Display area.
  • the TA monitoring processing unit 322 receives the second nurse call notification communication signal from the management server device SV, the TA monitoring processing unit 322 stores the monitoring information of the monitored subject Ob contained in the received second nurse call notification communication signal.
  • the TA monitor processing unit 322 displays a nurse call reception screen pre-stored in the TA storage unit 33 on the TA display unit 36 according to the nurse call reception information accommodated in the received second nurse call notification communication signal. . Then, when receiving a predetermined input operation from the TA input unit 35, the TA monitoring processing unit 322 executes a predetermined process corresponding to the input operation.
  • the call processing unit 323 performs a voice call with the sensor device SU by using the TA sound input / output unit 34 or the like. More specifically, the call processing unit 323 uses the TA sound input / output unit 34 and the like, and the notification source sensor device SU that has transmitted the first monitoring information communication signal and the first nurse call notification communication signal to the management server device SV. In addition, a voice call is performed, for example, by VoIP with the sensor device SU or the like selected and designated by the user (monitor) of the mobile terminal device TA.
  • the TA streaming processing unit 324 receives the distribution of the moving image from the sensor device SU, and displays the distributed moving image on the TA display unit 36 by streaming reproduction.
  • Such a portable terminal device TA can be configured by a portable communication terminal device such as a so-called tablet computer, a smartphone, or a mobile phone.
  • the TA input unit 35 and the TA display unit 36 constituting the touch panel have an instruction input unit (first instruction input unit) that receives an enlarged display instruction that is an instruction to enlarge and display a person shown in the image. ) And an example of a second instruction input unit that receives an operation instruction that is an instruction to perform a predetermined operation.
  • first instruction input unit receives an enlarged display instruction that is an instruction to enlarge and display a person shown in the image.
  • second instruction input unit that receives an operation instruction that is an instruction to perform a predetermined operation.
  • FIG. 9 is a flowchart illustrating an operation related to monitoring information of the sensor device in the monitored person monitoring system according to the embodiment.
  • FIG. 10 is a flowchart illustrating an operation related to monitoring information of the mobile terminal device in the monitored person monitoring system according to the embodiment.
  • FIG. 11 is a diagram illustrating an example of a standby screen displayed on the mobile terminal device in the monitored person monitoring system according to the embodiment.
  • FIG. 12 is a diagram illustrating an example of a monitoring information screen displayed on the mobile terminal device in the monitored person monitoring system according to the embodiment.
  • FIG. 13 is a diagram illustrating an example of a monitoring information screen in which a person's area is enlarged, which is displayed on the mobile terminal device in the monitored person monitoring system according to the embodiment.
  • each device SU, SV, SP, TA performs initialization of each necessary part and starts its operation when the power is turned on.
  • the SU control processing unit 14 includes the SU control unit 141, the SU monitoring processing unit 142, the person position processing unit 143, the nurse call processing unit 144, and the SU streaming processing unit 145. Functionally configured.
  • the TA control processing unit 32 is functionally configured with a TA control unit 321, a TA monitoring processing unit 322, a call processing unit 323, and a TA streaming processing unit 324.
  • a display processing unit 3221 is functionally configured in the TA monitoring processing unit 322.
  • the operation of the sensor device SU will be described.
  • the sensor device SU operates as follows for each frame or every several frames, thereby detecting a predetermined operation in the monitored person Ob and determining whether or not a nurse call is accepted.
  • the sensor device SU acquires an image (image data) for one frame from the imaging unit 11 as the target image by the SU control unit 141 of the SU control processing unit 14 (S11).
  • the sensor device SU performs a behavior detection process for detecting a predetermined behavior in the monitored person Ob by the SU monitoring processing unit 142 of the SU control processing unit 14 based on the acquired target image (S12).
  • the sensor device SU determines whether or not a predetermined behavior in the monitored person Ob is detected by the SU monitoring processing unit 142 in the behavior detection processing S12. As a result of this determination, when the action is not detected (No), the sensor device SU next executes the process S16, while when the action is detected (Yes), the sensor device SU The device SU executes step S16 after sequentially executing the next step S14 and step S15.
  • the sensor apparatus SU uses the person position processing unit 143 of the SU control processing unit 14 to generate person position data related to the position of the person in the acquired target image in the image.
  • the person position data may be any of the person position data in the first to fifth aspects described above. In the present embodiment, for example, the person position data in the second aspect is employed.
  • the person position processing unit 143 first extracts the moving object region MB as a person region from the acquired target image by, for example, the background difference method or the frame difference method.
  • the person position processing unit 143 acquires the moving object area MB extracted by the SU monitoring processing unit 142 in the process S12 from the SU monitoring processing unit 142.
  • the person position processing unit 143 obtains a rectangle that includes all or part of the moving object area MB, and obtains coordinate data representing the obtained rectangle as person position data (person area data).
  • the coordinate data representing the rectangle includes, for example, coordinate data of both end points of one diagonal line in the rectangle (P20 (x0, y0), P21 (x1, y1) in the example shown in FIG. 4)).
  • the person position processing unit 143 notifies the SU monitoring processing unit 142 of the obtained person position data (person area data).
  • the person position data may be a plurality corresponding to each of the plurality of moving object regions MB.
  • the sensor apparatus SU performs the SU monitoring process in order to notify the predetermined action detected in the processes S12 and S13 to the predetermined terminal apparatuses SP and TA via the management server apparatus SV.
  • the unit 142 transmits the first monitoring information communication signal to the management server device SV.
  • the SU monitoring processing unit 142 is a first monitoring information communication signal that contains its own sensor ID, the detection result, the target image, and the person position data (rectangular person area data in this example). Is transmitted to the management server device SV via the SU communication IF unit 15. Therefore, the first monitoring information communication signal for notifying the detection result includes the target image as the image of the monitored person Ob together with the detection result, and the person position data related to the position of the person shown in the target image is the target. Included with the image.
  • the sensor device SU determines whether or not the nurse call is accepted by the nurse call processing unit 144 of the SU control processing unit 14. That is, the processes S11 to S17 shown in FIG. 9 are repeatedly executed for each frame or every several frames, but during the period from the previous execution of the process S16 to the current execution of the process S16, It is determined whether or not the call reception operation unit 13 has been operated. As a result of this determination, when the nurse call reception operation unit 13 is not operated and the nurse call is not received (No), the sensor device SU ends the current process, while the nurse call reception operation unit 13 Is operated and the nurse call is accepted (Yes), the sensor device SU ends the present process after executing the next process S17.
  • the sensor device SU performs the first call by the nurse call processing unit 144.
  • a nurse call notification communication signal is transmitted to the management server device SV. More specifically, the nurse call processing unit 144 transmits a first nurse call notification communication signal containing the sensor ID of the own device and nurse call reception information to the management server device SV via the SU communication IF unit 15. .
  • the sensor device SU operates as described above with respect to detection of a predetermined action and reception of a nurse call in the monitored person Ob.
  • the management server device SV When receiving the first monitoring information communication signal from the sensor device SU via the network NW, the management server device SV receives the sensor ID, the determination result, the target image, and the person position data (this book) contained in the first monitoring information communication signal. In the embodiment, rectangular person area data) is stored (recorded) as monitoring information of the monitored person Ob monitored by the sensor device SU having this sensor ID. Then, the management server device SV identifies the notification destination terminal devices SP and TA corresponding to the notification source sensor device SU in the received first monitoring information communication signal from the notification destination correspondence relationship, and The second monitoring information communication signal is transmitted to the terminal devices SP and TA.
  • the management server device SV when the management server device SV receives the first nurse call notification communication signal from the sensor device SU via the network NW, the management server device SV obtains the sensor ID, nurse call reception information, etc. accommodated in the first nurse call notification communication signal, The information is stored (recorded) as monitoring information of the monitored person Ob monitored by the sensor device SU having this sensor ID. Then, the management server device SV identifies the notification destination terminal devices SP and TA corresponding to the notification source sensor device SU in the received first nurse call notification communication signal from the notification destination correspondence relationship, and this notification destination The second nurse call notification communication signal is transmitted to the terminal devices SP and TA.
  • the fixed terminal device SP and the mobile terminal device TA When the fixed terminal device SP and the mobile terminal device TA receive the second monitoring information communication signal from the management server device SV via the network NW, the fixed terminal device SP and the portable terminal device TA relate to monitoring the monitored person Ob accommodated in the second monitoring information communication signal.
  • the monitoring information is displayed. The operation of displaying the monitoring information by the mobile terminal device TA will be described in detail below.
  • the sensor ID contained in the second nurse call notification communication signal is received. It is displayed that a nurse call has been received from the monitored person Ob monitored by the sensor device SU.
  • the monitored person monitoring system MS detects a predetermined action in each monitored person Ob roughly by each sensor device SU, management server device SV, fixed terminal device SP, and portable terminal device TA. Each monitored person Ob is monitored.
  • the portable terminal device TA accepts a login operation by a monitor (user) such as a nurse or a caregiver, and the display process of the TA monitor processing unit 322 is performed.
  • a standby screen waiting for a communication signal addressed to the own device is displayed on the TA display unit 36.
  • the standby screen 51 includes a menu bar area 511 for displaying a menu bar, and a standby main area for displaying a message indicating that the user is waiting (for example, “no notification”) and an icon.
  • the menu bar area 511 is provided with an off-hook button 5111 for inputting an instruction for an extension call with another mobile terminal device TA or an outgoing call with an external telephone TL.
  • the portable terminal device TA determines whether or not the TA communication IF unit 31 has received a communication signal by the TA control unit 321 of the TA control processing unit 32 (S21). If the result of this determination is that a communication signal has not been received (No), the portable terminal device TA returns the processing to S21. If the result of the determination is that a communication signal has been received (Yes), The portable terminal device TA executes the following process S22. That is, the mobile terminal device TA is waiting for reception of a communication signal.
  • the portable terminal device TA determines the type of the received communication signal by the TA control unit 321. As a result of this determination, when the received communication signal is the second monitoring information communication signal (second monitoring information), the portable terminal device TA performs the following processing S23 and processing S24 in sequence and then processing S27. When the received communication signal is the second nurse call notification communication signal (second NC notification), the process S27 and the process S26 are sequentially executed, and then the process S27 is executed. When the communication signal is not the second monitoring information communication signal and the second nurse call notification communication signal (others), the process is performed after executing the process S29 for performing an appropriate process according to the communication signal received in the process S21. finish.
  • the received communication signal is the second monitoring information communication signal (second monitoring information)
  • the portable terminal device TA performs the following processing S23 and processing S24 in sequence and then processing S27.
  • the process S27 and the process S26 are sequentially executed, and then the process S27 is executed.
  • the communication signal is not the second monitoring information communication signal and the second
  • the portable terminal device TA monitors the monitored person Ob contained in the second monitoring information communication signal received from the management server apparatus SV in the process S21 by the TA monitoring processing unit 322 of the TA control processing unit 32. Is stored (recorded) in the TA monitoring information storage unit 331.
  • the TA monitoring processing unit 322 displays, for example, the monitoring information shown in FIG.
  • the screen 52 is displayed on the TA display unit 36 (S24).
  • the monitoring information screen 52 is a screen for displaying the monitoring information related to the monitoring of the monitored person Ob.
  • the monitoring information screen 52 includes a menu bar area 511 and an arrangement location of the sensor device SU having the sensor ID accommodated in the second monitoring information communication signal received in the process S21.
  • the monitored person name area 521 for displaying the name of the monitored person Ob monitored by the sensor device SU having the sensor ID, and the reception time of the second monitoring information communication signal received in the process S21 (or the predetermined action) Detection time), a detection information display area 522 for displaying the detection result accommodated in the second monitoring information communication signal received in step S21, and a second monitoring information communication signal received in step S21.
  • the TA storage unit 33 In order to display the installation location of the sensor device SU and the name of the monitored subject Ob in the monitored person name area 521, the TA storage unit 33 displays the installation location of the sensor device SU having the sensor ID and the sensor ID. And the name of the monitored person Ob monitored by the sensor device SU having the sensor ID is stored in advance in association with each other.
  • the detection results (names of getting up, getting out of bed, falling down, and falling down in this embodiment) contained in the second monitoring information communication signal received in step S21 may be displayed as they are.
  • the detection result is displayed as an icon that symbolically represents the detection result.
  • the TA storage unit 33 stores each action and an icon representative of the action in association with each other in advance.
  • the detection information display area 522 displays a wake-up icon that symbolizes wake-up.
  • a person position indicator HP2 based on the person position data is displayed in the target image being displayed.
  • the bedding BD and the monitored person Ob are shown in the target image.
  • the person position data accommodated in the second monitoring information communication signal received in step S21 is rectangular person area data, and the display processing unit 3221 displays the upper left in the front view of the target image being displayed.
  • a rectangular image HP2 represented by person area data (here, each coordinate data (x0, y0), (x1, y1) of both end points of a diagonal line) with a vertex as a coordinate origin, an object being displayed as a person position indicator HP2 Display it superimposed on the image.
  • the region surrounded by the person position indicator HP2 is for inputting the enlarged display instruction, which is an instruction to enlarge and display the person shown in the target image being displayed, to the mobile terminal device TA. It is a button.
  • the display processing unit 3221 may display an image in the person's area by a second display method different from the display method of the target image (first display method).
  • the first display method is normal display
  • the second display method is mosaic display
  • the display processing unit 3221 uses the image in the rectangular person position mark HP as the image in the person area.
  • An image outside the rectangular person position sign HP is displayed as usual, and the image inside the rectangular person position sign HP is displayed in a mosaic manner.
  • the first display method is color display
  • the second display method is monochrome display (single color display)
  • the display processing unit 3221 displays the image in the rectangular person position mark HP as the region of the person.
  • the inside image an image outside the rectangular person position sign HP is displayed in color
  • an image inside the rectangular person position sign HP is displayed in monochrome.
  • the first display method is monochrome display
  • the second display method is color display
  • the display processing unit 3221 uses the image in the rectangular person position mark HP as an image in the person area.
  • An image outside the rectangular person position sign HP is displayed in monochrome
  • an image inside the rectangular person position sign HP is displayed in color.
  • the first display method is positive display
  • the second display method is negative display
  • the display processing unit 3221 uses the image in the rectangular person position marker HP as the image in the person area.
  • An image outside the rectangular person position sign HP is displayed in a positive manner, and an image inside the rectangular person position sign HP is displayed in a negative form.
  • the first display method is negative display
  • the second display method is positive display
  • the display processing unit 3221 uses the image in the rectangular person position mark HP as the image in the person area.
  • the image outside the rectangular person position sign HP is displayed as a negative
  • the image inside the rectangular person position sign HP is displayed as a positive.
  • the “corresponding” button 524 has an intention to perform a predetermined response (response, response) such as lifesaving, nursing, care, and assistance for the detection result displayed on the monitoring information screen 52. It is a button for inputting execution intention information indicating that the user of the mobile terminal device TA is present to the mobile terminal device TA.
  • the “speak” button 525 is a button for requesting a voice call, and is used to input an instruction to connect the sensor device SU of the sensor ID and the mobile terminal device TA via the network NW. It is a button.
  • the “LIVE” button 526 is a button for requesting a live video, and is a button for inputting an instruction to display a video captured by the sensor device SU of the sensor ID.
  • the portable terminal device TA is accommodated in the second nurse call notification communication signal received from the management server apparatus SV in the process S21 by the TA monitoring processing unit 322 of the TA control processing unit 32.
  • the monitoring information related to the monitoring of the monitored person Ob is stored (recorded) in the TA monitoring information storage unit 331.
  • the TA monitoring processing unit 322 performs a nurse call stored in advance in the TA storage unit 33 in accordance with the nurse call reception information accommodated in the second nurse call notification communication signal received in the processing S21.
  • An unillustrated nurse call acceptance screen indicating acceptance is displayed on the TA display unit 36 (S26).
  • portable terminal device TA performs input operation with the touch panel which comprises TA input part 35 and TA display part 36 by TA control process part 32. It is determined whether or not it has been accepted. If the input operation is not accepted as a result of the determination (No), the portable terminal device TA returns the process to step S27. On the other hand, if the input operation is accepted as a result of the determination, the portable terminal TA The apparatus TA executes the next process S28.
  • the portable terminal device TA performs an appropriate process according to the content of the input operation by the TA control processing unit 32, and ends this process.
  • the portable terminal device TA performs an input operation (for example, a tap operation) of an area surrounded by the person position marker HP2 that also functions as a button for inputting an enlarged display instruction by the TA control processing unit 32.
  • the display processing section 3221 displays the person area enlarged in size than the area of the person currently displayed on the TA display section 36.
  • the area surrounded by the person position mark HP2 shown in FIG. 12 is enlarged about four times (doubled in each of the vertical and horizontal directions), and the area surrounded by the person position mark HPE shown in FIG. Is displayed.
  • the pixel value of a pixel generated by enlargement is generated by interpolating from the pixel value of a pixel adjacent thereto, for example. Since the person area is enlarged and displayed in this manner, the user (monitoring person) of the portable terminal device TA can easily find the monitored person Ob in the image, and can easily recognize the state of the monitored person Ob in the image. .
  • the portable terminal device TA when the TA control processing unit 32 receives an input operation of the “corresponding” button 524 (that is, when the corresponding intention is received), the portable terminal device TA is currently displaying on the TA display unit 36.
  • the monitoring information of the monitored person Ob is stored in the TA monitoring information storage unit 331 with the indication that “corresponding” has been received, and corresponds to the monitoring information of the monitored person Ob displayed on the TA display unit 36.
  • a communication signal (correspondence acceptance notification communication signal) containing information indicating that the sensor ID and “corresponding” are accepted (correspondence acceptance information) is transmitted to the management server device SV.
  • the management server device SV that has received the correspondence reception notification communication signal broadcasts the communication signal (correspondence reception well-known communication signal) containing the sensor ID and the correspondence reception information contained in the received correspondence reception notification communication signal. Transmit to the terminal devices SP and TA. As a result, regarding the sensor ID corresponding to the monitoring information of the monitored person Ob displayed on the TA display unit 36, the fact that “corresponding” has been received is synchronized between the terminal devices SP and TA.
  • the call processing unit 323 monitors the monitored person Ob displayed on the TA display unit 36.
  • a communication signal (call request communication signal) containing information such as requesting a voice call is transmitted to the sensor device SU, and the sensor device SU is connected to the sensor device SU via the network NW so as to be able to make a voice call.
  • a voice call can be made between the mobile terminal device TA and the sensor device SU.
  • the portable terminal device TA When the TA control processing unit 32 receives an input operation of an “end” button (not shown) that is a button for inputting an instruction to end a voice call, the portable terminal device TA causes the TA processing unit 323 to A communication signal (call end communication signal) containing information such as a request to end the voice call is transmitted to the sensor device SU that monitors the monitored person Ob displayed on the display unit 36. Thereby, the voice call between the mobile terminal device TA and the sensor device SU is terminated.
  • an “end” button not shown
  • a communication signal call end communication signal
  • the TA streaming processing unit 324 displays the monitored object currently displayed on the TA display unit 36.
  • a communication signal video distribution request communication signal
  • the video is connected to be downloadable via the sensor device SU, receives live video distribution from the sensor device SU, and displays the distributed video on the TA display unit 36 by streaming playback.
  • the video is displayed in the image area 523 instead of the still image, and the “live end” button (not shown) is displayed instead of the “view live” button 526. .
  • live video is displayed on the mobile terminal device TA.
  • the “live end” button (not shown) is a button for requesting the end of the moving image, and ends (stops) the distribution of the moving image picked up by the sensor device SU of the sensor ID and ends (stops) the display. This is a button for inputting an instruction to be performed.
  • the portable terminal device TA monitors the monitored person Ob currently displayed on the TA display unit 36 by the TA streaming processing unit 324.
  • a communication signal (moving image distribution end communication signal) containing information such as requesting the end of moving image distribution is transmitted to the sensor device SU, and a still image is displayed on the TA display unit 36. Accordingly, the mobile terminal device TA ends the live video display.
  • the mobile terminal device TA operates as described above regarding the detection results received from the sensor device SU via the management server device SV and each notification (re-notification of each nurse call reception).
  • the monitored person monitoring system MS, the terminal device SP, TA, and the display method implemented therein according to the present embodiment are based on the first monitoring information communication signal received by the TA communication IF unit 31. 2
  • the image of the image data accommodated in the monitoring information communication signal is displayed on the TA display unit 36, and the person position indicator HP2 based on the person position data accommodated in the second monitoring information communication signal is displayed on the TA display unit.
  • a display processing unit 3221 for displaying in an image is provided. Therefore, in the monitored person monitoring system MS, the terminal devices SP and TA, and the display method, the person position indicator HP2 is displayed in the image being displayed, so that the user (monitorer) displays the person position indicator HP2.
  • the monitored person Ob in the image can be found with the clue, and the monitored person Ob in the image can be easily found. Since the sensor device SU in the present embodiment accommodates not only images but also person position data (person area data in the present embodiment) in the first monitoring information communication signal, the terminal device SP that has received the second monitoring information communication signal. TA can display the person position sign HP2 on the displayed image. Therefore, as described above, the users of the terminal devices SP and TA can find the monitored person Ob in the image using the person position indicator HP2 as a clue, and can easily find the monitored person Ob in the image.
  • person position data person area data in the present embodiment
  • buttons for inputting an operation instruction which is an instruction for performing a predetermined operation, such as a “corresponding” button 524, a “speak” button 525, and a “watch live” button 526 to the mobile terminal device TA, are displayed in the image area.
  • an operation instruction which is an instruction for performing a predetermined operation, such as a “corresponding” button 524, a “speak” button 525, and a “watch live” button 526 to the mobile terminal device TA.
  • FIG. 14 is a diagram illustrating an example of a monitoring information screen in which a person area and a “corresponding” button overlap each other displayed on the mobile terminal device in the monitored person monitoring system according to the embodiment.
  • FIG. 15 is a diagram illustrating an example of a monitoring information screen that displays a person area at the center position of an image display area, which is displayed on the mobile terminal device in the monitored person monitoring system of the embodiment.
  • FIG. 16 is a diagram illustrating an example of a monitoring information screen that is displayed on the mobile terminal device in the monitored person monitoring system according to the embodiment and displays a person region and a “corresponding” button so as not to overlap each other.
  • the display processing unit 3221 has a center position SC of the image area 523 for displaying the target image on the TA display unit 36 and a position HPC of the person appearing in the target image on the target image.
  • the position HPC is, for example, the barycentric position P1 in the moving object region MB.
  • the position HPC is a position of an intersection of diagonal lines in the rectangle.
  • the position HPC is a center point in the circle.
  • the position HPC is the position of the intersection of the major axis and the minor axis in the ellipse. More specifically, the pixel position (coordinate data) of the center position SC is stored in advance in the TA storage unit 33, and the display processing unit 3221 stores the pixel position in the TA storage unit 33, for example, as shown in FIG.
  • the display positions of the target image and the person position indicator HP are adjusted so that the position HPC (in this example, the position of the intersection of diagonal lines in a rectangle) is positioned at the pixel position of the center position SC. Is displayed on the TA display unit 36. According to this, since the user (monitoring person) of the portable terminal device TA has only to watch the center position of the image area 523, it becomes easy to find the monitored person Ob in the image.
  • the display processing unit 3221 inputs an operation instruction, which is an instruction for performing a predetermined operation, to the remaining area excluding the person area in the display area of the TA display unit 36.
  • the button for displaying Preferably, the display processing unit 3221 displays the button in a remaining area excluding the person area in the image area 523 for displaying the target image on the TA display unit 36.
  • the button is displayed in the image area 523, as shown in FIG. 14, for example, a button corresponding to a part of the person's area (the area in the person position indicator HP2 in the example shown in FIG. 14) In the case where it overlaps with 524, the display processing unit 3221, as shown in FIG.
  • a “corresponding” button 524 is displayed in the area. Since the display positions of the buttons such as the “corresponding” button 524, the “speak” button 525, and the “view live” button 526 are preset, an area where the person area overlaps the button is It is known in advance. For this reason, the remaining area excluding the person area can be predicted and stored in the TA storage unit 33 in advance.
  • the display processing unit 3221 may display the button in the remaining area stored in advance in the TA storage unit 33 when all or part of the person's area overlaps the button. According to this, the person's area is not hidden by the button, and the user (monitoring person) of the mobile terminal device TA can easily find the monitored person Ob in the image.
  • the terminal device receives a monitoring information communication signal containing image data of an image of a monitored person to be monitored and person position data relating to a position of the person shown in the image on the image.
  • An image of the image data accommodated in the monitoring information communication signal received by the communication unit is displayed on the display unit, and based on the person position data accommodated in the monitoring information communication signal
  • a display processing unit for displaying a person position indicator representing the position of the person in an image displayed on the display unit.
  • the person position data is accommodated in the monitoring information communication signal as metadata of the image data.
  • the image data and the person position data are accommodated in an Exif (Exchangeable image file format) file.
  • the image data and the person position data are accommodated as form data in a multipart format file in HTTP (Hypertext Transfer Protocol).
  • the person position data is person area data of the person area including a position on the image of the person shown in the image.
  • the person position data is person area data relating to the person area including a position of the person in the image on the image, and the person area data is the person area data.
  • coordinate data representing any one of a rectangle, a circle, and an ellipse including all or a part thereof
  • the person position indicator is an image represented by the person region data.
  • the person position data is person area data relating to the person area including a position of the person in the image on the image, and the person area data is the person area data.
  • the person position marker is an image represented by the person area data.
  • Such a terminal device displays an image of the image data contained in the monitoring information communication signal received by the communication unit on the display unit, and determines the position of the person based on the person position data contained in the monitoring information communication signal.
  • a display processing unit is provided for displaying the person position sign to be displayed in the image displayed on the display unit. Therefore, since the person position sign is displayed in the displayed image in the terminal device, the user of the terminal device can find the person to be monitored in the image using the person position sign as a clue. Makes it easier to find the person being monitored.
  • the person position data is person area data related to the person area including a position of the person in the image on the image
  • the person position indicator is The image is represented by person area data
  • the display processing unit displays an image in the person area by a second display method different from the image display method.
  • the second display method is a method of displaying an image in the area of the person in a mosaic.
  • the image display method is a method of displaying the image in color or monochrome
  • the second display method is an image in the person's area in monochrome or color. How to display.
  • the display method of the image is a method of displaying the image positively or negatively
  • the second display method is a method of displaying the image in the person area negatively or positively. How to display.
  • the display processing unit displays the image in the person's area by a second display method different from the image display method (first display method). For this reason, the user of the terminal device can easily find the person's area in the image, in other words, the monitored person in the image.
  • the display processing unit includes a central position of an image display area for displaying the image on the display unit, and a position on the image of a person appearing in the image.
  • the image and the person position indicator are displayed on the display unit so that they coincide with each other.
  • Such a terminal device displays a person position indicator at the center position of the image display area by the display processing unit. For this reason, since the user of the said terminal device should just watch the center position of an image display area, it becomes easy to find the to-be-monitored person in an image.
  • the above-described terminal device further includes an instruction input unit that receives an enlarged display instruction that is an instruction for enlarging and displaying the person shown in the image, and the person position data is the person shown in the image
  • the person area data relating to the person area including the position on the image, the person position indicator is an image represented by the person area data
  • the display processing unit is the instruction input unit
  • Such a terminal device further includes an instruction input unit, and when the display processing unit accepts an enlarged display instruction by the instruction input unit, the terminal device expands more than the size of the area of the person being displayed on the display unit. Display the person area. Therefore, since the area of the person is enlarged and displayed, the user of the terminal device can easily find the person to be monitored in the image.
  • the above-described terminal device further includes a second instruction input unit that receives an operation instruction that is an instruction to perform a predetermined operation, and the second instruction input unit forms a touch panel with the display unit
  • the person position data is person area data relating to the person area including the position of the person in the image on the image
  • the person position indicator is the person position indicator.
  • the display processing unit displays the button in a remaining area excluding the person area in the display area of the display unit.
  • the display processing unit displays the button in a remaining area excluding the person area in the image display area for displaying the image on the display unit.
  • the display processing unit displays a button for receiving an operation instruction in the remaining area excluding the person area. For this reason, the region of the person is not hidden by the button, and the user of the terminal device can easily find the monitored person in the image.
  • a display method for a terminal device in which monitoring data communication that stores image data of an image of a monitored person to be monitored and person position data relating to a position of a person appearing in the image on the image
  • a receiving step for receiving the signal, and an image of the image data accommodated in the monitoring information communication signal received in the receiving step is displayed on a display unit for displaying, and the person position data accommodated in the monitoring information communication signal is displayed.
  • Such a display method of the terminal device displays an image of the image data accommodated in the monitoring information communication signal received in the receiving step on the display unit, and the person based on the person position data accommodated in the monitoring information communication signal
  • a sensor device includes a communication unit that performs communication, an imaging unit that performs imaging, and a person position process that generates person position data related to the position of the person in the image captured by the imaging unit in the image And a monitoring processing unit that transmits the image data of the image captured by the imaging unit and the monitoring information communication signal that stores the person position data generated by the person position processing unit by the communication unit.
  • Such a sensor device accommodates not only the image but also the person position data in the monitoring information communication signal, so that the terminal device that has received the monitoring information communication signal can display the person position indicator on the displayed image. Therefore, the user of the terminal device can find the monitored person in the image using the person position sign as a clue, and can easily find the monitored person in the image.
  • a monitored person monitoring system includes the above-described sensor device and any one of the above-described terminal devices.
  • Such a monitored person monitoring system includes the above-described sensor device and any one of the above-described terminal devices, it is easy to find the monitored person in the image.
  • a terminal device a display method for the terminal device, a sensor device, and a monitored person monitoring system can be provided.

Landscapes

  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Nursing (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Alarm Systems (AREA)
  • Emergency Alarm Devices (AREA)
  • Accommodation For Nursing Or Treatment Tables (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Telephone Function (AREA)
  • Telephonic Communication Services (AREA)

Abstract

A terminal device and a display method therefor receive a monitoring information communication signal containing image data of an image of a person to be monitored and person position data relating to a position on the image of the person in the image, and display, on a display unit, the image of the image data and a person position marker indicating the position of the person based on the person position data in the image. A sensor device generates person position data relating to a position in an image of a person in the image taken by an image pickup unit, and transmits a monitoring information communication signal containing the generated data. A person-to-be-monitored monitoring system is provided with the sensor device and the terminal device as described above.

Description

端末装置および端末装置の表示方法、センサ装置ならびに被監視者監視システムTerminal device, terminal device display method, sensor device, and monitored person monitoring system
 本発明は、監視すべき監視対象である被監視者を監視する被監視者監視システムに好適に用いられる端末装置および前記端末装置の表示制御方法、前記被監視者監視システムに好適に用いられるセンサ装置、ならびに、前記被監視者監視システムに関する。 The present invention relates to a terminal device suitably used for a monitored person monitoring system for monitoring a monitored person to be monitored, a display control method for the terminal apparatus, and a sensor suitably used for the monitored person monitoring system. The present invention relates to an apparatus and the monitored person monitoring system.
 我が国(日本)は、戦後の高度経済成長に伴う生活水準の向上、衛生環境の改善および医療水準の向上等によって、高齢化社会、より詳しくは、総人口に対する65歳以上の人口の割合である高齢化率が21%を超える超高齢化社会になっている。2005年では、総人口約1億2765万人に対し65歳以上の高齢者人口は、約2556万人であったのに対し、2020年では、総人口約1億2411万人に対し高齢者人口は、約3456万人となる予測もある。このような高齢化社会では、病気や怪我や高齢等による看護や介護を必要とする要看護者や要介護者(要看護者等)は、高齢化社会ではない通常の社会で生じる要看護者等よりもその増加が見込まれる。そして、我が国は、例えば2013年の合計特殊出生率が1.43という少子化社会でもある。そのため、高齢な要看護者等を高齢の家族(配偶者、子、兄弟)が介護する老老介護も起きて来ている。 Japan (Japan) is an aging society, more specifically the ratio of population over 65 years old to the total population due to the improvement of living standards accompanying the post-war high economic growth, improvement of sanitary environment and improvement of medical standards, etc. It is a super-aging society with an aging rate exceeding 21%. In 2005, the elderly population aged 65 and over was about 25.56 million compared to the total population of 127.65 million, whereas in 2020, the elderly population was about 124.11 million. The population is predicted to be about 34.56 million. In such an aging society, nurses who need nursing or nursing care due to illness, injury, elderly age, etc., or those who need nursing care (such as those who require nursing care) are those who need nursing in a normal society that is not an aging society. This is expected to increase more than Japan, for example, is a society with a declining birthrate with a total fertility rate of 1.43 in 2013. For this reason, elderly care has been taking place in which elderly nurses, etc., are cared for by an elderly family (spouse, child, brother).
 要看護者等は、病院や、老人福祉施設(日本の法令では老人短期入所施設、養護老人ホームおよび特別養護老人ホーム等)等の施設に入所し、その看護や介護を受ける。このような施設では、要看護者等が、例えばベッドからの転落や歩行中の転倒等によって怪我を負ったり、ベッドから抜け出して徘徊したりするなどの事態が生じ得る。このような事態に対し、可及的速やかに対応する必要がある。このような事態を放置しておくとさらに重大な事態に発展してしまう可能性もある。このため、前記施設では、看護師や介護士等は、定期的に巡視することによってその安否や様子を確認している。 Employees requiring nursing care, etc. enter hospitals and facilities for welfare for the elderly (Japanese elderly law short-term entrance facilities, nursing homes for the elderly and special nursing homes for the elderly, etc.) and receive nursing and care. In such a facility, a situation in which a nurse or the like needs to be injured or fallen out of the bed, for example, by falling from the bed or falling while walking can occur. It is necessary to respond to such a situation as quickly as possible. If such a situation is left unattended, it may develop into a more serious situation. For this reason, in the facility, nurses and caregivers regularly check their safety and state by patrol.
 しかしながら、要看護者等の増加数に対し看護師等の増加数が追い付かずに、看護業界や介護業界では、慢性的に人手不足になっている。さらに、日勤の時間帯に較べ、準夜勤や夜勤の時間帯では、看護師や介護士等の人数が減るため、一人当たりの業務負荷が増大するので、前記業務負荷の軽減が要請される。また、前記老老介護の事態は、前記施設でも例外ではなく、高齢の要看護者等を高齢の看護師等がケアすることもしばしば見られる。一般に高齢になると体力が衰えるため、健康であっても若い看護師等に比し看護等の負担が重くなり、また、その動きや判断も遅くなる。 However, the increase in the number of nurses etc. cannot keep up with the increase in the number of nurses required, and the nursing industry and the care industry are chronically short of manpower. Furthermore, since the number of nurses, caregivers, and the like is reduced in the semi-night work and night work hours compared to the day work hours, the work load per person increases, and thus the work load is required to be reduced. In addition, the situation of the elderly care is not an exception in the facility, and it is often seen that elderly nurses and the like care for elderly nurses and the like. In general, physical strength declines when older, so the burden of nursing etc. becomes heavier than young nurses etc. even if they are healthy, and their movements and judgments are also delayed.
 このような人手不足や看護師等の負担を軽減するため、看護業務や介護業務を補完する技術が求められている。このため、近年では、要看護者等の、監視すべき監視対象である被監視者を監視(モニタ)する被監視者監視技術が研究、開発されている。 In order to reduce the labor shortage and the burden on nurses, a technology that complements nursing work and nursing care work is required. For this reason, in recent years, monitored person monitoring techniques for monitoring a monitored person to be monitored, such as a care recipient, have been researched and developed.
 このような技術の一つとして、例えば特許文献1に開示されたナースコールシステムがある。この特許文献1に開示されたナースコールシステムは、ベッドに設置されて患者が看護師を呼び出すためのナースコール子機と、ナースステーションに設置されて前記ナースコール子機による呼び出しに応答するためのナースコール親機とを有するナースコールシステムであって、ベッド上の患者をベッド上方から撮像するカメラと、前記カメラの撮像映像から、患者が上半身を起こした状態及び患者がベッド上から離れた状態のうち少なくとも一方の発生を判断して注意状態発生信号を出力する状態判断手段とを有し、前記ナースコール親機は、前記注意状態発生信号を受けて報知動作する報知手段を有する。そして、このナースコールシステムは、前記ナースコール子機からの呼び出しに応答するために看護師が携行する携帯端末と、前記注意状態発生信号を受けて、前記カメラの撮像映像を前記携帯端末に送信する通信制御手段とを有する。 For example, there is a nurse call system disclosed in Patent Document 1 as one of such techniques. The nurse call system disclosed in Patent Document 1 is a nurse call slave set that is installed in a bed and a patient calls a nurse, and a nurse call set that is installed in a nurse station and responds to a call by the nurse call slave set. A nurse call system having a nurse call parent device, a camera for imaging a patient on a bed from above the bed, and a state in which the patient wakes up from a captured image of the camera and a state in which the patient is separated from the bed State judging means for judging the occurrence of at least one of them and outputting a caution state occurrence signal, and the nurse call master unit has a notifying means for performing a notification operation upon receiving the caution state occurrence signal. The nurse call system transmits a captured image of the camera to the portable terminal upon receiving the attention state generation signal and a portable terminal carried by a nurse to respond to a call from the nurse call slave. Communication control means.
 一方、安否確認の点では、一人暮らしの独居者も前記要介護者等と同様であり、被監視対象者となる。 On the other hand, in terms of safety confirmation, a single person living alone is the same as the care recipient and the like and is a subject to be monitored.
 ところで、前記特許文献1に開示されたナースコールシステムのように、カメラの映像が端末装置に表示されると、看護師等の監視者は、要看護者等の被監視者の状況を視覚で把握できるため、便利である。しかしながら、被監視者が小さく写っていたり、表示が低解像度であったり等すると、監視者は、画像中の被監視者を見つけ難い。特に、端末装置が例えばタブレットやスマートフォン等の携帯端末装置である場合、表示画面が比較的小さいため、監視者は、画像中の被監視者を見つけ難い。 By the way, when the video of the camera is displayed on the terminal device as in the nurse call system disclosed in Patent Document 1, the monitor such as a nurse visually indicates the status of the monitored person such as a nurse. It is convenient because it can be grasped. However, if the monitored person appears small or the display has a low resolution, it is difficult for the monitoring person to find the monitored person in the image. In particular, when the terminal device is a mobile terminal device such as a tablet or a smart phone, the monitor screen is relatively small, and thus it is difficult for the monitor to find the monitored person in the image.
特開2014-90913号公報JP 2014-90913 A
 本発明は、上述の事情に鑑みて為された発明であり、その目的は、画像中の被監視者を見つけ易くできる端末装置および端末装置の表示方法、センサ装置ならびに被監視者監視システムを提供することである。 The present invention has been made in view of the above circumstances, and an object thereof is to provide a terminal device, a display method for the terminal device, a sensor device, and a monitored person monitoring system that can easily find a monitored person in an image. It is to be.
 本発明にかかる端末装置およびその表示方法は、被監視者の画像の画像データ、および、前記画像に写る人物の前記画像上での位置に関する人物位置データを収容する監視情報通信信号を受信し、前記画像データの画像、および、その画像中に、前記人物位置データに基づく前記人物の位置を表す人物位置標識を表示部に表示する。本発明にかかるセンサ装置は、撮像部で撮像した画像に写る人物の前記画像での位置に関する人物位置データを生成し、これを収容する監視情報通信信号を送信する。本発明にかかる被監視者監視システムは、このようなセンサ装置および端末装置を備える。 The terminal device according to the present invention and the display method thereof receive the monitoring information communication signal containing the image data of the image of the monitored person and the person position data relating to the position of the person appearing in the image on the image, An image of the image data and a person position indicator representing the position of the person based on the person position data are displayed on the display unit. The sensor device according to the present invention generates person position data relating to the position of the person in the image captured by the image capturing unit, and transmits a monitoring information communication signal that accommodates the person position data. The monitored person monitoring system according to the present invention includes such a sensor device and a terminal device.
 上記並びにその他の本発明の目的、特徴及び利点は、以下の詳細な記載と添付図面から明らかになるであろう。 The above and other objects, features and advantages of the present invention will become apparent from the following detailed description and the accompanying drawings.
実施形態における被監視者監視システムの構成を示す図である。It is a figure which shows the structure of the monitoring person monitoring system in embodiment. 前記被監視者監視システムにおけるセンサ装置の構成を示す図である。It is a figure which shows the structure of the sensor apparatus in the said to-be-monitored person monitoring system. 第1態様の人物位置データおよび人物位置標識を説明するための図である。It is a figure for demonstrating the person position data and person position mark of a 1st aspect. 第2態様の人物位置データ(人物領域データ)および人物位置標識を説明するための図である。It is a figure for demonstrating the person position data (person area data) and person position indicator of a 2nd aspect. 第3態様の人物位置データ(人物領域データ)および人物位置標識を説明するための図である。It is a figure for demonstrating the person position data (person area data) and person position indicator of a 3rd aspect. 第4態様の人物位置データ(人物領域データ)および人物位置標識を説明するための図である。It is a figure for demonstrating the person position data (person area data) and person position indicator of a 4th aspect. 第5態様の人物位置データ(人物領域データ)および人物位置標識を説明するための図である。It is a figure for demonstrating the person position data (person area data) and person position indicator of a 5th aspect. 前記被監視者監視システムにおける携帯端末装置の構成を示す図である。It is a figure which shows the structure of the portable terminal device in the said to-be-monitored person monitoring system. 前記被監視者監視システムにおけるセンサ装置の監視情報に関する動作を示すフローチャートである。It is a flowchart which shows the operation | movement regarding the monitoring information of the sensor apparatus in the said to-be-monitored person monitoring system. 前記被監視者監視システムにおける携帯端末装置の監視情報に関する動作を示すフローチャートである。It is a flowchart which shows the operation | movement regarding the monitoring information of the portable terminal device in the said to-be-monitored person monitoring system. 前記被監視者監視システムにおける携帯端末装置に表示される待受け画面の一例を示す図である。It is a figure which shows an example of the standby screen displayed on the portable terminal device in the said to-be-monitored person monitoring system. 前記被監視者監視システムにおける携帯端末装置に表示される監視情報画面の一例を示す図である。It is a figure which shows an example of the monitoring information screen displayed on the portable terminal device in the said to-be-monitored person monitoring system. 前記被監視者監視システムにおける携帯端末装置に表示される、人物の領域を拡大した監視情報画面の一例を示す図である。It is a figure which shows an example of the monitoring information screen which expanded the area | region of the person displayed on the portable terminal device in the said to-be-monitored person monitoring system. 前記被監視者監視システムにおける携帯端末装置に表示される、人物の領域と「対応する」ボタンとが重なった監視情報画面の一例を示す図である。It is a figure which shows an example of the monitoring information screen with which the area | region of a person and the "corresponding" button overlapped, which are displayed on the portable terminal device in the monitored person monitoring system. 前記被監視者監視システムにおける携帯端末装置に表示される、画像表示領域の中央位置に人物の領域を表示する監視情報画面の一例を示す図である。It is a figure which shows an example of the monitoring information screen which displays the area | region of a person at the center position of the image display area displayed on the portable terminal device in the monitored person monitoring system. 前記被監視者監視システムにおける携帯端末装置に表示される、互いに重ならないように人物の領域と「対応する」ボタンを表示する監視情報画面の一例を示す図である。It is a figure which shows an example of the monitoring information screen which displays a person's area | region and a "corresponding" button so that it may not mutually overlap displayed on the portable terminal device in the said to-be-monitored person monitoring system.
 以下、本発明にかかる実施の一形態を図面に基づいて説明する。なお、各図において同一の符号を付した構成は、同一の構成であることを示し、適宜、その説明を省略する。本明細書において、総称する場合には添え字を省略した参照符号で示し、個別の構成を指す場合には添え字を付した参照符号で示す。 Hereinafter, an embodiment according to the present invention will be described with reference to the drawings. In addition, the structure which attached | subjected the same code | symbol in each figure shows that it is the same structure, The description is abbreviate | omitted suitably. In this specification, when referring generically, it shows with the reference symbol which abbreviate | omitted the suffix, and when referring to an individual structure, it shows with the reference symbol which attached | subjected the suffix.
 実施形態における被監視者監視システムは、監視すべき(見守るべき)監視対象(見守り対象)である被監視者(見守り対象者)Obを監視するシステムである。この被監視者監視システムは、センサ装置と、前記センサ装置と通信可能に接続される端末装置とを備え、前記撮像部で撮像された画像に基づいて監視対象である被監視者における所定の行動を検知して検知結果を前記端末装置に報知する。このような被監視者監視システムにおける前記センサ装置は、通信を行う第1通信部と、撮像を行う撮像部と、前記撮像部で撮像した画像に写る人物の前記画像での位置に関する人物位置データを生成する人物位置処理部と、前記撮像部で撮像された前記画像に基づいて監視対象である被監視者における所定の行動を検知し、検知結果、前記撮像部で撮像した画像の画像データ、および、前記人物位置処理部で生成した人物位置データを収容する監視情報通信信号を前記第1通信部で送信する第1監視処理部とを備える。前記被監視者監視システムにおける前記端末装置は、前記監視情報通信信号を受信する第2通信部と、表示を行う表示部と、前記第2通信部で受信した監視情報通信信号に収容された前記検知結果および前記画像データの画像を前記表示部に表示し、前記監視情報通信信号に収容された前記人物位置データに基づく前記人物の位置を表す人物位置標識を前記表示部に表示した画像中に表示する表示処理部を有する第2監視処理部とを備える。前記監視情報通信信号は、前記センサ装置から前記端末装置へ直接的に送信されて良く、また、前記センサ装置から他の装置(例えば以下の管理サーバ装置)を介して前記端末装置へ間接的に送信されて良い。なお、前記端末装置は、1種類の装置であって良いが、以下の説明では、前記端末装置は、固定端末装置と携帯端末装置との2種類の装置である。これら固定端末装置と携帯端末装置との主な相違は、固定端末装置が固定的に運用される一方、携帯端末装置が例えば看護師や介護士等の監視者(ユーザ)に携行されて運用される点であり、これら固定端末装置と携帯端末装置とは、略同様であるので、以下、携帯端末装置を主に説明する。 The monitored person monitoring system in the embodiment is a system that monitors a monitored person (watched person) Ob that is a monitored object (watched object) to be monitored (watched). The monitored person monitoring system includes a sensor device and a terminal device connected to the sensor device so as to be communicable, and performs predetermined behavior in a monitored person based on an image captured by the imaging unit. And the detection result is notified to the terminal device. The sensor device in such a monitored person monitoring system includes: a first communication unit that performs communication; an imaging unit that performs imaging; and person position data related to a position in the image of a person captured in an image captured by the imaging unit A person position processing unit for generating a predetermined action in the monitored person to be monitored based on the image captured by the imaging unit, and detection results, image data of the image captured by the imaging unit, And the 1st monitoring process part which transmits the monitoring information communication signal which accommodates the person position data produced | generated by the said person position processing part by the said 1st communication part is provided. The terminal device in the monitored person monitoring system includes a second communication unit that receives the monitoring information communication signal, a display unit that performs display, and the monitoring information communication signal received by the second communication unit. A detection result and an image of the image data are displayed on the display unit, and a person position indicator indicating the position of the person based on the person position data stored in the monitoring information communication signal is displayed on the display unit. And a second monitoring processing unit having a display processing unit for displaying. The monitoring information communication signal may be transmitted directly from the sensor device to the terminal device, or indirectly from the sensor device to the terminal device via another device (for example, the following management server device). May be sent. In addition, although the said terminal device may be one type of apparatus, in the following description, the said terminal device is two types of apparatuses, a fixed terminal device and a portable terminal device. The main difference between these fixed terminal devices and portable terminal devices is that the fixed terminal device is fixedly operated, while the portable terminal device is operated by being carried by a supervisor (user) such as a nurse or a caregiver. Since the fixed terminal device and the mobile terminal device are substantially the same, the mobile terminal device will be mainly described below.
 図1は、実施形態における被監視者監視システムの構成を示す図である。図2は、実施形態の被監視者監視システムにおけるセンサ装置の構成を示す図である。図3は、第1態様の人物位置データおよび人物位置標識を説明するための図である。図4は、第2態様の人物位置データ(人物領域データ)および人物位置標識を説明するための図である。図5は、第3態様の人物位置データ(人物領域データ)および人物位置標識を説明するための図である。図6は、第4態様の人物位置データ(人物領域データ)および人物位置標識を説明するための図である。図7は、第5態様の人物位置データ(人物領域データ)および人物位置標識を説明するための図である。図8は、実施形態の被監視者監視システムにおける携帯端末装置の構成を示す図である。 FIG. 1 is a diagram illustrating a configuration of a monitored person monitoring system according to the embodiment. FIG. 2 is a diagram illustrating a configuration of a sensor device in the monitored person monitoring system according to the embodiment. FIG. 3 is a diagram for explaining person position data and person position signs in the first mode. FIG. 4 is a diagram for explaining person position data (person area data) and person position signs in the second mode. FIG. 5 is a diagram for explaining person position data (person area data) and person position signs in the third mode. FIG. 6 is a diagram for explaining person position data (person area data) and person position signs in the fourth mode. FIG. 7 is a diagram for explaining person position data (person area data) and person position signs of the fifth aspect. FIG. 8 is a diagram illustrating a configuration of the mobile terminal device in the monitored person monitoring system according to the embodiment.
 より具体的には、被監視者監視システムMSは、例えば、図1に示すように、1または複数のセンサ装置SU(SU-1~SU-4)と、管理サーバ装置SVと、固定端末装置SPと、1または複数の携帯端末装置TA(TA-1、TA-2)と、構内交換機(PBX、Private branch exchange)CXとを備え、これらは、有線や無線で、LAN(Local Area Network)等の網(ネットワーク、通信回線)NWを介して通信可能に接続される。ネットワークNWは、通信信号を中継する例えばリピーター、ブリッジおよびルーター等の中継機が備えられても良い。図1に示す例では、これら複数のセンサ装置SU-1~SU-4、管理サーバ装置SV、固定端末装置SP、複数の携帯端末装置TA-1、TA-2および構内交換機CXは、L2スイッチの集線装置(ハブ、HUB)LSおよびアクセスポイントAPを含む有線および無線の混在したLAN(例えばIEEE802.11規格に従ったLAN等)NWによって互いに通信可能に接続されている。より詳しくは、複数のセンサ装置SU-1~SU-4、管理サーバ装置SV、固定端末装置SPおよび構内交換機CXは、集線装置LSに接続され、複数の携帯端末装置TA-1、TA-2は、アクセスポイントAPを介して集線装置LSに接続されている。そして、ネットワークNWは、TCP(Transmission control protocol)およびIP(Internet protocol)等のインターネットプロトコル群が用いられることによっていわゆるイントラネットを構成する。 More specifically, the monitored person monitoring system MS includes, for example, as shown in FIG. 1, one or a plurality of sensor devices SU (SU-1 to SU-4), a management server device SV, and a fixed terminal device. SP, one or more portable terminal devices TA (TA-1, TA-2), and private branch exchange (PBX) CX, which are wired or wireless, LAN (Local Area Network) Or the like via a network (network, communication line) NW. The network NW may be provided with repeaters such as repeaters, bridges, and routers that relay communication signals. In the example shown in FIG. 1, the plurality of sensor devices SU-1 to SU-4, the management server device SV, the fixed terminal device SP, the plurality of portable terminal devices TA-1, TA-2, and the private branch exchange CX include an L2 switch. Are connected to each other by a wired / wireless LAN (for example, a LAN in accordance with the IEEE 802.11 standard) NW including the LS and the access point AP. More specifically, the plurality of sensor devices SU-1 to SU-4, the management server device SV, the fixed terminal device SP, and the private branch exchange CX are connected to the line concentrator LS, and the plurality of portable terminal devices TA-1, TA-2. Is connected to the line concentrator LS via the access point AP. The network NW constitutes a so-called intranet by using Internet protocol groups such as TCP (Transmission control protocol) and IP (Internet protocol).
 被監視者監視システムMSは、被監視者Obに応じて適宜な場所に配設される。被監視者(見守り対象者)Obは、例えば、病気や怪我等によって看護を必要とする者や、身体能力の低下等によって介護を必要とする者や、一人暮らしの独居者等である。特に、早期発見と早期対処とを可能にする観点から、被監視者Obは、例えば異常状態等の所定の不都合な事象がその者に生じた場合にその発見を必要としている者であることが好ましい。このため、被監視者監視システムMSは、被監視者Obの種類に応じて、病院、老人福祉施設および住戸等の建物に好適に配設される。図1に示す例では、被監視者監視システムMSは、複数の被監視者Obが入居する複数の居室RMや、ナースステーション等の複数の部屋を備える介護施設の建物に配設されている。 The monitored person monitoring system MS is arranged at an appropriate place according to the monitored person Ob. The monitored person (person to be watched) Ob is, for example, a person who needs nursing due to illness or injury, a person who needs care due to a decrease in physical ability, a single person living alone, or the like. In particular, from the viewpoint of enabling early detection and early action, the monitored person Ob may be a person who needs the detection when a predetermined inconvenient event such as an abnormal state occurs in the person. preferable. For this reason, the monitored person monitoring system MS is suitably arranged in a building such as a hospital, a welfare facility for the elderly, and a dwelling unit according to the type of the monitored person Ob. In the example illustrated in FIG. 1, the monitored person monitoring system MS is disposed in a building of a care facility that includes a plurality of rooms RM in which a plurality of monitored persons Ob live and a plurality of rooms such as a nurse station.
 構内交換機(回線切換機)CXは、ネットワークNWに接続され、携帯端末装置TA同士における発信、着信および通話等の内線電話の制御を行って前記携帯端末装置TA同士の内線電話を実施し、そして、例えば固定電話網や携帯電話網等の公衆電話網PNを介して例えば固定電話機や携帯電話機等の外線電話機TLに接続され、外線電話機TLと携帯端末装置TAとの間における発信、着信および通話等の外線電話の制御を行って外線電話機TLと携帯端末装置TAとの間における外線電話を実施する装置である。構内交換機CXは、例えば、デジタル交換機や、IP-PBX(Internet Protocol Private Branch eXchange)等である。 A private branch exchange (line switching unit) CX is connected to the network NW, controls extension calls such as outgoing calls, incoming calls, and calls between the portable terminal devices TA, and performs extension calls between the portable terminal devices TA, and For example, outgoing calls, incoming calls, and calls between the external telephone TL and the mobile terminal device TA are connected to an external telephone TL such as a fixed telephone or a mobile telephone via a public telephone network PN such as a fixed telephone network or a mobile telephone network. This is a device that performs an outside line telephone call between the outside line telephone TL and the portable terminal device TA by controlling the outside line telephone. The private branch exchange CX is, for example, a digital exchange or an IP-PBX (Internet Protocol Private Branch eXchange).
 センサ装置SUは、ネットワークNWを介して他の装置SV、SP、TAと通信する通信機能等を備え、被監視者Obにおける所定の行動を検知してその検知結果を管理サーバ装置SVへ報知し、ナースコールを受付けてその旨を管理サーバ装置SVへ通知し、端末装置SP、TAとの間で音声通話を行い、動画を含む画像を生成して端末装置SP、TAへ動画を配信する装置である。そして、本実施形態では、画像を端末装置SP、TAへ送信する際に、センサ装置SUは、端末装置SP、TAへ送信するために、前記画像に写る人物の前記画像上での位置に関する人物位置データを生成して前記画像と共に管理サーバ装置SVへ送信する。このようなセンサ装置SUは、例えば、図2に示すように、撮像部11と、センサ側音入出力部(SU音入出力部)12と、ナースコール受付操作部13と、センサ側制御処理部(SU制御処理部)14と、センサ側通信インターフェース部(SU通信IF部)15と、センサ側記憶部(SU記憶部)16とを備える。 The sensor device SU has a communication function that communicates with other devices SV, SP, TA via the network NW, detects a predetermined action in the monitored person Ob, and notifies the management server device SV of the detection result. An apparatus that accepts a nurse call, notifies the management server SV of that fact, makes a voice call with the terminal devices SP and TA, generates an image including a moving image, and distributes the moving image to the terminal devices SP and TA It is. In this embodiment, when transmitting an image to the terminal devices SP and TA, the sensor device SU transmits the person to the terminal devices SP and TA, and the person related to the position of the person in the image on the image. Position data is generated and transmitted to the management server device SV together with the image. For example, as shown in FIG. 2, such a sensor device SU includes an imaging unit 11, a sensor side sound input / output unit (SU sound input / output unit) 12, a nurse call reception operation unit 13, and a sensor side control process. Unit (SU control processing unit) 14, sensor-side communication interface unit (SU communication IF unit) 15, and sensor-side storage unit (SU storage unit) 16.
 撮像部11は、SU制御処理部14に接続され、SU制御処理部14の制御に従って、撮像を行い、画像(画像データ)を生成する装置である。前記画像には、静止画(静止画データ)および動画(動画データ)が含まれる。撮像部11は、監視すべき監視対象である被監視者Obが所在を予定している空間(所在空間、図1に示す例では配設場所の居室RM)を監視可能に配置され、前記所在空間を撮像対象としてその上方から撮像し、前記撮像対象を俯瞰した画像(画像データ)を生成し、前記撮像対象の画像(対象画像)をSU制御処理部14へ出力する。好ましくは、被監視者Ob全体を撮像できる蓋然性が高いことから、撮像部11は、被監視者Obが横臥する寝具(例えばベッド等)における、被監視者Obの頭部が位置すると予定されている予め設定された頭部予定位置(通常、枕の配設位置)の直上から撮像対象を撮像できるように配設される。センサ装置SUは、この撮像部11によって、被監視者Obを、被監視者Obの上方から撮像した画像、好ましくは前記頭部予定位置の直上から撮像した画像を取得する。 The imaging unit 11 is an apparatus that is connected to the SU control processing unit 14 and performs imaging in accordance with the control of the SU control processing unit 14 to generate an image (image data). The image includes a still image (still image data) and a moving image (moving image data). The imaging unit 11 is disposed so as to be able to monitor a space (location space, in the example shown in FIG. 1, where the monitored person Ob that is a monitoring target to be monitored) is located. The space is imaged as an imaging target from above, an image (image data) overlooking the imaging target is generated, and the imaging target image (target image) is output to the SU control processing unit 14. Preferably, since there is a high probability that the entire monitored person Ob can be imaged, the imaging unit 11 is expected to be located at the head of the monitored person Ob in the bedding on which the monitored person Ob is lying (for example, a bed). It is arranged so that the imaging target can be imaged from directly above the preset planned head position (usually the position where the pillow is disposed). The sensor device SU uses the imaging unit 11 to acquire an image of the monitored person Ob taken from above the monitored person Ob, preferably an image taken from directly above the planned head position.
 このような撮像部11は、可視光の画像を生成する装置であって良いが、比較的暗がりでも被監視者Obを監視できるように、本実施形態では、赤外線の画像を生成する装置である。このような撮像部11は、例えば、本実施形態では、撮像対象における赤外の光学像を所定の結像面上に結像する結像光学系、前記結像面に受光面を一致させて配置され、前記撮像対象における赤外の光学像を電気的な信号に変換するイメージセンサ、および、イメージセンサの出力を画像処理することで前記撮像対象における赤外の画像を表すデータである画像データを生成する画像処理部等を備えるデジタル赤外線カメラである。撮像部11の前記結像光学系は、本実施形態では、その配設された居室RM全体を撮像できる画角を持つ広角な光学系(いわゆる広角レンズ(魚眼レンズを含む))であることが好ましい。 Such an imaging unit 11 may be a device that generates an image of visible light, but in the present embodiment, it is a device that generates an infrared image so that the monitored person Ob can be monitored even in a relatively dark place. . For example, in this embodiment, the imaging unit 11 has an imaging optical system that forms an infrared optical image of an imaging target on a predetermined imaging surface, and a light receiving surface that matches the imaging surface. An image sensor that is arranged and converts an infrared optical image in the imaging target into an electrical signal, and image data that represents an infrared image in the imaging target by performing image processing on the output of the image sensor It is a digital infrared camera provided with the image processing part etc. which produce | generate. In the present embodiment, the imaging optical system of the imaging unit 11 is preferably a wide-angle optical system (so-called wide-angle lens (including a fisheye lens)) having an angle of view that can image the entire living room RM in which the imaging unit 11 is disposed. .
 SU音入出力部12は、音を入出力する回路である。すなわち、SU音入出力部12は、SU制御処理部14に接続され、SU制御処理部14の制御に従って音を表す電気信号に応じた音を生成して出力するための回路であって、外部の音を取得してセンサ装置SUに入力するための回路である。SU音入出力部12は、例えば、音の電気信号(音データ)を音の機械振動信号(音響信号)に変換するスピーカ等と、可聴領域の音の機械振動信号を電気信号に変換するマイクロフォン等とを備えて構成される。SU音入出力部12は、外部の音を表す電気信号をSU制御処理部14へ出力し、また、SU制御処理部14から入力された電気信号を音の機械振動信号に変換して出力する。 The SU sound input / output unit 12 is a circuit that inputs and outputs sound. That is, the SU sound input / output unit 12 is a circuit that is connected to the SU control processing unit 14 and generates and outputs a sound corresponding to an electrical signal representing sound according to the control of the SU control processing unit 14. It is a circuit for acquiring the sound of and inputting it into the sensor device SU. The SU sound input / output unit 12 includes, for example, a speaker that converts a sound electrical signal (sound data) into a sound mechanical vibration signal (acoustic signal), and a microphone that converts a sound mechanical vibration signal in the audible region into an electrical signal. And so on. The SU sound input / output unit 12 outputs an electric signal representing an external sound to the SU control processing unit 14, and converts the electric signal input from the SU control processing unit 14 into a sound mechanical vibration signal and outputs the sound. .
 ナースコール受付操作部13は、SU制御処理部14に接続され、ナースコールを当該センサ装置SUに入力するための例えば押しボタン式スイッチ等のスイッチ回路である。なお、ナースコール受付操作部13は、有線でSU制御処理部14に接続されて良く、また、例えばBluetooth(登録商標)規格等の近距離無線通信でSU制御処理部14に接続されて良い。 The nurse call reception operation unit 13 is connected to the SU control processing unit 14 and is a switch circuit such as a push button switch for inputting the nurse call to the sensor device SU. Note that the nurse call reception operation unit 13 may be connected to the SU control processing unit 14 by wire, or may be connected to the SU control processing unit 14 by short-range wireless communication such as Bluetooth (registered trademark) standard.
 SU通信IF部15は、SU制御処理部14に接続され、SU制御処理部14の制御に従って通信を行うための通信回路である。SU通信IF部15は、SU制御処理部14から入力された転送すべきデータを収容した通信信号を、この被監視者監視システムMSのネットワークNWで用いられる通信プロトコルに従って生成し、この生成した通信信号をネットワークNWを介して他の装置SV、SP、TAへ送信する。SU通信IF部15は、ネットワークNWを介して他の装置SV、SP、TAから通信信号を受信し、この受信した通信信号からデータを取り出し、この取り出したデータをSU制御処理部14が処理可能な形式のデータに変換してSU制御処理部14へ出力する。SU通信IF部15は、例えば、IEEE802.11規格等に従った通信インターフェース回路を備えて構成される。 The SU communication IF unit 15 is a communication circuit that is connected to the SU control processing unit 14 and performs communication according to the control of the SU control processing unit 14. The SU communication IF unit 15 generates a communication signal containing data to be transferred input from the SU control processing unit 14 in accordance with a communication protocol used in the network NW of the monitored person monitoring system MS, and the generated communication The signal is transmitted to other devices SV, SP, TA via the network NW. The SU communication IF unit 15 receives communication signals from other devices SV, SP, and TA via the network NW, extracts data from the received communication signals, and the SU control processing unit 14 can process the extracted data. The data is converted into data in a proper format and output to the SU control processing unit 14. The SU communication IF unit 15 includes, for example, a communication interface circuit that complies with the IEEE 802.11 standard or the like.
 SU記憶部16は、SU制御処理部14に接続され、SU制御処理部14の制御に従って、各種の所定のプログラムおよび各種の所定のデータを記憶する回路である。前記各種の所定のプログラムには、例えば、センサ装置SUの各部を当該各部の機能に応じてそれぞれ制御するSU制御プログラムや、撮像部11で撮像した画像に写る人物の前記画像での位置に関する人物位置データを生成する人物位置処理プログラムや、被監視者Obにおける所定の行動を検知して検知結果を、撮像部11で撮像した画像(画像データ)および前記人物位置処理プログラムで生成した人物位置データと共に、管理サーバ装置SVを介して所定の端末装置SP、TAへ通知するSU監視処理プログラムや、ナースコール受付操作部13でナースコールを受付けた場合にその旨を管理サーバ装置SVへ通知し、SU音入出力部12等を用いることで端末装置SP、TAとの間で音声通話を行うナースコール処理プログラムや、撮像部11で生成した動画を、その動画を要求した端末装置SP、TAへストリーミングで配信するSUストリーミング処理プログラム等の制御処理プログラムが含まれる。前記各種の所定のデータには、自機の、センサ装置SUを特定し識別するための識別子であるセンサ装置識別子(センサID)、および、管理サーバ装置SVの通信アドレス等の、各プログラムを実行する上で必要なデータ等が含まれる。SU記憶部16は、例えば不揮発性の記憶素子であるROM(Read Only Memory)や書き換え可能な不揮発性の記憶素子であるEEPROM(Electrically Erasable Programmable Read Only Memory)等を備える。そして、SU記憶部16は、前記所定のプログラムの実行中に生じるデータ等を記憶するいわゆるSU制御処理部14のワーキングメモリとなるRAM(Random Access Memory)等を含む。 The SU storage unit 16 is a circuit that is connected to the SU control processing unit 14 and stores various predetermined programs and various predetermined data under the control of the SU control processing unit 14. Examples of the various predetermined programs include a SU control program that controls each unit of the sensor device SU according to the function of each unit, and a person related to the position of the person in the image captured by the imaging unit 11 in the image. A person position processing program for generating position data, an image (image data) obtained by detecting a predetermined action in the monitored person Ob and picking up the detection result by the image pickup unit 11, and the person position data generated by the person position processing program At the same time, when receiving a nurse call at the SU monitoring processing program for notifying the predetermined terminal devices SP and TA via the management server device SV or the nurse call reception operation unit 13, the management server device SV is notified of this, A nurse call processing program for making a voice call with the terminal devices SP and TA by using the SU sound input / output unit 12 or the like The videos generated by the imaging unit 11, the terminal apparatus SP requested the video, which contains control program of SU streaming processing program for broadcast streaming to TA. For each of the various predetermined data, each program such as a sensor device identifier (sensor ID) that is an identifier for identifying and identifying the sensor device SU of the own device and a communication address of the management server device SV is executed. Data necessary for doing this is included. The SU storage unit 16 includes, for example, a ROM (Read Only Memory) that is a nonvolatile storage element, an EEPROM (Electrically Erasable Programmable Read Only Memory) that is a rewritable nonvolatile storage element, and the like. The SU storage unit 16 includes a RAM (Random Access Memory) that serves as a working memory of the so-called SU control processing unit 14 that stores data generated during the execution of the predetermined program.
 SU制御処理部14は、センサ装置SUの各部を当該各部の機能に応じてそれぞれ制御し、被監視者Obにおける所定の行動を検知してその検知結果を前記画像(画像データ)および人物位置データと共に管理サーバ装置SVへ報知し、ナースコールを受付けてその旨を管理サーバ装置SVへ通知し、端末装置SP、TAとの間で音声通話を行い、そして、動画を含む画像を生成して端末装置SP、TAへ動画を配信するための回路である。SU制御処理部14は、例えば、CPU(Central Processing Unit)およびその周辺回路を備えて構成される。SU制御処理部14は、前記制御処理プログラムが実行されることによって、センサ側制御部(SU制御部)141、センサ側監視処理部(SU監視処理部)142、人物位置処理部143、ナースコール処理部(ナースコール処理部)144およびセンサ側ストリーミング処理部(SUストリーミング処理部)145を機能的に備える。 The SU control processing unit 14 controls each part of the sensor device SU according to the function of each part, detects a predetermined action in the monitored person Ob, and detects the detection result as the image (image data) and the person position data. In addition, the management server device SV is notified, the nurse call is accepted, the management server device SV is notified of that, a voice call is performed with the terminal devices SP and TA, and an image including a moving image is generated to generate a terminal. This is a circuit for distributing a moving image to the devices SP and TA. The SU control processing unit 14 includes, for example, a CPU (Central Processing Unit) and its peripheral circuits. The SU control processing unit 14 executes a sensor processing unit (SU control unit) 141, a sensor side monitoring processing unit (SU monitoring processing unit) 142, a person position processing unit 143, and a nurse call by executing the control processing program. A processing unit (nurse call processing unit) 144 and a sensor-side streaming processing unit (SU streaming processing unit) 145 are functionally provided.
 SU制御部141は、センサ装置SUの各部を当該各部の機能に応じてそれぞれ制御し、センサ装置SUの全体制御を司るものである。 The SU control unit 141 controls each part of the sensor device SU according to the function of each part, and governs overall control of the sensor device SU.
 SU監視処理部142は、被監視者Obにおける、予め設定された所定の行動(状態、状況)を画像に基づいて検知して、前記画像および後述のように人物位置処理部143で生成した人物位置データと共に、管理サーバ装置SVへ報知(通知、送信)するものである。 The SU monitoring processing unit 142 detects a predetermined action (state, situation) set in advance in the monitored person Ob based on the image, and the person generated by the person position processing unit 143 as described above and the person described later. Along with the position data, the management server device SV is notified (notified and transmitted).
 より具体的には、本実施形態では、前記所定の行動は、例えば、被監視者Obが起きた起床、被監視者Obが寝具から離れた離床、被監視者Obが寝具から落ちた転落、および、被監視者Obが倒れた転倒の4つの行動である。SU監視処理部142は、例えば、撮像部11で撮像した対象画像に基づいて被監視者Obの頭部を検出し、この検出した被監視者Obの頭部における大きさの時間変化に基づいて被監視者Obの起床、離床、転倒および転落を検知する。より詳しくは、まず、寝具の所在領域、および、第1ないし第3閾値Th1~Th3が前記各種の所定のデータの1つとして予めSU記憶部16に記憶される。前記第1閾値Th1は、寝具の所在領域内における横臥姿勢の頭部の大きさと座位姿勢の頭部の大きさとを識別するための値である。前記第2閾値Th2は、寝具の所在領域を除く居室RM内における立位姿勢の頭部の大きさであるか否かを識別するための値である。前記第3閾値Th3は、寝具の所在領域を除く居室RM内における横臥姿勢の頭部の大きさであるか否かを識別するための値である。そして、SU監視処理部142は、対象画像から例えば背景差分法やフレーム差分法によって被監視者Obの人物の領域として動体領域を抽出する。次に、SU監視処理部142は、この抽出した動体領域から、例えば円形や楕円形のハフ変換によって、また例えば予め用意された頭部のモデルを用いたパターンマッチングによって、また例えば頭部検出用に学習したニューラルネットワークによって、被監視者Obの頭部領域を抽出する。次に、SU監視処理部142は、この抽出した頭部の位置および大きさから起床、離床、転倒および転落を検知する。例えば、SU監視処理部142は、この抽出した頭部の位置が寝具の所在領域内であって、前記抽出した頭部の大きさが前記第1閾値Th1を用いることによって横臥姿勢の大きさから座位姿勢の大きさへ時間変化した場合には、起床と判定し、前記起床を検知する。例えば、SU監視処理部142は、この抽出した頭部の位置が寝具の所在領域内から寝具の所在領域外へ時間変化した場合であって、前記抽出した頭部の大きさが前記第2閾値Th2を用いることによって或る大きさから立位姿勢の大きさへ時間変化した場合には、離床と判定し、前記離床を検知する。例えば、SU監視処理部142は、この抽出した頭部の位置が寝具の所在領域内から寝具の所在領域外へ時間変化した場合であって、前記抽出した頭部の大きさが前記第3閾値Th3を用いることによって或る大きさから横臥姿勢の大きさへ時間変化した場合には、転落と判定し、前記転落を検知する。例えば、SU監視処理部142は、この抽出した頭部の位置が寝具の所在領域を除く居室RM内であって、前記抽出した頭部の大きさが前記第3閾値Th3を用いることによって或る大きさから横臥姿勢の大きさへ時間変化した場合には、転倒と判定し、前記転倒を検知する。 More specifically, in the present embodiment, the predetermined action includes, for example, wake-up when the monitored person Ob occurred, leaving the monitored person Ob away from the bedding, falling down when the monitored person Ob fell from the bedding, And four actions of the fall where the monitored person Ob fell. For example, the SU monitoring processing unit 142 detects the head of the monitored person Ob based on the target image captured by the imaging unit 11, and based on the time change in the size of the detected head of the monitored person Ob. Detects the monitored person's getting up, getting out of bed, falling and falling. More specifically, first, the bedding location area and the first to third threshold values Th1 to Th3 are stored in advance in the SU storage unit 16 as one of the various predetermined data. The first threshold Th1 is a value for discriminating between the size of the head in the lying posture and the size of the head in the sitting posture in the bedding location region. The second threshold Th2 is a value for identifying whether or not the head is in the standing posture in the living room RM excluding the bedding location area. The third threshold Th3 is a value for identifying whether the head is in the lying position in the living room RM excluding the bedding location area. Then, the SU monitoring processing unit 142 extracts a moving object area from the target image as a person area of the monitored person Ob, for example, by a background difference method or a frame difference method. Next, the SU monitoring processing unit 142 uses, for example, a circular or elliptical Hough transform from the extracted moving object region, for example, by pattern matching using a head model prepared in advance, or for example for head detection. The head region of the monitored person Ob is extracted by the neural network learned in step (b). Next, the SU monitoring processing unit 142 detects wakeup, getting out of bed, falling, and falling from the position and size of the extracted head. For example, the SU monitoring processing unit 142 determines that the position of the extracted head is within the bedding location area, and the size of the extracted head uses the first threshold Th1 to determine the size of the lying posture. When the time changes to the size of the sitting posture, it is determined that the user is getting up and the rising is detected. For example, the SU monitoring processing unit 142 is a case where the position of the extracted head changes with time from within the bedding location area to outside the bedding location area, and the size of the extracted head is the second threshold value. When the time changes from a certain size to the size of the standing posture by using Th2, it is determined that the user has left the bed and the bed is detected. For example, the SU monitoring processing unit 142 is a case where the position of the extracted head changes over time from the bedding location area to the outside of the bedding location area, and the size of the extracted head is the third threshold value. When the time changes from a certain size to the size of the recumbent posture by using Th3, it is determined as a fall and the fall is detected. For example, the SU monitoring processing unit 142 has the extracted head position in the living room RM excluding the bedding location area, and the extracted head size is determined by using the third threshold Th3. When the time changes from the size to the size of the recumbent posture, it is determined that the vehicle falls and the fall is detected.
 なお、画像上での人物の領域は、所定の大きさ(広さ、サイズ)の範囲内となるので、人物の領域として抽出する動体領域も所定の大きさ(広さ、サイズ)の範囲内となる。このため、動体領域として抽出する範囲が予め設定され、動体領域を抽出する際に、前記対象画像から例えば背景差分法やフレーム差分法によって抽出された動体領域が前記範囲外である場合には、この対象画像から例えば背景差分法やフレーム差分法によって抽出された動体領域は、人物の領域として抽出する動体領域から除外されてよい。 Since the person area on the image is within a predetermined size (width, size), the moving object area to be extracted as the person area is also within the predetermined size (width, size). It becomes. For this reason, a range to be extracted as a moving body region is preset, and when a moving body region is extracted from the target image, for example, by a background difference method or a frame difference method, The moving object region extracted from the target image by, for example, the background difference method or the frame difference method may be excluded from the moving object region extracted as the person region.
 このように前記所定の行動を検知すると、SU監視処理部142は、この検知結果とこの検知結果を得る際に用いられた前記対象画像と、前記対象画像に対する人物位置データとを、SU通信IF部15で管理サーバ装置SVへ報知する。より詳しくは、SU監視処理部142は、自機のセンサID、検知結果(本実施形態では、起床、離床、転落および転倒のうちの1または複数)、前記所定の行動の検知に用いられた対象画像、および、この対象画像に対する人物位置データを収容した通信信号(第1監視情報通信信号)を、SU通信IF部15を介して管理サーバ装置SVへ送信する。前記画像は、静止画および動画のうちの少なくとも一方であって良く、本実施形態では、後述するように、まず、静止画が報知され、ユーザの要求に応じて動画が配信される。なお、まず、動画が配信されても良く、また、静止画および動画が送信され、画面分割で静止画および動画が端末装置SP、TAに表示されても良い。 When the predetermined action is detected in this way, the SU monitoring processing unit 142 uses the SU communication IF to detect the detection result, the target image used when obtaining the detection result, and the person position data with respect to the target image. The unit 15 notifies the management server device SV. More specifically, the SU monitoring processing unit 142 is used for detecting the sensor ID of the own device, the detection result (one or more of getting up, getting out of bed, falling, and falling in this embodiment) and the predetermined action. A communication signal (first monitoring information communication signal) containing the target image and person position data for the target image is transmitted to the management server device SV via the SU communication IF unit 15. The image may be at least one of a still image and a moving image. In the present embodiment, as described later, first, a still image is notified, and the moving image is distributed in response to a user request. First, a moving image may be distributed, a still image and a moving image may be transmitted, and the still image and the moving image may be displayed on the terminal devices SP and TA by screen division.
 前記人物位置データは、後述のように人物位置処理部143で生成され、前記対象画像の画像データにおけるメタデータとして前記第1監視情報通信信号に収容される。例えば、前記画像データおよび前記人物位置データは、Exif(Exchangeable image file format)ファイルに収容される。より具体的には、Exifファイルは、JPEG(Joint Photographic Experts Group)ファイルフォーマットに準拠したファイルであり、アプリケーションマーカセグメントに、撮影条件等のいわゆるExif情報が格納される。前記人物位置データは、このアプリケーションマーカセグメントに格納される。また例えば、前記画像データおよび前記人物位置データは、HTTP(Hypertext Transfer Protocol)におけるマルチパート形式のファイルにフォームデータとして収容される。より具体的には、前記画像データは、マルチパート形式のファイルにおける1つのフォームデータとして収容され、前記人物位置データは、マルチパート形式の前記ファイルにおける他の1つのフォームデータとして収容される。 The person position data is generated by the person position processing unit 143 as described later, and is stored in the first monitoring information communication signal as metadata in the image data of the target image. For example, the image data and the person position data are accommodated in an Exif (Exchangeable image file format) file. More specifically, the Exif file is a file conforming to a JPEG (Joint Photographic Experts Group) file format, and so-called Exif information such as shooting conditions is stored in an application marker segment. The person position data is stored in the application marker segment. Further, for example, the image data and the person position data are accommodated as form data in a multipart file in HTTP (Hypertext Transfer Protocol). More specifically, the image data is accommodated as one form data in a multi-part format file, and the person position data is accommodated as another form data in the multi-part format file.
 人物位置処理部143は、撮像部11で撮像した画像に写る人物の前記画像での位置に関する人物位置データを生成するものである。本実施形態では、前記画像は、被監視者における所定の行動を検知するために用いられた前記対象画像である。前記人物位置データとして、例えば、次の第1ないし第5態様のデータを用いることができる。 The person position processing unit 143 generates person position data regarding the position of the person in the image captured by the image capturing unit 11. In the present embodiment, the image is the target image used for detecting a predetermined action in the monitored person. As the person position data, for example, data of the following first to fifth modes can be used.
 第1態様の人物位置データは、前記画像に写る人物の前記画像上での位置そのものの座標データである。この第1態様の人物位置データが用いられる場合では、より具体的には、人物位置処理部143は、まず、SU監視処理部142と同様に、撮像部11で撮像した対象画像から例えば背景差分法やフレーム差分法によって人物の領域として動体領域MBを抽出する。そして、人物位置処理部143は、図3に示すように、この抽出した動体領域MBにおける重心位置P1を求め、この求めた重心位置P1の座標データ(x、y)を第1態様の人物位置データとして求める(前記画像に写る人物の前記画像上での位置=重心位置P1)。前記座標データ(x、y)は、以下の第2ないし第5態様の人物位置データも同様に、例えば、対象画像の左上頂点を座標原点とする画素位置で与えられる。なお、第1態様の人物位置データが用いられる場合では、後述するように携帯端末装置TAに表示される人物位置標識は、重心位置P1の座標データ(x、y)で表される点の画像HP1となる。 The person position data of the first aspect is the coordinate data of the position of the person in the image itself on the image. In the case where the person position data of the first aspect is used, more specifically, the person position processing unit 143 first, like the SU monitoring processing unit 142, for example, a background difference from the target image captured by the imaging unit 11. The moving object region MB is extracted as a person region by the method or the frame difference method. Then, as shown in FIG. 3, the person position processing unit 143 obtains the gravity center position P1 in the extracted moving object region MB, and uses the coordinate data (x, y) of the obtained gravity center position P1 as the person position of the first mode. Obtained as data (position of the person in the image on the image = center of gravity position P1). The coordinate data (x, y) is given in the same way as the human position data of the following second to fifth modes, for example, at a pixel position with the upper left vertex of the target image as the coordinate origin. When the person position data of the first aspect is used, as will be described later, the person position indicator displayed on the mobile terminal device TA is an image of a point represented by the coordinate data (x, y) of the gravity center position P1. HP1.
 第2態様の人物位置データは、前記画像に写る人物の前記画像上での位置(例えば前記重心位置P1等)を含む前記人物の領域に関する人物領域データであり、前記人物領域データは、前記人物の領域の全部または一部を含む矩形を表す座標データである。この第2態様の人物位置データが用いられる場合では、より具体的には、人物位置処理部143は、上述と同様に、人物の領域として動体領域MBを抽出し、図4に示すように、この動体領域MBの全部または一部を含む矩形(例えば動体領域MBに対する外接矩形等)を求め、この求めた矩形を表す座標データを第2態様の人物位置データ(人物領域データ)として求める。この矩形を表す座標データは、例えば、矩形における1つの対角線の両端点P20、P21の各座標データ(x0、y0)、(x1、y1)から成る。また例えば、前記矩形を表す座標データは、矩形における4頂点の各座標データから成る(図略)。なお、第2態様の人物位置データが用いられる場合では、後述するように携帯端末装置TAに表示される人物位置標識は、前記人物領域データ(x0、y0)、(x1、y1)で表される画像、すなわち、矩形の画像(矩形の輪郭線の画像)HP2となる。 The person position data of the second aspect is person area data relating to the person area including a position of the person in the image (for example, the center of gravity position P1), and the person area data is the person area data. This is coordinate data representing a rectangle including all or part of the area. In the case where the person position data of the second mode is used, more specifically, the person position processing unit 143 extracts the moving object area MB as the person area as described above, and as shown in FIG. A rectangle including all or part of the moving object region MB (for example, a circumscribed rectangle for the moving object region MB) is obtained, and coordinate data representing the obtained rectangle is obtained as person position data (person region data) of the second mode. The coordinate data representing the rectangle includes, for example, coordinate data (x0, y0) and (x1, y1) of both end points P20 and P21 of one diagonal line in the rectangle. Further, for example, the coordinate data representing the rectangle is composed of coordinate data of four vertices in the rectangle (not shown). When the person position data of the second mode is used, the person position indicator displayed on the mobile terminal device TA is represented by the person area data (x0, y0) and (x1, y1) as described later. Image, that is, a rectangular image (rectangular contour image) HP2.
 第3態様の人物位置データは、前記画像に写る人物の前記画像上での位置(例えば前記重心位置P1等)を含む前記人物の領域に関する人物領域データであり、前記人物領域データは、前記人物の領域の輪郭線を表す座標データである。この第3態様の人物位置データが用いられる場合では、より具体的には、人物位置処理部143は、上述と同様に、人物の領域として動体領域MBを抽出し、図5に示すように、この動体領域MBの輪郭線を求め、この求めた輪郭線を表す座標データを第3態様の人物位置データ(人物領域データ)として求める。この動体領域MBの輪郭線を表す座標データは、動体領域MBの輪郭線における各頂点P30、P31、P32、・・・、P3Cの各座標データ(x0、y0)、(x1、y1)、(x2、y2)、・・・、(xpc、ypc)から成る。より詳しくは、動体領域MBが比較的複雑な形状である場合には、動体領域MBの輪郭線における各頂点の個数が多数となり、第1監視情報通信信号における人物位置データに与えられているデータ容量(例えば数個~十数個分の座標データのデータ容量)を超える場合が有り得る。このため、人物位置処理部143は、人物の領域としての動体領域MBを構成する画素を、隣接する複数の画素で纏めてモザイクを形成することで、動体領域MBを複数のモザイクで構成する(動体領域MBをモザイク化する)。このモザイクを大きくすることで、動体領域MBを構成するモザイクの個数が低減し、この結果、この複数のモザイクで構成される動体領域MBの輪郭線における頂点の個数も低減する。したがって、人物位置処理部143は、第1監視情報通信信号における人物位置データに与えられているデータ容量以下となるまで、モザイクの大きさを順次に大きくし、モザイクの大きさを調整する。なお、第3態様の人物位置データが用いられる場合では、後述するように携帯端末装置TAに表示される人物位置標識は、前記人物領域データ(x0、y0)、(x1、y1)、(x2、y2)、・・・、(xpc、ypc)で表される画像、すなわち、人物の領域における輪郭線の画像HP3となる。 The person position data of the third aspect is person area data relating to the person area including a position of the person in the image (for example, the center of gravity position P1), and the person area data is the person area data. This is coordinate data representing the outline of the area. In the case where the person position data of the third aspect is used, more specifically, the person position processing unit 143 extracts the moving object area MB as the person area as described above, and as shown in FIG. A contour line of the moving body region MB is obtained, and coordinate data representing the obtained contour line is obtained as person position data (person region data) of the third mode. The coordinate data representing the contour line of the moving object region MB is coordinate data (x0, y0), (x1, y1), (x1, y1), (P3C) of the vertexes P30, P31, P32,. x2, y2), ..., (xpc, ypc). More specifically, when the moving object region MB has a relatively complicated shape, the number of vertices on the contour line of the moving object region MB is large, and the data given to the person position data in the first monitoring information communication signal There is a possibility that the capacity (for example, the data capacity of several to dozens of coordinate data) is exceeded. For this reason, the person position processing unit 143 forms a mosaic by combining the pixels constituting the moving object region MB as the person region with a plurality of adjacent pixels, thereby forming the moving object region MB with a plurality of mosaics ( The moving object area MB is made into a mosaic). By enlarging this mosaic, the number of mosaics constituting the moving object region MB is reduced, and as a result, the number of vertices in the contour line of the moving object region MB constituted by the plurality of mosaics is also reduced. Accordingly, the person position processing unit 143 sequentially increases the size of the mosaic and adjusts the size of the mosaic until the data capacity is equal to or less than the data capacity given to the person position data in the first monitoring information communication signal. When the person position data of the third aspect is used, the person position indicator displayed on the mobile terminal device TA is the person area data (x0, y0), (x1, y1), (x2) as will be described later. , Y2),..., (Xpc, ypc), that is, an image HP3 of a contour line in a person region.
 第4態様の人物位置データは、前記画像に写る人物の前記画像上での位置(例えば前記重心位置P1等)を含む前記人物の領域に関する人物領域データであり、前記人物領域データは、前記人物の領域の全部または一部を含む円形を表す座標データである。この第4態様の人物位置データが用いられる場合では、より具体的には、人物位置処理部143は、上述と同様に、人物の領域として動体領域MBを抽出し、図6に示すように、この動体領域MBの全部または一部を含む円形(例えば動体領域MBに対する外接円等)を求め、この求めた円形を表す座標データを第4態様の人物位置データ(人物領域データ)として求める。この円形を表す座標データは、例えば、円形における中心点P4の座標データ(x、y)および半径rから成る。また例えば、前記円形を表す座標データは、動体領域MBにおける重心位置P4の座標データ(x、y)および予め設定された固定値の半径rから成る。なお、第4態様の人物位置データが用いられる場合では、後述するように携帯端末装置TAに表示される人物位置標識は、前記人物領域データ(x、y)、rで表される画像、すなわち、円形の画像(円形の輪郭線の画像)HP4となる。 The person position data of the fourth aspect is person area data relating to the person area including the position of the person in the image (for example, the center of gravity position P1), and the person area data is the person area data. Is coordinate data representing a circle including all or part of the area. In the case where the person position data of the fourth aspect is used, more specifically, the person position processing unit 143 extracts the moving object area MB as the person area as described above, and as shown in FIG. A circle including all or part of the moving object region MB (for example, a circumscribed circle for the moving object region MB) is obtained, and coordinate data representing the obtained circle is obtained as person position data (person region data) of the fourth mode. The coordinate data representing the circle includes, for example, coordinate data (x, y) and a radius r of the center point P4 in the circle. Further, for example, the coordinate data representing the circle includes coordinate data (x, y) of the gravity center position P4 in the moving object region MB and a preset radius r. In the case where the person position data of the fourth aspect is used, as will be described later, the person position indicator displayed on the mobile terminal device TA is an image represented by the person area data (x, y), r, that is, A circular image (circular contour image) HP4 is obtained.
 第5態様の人物位置データは、前記画像に写る人物の前記画像上での位置(例えば前記重心位置P1等)を含む前記人物の領域に関する人物領域データであり、前記人物領域データは、前記人物の領域の全部または一部を含む楕円形を表す座標データである。この第5態様の人物位置データが用いられる場合では、より具体的には、人物位置処理部143は、上述と同様に、人物の領域として動体領域MBを抽出し、図7に示すように、この動体領域MBの全部または一部を含む楕円形を求め、この求めた楕円形を表す座標データを第5態様の人物位置データ(人物領域データ)として求める。より詳しくは、人物位置処理部143は、人物の領域としての動体領域MBにおける輪郭線上に適当な1個の第1設定点P51を設け、この第1設定点P51から最も離れた前記輪郭線上の点を第2設定点P53として求め、これら第1設定点P51と第2設定点P53とを結ぶ線分P51P53を楕円形の長軸とする。次に、人物位置処理部143は、前記長軸とした第1設定点P51と第2設定点P53とを結ぶ線分P51P53の中央点P50を通り、前記線分P51P53と直交する直線と前記輪郭線との2交点を求め、これら求めた2交点それぞれと中央点P50との長さを求め、前記2交点のうちから、より長い長さを与える交点を、短軸と楕円との交点P52とする。これによって長軸、短軸および長軸と短軸との交点が求められ、楕円が求められる。この楕円形を表す座標データは、例えば、長軸と短軸との交点P50、楕円上の長軸との1交点(第1設定点)P51、および、楕円上の短軸との1交点P52の各座標データ(x0、y0)、(x1、y1)、(x2、y2)から成る。また例えば、前記楕円形を表す座標データは、長軸と短軸との交点P50の座標データ(x0、y0)、長軸(または短軸)とx軸(またはy軸)とのなす角度、長軸の長さおよび短軸の長さから成る。なお、第5態様の人物位置データが用いられる場合では、携帯端末装置TAに表示される人物位置標識は、前記人物領域データ(x0、y0)、(x1、y1)、(x2、y2)で表される画像、すなわち、楕円形の画像(楕円形の輪郭線の画像)HP5となる。 The person position data of the fifth aspect is person area data relating to the person area including a position (for example, the barycentric position P1) of the person shown in the image, and the person area data is the person area data. Coordinate data representing an ellipse including all or part of the region. In the case where the person position data of the fifth aspect is used, more specifically, the person position processing unit 143 extracts the moving object area MB as the person area as described above, and as shown in FIG. An ellipse including all or part of the moving object region MB is obtained, and coordinate data representing the obtained ellipse is obtained as person position data (person region data) of the fifth mode. More specifically, the person position processing unit 143 provides an appropriate one first set point P51 on the contour line in the moving body region MB as the person region, and the person position processing unit 143 is on the contour line farthest from the first set point P51. A point is obtained as a second set point P53, and a line segment P51P53 connecting the first set point P51 and the second set point P53 is defined as an elliptical long axis. Next, the person position processing unit 143 passes through the center point P50 of the line segment P51P53 connecting the first set point P51 and the second set point P53 as the long axis, and the straight line orthogonal to the line segment P51P53 and the contour The two intersections with the line are obtained, the lengths of each of the obtained two intersections and the center point P50 are obtained, and an intersection giving a longer length is selected from the two intersections as an intersection P52 between the minor axis and the ellipse. To do. As a result, the major axis, the minor axis, and the intersection of the major axis and the minor axis are obtained, and an ellipse is obtained. The coordinate data representing the ellipse is, for example, an intersection P50 between the major axis and the minor axis, one intersection (first set point) P51 with the major axis on the ellipse, and one intersection P52 with the minor axis on the ellipse. Coordinate data (x0, y0), (x1, y1), (x2, y2). In addition, for example, the coordinate data representing the ellipse includes coordinate data (x0, y0) of an intersection P50 between the major axis and the minor axis, an angle formed by the major axis (or minor axis) and the x axis (or y axis), It consists of the length of the major axis and the length of the minor axis. When the person position data of the fifth aspect is used, the person position indicator displayed on the mobile terminal device TA is the person area data (x0, y0), (x1, y1), (x2, y2). An image to be represented, that is, an elliptical image (an image of an elliptical outline) HP5.
 そして、人物位置処理部143は、このように生成した人物位置データをSU監視処理部142へ通知する。例えば、本実施形態では、第2態様の人物位置データが用いられ、人物位置処理部143は、この第2態様の人物位置データをSU監視処理部142へ通知する。以下、第2態様の人物位置データが用いられる場合について説明するが、他の態様の人物位置データが用いられる場合も同様に説明できる。 Then, the person position processing unit 143 notifies the SU monitoring processing unit 142 of the person position data generated in this way. For example, in the present embodiment, the person position data of the second mode is used, and the person position processing unit 143 notifies the SU monitoring processing unit 142 of the person position data of the second mode. Hereinafter, although the case where the person position data of the second aspect is used will be described, the same applies to the case where the person position data of another aspect is used.
 ナースコール処理部144は、ナースコール受付操作部13でナースコールを受付けた場合にその旨を管理サーバ装置SVへ通報し、SU音入出力部12等を用いることで端末装置SP、TAとの間で音声通話を行うものである。より具体的には、ナースコール処理部144は、ナースコール受付操作部13が入力操作されると、自機のセンサID、および、ナースコールを受付けた旨を表すナースコール受付情報を収容した第1ナースコール通知通信信号を、SU通信IF部15を介して管理サーバ装置SVへ送信する。そして、ナースコール処理部144は、SU音入出力部12等を用い、端末装置SP、TAとの間で例えばVoIP(Voice over Internet Protocol)によって音声通話を行う。 When the nurse call accepting operation unit 13 accepts the nurse call, the nurse call processing unit 144 reports the fact to the management server device SV and uses the SU sound input / output unit 12 or the like to communicate with the terminal devices SP and TA. Voice calls are made between the two. More specifically, when the nurse call reception operation unit 13 is input, the nurse call processing unit 144 stores the sensor ID of the own device and nurse call reception information indicating that the nurse call is received. A 1 nurse call notification communication signal is transmitted to the management server device SV via the SU communication IF unit 15. The nurse call processing unit 144 uses the SU sound input / output unit 12 and the like to perform a voice call with the terminal devices SP and TA, for example, by VoIP (Voice over Internet Protocol).
 SUストリーミング処理部145は、SU通信IF部15を介して固定端末装置SPまたは携帯端末装置TAから動画の配信の要求があった場合に、この要求のあった固定端末装置SPまたは携帯端末装置TAへ、撮像部11で生成した動画(例えばライブの動画)をストリーミング再生でSU通信IF部15を介して配信するものである。 When there is a video distribution request from the fixed terminal device SP or the portable terminal device TA via the SU communication IF unit 15, the SU streaming processing unit 145 receives the request from the fixed terminal device SP or the portable terminal device TA. The moving image generated by the imaging unit 11 (for example, a live moving image) is distributed via the SU communication IF unit 15 by streaming reproduction.
 図1には、一例として、4個の第1ないし第4センサ装置SU-1~SU-4が示されており、第1センサ装置SU-1は、被監視者Obの一人であるAさんOb-1の居室RM-1(不図示)に配設され、第2センサ装置SU-2は、被監視者Obの一人であるBさんOb-2の居室RM-2(不図示)に配設され、第3センサ装置SU-3は、被監視者Obの一人であるCさんOb-3の居室RM-3(不図示)に配設され、そして、第4センサ装置SU-4は、被監視者Obの一人であるDさんOb-4の居室RM-4(不図示)に配設されている。 FIG. 1 shows four first to fourth sensor devices SU-1 to SU-4 as an example, and the first sensor device SU-1 is one of the monitored persons Ob. The second sensor device SU-2 is arranged in a room RM-2 (not shown) of Mr. B Ob-2 who is one of the monitored persons Ob. The third sensor device SU-3 is disposed in the room RM-3 (not shown) of Mr. C Ob-3, one of the monitored subjects Ob, and the fourth sensor device SU-4 It is arranged in the room RM-4 (not shown) of Mr. D Ob-4, one of the monitored persons Ob.
 管理サーバ装置SVは、ネットワークNWを介して他の装置SU、TA、SPと通信する通信機能を備え、センサ装置SUから被監視者Obに関する前記検知結果、前記対象画像および前記人物位置データを受信して被監視者Obに対する監視に関する情報(監視情報)を管理し、これら受信した被監視者Obに関する前記検知結果、前記対象画像および前記人物位置データを所定の端末装置SP、TAへ報知(再報知、送信)する機器である。より具体的には、管理サーバ装置SVは、報知元のセンサ装置SU(センサID)と報知先(再報知先)の端末装置SP、TA(端末ID)との対応関係(報知先対応関係)、および、各装置SU、SP、TA(各ID)とその通信アドレスとの対応関係(通信アドレス対応関係)を予め記憶している。端末IDは、端末装置SP、TAを特定し識別するための識別子である。まず、管理サーバ装置SVは、第1監視情報通信信号を受信すると、この受信した第1監視情報通信信号における報知元(送信元)のセンサ装置と前記受信した第1監視情報通信信号に収容されたデータとを互いに対応付けて被監視者Obの監視情報として記憶(記録)する。そして、管理サーバ装置SVは、前記報知先対応関係から、前記受信した第1監視情報通信信号における報知元のセンサ装置SUに対応する報知先の端末装置SP、TAを特定し、この報知先の端末装置SP、TAへ第2監視情報通信信号を送信する。この第2監視情報通信信号には、前記受信した第1監視情報通信信号に収容されたセンサID、前記検知結果、前記対象画像および前記人物位置データ、ならびに、動画のダウンロード先として、前記受信した第1監視情報通信信号に収容されたセンサIDを持つセンサ装置SUに対応する通信アドレスが収容される。通信アドレスは、前記通信アドレス対応関係から取得される。また、管理サーバ装置SVは、第1ナースコール通知通信信号を受信すると、この受信した第1ナースコール通知通信信号における報知元のセンサ装置と前記受信した第1ナースコール通知通信信号に収容されたデータとを互いに対応付けて被監視者Obの監視情報として記憶(記録)する。そして、管理サーバ装置SVは、前記報知先対応関係から、前記受信した第1ナースコール通知通信信号における報知元のセンサ装置に対応する報知先の端末装置SP、TAを特定し、この報知先の端末装置SP、TAへ第2ナースコール通知通信信号を送信する。この第2ナースコール通知通信信号には、前記受信した第1ナースコール通知通信信号に収容されたセンサIDおよびナースコール受付情報が収容される。なお、第2ナースコール通知通信信号に動画のダウンロード先として、前記受信した第1ナースコール通知通信信号に収容されたセンサIDを持つセンサ装置SUに対応する通信アドレスが収容されても良い。管理サーバ装置SVは、クライアント(本実施形態では端末装置SP、TA等)の要求に応じたデータを前記クライアントに提供する。このような管理サーバ装置SVは、例えば、通信機能付きのコンピュータによって構成可能である。 The management server device SV has a communication function for communicating with other devices SU, TA, SP via the network NW, and receives the detection result, the target image, and the person position data regarding the monitored person Ob from the sensor device SU. Thus, information (monitoring information) related to monitoring of the monitored person Ob is managed, and the detection result, the target image and the person position data regarding the received monitored person Ob are notified (re-recorded) to the predetermined terminal devices SP and TA. Notification and transmission). More specifically, the management server device SV is a correspondence relationship between the notification source sensor device SU (sensor ID) and the notification destination (re-notification destination) terminal device SP, TA (terminal ID) (notification destination correspondence relationship). , And the correspondence relationship (communication address correspondence relationship) between each device SU, SP, TA (each ID) and its communication address is stored in advance. The terminal ID is an identifier for identifying and identifying the terminal devices SP and TA. First, when the management server device SV receives the first monitoring information communication signal, the management server device SV is accommodated in the notification device (transmission source) sensor device and the received first monitoring information communication signal in the received first monitoring information communication signal. Are stored (recorded) as monitoring information of the monitored person Ob in association with each other. Then, the management server device SV identifies the notification destination terminal devices SP and TA corresponding to the notification source sensor device SU in the received first monitoring information communication signal from the notification destination correspondence relationship, and The second monitoring information communication signal is transmitted to the terminal devices SP and TA. In the second monitoring information communication signal, the sensor ID contained in the received first monitoring information communication signal, the detection result, the target image and the person position data, and the received as the download destination of the moving image. A communication address corresponding to the sensor device SU having the sensor ID accommodated in the first monitoring information communication signal is accommodated. The communication address is obtained from the communication address correspondence relationship. In addition, when the management server device SV receives the first nurse call notification communication signal, the management server device SV is accommodated in the notification source sensor device in the received first nurse call notification communication signal and the received first nurse call notification communication signal. The data is associated with each other and stored (recorded) as monitoring information of the monitored person Ob. Then, the management server device SV identifies the notification destination terminal devices SP and TA corresponding to the notification source sensor device in the received first nurse call notification communication signal from the notification destination correspondence relationship, and A second nurse call notification communication signal is transmitted to the terminal devices SP and TA. The second nurse call notification communication signal includes the sensor ID and nurse call reception information stored in the received first nurse call notification communication signal. The second nurse call notification communication signal may include a communication address corresponding to the sensor device SU having the sensor ID stored in the received first nurse call notification communication signal as a download destination of the moving image. The management server device SV provides the client with data corresponding to the request of the client (terminal device SP, TA, etc. in this embodiment). Such a management server device SV can be configured by, for example, a computer with a communication function.
 固定端末装置SPは、ネットワークNWを介して他の装置SU、SV、TAと通信する通信機能、所定の情報を表示する表示機能、および、所定の指示やデータを入力する入力機能等を備え、管理サーバ装置SVや携帯端末装置TAに与える所定の指示やデータを入力したり、センサ装置SUで得られた監視情報を表示したり等することによって、被監視者監視システムMSのユーザインターフェース(UI)として機能する機器である。このような固定端末装置SPは、例えば、通信機能付きのコンピュータによって構成可能である。なお、前記端末装置の一例としての固定端末装置SPは、携帯端末装置TAと同様に動作するが、本明細書では、前記端末装置の他の一例である携帯端末装置TAについて説明される。 The fixed terminal device SP includes a communication function for communicating with other devices SU, SV, TA via the network NW, a display function for displaying predetermined information, an input function for inputting predetermined instructions and data, and the like. The user interface (UI) of the monitored person monitoring system MS is input by inputting predetermined instructions and data to be given to the management server device SV and the portable terminal device TA, or displaying the monitoring information obtained by the sensor device SU. ). Such a fixed terminal device SP can be configured by, for example, a computer with a communication function. The fixed terminal device SP as an example of the terminal device operates in the same manner as the mobile terminal device TA. However, in this specification, a mobile terminal device TA that is another example of the terminal device will be described.
 携帯端末装置TAは、ネットワークNWを介して他の装置SV、SP、SUと通信する通信機能、所定の情報を表示する表示機能、所定の指示やデータを入力する入力機能、および、音声通話を行う通話機能等を備え、管理サーバ装置SVやセンサ装置SUに与える所定の指示やデータを入力したり、管理サーバ装置SVからの通報によってセンサ装置SUで得られた監視情報(動画を含む)を表示したり、センサ装置SUとの間で音声通話によってナースコールの応答や声かけしたり等するための機器である。 The mobile terminal device TA has a communication function for communicating with other devices SV, SP, SU via the network NW, a display function for displaying predetermined information, an input function for inputting predetermined instructions and data, and a voice call. A monitoring function (including moving images) obtained from the sensor device SU by inputting a predetermined instruction or data to be provided to the management server device SV or the sensor device SU, or by a report from the management server device SV. This is a device for displaying or answering or calling a nurse call by voice call with the sensor device SU.
 このような携帯端末装置TAは、本実施形態では、例えば、図8に示すように、端末側通信インターフェース部(TA通信IF部)31と、端末側制御処理部(TA制御処理部)32と、端末側記憶部(TA記憶部)33と、端末側音入出力部(TA音入出力部)34と、端末側入力部(TA入力部)35と、端末側表示部(TA表示部)36と、端末側インターフェース部(TAIF部)37とを備える。 In the present embodiment, for example, as shown in FIG. 8, such a portable terminal device TA includes a terminal-side communication interface unit (TA communication IF unit) 31, a terminal-side control processing unit (TA control processing unit) 32, and A terminal side storage unit (TA storage unit) 33, a terminal side sound input / output unit (TA sound input / output unit) 34, a terminal side input unit (TA input unit) 35, and a terminal side display unit (TA display unit). 36 and a terminal-side interface unit (TAIF unit) 37.
 TA通信IF部31は、SU通信IF部15と同様に、TA制御処理部32に接続され、TA制御処理部32の制御に従って通信を行うための通信回路である。TA通信IF部31は、例えば、IEEE802.11規格等に従った通信インターフェース回路を備えて構成される。 The TA communication IF unit 31 is a communication circuit that is connected to the TA control processing unit 32 and performs communication according to the control of the TA control processing unit 32, similarly to the SU communication IF unit 15. The TA communication IF unit 31 includes, for example, a communication interface circuit that conforms to the IEEE 802.11 standard or the like.
 TA音入出力部34は、SU音入出力部12と同様に、TA制御処理部32に接続され、TA制御処理部32の制御に従って音を表す電気信号に応じた音を生成して出力するための回路であって、外部の音を取得して携帯端末装置TAに入力するための回路である。 Similar to the SU sound input / output unit 12, the TA sound input / output unit 34 is connected to the TA control processing unit 32, and generates and outputs a sound corresponding to an electrical signal representing sound according to the control of the TA control processing unit 32. This is a circuit for acquiring external sound and inputting it to the mobile terminal device TA.
 TA入力部35は、TA制御処理部32に接続され、例えば、所定の操作を受け付け、携帯端末装置TAに入力する回路であり、例えば、所定の機能を割り付けられた複数の入力スイッチ等である。前記所定の操作には、例えば、ログインするためのIDの入力操作や、音声通話の要求操作およびその終了操作や、ライブでの動画の要求操作およびその終了操作や、前記報知された被監視者Obに対する例えば救命、看護、介護および介助等の対応(対処、応対)を実行する意思がある旨(“対応する”)の入力操作や、画像に写る人物を拡大して表示する指示(命令、コマンド)である拡大表示指示の入力操作等の、監視する上で必要な各種操作等が含まれる。TA表示部36は、TA制御処理部32に接続され、TA制御処理部32の制御に従って、TA入力部35から入力された所定の操作内容や、被監視者監視システムMSによって監視されている被監視者Obに対する監視に関する前記監視情報(例えばセンサ装置SUで検知した所定の行動の種類や、被監視者Obの画像(静止画および動画)や、人物位置データに基づく人物の位置を表す人物位置標識や、ナースコールの受付等)等を表示する回路であり、例えばLCD(液晶ディスプレイ)および有機ELディスプレイ等の表示装置である。そして、本実施形態では、TA入力部35およびTA表示部36からタッチパネルが構成されている。この場合において、TA入力部35は、例えば抵抗膜方式や静電容量方式等の操作位置を検出して入力する位置入力デバイスである。このタッチパネルでは、TA表示部36の表示面上に位置入力デバイスが設けられ、TA表示部36に入力可能な1または複数の入力内容の候補が表示され、例えば看護師や介護士等のユーザ(監視者)が、入力したい入力内容を表示した表示位置を触れると、位置入力デバイスによってその位置が検出され、検出された位置に表示された表示内容がユーザの操作入力内容として携帯端末装置TAに入力される。 The TA input unit 35 is connected to the TA control processing unit 32 and is, for example, a circuit that accepts a predetermined operation and inputs it to the mobile terminal device TA, for example, a plurality of input switches assigned with a predetermined function. . Examples of the predetermined operation include an ID input operation for logging in, a voice call request operation and its end operation, a live video request operation and its end operation, and the informed monitored person. For example, an input operation indicating that there is an intention to perform a response (response, response) such as lifesaving, nursing, care, and assistance to “Ob” (“correspond”), and an instruction (command, Various operations necessary for monitoring, such as an input operation of an enlarged display instruction that is a command) are included. The TA display unit 36 is connected to the TA control processing unit 32, and under the control of the TA control processing unit 32, the predetermined operation content input from the TA input unit 35 and the monitored target MS monitored by the monitored person monitoring system MS. The monitoring information related to the monitoring of the monitoring person Ob (for example, the type of the predetermined action detected by the sensor device SU, the image (still image and moving image) of the monitored person Ob, and the person position representing the position of the person based on the person position data) For example, a display device such as an LCD (Liquid Crystal Display) and an organic EL display. In the present embodiment, the TA input unit 35 and the TA display unit 36 constitute a touch panel. In this case, the TA input unit 35 is a position input device that detects and inputs an operation position such as a resistance film method or a capacitance method. In this touch panel, a position input device is provided on the display surface of the TA display unit 36, and one or more input content candidates that can be input are displayed on the TA display unit 36. For example, a user such as a nurse or a caregiver ( When the monitor) touches the display position where the input content to be input is displayed, the position is detected by the position input device, and the display content displayed at the detected position is input to the portable terminal device TA as the operation input content of the user. Entered.
 TAIF部37は、TA制御処理部32に接続され、TA制御処理部32の制御に従って、外部機器との間でデータの入出力を行う回路であり、例えば、Bluetooth(登録商標)規格を用いたインターフェース回路、IrDA規格等の赤外線通信を行うインターフェース回路、および、USB規格を用いたインターフェース回路等である。 The TAIF unit 37 is a circuit that is connected to the TA control processing unit 32 and inputs / outputs data to / from an external device according to the control of the TA control processing unit 32. For example, the TAIF unit 37 uses the Bluetooth (registered trademark) standard. An interface circuit, an interface circuit that performs infrared communication such as the IrDA standard, and an interface circuit that uses the USB standard.
 TA記憶部33は、TA制御処理部32に接続され、TA制御処理部32の制御に従って、各種の所定のプログラムおよび各種の所定のデータを記憶する回路である。前記各種の所定のプログラムには、例えば、携帯端末装置TAの各部を当該各部の機能に応じてそれぞれ制御するTA制御プログラムや、センサ装置SUから管理サーバ装置SVを介して受信した前記検知結果や前記ナースコール等の被監視者Obに対する監視に関する監視情報を記憶(記録)し、前記検知結果や前記ナースコールを表示するTA監視処理プログラムや、TA音入出力部34等を用いることでセンサ装置SUとの間で音声通話を行うTA通話処理プログラムや、センサ装置SUから動画の配信を受け、前記配信を受けた動画をストリーミング再生でTA表示部36に表示するストリーミング処理プログラム等の制御処理プログラムが含まれる。TA監視処理プログラムには、第2監視情報通信信号に収容された検知結果および画像データの画像をTA表示部36に表示し、前記第2監視情報通信信号に収容された人物位置データに基づく人物位置標識をTA表示部36に表示した画像中に表示する表示処理プログラムが含まれる。前記各種の所定のデータでは、自機の端末ID、TA表示部36に表示される画面情報、および、被監視者Obに対する監視に関する前記監視情報等の各プログラムを実行する上で必要なデータ等が含まれる。TA記憶部33は、例えばROMやEEPROM等を備える。TA記憶部33は、前記所定のプログラムの実行中に生じるデータ等を記憶するいわゆるTA制御処理部32のワーキングメモリとなるRAM等を含む。そして、TA記憶部33は、前記監視情報を記憶するための端末側監視情報記憶部(TA監視情報記憶部)331を機能的に備える。 The TA storage unit 33 is a circuit that is connected to the TA control processing unit 32 and stores various predetermined programs and various predetermined data under the control of the TA control processing unit 32. The various predetermined programs include, for example, a TA control program for controlling each unit of the mobile terminal device TA according to the function of each unit, the detection result received from the sensor device SU via the management server device SV, Sensor information is stored by recording (recording) monitoring information relating to monitoring of the monitored person Ob, such as the nurse call, and using the TA monitoring processing program for displaying the detection result and the nurse call, the TA sound input / output unit 34, and the like. Control processing program such as a TA call processing program for making a voice call with the SU and a streaming processing program for receiving the distribution of the moving image from the sensor device SU and displaying the distributed moving image on the TA display unit 36 by streaming playback Is included. In the TA monitoring processing program, the detection result and the image data stored in the second monitoring information communication signal are displayed on the TA display unit 36, and the person based on the person position data stored in the second monitoring information communication signal is displayed. A display processing program for displaying the position indicator in the image displayed on the TA display unit 36 is included. In the various kinds of predetermined data, data necessary for executing each program such as the terminal ID of the own device, the screen information displayed on the TA display unit 36, and the monitoring information related to monitoring of the monitored person Ob, etc. Is included. The TA storage unit 33 includes, for example, a ROM and an EEPROM. The TA storage unit 33 includes a RAM serving as a working memory for the so-called TA control processing unit 32 that stores data generated during execution of the predetermined program. The TA storage unit 33 functionally includes a terminal-side monitoring information storage unit (TA monitoring information storage unit) 331 for storing the monitoring information.
 TA監視情報記憶部331は、各装置SV、SP、SUそれぞれとの間で送受信した被監視者Obの監視情報を記憶するものである。より具体的には、TA監視情報記憶部331は、本実施形態では、前記監視情報として、管理サーバ装置SVから受信した第2監視情報通信信号に収容されたセンサID、検知結果、対象画像、人物位置データおよび動画のダウンロード先のセンサ装置SUの通信アドレス、ならびに、当該第2監視情報通信信号の受信時刻等を互いに対応付けて記憶し、管理サーバ装置SVから受信した第2ナースコール通知通信信号に収容されたセンサIDおよびナースコール受付情報、ならびに、当該第2ナースコール通知通信信号の受信時刻等を互いに対応付けて記憶する。 The TA monitoring information storage unit 331 stores monitoring information of the monitored person Ob transmitted / received to / from each of the devices SV, SP, and SU. More specifically, in the present embodiment, the TA monitoring information storage unit 331 includes, as the monitoring information, the sensor ID, the detection result, the target image, and the sensor ID that are accommodated in the second monitoring information communication signal received from the management server device SV. The second nurse call notification communication received from the management server device SV is stored in association with the communication address of the sensor device SU to which the person position data and video are downloaded, the reception time of the second monitoring information communication signal, and the like. The sensor ID and nurse call reception information accommodated in the signal, the reception time of the second nurse call notification communication signal, and the like are stored in association with each other.
 TA制御処理部32は、携帯端末装置TAの各部を当該各部の機能に応じてそれぞれ制御し、被監視者Obに対する前記監視情報を受け付けて表示し、ナースコールの応答や声かけするための回路である。TA制御処理部32は、例えば、CPUおよびその周辺回路を備えて構成される。TA制御処理部32は、制御処理プログラムが実行されることによって、端末側制御部(TA制御部)321、端末側監視処理部(TA監視処理部)322、通話処理部323および端末側ストリーミング処理部(TAストリーミング処理部)324を機能的に備える。TA監視処理部322は、表示処理部3221を機能的に備える。 The TA control processing unit 32 is a circuit for controlling each unit of the mobile terminal device TA according to the function of each unit, receiving and displaying the monitoring information for the monitored person Ob, and answering or calling a nurse call. It is. The TA control processing unit 32 includes, for example, a CPU and its peripheral circuits. The TA control processing unit 32 executes a terminal processing unit (TA control unit) 321, a terminal side monitoring processing unit (TA monitoring processing unit) 322, a call processing unit 323, and a terminal side streaming process by executing a control processing program. Unit (TA streaming processing unit) 324 functionally. The TA monitoring processing unit 322 functionally includes a display processing unit 3221.
 TA制御部321は、携帯端末装置TAの各部を当該各部の機能に応じてそれぞれ制御し、携帯端末装置TAの全体制御を司るものである。 The TA control unit 321 controls each part of the mobile terminal apparatus TA according to the function of each part, and controls the entire mobile terminal apparatus TA.
 TA監視処理部322は、センサ装置SUから管理サーバ装置SVを介して受信した前記検知結果や前記ナースコール等の被監視者Obに対する監視に関する監視情報を記憶(記録)し、前記検知結果や前記ナースコールを表示するものである。より具体的には、TA監視処理部322は、管理サーバ装置SVから第2監視情報通信信号を受信すると、この受信した第2監視情報通信信号に収容された、被監視者Obの監視情報をTA監視情報記憶部331に記憶(記録)する。TA監視処理部322は、表示処理部3221によって、前記受信した第2監視情報通信信号に収容された各情報に応じた画面をTA表示部36に表示する。より詳しくは、表示処理部3221は、前記受信した第2監視情報通信信号に収容された前記検知結果および前記対象画像をTA表示部36に表示し、前記受信した第2監視情報通信信号に収容された前記人物位置データに基づく人物位置標識を、TA表示部36に表示した前記対象画像中に表示する。そして、本実施形態では、表示処理部3221は、TA入力部35で前記拡大表示指示を受け付けた場合に、TA表示部36に表示中の前記人物の領域の大きさよりも拡大して前記人物の領域を表示する。TA監視処理部322は、管理サーバ装置SVから第2ナースコール通知通信信号を受信すると、この受信した第2ナースコール通知通信信号に収容された、被監視者Obの監視情報をTA監視情報記憶部331に記憶(記録)する。TA監視処理部322は、前記受信した第2ナースコール通知通信信号に収容されたナースコール受付情報に応じて、TA記憶部33に予め記憶されたナースコール受付画面をTA表示部36に表示する。そして、TA監視処理部322は、TA入力部35から所定の入力操作を受け付けると、その入力操作に応じた所定の処理を実行する。 The TA monitoring processing unit 322 stores (records) monitoring information related to monitoring of the monitored person Ob such as the detection result or the nurse call received from the sensor device SU via the management server device SV. A nurse call is displayed. More specifically, when the TA monitoring processing unit 322 receives the second monitoring information communication signal from the management server device SV, the TA monitoring processing unit 322 displays the monitoring information of the monitored person Ob contained in the received second monitoring information communication signal. Store (record) in the TA monitoring information storage unit 331. The TA monitor processor 322 causes the display processor 3221 to display a screen corresponding to each piece of information contained in the received second monitor information communication signal on the TA display 36. More specifically, the display processing unit 3221 displays the detection result and the target image accommodated in the received second monitoring information communication signal on the TA display unit 36, and accommodates in the received second monitoring information communication signal. A person position indicator based on the person position data thus displayed is displayed in the target image displayed on the TA display unit 36. In this embodiment, when the display processing unit 3221 receives the enlarged display instruction from the TA input unit 35, the display processing unit 3221 enlarges the size of the person's area being displayed on the TA display unit 36, and Display area. When the TA monitoring processing unit 322 receives the second nurse call notification communication signal from the management server device SV, the TA monitoring processing unit 322 stores the monitoring information of the monitored subject Ob contained in the received second nurse call notification communication signal. Store (record) in the unit 331. The TA monitor processing unit 322 displays a nurse call reception screen pre-stored in the TA storage unit 33 on the TA display unit 36 according to the nurse call reception information accommodated in the received second nurse call notification communication signal. . Then, when receiving a predetermined input operation from the TA input unit 35, the TA monitoring processing unit 322 executes a predetermined process corresponding to the input operation.
 通話処理部323は、TA音入出力部34等を用いることでセンサ装置SUとの間で音声通話を行うものである。より具体的には、通話処理部323は、TA音入出力部34等を用い、第1監視情報通信信号や第1ナースコール通報通信信号を管理サーバ装置SVへ送信した報知元のセンサ装置SUや、携帯端末装置TAのユーザ(監視者)によって選択され指定されたセンサ装置SU等との間で例えばVoIPによって音声通話を行う。 The call processing unit 323 performs a voice call with the sensor device SU by using the TA sound input / output unit 34 or the like. More specifically, the call processing unit 323 uses the TA sound input / output unit 34 and the like, and the notification source sensor device SU that has transmitted the first monitoring information communication signal and the first nurse call notification communication signal to the management server device SV. In addition, a voice call is performed, for example, by VoIP with the sensor device SU or the like selected and designated by the user (monitor) of the mobile terminal device TA.
 TAストリーミング処理部324は、センサ装置SUから動画の配信を受け、前記配信を受けた動画をストリーミング再生でTA表示部36に表示するものである。 The TA streaming processing unit 324 receives the distribution of the moving image from the sensor device SU, and displays the distributed moving image on the TA display unit 36 by streaming reproduction.
 このような携帯端末装置TAは、例えば、いわゆるタブレット型コンピュータやスマートフォンや携帯電話機等の、持ち運び可能な通信端末装置によって構成可能である。 Such a portable terminal device TA can be configured by a portable communication terminal device such as a so-called tablet computer, a smartphone, or a mobile phone.
 なお、本実施形態では、タッチパネルを構成するTA入力部35およびTA表示部36が、前記画像に写る人物を拡大して表示する指示である拡大表示指示を受け付ける指示入力部(第1指示入力部)の一例に相当し、そして、所定の操作を行う指示である操作指示を受け付ける第2指示入力部の一例にも相当する。 In the present embodiment, the TA input unit 35 and the TA display unit 36 constituting the touch panel have an instruction input unit (first instruction input unit) that receives an enlarged display instruction that is an instruction to enlarge and display a person shown in the image. ) And an example of a second instruction input unit that receives an operation instruction that is an instruction to perform a predetermined operation.
 次に、本実施形態の動作について説明する。図9は、実施形態の被監視者監視システムにおけるセンサ装置の監視情報に関する動作を示すフローチャートである。図10は、実施形態の被監視者監視システムにおける携帯端末装置の監視情報に関する動作を示すフローチャートである。図11は、実施形態の被監視者監視システムにおける携帯端末装置に表示される待受け画面の一例を示す図である。図12は、実施形態の被監視者監視システムにおける携帯端末装置に表示される監視情報画面の一例を示す図である。図13は、実施形態の被監視者監視システムにおける携帯端末装置に表示される、人物の領域を拡大した監視情報画面の一例を示す図である。 Next, the operation of this embodiment will be described. FIG. 9 is a flowchart illustrating an operation related to monitoring information of the sensor device in the monitored person monitoring system according to the embodiment. FIG. 10 is a flowchart illustrating an operation related to monitoring information of the mobile terminal device in the monitored person monitoring system according to the embodiment. FIG. 11 is a diagram illustrating an example of a standby screen displayed on the mobile terminal device in the monitored person monitoring system according to the embodiment. FIG. 12 is a diagram illustrating an example of a monitoring information screen displayed on the mobile terminal device in the monitored person monitoring system according to the embodiment. FIG. 13 is a diagram illustrating an example of a monitoring information screen in which a person's area is enlarged, which is displayed on the mobile terminal device in the monitored person monitoring system according to the embodiment.
 このような被監視者監視システムMSでは、各装置SU、SV、SP、TAは、電源が投入されると、必要な各部の初期化を実行し、その稼働を始める。センサ装置SUでは、その制御処理プログラムの実行によって、SU制御処理部14には、SU制御部141、SU監視処理部142、人物位置処理部143、ナースコール処理部144およびSUストリーミング処理部145が機能的に構成される。携帯端末装置TAでは、その制御処理プログラムの実行によって、TA制御処理部32には、TA制御部321、TA監視処理部322、通話処理部323およびTAストリーミング処理部324が機能的に構成され、TA監視処理部322には、表示処理部3221が機能的に構成される。 In such a monitored person monitoring system MS, each device SU, SV, SP, TA performs initialization of each necessary part and starts its operation when the power is turned on. In the sensor device SU, by executing the control processing program, the SU control processing unit 14 includes the SU control unit 141, the SU monitoring processing unit 142, the person position processing unit 143, the nurse call processing unit 144, and the SU streaming processing unit 145. Functionally configured. In the portable terminal device TA, by executing the control processing program, the TA control processing unit 32 is functionally configured with a TA control unit 321, a TA monitoring processing unit 322, a call processing unit 323, and a TA streaming processing unit 324. A display processing unit 3221 is functionally configured in the TA monitoring processing unit 322.
 センサ装置SUの動作について説明する。センサ装置SUは、各フレームごとに、あるいは、数フレームおきに、次のように動作することで、被監視者Obにおける所定の動作を検知し、ナースコールの受付の有無を判定している。 The operation of the sensor device SU will be described. The sensor device SU operates as follows for each frame or every several frames, thereby detecting a predetermined operation in the monitored person Ob and determining whether or not a nurse call is accepted.
 図9において、まず、センサ装置SUは、SU制御処理部14のSU制御部141によって、撮像部11から1フレーム分の画像(画像データ)を前記対象画像として取得する(S11)。 9, first, the sensor device SU acquires an image (image data) for one frame from the imaging unit 11 as the target image by the SU control unit 141 of the SU control processing unit 14 (S11).
 次に、センサ装置SUは、SU制御処理部14のSU監視処理部142によって、被監視者Obにおける所定の行動を検知する行動検知処理を前記取得した対象画像に基づいて実行する(S12)。 Next, the sensor device SU performs a behavior detection process for detecting a predetermined behavior in the monitored person Ob by the SU monitoring processing unit 142 of the SU control processing unit 14 based on the acquired target image (S12).
 次に、センサ装置SUは、SU監視処理部142によって、前記行動検知処理S12で被監視者Obにおける所定の行動が検知されたか否かを判定する。この判定の結果、前記行動が検知されていない場合(No)には、センサ装置SUは、次に、処理S16を実行し、一方、前記行動が検知されている場合(Yes)には、センサ装置SUは、次の処理S14および処理S15を順次に実行した後に、処理S16を実行する。 Next, the sensor device SU determines whether or not a predetermined behavior in the monitored person Ob is detected by the SU monitoring processing unit 142 in the behavior detection processing S12. As a result of this determination, when the action is not detected (No), the sensor device SU next executes the process S16, while when the action is detected (Yes), the sensor device SU The device SU executes step S16 after sequentially executing the next step S14 and step S15.
 この処理S14では、センサ装置SUは、SU制御処理部14の人物位置処理部143によって、前記取得した対象画像に写る人物の前記画像での位置に関する人物位置データを生成する。人物位置データは、上述の第1ないし第5態様の人物位置データのうちのいずれであって良いが、本実施形態では、例えば、第2態様の人物位置データが採用されている。この場合では、人物位置処理部143は、まず、前記取得した対象画像から例えば背景差分法やフレーム差分法によって人物の領域として動体領域MBを抽出する。あるいは、人物位置処理部143は、処理S12でSU監視処理部142によって抽出された動体領域MBをSU監視処理部142から取得する。次に、人物位置処理部143は、この動体領域MBの全部または一部を含む矩形を求め、この求めた矩形を表す座標データを人物位置データ(人物領域データ)として求める。前記矩形を表す座標データは、例えば、矩形における1つの対角線の両端点の各座標データから成る(図4に示す例ではP20(x0、y0)、P21(x1、y1))。そして、人物位置処理部143は、この求めた人物位置データ(人物領域データ)をSU監視処理部142へ通知する。なお、動体領域MBが複数である場合には、人物位置データは、前記複数の動体領域MBそれぞれに対応する複数であって良い。 In this process S14, the sensor apparatus SU uses the person position processing unit 143 of the SU control processing unit 14 to generate person position data related to the position of the person in the acquired target image in the image. The person position data may be any of the person position data in the first to fifth aspects described above. In the present embodiment, for example, the person position data in the second aspect is employed. In this case, the person position processing unit 143 first extracts the moving object region MB as a person region from the acquired target image by, for example, the background difference method or the frame difference method. Alternatively, the person position processing unit 143 acquires the moving object area MB extracted by the SU monitoring processing unit 142 in the process S12 from the SU monitoring processing unit 142. Next, the person position processing unit 143 obtains a rectangle that includes all or part of the moving object area MB, and obtains coordinate data representing the obtained rectangle as person position data (person area data). The coordinate data representing the rectangle includes, for example, coordinate data of both end points of one diagonal line in the rectangle (P20 (x0, y0), P21 (x1, y1) in the example shown in FIG. 4)). Then, the person position processing unit 143 notifies the SU monitoring processing unit 142 of the obtained person position data (person area data). When there are a plurality of moving object regions MB, the person position data may be a plurality corresponding to each of the plurality of moving object regions MB.
 前記処理S14の次の処理S15では、処理S12および処理S13で検知した所定の行動を管理サーバ装置SVを介して所定の端末装置SP、TAへ報知するために、センサ装置SUは、SU監視処理部142によって、第1監視情報通信信号を管理サーバ装置SVへ送信する。より具体的には、SU監視処理部142は、自機のセンサID、前記検知結果、前記対象画像および前記人物位置データ(この例では矩形の人物領域データ)を収容した第1監視情報通信信号を、SU通信IF部15を介して管理サーバ装置SVへ送信する。したがって、検知結果を報知するための第1監視情報通信信号には、被監視者Obの画像として対象画像が前記検知結果と共に含まれ、前記対象画像に写る人物の位置に関する人物位置データが前記対象画像と共に含まれる。 In the process S15 subsequent to the process S14, the sensor apparatus SU performs the SU monitoring process in order to notify the predetermined action detected in the processes S12 and S13 to the predetermined terminal apparatuses SP and TA via the management server apparatus SV. The unit 142 transmits the first monitoring information communication signal to the management server device SV. More specifically, the SU monitoring processing unit 142 is a first monitoring information communication signal that contains its own sensor ID, the detection result, the target image, and the person position data (rectangular person area data in this example). Is transmitted to the management server device SV via the SU communication IF unit 15. Therefore, the first monitoring information communication signal for notifying the detection result includes the target image as the image of the monitored person Ob together with the detection result, and the person position data related to the position of the person shown in the target image is the target. Included with the image.
 前記処理S16では、センサ装置SUは、SU制御処理部14のナースコール処理部144によって、ナースコールを受け付けているか否かを判定する。すなわち、図9に示す処理S11ないし処理S17は、各フレームごとに、あるいは、数フレームおきに、繰り返し実行されるが、前回における処理S16の実行から今般における処理S16の実行までの間に、ナースコール受付操作部13が操作されたか否かが判定される。この判定の結果、ナースコール受付操作部13が操作されず、ナースコールを受け付けていない場合(No)には、センサ装置SUは、今回の本処理を終了し、一方、ナースコール受付操作部13が操作され、ナースコールを受け付けている場合(Yes)には、センサ装置SUは、次の処理S17を実行した後に、今回の本処理を終了する。 In the processing S16, the sensor device SU determines whether or not the nurse call is accepted by the nurse call processing unit 144 of the SU control processing unit 14. That is, the processes S11 to S17 shown in FIG. 9 are repeatedly executed for each frame or every several frames, but during the period from the previous execution of the process S16 to the current execution of the process S16, It is determined whether or not the call reception operation unit 13 has been operated. As a result of this determination, when the nurse call reception operation unit 13 is not operated and the nurse call is not received (No), the sensor device SU ends the current process, while the nurse call reception operation unit 13 Is operated and the nurse call is accepted (Yes), the sensor device SU ends the present process after executing the next process S17.
 この処理S17では、処理S16でその受付が判明したナースコールを管理サーバ装置SVを介して所定の端末装置SP、TAへ報知するために、センサ装置SUは、ナースコール処理部144によって、第1ナースコール通知通信信号を管理サーバ装置SVへ送信する。より具体的には、ナースコール処理部144は、自機のセンサIDおよびナースコール受付情報を収容した第1ナースコール通知通信信号を、SU通信IF部15を介して管理サーバ装置SVへ送信する。 In this process S17, in order to notify the nurse call that has been accepted in process S16 to the predetermined terminal devices SP and TA via the management server device SV, the sensor device SU performs the first call by the nurse call processing unit 144. A nurse call notification communication signal is transmitted to the management server device SV. More specifically, the nurse call processing unit 144 transmits a first nurse call notification communication signal containing the sensor ID of the own device and nurse call reception information to the management server device SV via the SU communication IF unit 15. .
 被監視者Obにおける所定の行動の検知やナースコールの受け付けに関し、センサ装置SUは、以上のように動作している。 The sensor device SU operates as described above with respect to detection of a predetermined action and reception of a nurse call in the monitored person Ob.
 管理サーバ装置SVは、第1監視情報通信信号をネットワークNWを介してセンサ装置SUから受信すると、この第1監視情報通信信号に収容されたセンサID、判定結果、対象画像および人物位置データ(本実施形態では矩形の人物領域データ)等を、このセンサIDを持つセンサ装置SUで監視されている被監視者Obの監視情報として記憶(記録)する。そして、管理サーバ装置SVは、前記報知先対応関係から、前記受信した第1監視情報通信信号における報知元のセンサ装置SUに対応する報知先の端末装置SP、TAを特定し、この報知先の端末装置SP、TAへ第2監視情報通信信号を送信する。また、管理サーバ装置SVは、第1ナースコール通知通信信号をネットワークNWを介してセンサ装置SUから受信すると、この第1ナースコール通知通信信号に収容されたセンサIDおよびナースコール受付情報等を、このセンサIDを持つセンサ装置SUで監視されている被監視者Obの監視情報として記憶(記録)する。そして、管理サーバ装置SVは、前記報知先対応関係から、前記受信した第1ナースコール通知通信信号における報知元のセンサ装置SUに対応する報知先の端末装置SP、TAを特定し、この報知先の端末装置SP、TAへ第2ナースコール通知通信信号を送信する。 When receiving the first monitoring information communication signal from the sensor device SU via the network NW, the management server device SV receives the sensor ID, the determination result, the target image, and the person position data (this book) contained in the first monitoring information communication signal. In the embodiment, rectangular person area data) is stored (recorded) as monitoring information of the monitored person Ob monitored by the sensor device SU having this sensor ID. Then, the management server device SV identifies the notification destination terminal devices SP and TA corresponding to the notification source sensor device SU in the received first monitoring information communication signal from the notification destination correspondence relationship, and The second monitoring information communication signal is transmitted to the terminal devices SP and TA. Also, when the management server device SV receives the first nurse call notification communication signal from the sensor device SU via the network NW, the management server device SV obtains the sensor ID, nurse call reception information, etc. accommodated in the first nurse call notification communication signal, The information is stored (recorded) as monitoring information of the monitored person Ob monitored by the sensor device SU having this sensor ID. Then, the management server device SV identifies the notification destination terminal devices SP and TA corresponding to the notification source sensor device SU in the received first nurse call notification communication signal from the notification destination correspondence relationship, and this notification destination The second nurse call notification communication signal is transmitted to the terminal devices SP and TA.
 固定端末装置SPおよび携帯端末装置TAは、前記第2監視情報通信信号をネットワークNWを介して管理サーバ装置SVから受信すると、この第2監視情報通信信号に収容された被監視者Obの監視に関する前記監視情報を表示する。携帯端末装置TAによるこの監視情報を表示する動作については、以下で詳述する。また、固定端末装置SPおよび携帯端末装置TAは、前記第2ナースコール通知通信信号をネットワークNWを介して管理サーバ装置SVから受信すると、この第2ナースコール通知通信信号に収容されたセンサIDを持つセンサ装置SUで監視されている被監視者Obからナースコールを受け付けたことを表示する。このような動作によって、被監視者監視システムMSは、各センサ装置SU、管理サーバ装置SV、固定端末装置SPおよび携帯端末装置TAによって、大略、各被監視者Obにおける所定の行動を検知して各被監視者Obを監視している。 When the fixed terminal device SP and the mobile terminal device TA receive the second monitoring information communication signal from the management server device SV via the network NW, the fixed terminal device SP and the portable terminal device TA relate to monitoring the monitored person Ob accommodated in the second monitoring information communication signal. The monitoring information is displayed. The operation of displaying the monitoring information by the mobile terminal device TA will be described in detail below. Further, when the fixed terminal device SP and the portable terminal device TA receive the second nurse call notification communication signal from the management server device SV via the network NW, the sensor ID contained in the second nurse call notification communication signal is received. It is displayed that a nurse call has been received from the monitored person Ob monitored by the sensor device SU. By such an operation, the monitored person monitoring system MS detects a predetermined action in each monitored person Ob roughly by each sensor device SU, management server device SV, fixed terminal device SP, and portable terminal device TA. Each monitored person Ob is monitored.
 次に、端末装置SP、TAにおける、被監視者Obの監視に関する前記監視情報を表示する動作およびそれに関連する動作について、説明する。ここでは、代表的に、携帯端末装置TAの動作について説明する。 Next, the operation of displaying the monitoring information related to the monitoring of the monitored person Ob in the terminal devices SP and TA and the operation related thereto will be described. Here, as an example, the operation of the mobile terminal device TA will be described.
 上述したように、電源が投入され、その稼働を始めると、携帯端末装置TAでは、例えば看護師や介護士等の監視者(ユーザ)によるログイン操作が受け付けられ、TA監視処理部322の表示処理部3221によって、自機宛の通信信号を待ち受ける待受け画面がTA表示部36に表示される。この待受け画面51は、例えば、図11に示すように、メニューバーを表示するメニューバー領域511と、待ち受け中であることを表すメッセージ(例えば「通知はありません」)およびアイコンを表示する待受けメイン領域512と、現在時刻を表示する時刻領域513と、今日の年月日曜日を表示する年月日曜日領域514と、今、当該携帯端末装置TAにログインしているユーザ名を表示するユーザ名領域515とを備える。メニューバー領域511には、他の携帯端末装置TAとの内線通話や外線電話機TLとの外線電話の発信の指示を入力するためのオフフックボタン5111が備えられている。 As described above, when the power is turned on and the operation thereof is started, the portable terminal device TA accepts a login operation by a monitor (user) such as a nurse or a caregiver, and the display process of the TA monitor processing unit 322 is performed. By the unit 3221, a standby screen waiting for a communication signal addressed to the own device is displayed on the TA display unit 36. For example, as shown in FIG. 11, the standby screen 51 includes a menu bar area 511 for displaying a menu bar, and a standby main area for displaying a message indicating that the user is waiting (for example, “no notification”) and an icon. 512, a time area 513 for displaying the current time, a year / month / sunday area 514 for displaying the current year / month / sunday, and a user name area 515 for displaying the name of the user who is currently logged in to the portable terminal device TA. Is provided. The menu bar area 511 is provided with an off-hook button 5111 for inputting an instruction for an extension call with another mobile terminal device TA or an outgoing call with an external telephone TL.
 そして、図10において、携帯端末装置TAは、TA制御処理部32のTA制御部321によって、TA通信IF部31で通信信号を受信したか否かを判定する(S21)。この判定の結果、通信信号を受信していない場合(No)には、携帯端末装置TAは、処理をS21に戻し、前記判定の結果、通信信号を受信している場合(Yes)には、携帯端末装置TAは、次の処理S22を実行する。すなわち、携帯端末装置TAは、通信信号の受信を待ち受けている。 In FIG. 10, the portable terminal device TA determines whether or not the TA communication IF unit 31 has received a communication signal by the TA control unit 321 of the TA control processing unit 32 (S21). If the result of this determination is that a communication signal has not been received (No), the portable terminal device TA returns the processing to S21. If the result of the determination is that a communication signal has been received (Yes), The portable terminal device TA executes the following process S22. That is, the mobile terminal device TA is waiting for reception of a communication signal.
 処理S22では、携帯端末装置TAは、TA制御部321によって、この受信した通信信号の種類を判定する。この判定の結果、携帯端末装置TAは、前記受信した通信信号が第2監視情報通信信号である場合(第2監視情報)には、次の処理S23および処理S24を順次に実行した後に処理S27を実行し、前記受信した通信信号が第2ナースコール通知通信信号である場合(第2NC通知)には、次の処理S25および処理S26を順次に実行した後に処理S27を実行し、前記受信した通信信号が第2監視情報通信信号および第2ナースコール通知通信信号ではない場合(その他)には、処理S21で受信した通信信号に応じた適宜な処理を行う処理S29を実行した後に本処理を終了する。 In process S22, the portable terminal device TA determines the type of the received communication signal by the TA control unit 321. As a result of this determination, when the received communication signal is the second monitoring information communication signal (second monitoring information), the portable terminal device TA performs the following processing S23 and processing S24 in sequence and then processing S27. When the received communication signal is the second nurse call notification communication signal (second NC notification), the process S27 and the process S26 are sequentially executed, and then the process S27 is executed. When the communication signal is not the second monitoring information communication signal and the second nurse call notification communication signal (others), the process is performed after executing the process S29 for performing an appropriate process according to the communication signal received in the process S21. finish.
 処理S23では、携帯端末装置TAは、TA制御処理部32のTA監視処理部322によって、処理S21で管理サーバ装置SVから受信した第2監視情報通信信号に収容された、被監視者Obに対する監視に関する監視情報をTA監視情報記憶部331に記憶(記録)する。 In the process S23, the portable terminal device TA monitors the monitored person Ob contained in the second monitoring information communication signal received from the management server apparatus SV in the process S21 by the TA monitoring processing unit 322 of the TA control processing unit 32. Is stored (recorded) in the TA monitoring information storage unit 331.
 この処理S23の次に、TA監視処理部322は、表示処理部3221によって、処理S21で受信した第2監視情報通信信号に収容された各情報に応じた画面を、例えば図12に示す監視情報画面52をTA表示部36に表示する(S24)。 Following this processing S23, the TA monitoring processing unit 322 displays, for example, the monitoring information shown in FIG. The screen 52 is displayed on the TA display unit 36 (S24).
 この監視情報画面52は、被監視者Obの監視に関する前記監視情報を表示するための画面である。前記監視情報画面52は、例えば、図12に示すように、メニューバー領域511と、処理S21で受信した第2監視情報通信信号に収容されたセンサIDを持つセンサ装置SUの配設場所および前記センサIDを持つ前記センサ装置SUによって監視される被監視者Obの名前を表示する被監視者名領域521と、処理S21で受信した第2監視情報通信信号の受信時刻(または前記所定の行動の検知時刻)からの経過時間、および、処理S21で受信した第2監視情報通信信号に収容された前記検知結果を表示する検知情報表示領域522と、処理S21で受信した第2監視情報通信信号に収容された画像(すなわち、前記センサIDを持つ前記センサ装置SUによって撮像された対象画像)(ここでは静止画)、および、処理S21で受信した第2監視情報通信信号に収容された前記人物位置データ(この例では矩形の人物領域データ)に基づく人物位置標識HP2を表示する画像領域523と、「対応する」ボタン524と、「話す」ボタン525と、「LIVEを見る」ボタン526とを備える。 The monitoring information screen 52 is a screen for displaying the monitoring information related to the monitoring of the monitored person Ob. For example, as shown in FIG. 12, the monitoring information screen 52 includes a menu bar area 511 and an arrangement location of the sensor device SU having the sensor ID accommodated in the second monitoring information communication signal received in the process S21. The monitored person name area 521 for displaying the name of the monitored person Ob monitored by the sensor device SU having the sensor ID, and the reception time of the second monitoring information communication signal received in the process S21 (or the predetermined action) Detection time), a detection information display area 522 for displaying the detection result accommodated in the second monitoring information communication signal received in step S21, and a second monitoring information communication signal received in step S21. In the accommodated image (that is, the target image captured by the sensor device SU having the sensor ID) (here, a still image), and in the process S21 An image area 523 for displaying a person position indicator HP2 based on the person position data (in this example, rectangular person area data) contained in the transmitted second monitoring information communication signal, a “corresponding” button 524, and “speak "Button 525 and a" watch live "button 526.
 被監視者名領域521に、センサ装置SUの配設場所および被監視者Obの名前を表示するために、TA記憶部33には、センサID、前記センサIDを持つセンサ装置SUの配設場所および前記センサIDを持つ前記センサ装置SUによって監視される被監視者Obの名前が互いに対応付けられて予め記憶される。 In order to display the installation location of the sensor device SU and the name of the monitored subject Ob in the monitored person name area 521, the TA storage unit 33 displays the installation location of the sensor device SU having the sensor ID and the sensor ID. And the name of the monitored person Ob monitored by the sensor device SU having the sensor ID is stored in advance in association with each other.
 検知情報表示領域522には、処理S21で受信した第2監視情報通信信号に収容された検知結果(本実施形態では、起床、離床、転落および転倒の各名称)がそのまま表示されても良いが、本実施形態では、前記検知結果を象徴的に表すアイコンで表示されている。このアイコンで表示するために、TA記憶部33には、各行動およびその行動を象徴的に表すアイコンが互いに対応付けられて予め記憶される。図12に示す例では、検知情報表示領域522には、起床を象徴的に表す起床アイコンが表示されている。 In the detection information display area 522, the detection results (names of getting up, getting out of bed, falling down, and falling down in this embodiment) contained in the second monitoring information communication signal received in step S21 may be displayed as they are. In this embodiment, the detection result is displayed as an icon that symbolically represents the detection result. In order to display this icon, the TA storage unit 33 stores each action and an icon representative of the action in association with each other in advance. In the example shown in FIG. 12, the detection information display area 522 displays a wake-up icon that symbolizes wake-up.
 画像領域523には、表示中の対象画像中に、人物位置データに基づく人物位置標識HP2が表示される。この図12に示す例では、対象画像には、寝具BDおよび被監視者Obが写っている。本実施形態では、処理S21で受信した第2監視情報通信信号に収容された人物位置データは、矩形の人物領域データであり、表示処理部3221は、表示中の対象画像における正面視にて左上頂点を座標原点として、人物領域データ(ここでは対角線の両端点の各座標データ(x0、y0)、(x1、y1))で表される矩形の画像HP2を人物位置標識HP2として表示中の対象画像に重畳して表示する。そして、本実施形態では、人物位置標識HP2で囲まれた領域は、表示中の対象画像に写る人物を拡大して表示する指示である前記拡大表示指示を、当該携帯端末装置TAに入力するためのボタンになっている。 In the image area 523, a person position indicator HP2 based on the person position data is displayed in the target image being displayed. In the example shown in FIG. 12, the bedding BD and the monitored person Ob are shown in the target image. In the present embodiment, the person position data accommodated in the second monitoring information communication signal received in step S21 is rectangular person area data, and the display processing unit 3221 displays the upper left in the front view of the target image being displayed. A rectangular image HP2 represented by person area data (here, each coordinate data (x0, y0), (x1, y1) of both end points of a diagonal line) with a vertex as a coordinate origin, an object being displayed as a person position indicator HP2 Display it superimposed on the image. In the present embodiment, the region surrounded by the person position indicator HP2 is for inputting the enlarged display instruction, which is an instruction to enlarge and display the person shown in the target image being displayed, to the mobile terminal device TA. It is a button.
 なお、表示処理部3221は、前記対象画像の表示方法(第1表示方法)と異なる第2表示方法で人物の領域内の画像を表示しても良い。前記人物の領域内外を互いに異なる表示方法で表示することで、携帯端末装置TAのユーザ(監視者)は、画像中の人物の領域、言い換えれば、画像中の被監視者Obを見つけ易くなる。例えば、第1表示方法は、通常の表示であり、第2表示方法は、モザイク表示であり、表示処理部3221は、矩形の人物位置標識HP内の画像を前記人物の領域内の画像として、矩形の人物位置標識HP外の画像を通常の通りに表示し、矩形の人物位置標識HP内の画像をモザイク化して表示する。また例えば、第1表示方法は、カラー表示であり、第2表示方法は、モノクロ表示(単色化表示)であり、表示処理部3221は、矩形の人物位置標識HP内の画像を前記人物の領域内の画像として、矩形の人物位置標識HP外の画像をカラーで表示し、矩形の人物位置標識HP内の画像をモノクロで表示する。また例えば、第1表示方法は、モノクロ表示であり、第2表示方法は、カラー表示であり、表示処理部3221は、矩形の人物位置標識HP内の画像を前記人物の領域内の画像として、矩形の人物位置標識HP外の画像をモノクロで表示し、矩形の人物位置標識HP内の画像をカラーで表示する。また例えば、第1表示方法は、ポジ表示であり、第2表示方法は、ネガ表示であり、表示処理部3221は、矩形の人物位置標識HP内の画像を前記人物の領域内の画像として、矩形の人物位置標識HP外の画像をポジで表示し、矩形の人物位置標識HP内の画像をネガで表示する。また例えば、第1表示方法は、ネガ表示であり、第2表示方法は、ポジ表示であり、表示処理部3221は、矩形の人物位置標識HP内の画像を前記人物の領域内の画像として、矩形の人物位置標識HP外の画像をネガで表示し、矩形の人物位置標識HP内の画像をポジで表示する。 Note that the display processing unit 3221 may display an image in the person's area by a second display method different from the display method of the target image (first display method). By displaying the inside and outside of the person area by different display methods, the user (monitoring person) of the portable terminal device TA can easily find the person area in the image, in other words, the monitored person Ob in the image. For example, the first display method is normal display, the second display method is mosaic display, and the display processing unit 3221 uses the image in the rectangular person position mark HP as the image in the person area. An image outside the rectangular person position sign HP is displayed as usual, and the image inside the rectangular person position sign HP is displayed in a mosaic manner. Further, for example, the first display method is color display, the second display method is monochrome display (single color display), and the display processing unit 3221 displays the image in the rectangular person position mark HP as the region of the person. As the inside image, an image outside the rectangular person position sign HP is displayed in color, and an image inside the rectangular person position sign HP is displayed in monochrome. Further, for example, the first display method is monochrome display, the second display method is color display, and the display processing unit 3221 uses the image in the rectangular person position mark HP as an image in the person area. An image outside the rectangular person position sign HP is displayed in monochrome, and an image inside the rectangular person position sign HP is displayed in color. Further, for example, the first display method is positive display, the second display method is negative display, and the display processing unit 3221 uses the image in the rectangular person position marker HP as the image in the person area. An image outside the rectangular person position sign HP is displayed in a positive manner, and an image inside the rectangular person position sign HP is displayed in a negative form. Further, for example, the first display method is negative display, the second display method is positive display, and the display processing unit 3221 uses the image in the rectangular person position mark HP as the image in the person area. The image outside the rectangular person position sign HP is displayed as a negative, and the image inside the rectangular person position sign HP is displayed as a positive.
 「対応する」ボタン524は、監視情報画面52では、この監視情報画面52に表示された検知結果に対し例えば救命、看護、介護および介助等の所定の対応(応対、対処)を実施する意思が当該携帯端末装置TAのユーザにある旨を表す実施意思情報を、当該携帯端末装置TAに入力するためのボタンである。「話す」ボタン525は、音声通話を要求するためのボタンであって、前記センサIDの前記センサ装置SUと当該携帯端末装置TAとをネットワークNWを介して通話可能に接続する指示を入力するためのボタンである。「LIVEを見る」ボタン526は、ライブでの動画を要求するためのボタンであって、前記センサIDの前記センサ装置SUによって撮像される動画を表示させる指示を入力するためのボタンである。 In the monitoring information screen 52, the “corresponding” button 524 has an intention to perform a predetermined response (response, response) such as lifesaving, nursing, care, and assistance for the detection result displayed on the monitoring information screen 52. It is a button for inputting execution intention information indicating that the user of the mobile terminal device TA is present to the mobile terminal device TA. The “speak” button 525 is a button for requesting a voice call, and is used to input an instruction to connect the sensor device SU of the sensor ID and the mobile terminal device TA via the network NW. It is a button. The “LIVE” button 526 is a button for requesting a live video, and is a button for inputting an instruction to display a video captured by the sensor device SU of the sensor ID.
 図10に戻って、一方、処理S25では、携帯端末装置TAは、TA制御処理部32のTA監視処理部322によって、処理S21で管理サーバ装置SVから受信した第2ナースコール通知通信信号に収容された、被監視者Obに対する監視に関する監視情報をTA監視情報記憶部331に記憶(記録)する。 Returning to FIG. 10, on the other hand, in the process S25, the portable terminal device TA is accommodated in the second nurse call notification communication signal received from the management server apparatus SV in the process S21 by the TA monitoring processing unit 322 of the TA control processing unit 32. The monitoring information related to the monitoring of the monitored person Ob is stored (recorded) in the TA monitoring information storage unit 331.
 この処理S25の次に、TA監視処理部322は、処理S21で受信した第2ナースコール通知通信信号に収容されたナースコール受付情報に応じて、TA記憶部33に予め記憶されたナースコールを受け付けた旨を表す図略のナースコール受付画面をTA表示部36に表示する(S26)。 Following this processing S25, the TA monitoring processing unit 322 performs a nurse call stored in advance in the TA storage unit 33 in accordance with the nurse call reception information accommodated in the second nurse call notification communication signal received in the processing S21. An unillustrated nurse call acceptance screen indicating acceptance is displayed on the TA display unit 36 (S26).
 そして、これら処理S24および処理S26それぞれの後に実行される前記処理S27では、携帯端末装置TAは、TA制御処理部32によって、TA入力部35およびTA表示部36を備えて成るタッチパネルで入力操作を受け付けたか否かを判定する。この判定の結果、入力操作を受け付けていない場合(No)には、携帯端末装置TAは、処理を処理S27に戻し、一方、前記判定の結果、入力操作を受け付けている場合には、携帯端末装置TAは、次の処理S28を実行する。 And in said process S27 performed after each of these process S24 and process S26, portable terminal device TA performs input operation with the touch panel which comprises TA input part 35 and TA display part 36 by TA control process part 32. It is determined whether or not it has been accepted. If the input operation is not accepted as a result of the determination (No), the portable terminal device TA returns the process to step S27. On the other hand, if the input operation is accepted as a result of the determination, the portable terminal TA The apparatus TA executes the next process S28.
 この処理S28では、携帯端末装置TAは、TA制御処理部32によって、入力操作の内容に応じた適宜な処理を実行し、本処理を終了する。 In this process S28, the portable terminal device TA performs an appropriate process according to the content of the input operation by the TA control processing unit 32, and ends this process.
 例えば、携帯端末装置TAは、TA制御処理部32によって、拡大表示指示を入力するためのボタンとしても機能している、人物位置標識HP2で囲まれた領域の入力操作(例えばタップ操作等)を受け付けると、表示処理部3221によって、TA表示部36に表示中の人物の領域の大きさよりも拡大して前記人物の領域を表示する。例えば、図12に示す人物位置標識HP2で囲まれた領域は、約4倍に拡大され(縦方向および横方向それぞれ2倍ずつ拡大され)、図13に示す人物位置標識HPEで囲まれた領域で表示される。なお、拡大によって生じる画素の画素値は、例えば、これに隣接する画素の画素値から補間されて生成される。このように人物の領域が拡大表示されるので、携帯端末装置TAのユーザ(監視者)は、画像中の被監視者Obを見つけ易くなり、被監視者Obの様子を画像で認識し易くなる。 For example, the portable terminal device TA performs an input operation (for example, a tap operation) of an area surrounded by the person position marker HP2 that also functions as a button for inputting an enlarged display instruction by the TA control processing unit 32. When accepted, the display processing section 3221 displays the person area enlarged in size than the area of the person currently displayed on the TA display section 36. For example, the area surrounded by the person position mark HP2 shown in FIG. 12 is enlarged about four times (doubled in each of the vertical and horizontal directions), and the area surrounded by the person position mark HPE shown in FIG. Is displayed. Note that the pixel value of a pixel generated by enlargement is generated by interpolating from the pixel value of a pixel adjacent thereto, for example. Since the person area is enlarged and displayed in this manner, the user (monitoring person) of the portable terminal device TA can easily find the monitored person Ob in the image, and can easily recognize the state of the monitored person Ob in the image. .
 また例えば、携帯端末装置TAは、TA制御処理部32によって、「対応する」ボタン524の入力操作を受け付けると(すなわち、前記対応意思を受け付けると)、現在、TA表示部36に表示している被監視者Obの監視情報に、「対応する」を受け付けた旨を付してTA監視情報記憶部331に記憶し、TA表示部36に表示している被監視者Obの監視情報に対応するセンサIDおよび「対応する」を受け付けた旨を表す情報(対応受付情報)を収容した通信信号(対応受付通知通信信号)を管理サーバ装置SVへ送信する。この対応受付通知通信信号を受信した管理サーバ装置SVは、この受信した対応受付通知通信信号に収容されたセンサIDおよび対応受付情報を収容した通信信号(対応受付周知通信信号)を同報通信で端末装置SP、TAへ送信する。これによって、TA表示部36に表示している被監視者Obの監視情報に対応するセンサIDに関し、「対応する」を受け付けた旨が各端末装置SP、TA間で同期される。 Also, for example, when the TA control processing unit 32 receives an input operation of the “corresponding” button 524 (that is, when the corresponding intention is received), the portable terminal device TA is currently displaying on the TA display unit 36. The monitoring information of the monitored person Ob is stored in the TA monitoring information storage unit 331 with the indication that “corresponding” has been received, and corresponds to the monitoring information of the monitored person Ob displayed on the TA display unit 36. A communication signal (correspondence acceptance notification communication signal) containing information indicating that the sensor ID and “corresponding” are accepted (correspondence acceptance information) is transmitted to the management server device SV. The management server device SV that has received the correspondence reception notification communication signal broadcasts the communication signal (correspondence reception well-known communication signal) containing the sensor ID and the correspondence reception information contained in the received correspondence reception notification communication signal. Transmit to the terminal devices SP and TA. As a result, regarding the sensor ID corresponding to the monitoring information of the monitored person Ob displayed on the TA display unit 36, the fact that “corresponding” has been received is synchronized between the terminal devices SP and TA.
 また例えば、携帯端末装置TAは、TA制御処理部32によって、「話す」ボタン525の入力操作を受け付けると、通話処理部323によって、TA表示部36に表示している被監視者Obを監視するセンサ装置SUへ、音声通話を要求する旨等の情報を収容した通信信号(通話要求通信信号)を送信し、これに応じたセンサ装置SUとネットワークNWを介して音声通話可能に接続する。これによって携帯端末装置TAとセンサ装置SUとの間で音声通話が可能となる。なお、携帯端末装置TAは、TA制御処理部32によって、音声通話の終了の指示を入力するためのボタンである図略の「終了」ボタンの入力操作を受け付けると、通話処理部323によって、TA表示部36に表示している被監視者Obを監視するセンサ装置SUへ、音声通話の終了を要求する旨等の情報を収容した通信信号(通話終了通信信号)を送信する。これによって携帯端末装置TAとセンサ装置SUとの間での音声通話が終了される。 Further, for example, when the mobile terminal device TA receives an input operation of the “speak” button 525 by the TA control processing unit 32, the call processing unit 323 monitors the monitored person Ob displayed on the TA display unit 36. A communication signal (call request communication signal) containing information such as requesting a voice call is transmitted to the sensor device SU, and the sensor device SU is connected to the sensor device SU via the network NW so as to be able to make a voice call. As a result, a voice call can be made between the mobile terminal device TA and the sensor device SU. When the TA control processing unit 32 receives an input operation of an “end” button (not shown) that is a button for inputting an instruction to end a voice call, the portable terminal device TA causes the TA processing unit 323 to A communication signal (call end communication signal) containing information such as a request to end the voice call is transmitted to the sensor device SU that monitors the monitored person Ob displayed on the display unit 36. Thereby, the voice call between the mobile terminal device TA and the sensor device SU is terminated.
 また例えば、携帯端末装置TAは、TA制御処理部32によって、「LIVEを見る」ボタン526の入力操作を受け付けると、TAストリーミング処理部324によって、現在、TA表示部36に表示している被監視者Obを監視するセンサ装置SUへ、ライブでの動画の配信を要求する旨等の情報を収容した通信信号(動画配信要求通信信号)を送信し、これに応じたセンサ装置SUとネットワークNWを介して動画のダウンロード可能に接続し、前記センサ装置SUからライブでの動画の配信を受け、この配信を受けた動画をストリーミング再生でTA表示部36に表示する。このライブでの動画を表示する監視情報画面52では、画像領域523に静止画に代え動画が表示され、そして、「LIVEを見る」ボタン526に代え図略の「LIVE終了」ボタンが表示される。これによって携帯端末装置TAには、ライブでの動画が表示される。前記図略の「LIVE終了」ボタンは、動画の終了を要求するためのボタンであって、前記センサIDの前記センサ装置SUによって撮像される動画の配信を終了(停止)させ表示を終了(停止)させる指示を入力するためのボタンである。携帯端末装置TAは、TA制御処理部32によって、「LIVE終了」ボタンの入力操作を受け付けると、TAストリーミング処理部324によって、現在、TA表示部36に表示している被監視者Obを監視するセンサ装置SUへ、動画配信の終了を要求する旨等の情報を収容した通信信号(動画配信終了通信信号)を送信し、静止画をTA表示部36に表示する。これによって携帯端末装置TAは、ライブでの動画の表示を終了する。 Further, for example, when the mobile terminal device TA receives an input operation of the “watch live” button 526 by the TA control processing unit 32, the TA streaming processing unit 324 displays the monitored object currently displayed on the TA display unit 36. A communication signal (video distribution request communication signal) containing information such as requesting live video distribution is transmitted to the sensor device SU that monitors the person Ob, and the sensor device SU and the network NW corresponding thereto are transmitted. The video is connected to be downloadable via the sensor device SU, receives live video distribution from the sensor device SU, and displays the distributed video on the TA display unit 36 by streaming playback. On the monitoring information screen 52 that displays the live video, the video is displayed in the image area 523 instead of the still image, and the “live end” button (not shown) is displayed instead of the “view live” button 526. . As a result, live video is displayed on the mobile terminal device TA. The “live end” button (not shown) is a button for requesting the end of the moving image, and ends (stops) the distribution of the moving image picked up by the sensor device SU of the sensor ID and ends (stops) the display. This is a button for inputting an instruction to be performed. When the TA control processing unit 32 receives an input operation of the “live end” button, the portable terminal device TA monitors the monitored person Ob currently displayed on the TA display unit 36 by the TA streaming processing unit 324. A communication signal (moving image distribution end communication signal) containing information such as requesting the end of moving image distribution is transmitted to the sensor device SU, and a still image is displayed on the TA display unit 36. Accordingly, the mobile terminal device TA ends the live video display.
 センサ装置SUから管理サーバ装置SVを介して報知を受けた検知結果やナースコール受付の各報知(各再報知)に関し、携帯端末装置TAは、以上のように動作している。 The mobile terminal device TA operates as described above regarding the detection results received from the sensor device SU via the management server device SV and each notification (re-notification of each nurse call reception).
 以上説明したように、本実施形態における被監視者監視システムMS、端末装置SP、TAおよびこれに実装された表示方法は、TA通信IF部31で受信した、第1監視情報通信信号に基づく第2監視情報通信信号に収容された画像データの画像をTA表示部36に表示し、前記第2監視情報通信信号に収容された人物位置データに基づく人物位置標識HP2を、TA表示部に表示した画像中に表示する表示処理部3221を備える。したがって、上記被監視者監視システムMS、上記端末装置SP、TAおよび上記表示方法では、表示中の画像に人物位置標識HP2が表示されるので、そのユーザ(監視者)は、人物位置標識HP2を手がかりに、画像中の被監視者Obを見つけことができ、画像中の被監視者Obを見つけ易くなる。本実施形態におけるセンサ装置SUは、画像だけでなく人物位置データ(本実施形態では人物領域データ)も第1監視情報通信信号に収容するので、前記第2監視情報通信信号を受信した端末装置SP、TAは、表示中の画像に人物位置標識HP2を表示できる。したがって、上記端末装置SP、TAのユーザは、上述のように、人物位置標識HP2を手がかりに、画像中の被監視者Obを見つけことができ、画像中の被監視者Obを見つけ易くなる。 As described above, the monitored person monitoring system MS, the terminal device SP, TA, and the display method implemented therein according to the present embodiment are based on the first monitoring information communication signal received by the TA communication IF unit 31. 2 The image of the image data accommodated in the monitoring information communication signal is displayed on the TA display unit 36, and the person position indicator HP2 based on the person position data accommodated in the second monitoring information communication signal is displayed on the TA display unit. A display processing unit 3221 for displaying in an image is provided. Therefore, in the monitored person monitoring system MS, the terminal devices SP and TA, and the display method, the person position indicator HP2 is displayed in the image being displayed, so that the user (monitorer) displays the person position indicator HP2. The monitored person Ob in the image can be found with the clue, and the monitored person Ob in the image can be easily found. Since the sensor device SU in the present embodiment accommodates not only images but also person position data (person area data in the present embodiment) in the first monitoring information communication signal, the terminal device SP that has received the second monitoring information communication signal. TA can display the person position sign HP2 on the displayed image. Therefore, as described above, the users of the terminal devices SP and TA can find the monitored person Ob in the image using the person position indicator HP2 as a clue, and can easily find the monitored person Ob in the image.
 なお、上述の実施形態では、対象画像に写る人物の前記対象画像上での位置によっては、例えば図14に示すように、人物の領域(図14に示す例では人物位置標識HP2内の領域)の一部しか、TA表示部36に表示されない場合がある。あるいは、「対応する」ボタン524、「話す」ボタン525および「LIVEを見る」ボタン526等の、所定の操作を行う指示である操作指示を当該携帯端末装置TAに入力するためのボタンが画像領域523内に表示される場合、対象画像に写る人物の前記対象画像上での位置によっては、例えば図14に示すように、人物の領域(図14に示す例では人物位置標識HP2内の領域)の一部または全部と前記ボタンと重なって、前記人物の領域の一部または全部がTA表示部36に表示される場合がある。このような場合を解消するために、表示処理部3221は、次の第1および第2変形形態のように構成されても良い。図14は、実施形態の被監視者監視システムにおける携帯端末装置に表示される、人物の領域と「対応する」ボタンとが重なった監視情報画面の一例を示す図である。図15は、実施形態の被監視者監視システムにおける携帯端末装置に表示される、画像表示領域の中央位置に人物の領域を表示する監視情報画面の一例を示す図である。図16は、実施形態の被監視者監視システムにおける携帯端末装置に表示される、互いに重ならないように人物の領域と「対応する」ボタンを表示する監視情報画面の一例を示す図である。 In the above-described embodiment, depending on the position of the person shown in the target image on the target image, for example, as shown in FIG. 14, a person area (in the example shown in FIG. 14, the area in the person position indicator HP2). May be displayed on the TA display part 36 only. Alternatively, buttons for inputting an operation instruction, which is an instruction for performing a predetermined operation, such as a “corresponding” button 524, a “speak” button 525, and a “watch live” button 526 to the mobile terminal device TA, are displayed in the image area. When displayed in 523, depending on the position of the person appearing in the target image on the target image, for example, as shown in FIG. 14, the area of the person (in the example shown in FIG. 14, the area within the person position indicator HP2) In some cases, a part or all of the person's area overlaps with the button and part or all of the person's area is displayed on the TA display unit 36. In order to eliminate such a case, the display processing unit 3221 may be configured as in the following first and second modifications. FIG. 14 is a diagram illustrating an example of a monitoring information screen in which a person area and a “corresponding” button overlap each other displayed on the mobile terminal device in the monitored person monitoring system according to the embodiment. FIG. 15 is a diagram illustrating an example of a monitoring information screen that displays a person area at the center position of an image display area, which is displayed on the mobile terminal device in the monitored person monitoring system of the embodiment. FIG. 16 is a diagram illustrating an example of a monitoring information screen that is displayed on the mobile terminal device in the monitored person monitoring system according to the embodiment and displays a person region and a “corresponding” button so as not to overlap each other.
 第1変形形態では、表示処理部3221は、TA表示部36における対象画像を表示するための画像領域523の中央位置SCと、前記対象画像に写る人物の前記対象画像上での位置HPCとが互いに一致するように(SC=HPC)、前記対象画像および前記人物位置標識HPをTA表示部36に表示する。前記位置HPCは、例えば、動体領域MBにおける重心位置P1である。また例えば、人物位置データが矩形の人物領域データである場合、前記位置HPCは、前記矩形における対角線の交点の位置である。また例えば、人物位置データが円形の人物領域データである場合、前記位置HPCは、前記円形における中心点である。また例えば、人物位置データが楕円形の人物領域データである場合、前記位置HPCは、前記楕円形における長軸と短軸との交点の位置である。より具体的には、前記中央位置SCの画素位置(座標データ)がTA記憶部33に予め記憶されており、表示処理部3221は、例えば図15に示すように、このTA記憶部33に記憶された前記中央位置SCの画素位置に、前記位置HPC(この例では矩形における対角線の交点の位置)が位置するように、前記対象画像および前記人物位置標識HPそれぞれの表示位置を調整してこれらをTA表示部36に表示する。これによれば、上記携帯端末装置TAのユーザ(監視者)は、画像領域523の中央位置を注視すれば良いので、画像中の被監視者Obを見つけ易くなる。 In the first modification, the display processing unit 3221 has a center position SC of the image area 523 for displaying the target image on the TA display unit 36 and a position HPC of the person appearing in the target image on the target image. The target image and the person position indicator HP are displayed on the TA display unit 36 so as to match each other (SC = HPC). The position HPC is, for example, the barycentric position P1 in the moving object region MB. For example, when the person position data is rectangular person area data, the position HPC is a position of an intersection of diagonal lines in the rectangle. For example, when the person position data is circular person area data, the position HPC is a center point in the circle. For example, when the person position data is elliptical person area data, the position HPC is the position of the intersection of the major axis and the minor axis in the ellipse. More specifically, the pixel position (coordinate data) of the center position SC is stored in advance in the TA storage unit 33, and the display processing unit 3221 stores the pixel position in the TA storage unit 33, for example, as shown in FIG. The display positions of the target image and the person position indicator HP are adjusted so that the position HPC (in this example, the position of the intersection of diagonal lines in a rectangle) is positioned at the pixel position of the center position SC. Is displayed on the TA display unit 36. According to this, since the user (monitoring person) of the portable terminal device TA has only to watch the center position of the image area 523, it becomes easy to find the monitored person Ob in the image.
 第2変形形態では、表示処理部3221は、TA表示部36の表示領域における人物の領域を除いた残余の領域に、所定の操作を行う指示である操作指示を当該携帯端末装置TAに入力するための前記ボタンを表示する。好ましくは、表示処理部3221は、TA表示部36の前記対象画像を表示するための画像領域523における前記人物の領域を除いた残余の領域に、前記ボタンを表示する。例えば、前記ボタンが画像領域523内に表示される場合、例えば図14に示すように、人物の領域(図14に示す例では人物位置標識HP2内の領域)の一部が「対応する」ボタン524と重なっている場合、表示処理部3221は、図16に示すように、人物の領域(この例では人物位置標識HP2内の領域)を除いた残余の領域、例えば画像領域523の上端部の領域に、「対応する」ボタン524を表示する。「対応する」ボタン524、「話す」ボタン525および「LIVEを見る」ボタン526等の、前記ボタンの表示位置は、予め既定されているので、前記人物の領域と前記ボタンとの重なる領域は、予め既知である。このため、前記人物の領域を除いた前記残余の領域は、予測でき、予めTA記憶部33に記憶される。表示処理部3221は、前記人物の領域の全部または一部が前記ボタンと重なった場合、TA記憶部33に予め記憶された前記残余の領域に、前記ボタンを表示すればよい。これによれば、前記ボタンによって前記人物の領域が隠れることが無く、携帯端末装置TAのユーザ(監視者)は、画像中の被監視者Obを見つけ易くなる。 In the second modification, the display processing unit 3221 inputs an operation instruction, which is an instruction for performing a predetermined operation, to the remaining area excluding the person area in the display area of the TA display unit 36. The button for displaying. Preferably, the display processing unit 3221 displays the button in a remaining area excluding the person area in the image area 523 for displaying the target image on the TA display unit 36. For example, when the button is displayed in the image area 523, as shown in FIG. 14, for example, a button corresponding to a part of the person's area (the area in the person position indicator HP2 in the example shown in FIG. 14) In the case where it overlaps with 524, the display processing unit 3221, as shown in FIG. A “corresponding” button 524 is displayed in the area. Since the display positions of the buttons such as the “corresponding” button 524, the “speak” button 525, and the “view live” button 526 are preset, an area where the person area overlaps the button is It is known in advance. For this reason, the remaining area excluding the person area can be predicted and stored in the TA storage unit 33 in advance. The display processing unit 3221 may display the button in the remaining area stored in advance in the TA storage unit 33 when all or part of the person's area overlaps the button. According to this, the person's area is not hidden by the button, and the user (monitoring person) of the mobile terminal device TA can easily find the monitored person Ob in the image.
 本明細書は、上記のように様々な態様の技術を開示しているが、そのうち主な技術を以下に纏める。 This specification discloses various modes of technology as described above, and the main technologies are summarized below.
 一態様にかかる端末装置は、監視対象である被監視者の画像の画像データ、および、前記画像に写る人物の前記画像上での位置に関する人物位置データを収容する監視情報通信信号を受信する通信部と、表示を行う表示部と、前記通信部で受信した監視情報通信信号に収容された画像データの画像を前記表示部に表示し、前記監視情報通信信号に収容された人物位置データに基づく前記人物の位置を表す人物位置標識を、前記表示部に表示した画像中に表示する表示処理部とを備える。好ましくは、上述の端末装置において、前記人物位置データは、前記画像データのメタデータとして前記監視情報通信信号に収容される。好ましくは、上述の端末装置において、前記画像データおよび前記人物位置データは、Exif(Exchangeable image file format)ファイルに収容される。好ましくは、上述の端末装置において、前記画像データおよび前記人物位置データは、HTTP(Hypertext Transfer Protocol)におけるマルチパート形式のファイルにフォームデータとして収容される。好ましくは、上述の端末装置において、前記人物位置データは、前記画像に写る人物の前記画像上での位置を含む前記人物の領域の人物領域データである。好ましくは、上述の端末装置において、前記人物位置データは、前記画像に写る人物の前記画像上での位置を含む前記人物の領域に関する人物領域データであり、前記人物領域データは、前記人物の領域の全部または一部を含む矩形、円形および楕円のうちのいずれか1つを表す座標データであり、前記人物位置標識は、前記人物領域データで表される画像である。好ましくは、上述の端末装置において、前記人物位置データは、前記画像に写る人物の前記画像上での位置を含む前記人物の領域に関する人物領域データであり、前記人物領域データは、前記人物の領域の輪郭線を表す座標データであり、前記人物位置標識は、前記人物領域データで表される画像である。 The terminal device according to one aspect receives a monitoring information communication signal containing image data of an image of a monitored person to be monitored and person position data relating to a position of the person shown in the image on the image. An image of the image data accommodated in the monitoring information communication signal received by the communication unit is displayed on the display unit, and based on the person position data accommodated in the monitoring information communication signal A display processing unit for displaying a person position indicator representing the position of the person in an image displayed on the display unit. Preferably, in the above terminal device, the person position data is accommodated in the monitoring information communication signal as metadata of the image data. Preferably, in the above-described terminal device, the image data and the person position data are accommodated in an Exif (Exchangeable image file format) file. Preferably, in the terminal device described above, the image data and the person position data are accommodated as form data in a multipart format file in HTTP (Hypertext Transfer Protocol). Preferably, in the above-described terminal device, the person position data is person area data of the person area including a position on the image of the person shown in the image. Preferably, in the above-described terminal device, the person position data is person area data relating to the person area including a position of the person in the image on the image, and the person area data is the person area data. Are coordinate data representing any one of a rectangle, a circle, and an ellipse including all or a part thereof, and the person position indicator is an image represented by the person region data. Preferably, in the above-described terminal device, the person position data is person area data relating to the person area including a position of the person in the image on the image, and the person area data is the person area data. The person position marker is an image represented by the person area data.
 このような端末装置は、通信部で受信した監視情報通信信号に収容された画像データの画像を表示部に表示し、前記監視情報通信信号に収容された人物位置データに基づく前記人物の位置を表す人物位置標識を、前記表示部に表示した画像中に表示する表示処理部を備える。したがって、上記端末装置では、表示中の画像に人物位置標識が表示されるので、上記端末装置のユーザは、前記人物位置標識を手がかりに、画像中の被監視者を見つけことができ、画像中の被監視者を見つけ易くなる。 Such a terminal device displays an image of the image data contained in the monitoring information communication signal received by the communication unit on the display unit, and determines the position of the person based on the person position data contained in the monitoring information communication signal. A display processing unit is provided for displaying the person position sign to be displayed in the image displayed on the display unit. Therefore, since the person position sign is displayed in the displayed image in the terminal device, the user of the terminal device can find the person to be monitored in the image using the person position sign as a clue. Makes it easier to find the person being monitored.
 他の一態様では、上述の端末装置において、前記人物位置データは、前記画像に写る人物の前記画像上での位置を含む前記人物の領域に関する人物領域データであり、前記人物位置標識は、前記人物領域データで表される画像であり、前記表示処理部は、前記画像の表示方法と異なる第2表示方法で前記人物の領域内の画像を表示する。好ましくは、上述の端末装置において、前記第2表示方法は、前記人物の領域内の画像をモザイクで表示する方法である。好ましくは、上述の端末装置において、前記画像の表示方法は、前記画像をカラーでまたはモノクロで表示する方法であり、前記第2表示方法は、前記人物の領域内の画像をモノクロでまたはカラーで表示する方法である。好ましくは、上述の端末装置において、前記画像の表示方法は、前記画像をポジでまたはネガで表示する方法であり、前記第2表示方法は、前記人物の領域内の画像をネガでまたはポジで表示する方法である。 In another aspect, in the above-described terminal device, the person position data is person area data related to the person area including a position of the person in the image on the image, and the person position indicator is The image is represented by person area data, and the display processing unit displays an image in the person area by a second display method different from the image display method. Preferably, in the terminal device described above, the second display method is a method of displaying an image in the area of the person in a mosaic. Preferably, in the terminal device described above, the image display method is a method of displaying the image in color or monochrome, and the second display method is an image in the person's area in monochrome or color. How to display. Preferably, in the above-described terminal device, the display method of the image is a method of displaying the image positively or negatively, and the second display method is a method of displaying the image in the person area negatively or positively. How to display.
 このような端末装置は、表示処理部によって、画像の表示方法(第1表示方法)と異なる第2表示方法で人物の領域内の画像を表示する。このため、上記端末装置のユーザは、画像中の人物の領域、言い換えれば、画像中の被監視者を見つけ易くなる。 In such a terminal device, the display processing unit displays the image in the person's area by a second display method different from the image display method (first display method). For this reason, the user of the terminal device can easily find the person's area in the image, in other words, the monitored person in the image.
 他の一態様では、これら上述の端末装置において、前記表示処理部は、前記表示部における前記画像を表示するための画像表示領域の中央位置と、前記画像に写る人物の前記画像上での位置とが互いに一致するように、前記画像および前記人物位置標識を前記表示部に表示する。 In another aspect, in the above-described terminal devices, the display processing unit includes a central position of an image display area for displaying the image on the display unit, and a position on the image of a person appearing in the image. The image and the person position indicator are displayed on the display unit so that they coincide with each other.
 このような端末装置は、表示処理部によって、画像表示領域の中央位置に人物位置標識を表示する。このため、上記端末装置のユーザは、画像表示領域の中央位置を注視すれば良いので、画像中の被監視者を見つけ易くなる。 Such a terminal device displays a person position indicator at the center position of the image display area by the display processing unit. For this reason, since the user of the said terminal device should just watch the center position of an image display area, it becomes easy to find the to-be-monitored person in an image.
 他の一態様では、これら上述の端末装置において、前記画像に写る人物を拡大して表示する指示である拡大表示指示を受け付ける指示入力部をさらに備え、前記人物位置データは、前記画像に写る人物の前記画像上での位置を含む前記人物の領域に関する人物領域データであり、前記人物位置標識は、前記人物領域データで表される画像であり、前記表示処理部は、前記指示入力部で前記拡大表示指示を受け付けた場合に、前記表示部に表示中の前記人物の領域の大きさよりも拡大して前記人物の領域を表示する。 In another aspect, the above-described terminal device further includes an instruction input unit that receives an enlarged display instruction that is an instruction for enlarging and displaying the person shown in the image, and the person position data is the person shown in the image The person area data relating to the person area including the position on the image, the person position indicator is an image represented by the person area data, and the display processing unit is the instruction input unit When an enlarged display instruction is received, the area of the person is displayed in a larger size than the area of the person being displayed on the display unit.
 このような端末装置は、指示入力部をさらに備え、表示処理部によって、前記指示入力部で拡大表示指示を受け付けた場合に、表示部に表示中の人物の領域の大きさよりも拡大して前記人物の領域を表示する。したがって、人物の領域が拡大表示されるので、上記端末装置のユーザは、画像中の被監視者を見つけ易くなる。 Such a terminal device further includes an instruction input unit, and when the display processing unit accepts an enlarged display instruction by the instruction input unit, the terminal device expands more than the size of the area of the person being displayed on the display unit. Display the person area. Therefore, since the area of the person is enlarged and displayed, the user of the terminal device can easily find the person to be monitored in the image.
 他の一態様では、これら上述の端末装置において、所定の操作を行う指示である操作指示を受け付ける第2指示入力部をさらに備え、前記第2指示入力部は、前記表示部とでタッチパネルを構成し、前記表示部にボタンで表示され、前記人物位置データは、前記画像に写る人物の前記画像上での位置を含む前記人物の領域に関する人物領域データであり、前記人物位置標識は、前記人物領域データで表される画像であり、前記表示処理部は、前記表示部の表示領域における前記人物の領域を除いた残余の領域に、前記ボタンを表示する。好ましくは、上述の端末装置において、前記表示処理部は、前記表示部の前記画像を表示するための画像表示領域における前記人物の領域を除いた残余の領域に、前記ボタンを表示する。 In another aspect, the above-described terminal device further includes a second instruction input unit that receives an operation instruction that is an instruction to perform a predetermined operation, and the second instruction input unit forms a touch panel with the display unit And the person position data is person area data relating to the person area including the position of the person in the image on the image, and the person position indicator is the person position indicator. The display processing unit displays the button in a remaining area excluding the person area in the display area of the display unit. Preferably, in the terminal device described above, the display processing unit displays the button in a remaining area excluding the person area in the image display area for displaying the image on the display unit.
 このような端末装置は、表示処理部によって、人物の領域を除いた残余の領域に操作指示を受け付けるボタンを表示する。このため、前記ボタンによって前記人物の領域が隠れることが無く、上記端末装置のユーザは、画像中の被監視者を見つけ易くなる。 In such a terminal device, the display processing unit displays a button for receiving an operation instruction in the remaining area excluding the person area. For this reason, the region of the person is not hidden by the button, and the user of the terminal device can easily find the monitored person in the image.
 他の一態様にかかる端末装置の表示方法は、監視対象である被監視者の画像の画像データ、および、前記画像に写る人物の前記画像上での位置に関する人物位置データを収容する監視情報通信信号を受信する受信工程と、前記受信工程で受信した監視情報通信信号に収容された画像データの画像を、表示を行う表示部に表示し、前記監視情報通信信号に収容された人物位置データに基づく前記人物の位置を表す人物位置標識を前記表示部に表示した画像中に表示する表示処理工程とを備える。 According to another aspect of the present invention, there is provided a display method for a terminal device, in which monitoring data communication that stores image data of an image of a monitored person to be monitored and person position data relating to a position of a person appearing in the image on the image A receiving step for receiving the signal, and an image of the image data accommodated in the monitoring information communication signal received in the receiving step is displayed on a display unit for displaying, and the person position data accommodated in the monitoring information communication signal is displayed. And a display processing step of displaying a person position sign indicating the position of the person based on the image displayed on the display unit.
 このような端末装置の表示方法は、受信工程で受信した監視情報通信信号に収容された画像データの画像を表示部に表示し、前記監視情報通信信号に収容された人物位置データに基づく前記人物の位置を表す人物位置標識を前記表示部に表示した画像中に表示する表示制御工程を備える。したがって、上記端末装置の表示方法では、表示中の画像に人物位置標識が表示されるので、上記端末装置のユーザは、前記人物位置標識を手がかりに、画像中の被監視者を見つけことができ、画像中の被監視者を見つけ易くなる。 Such a display method of the terminal device displays an image of the image data accommodated in the monitoring information communication signal received in the receiving step on the display unit, and the person based on the person position data accommodated in the monitoring information communication signal A display control step of displaying a person position sign representing the position of the person in the image displayed on the display unit. Therefore, in the display method of the terminal device, since the person position sign is displayed on the image being displayed, the user of the terminal device can find the monitored person in the image using the person position sign as a clue. This makes it easier to find the person being monitored in the image.
 他の一態様にかかるセンサ装置は、通信を行う通信部と、撮像を行う撮像部と、前記撮像部で撮像した画像に写る人物の前記画像での位置に関する人物位置データを生成する人物位置処理部と、前記撮像部で撮像した画像の画像データ、および、前記人物位置処理部で生成した人物位置データを収容する監視情報通信信号を前記通信部で送信する監視処理部とを備える。 A sensor device according to another aspect includes a communication unit that performs communication, an imaging unit that performs imaging, and a person position process that generates person position data related to the position of the person in the image captured by the imaging unit in the image And a monitoring processing unit that transmits the image data of the image captured by the imaging unit and the monitoring information communication signal that stores the person position data generated by the person position processing unit by the communication unit.
 このようなセンサ装置は、画像だけでなく人物位置データも監視情報通信信号に収容するので、前記監視情報通信信号を受信した端末装置は、表示中の画像に人物位置標識を表示できる。したがって、上記端末装置のユーザは、前記人物位置標識を手がかりに、画像中の被監視者を見つけことができ、画像中の被監視者を見つけ易くなる。 Such a sensor device accommodates not only the image but also the person position data in the monitoring information communication signal, so that the terminal device that has received the monitoring information communication signal can display the person position indicator on the displayed image. Therefore, the user of the terminal device can find the monitored person in the image using the person position sign as a clue, and can easily find the monitored person in the image.
 他の一態様にかかる被監視者監視システムは、上述のセンサ装置と、これら上述のいずれかの端末装置とを備える。 A monitored person monitoring system according to another aspect includes the above-described sensor device and any one of the above-described terminal devices.
 このような被監視者監視システムは、上述のセンサ装置と、これら上述のいずれかの端末装置とを備えるので、画像中の被監視者を見つけ易くできる。 Since such a monitored person monitoring system includes the above-described sensor device and any one of the above-described terminal devices, it is easy to find the monitored person in the image.
 この出願は、2016年2月15日に出願された日本国特許出願特願2016-25856を基礎とするものであり、その内容は、本願に含まれるものである。 This application is based on Japanese Patent Application No. 2016-25856 filed on Feb. 15, 2016, the contents of which are included in the present application.
 本発明を表現するために、上述において図面を参照しながら実施形態を通して本発明を適切且つ十分に説明したが、当業者であれば上述の実施形態を変更および/または改良することは容易に為し得ることであると認識すべきである。したがって、当業者が実施する変更形態または改良形態が、請求の範囲に記載された請求項の権利範囲を離脱するレベルのものでない限り、当該変更形態または当該改良形態は、当該請求項の権利範囲に包括されると解釈される。 In order to express the present invention, the present invention has been properly and fully described through the embodiments with reference to the drawings. However, those skilled in the art can easily change and / or improve the above-described embodiments. It should be recognized that this is possible. Accordingly, unless the modifications or improvements implemented by those skilled in the art are at a level that departs from the scope of the claims recited in the claims, the modifications or improvements are not limited to the scope of the claims. To be construed as inclusive.
 本発明によれば、端末装置および端末装置の表示方法、センサ装置ならびに被監視者監視システムが提供できる。
 
According to the present invention, a terminal device, a display method for the terminal device, a sensor device, and a monitored person monitoring system can be provided.

Claims (8)

  1.  監視対象である被監視者の画像の画像データ、および、前記画像に写る人物の前記画像上での位置に関する人物位置データを収容する監視情報通信信号を受信する通信部と、
     表示を行う表示部と、
     前記通信部で受信した監視情報通信信号に収容された画像データの画像を前記表示部に表示し、前記監視情報通信信号に収容された人物位置データに基づく前記人物の位置を表す人物位置標識を、前記表示部に表示した画像中に表示する表示処理部とを備える、
     端末装置。
    A communication unit that receives image data of an image of a monitored person to be monitored, and a monitoring information communication signal that contains person position data related to the position of the person in the image on the image;
    A display unit for displaying, and
    An image of the image data accommodated in the monitoring information communication signal received by the communication unit is displayed on the display unit, and a person position indicator representing the position of the person based on the person position data accommodated in the monitoring information communication signal is provided. A display processing unit for displaying in the image displayed on the display unit,
    Terminal device.
  2.  前記人物位置データは、前記画像に写る人物の前記画像上での位置を含む前記人物の領域に関する人物領域データであり、
     前記人物位置標識は、前記人物領域データで表される画像であり、
     前記表示処理部は、前記画像の表示方法と異なる第2表示方法で前記人物の領域内の画像を表示する、
     請求項1に記載の端末装置。
    The person position data is person area data relating to the person area including the position of the person in the image on the image,
    The person position indicator is an image represented by the person area data,
    The display processing unit displays an image in the person's region by a second display method different from the image display method;
    The terminal device according to claim 1.
  3.  前記表示処理部は、前記表示部における前記画像を表示するための画像表示領域の中央位置と、前記画像に写る人物の前記画像上での位置とが互いに一致するように、前記画像および前記人物位置標識を前記表示部に表示する、
     請求項1または請求項2に記載の端末装置。
    The display processing unit includes the image and the person so that a central position of an image display region for displaying the image on the display unit and a position on the image of a person appearing in the image are coincident with each other. Displaying a position sign on the display unit;
    The terminal device according to claim 1 or 2.
  4.  前記画像に写る人物を拡大して表示する指示である拡大表示指示を受け付ける指示入力部をさらに備え、
     前記人物位置データは、前記画像に写る人物の前記画像上での位置を含む前記人物の領域に関する人物領域データであり、
     前記人物位置標識は、前記人物領域データで表される画像であり、
     前記表示処理部は、前記指示入力部で前記拡大表示指示を受け付けた場合に、前記表示部に表示中の前記人物の領域の大きさよりも拡大して前記人物の領域を表示する、
     請求項1ないし請求項3のいずれか1項に記載の端末装置。
    An instruction input unit that receives an enlarged display instruction that is an instruction to enlarge and display a person shown in the image;
    The person position data is person area data relating to the person area including the position of the person in the image on the image,
    The person position indicator is an image represented by the person area data,
    When the display processing unit receives the enlarged display instruction in the instruction input unit, the display processing unit displays the area of the person in a larger size than the area of the person being displayed on the display unit.
    The terminal device according to any one of claims 1 to 3.
  5.  所定の操作を行う指示である操作指示を受け付ける第2指示入力部をさらに備え、
     前記第2指示入力部は、前記表示部とでタッチパネルを構成し、前記表示部にボタンで表示され、
     前記人物位置データは、前記画像に写る人物の前記画像上での位置を含む前記人物の領域に関する人物領域データであり、
     前記人物位置標識は、前記人物領域データで表される画像であり、
     前記表示処理部は、前記表示部の表示領域における前記人物の領域を除いた残余の領域に、前記ボタンを表示する、
     請求項1ないし請求項4のいずれか1項に記載の端末装置。
    A second instruction input unit that receives an operation instruction that is an instruction to perform a predetermined operation;
    The second instruction input unit constitutes a touch panel with the display unit, and is displayed with buttons on the display unit.
    The person position data is person area data relating to the person area including the position of the person in the image on the image,
    The person position indicator is an image represented by the person area data,
    The display processing unit displays the button in a remaining area excluding the person area in the display area of the display unit;
    The terminal device according to any one of claims 1 to 4.
  6.  監視対象である被監視者の画像の画像データ、および、前記画像に写る人物の前記画像上での位置に関する人物位置データを収容する監視情報通信信号を受信する受信工程と、
     前記受信工程で受信した監視情報通信信号に収容された画像データの画像を、表示を行う表示部に表示し、前記監視情報通信信号に収容された人物位置データに基づく前記人物の位置を表す人物位置標識を前記表示部に表示した画像中に表示する表示処理工程とを備える、
     端末装置の表示方法。
    A receiving step of receiving a monitoring information communication signal containing image data of an image of a monitored person to be monitored and person position data relating to a position of the person shown in the image on the image;
    A person representing the position of the person based on the person position data stored in the monitoring information communication signal by displaying the image of the image data stored in the monitoring information communication signal received in the receiving step on a display unit that performs display A display processing step of displaying a position sign in the image displayed on the display unit,
    Terminal device display method.
  7.  通信を行う通信部と、
     撮像を行う撮像部と、
     前記撮像部で撮像した画像に写る人物の前記画像での位置に関する人物位置データを生成する人物位置処理部と、
     前記撮像部で撮像した画像の画像データ、および、前記人物位置処理部で生成した人物位置データを収容する監視情報通信信号を前記通信部で送信する監視処理部とを備える、
     センサ装置。
    A communication unit for communication;
    An imaging unit for imaging;
    A person position processing unit that generates person position data relating to the position of the person in the image captured by the image capturing unit;
    A monitoring processing unit that transmits image data of an image captured by the imaging unit and a monitoring information communication signal that stores the person position data generated by the person position processing unit, by the communication unit;
    Sensor device.
  8.  請求項7に記載のセンサ装置と、
     請求項1ないし請求項5のいずれか1項に記載の端末装置とを備える、
     被監視者監視システム。
    A sensor device according to claim 7;
    A terminal device according to any one of claims 1 to 5,
    Monitored person monitoring system.
PCT/JP2017/002079 2016-02-15 2017-01-23 Terminal device, display method for terminal device, sensor device, and person-to-be-monitored monitoring system WO2017141629A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2018500001A JP6804510B2 (en) 2016-02-15 2017-01-23 Detection system and display method of detection system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-025856 2016-02-15
JP2016025856 2016-02-15

Publications (1)

Publication Number Publication Date
WO2017141629A1 true WO2017141629A1 (en) 2017-08-24

Family

ID=59624976

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/002079 WO2017141629A1 (en) 2016-02-15 2017-01-23 Terminal device, display method for terminal device, sensor device, and person-to-be-monitored monitoring system

Country Status (2)

Country Link
JP (2) JP6804510B2 (en)
WO (1) WO2017141629A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102484784B1 (en) * 2022-04-19 2023-01-04 김종환 Breath detection emergency alarm system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010157119A (en) * 2008-12-26 2010-07-15 Fujitsu Ltd Monitoring device, monitoring method, and monitoring program
JP2013206012A (en) * 2012-03-28 2013-10-07 Nippon Telegraph & Telephone West Corp Monitoring system and monitoring method
JP2014166197A (en) * 2013-02-28 2014-09-11 Nk Works Co Ltd Information processing equipment, information processing method, and program
WO2015011591A1 (en) * 2013-07-22 2015-01-29 Koninklijke Philips N.V. Automatic continuous patient movement monitoring

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001145101A (en) * 1999-11-12 2001-05-25 Mega Chips Corp Human image compressing device
JP3979902B2 (en) * 2001-08-30 2007-09-19 株式会社日立国際電気 Surveillance video delivery system and surveillance video delivery method
JP4718950B2 (en) * 2005-09-26 2011-07-06 Necカシオモバイルコミュニケーションズ株式会社 Image output apparatus and program
JP2007150702A (en) * 2005-11-28 2007-06-14 Keakomu:Kk Nurse call system
JP5270962B2 (en) * 2008-05-29 2013-08-21 京セラ株式会社 Motion detection system
JP2010237873A (en) * 2009-03-30 2010-10-21 Sogo Keibi Hosho Co Ltd Device, method, and program for detecting attitude change
JP2011138178A (en) * 2009-12-25 2011-07-14 Takenaka Komuten Co Ltd Light emitting device, suspicious person detection system and program
JP5672183B2 (en) * 2011-07-15 2015-02-18 富士通株式会社 Information processing apparatus, information processing method, and information processing program
JP5953673B2 (en) * 2011-08-11 2016-07-20 日本電気株式会社 Action identification device, action identification method, and program
JP5812948B2 (en) * 2012-07-13 2015-11-17 アイホン株式会社 Patient recognition device
JP5858940B2 (en) * 2013-02-01 2016-02-10 キング通信工業株式会社 Bed leaving monitoring system
JP2015032125A (en) * 2013-08-02 2015-02-16 アズビル株式会社 Watching device and watching system
JP6281249B2 (en) * 2013-11-09 2018-02-21 富士通株式会社 Information processing apparatus, information processing system, program, and information processing method
JP6417670B2 (en) * 2014-02-21 2018-11-07 オムロン株式会社 Monitoring device, monitoring system, monitoring method, monitoring program, and computer-readable recording medium recording the monitoring program
JP6336309B2 (en) * 2014-03-26 2018-06-06 株式会社ドワンゴ Terminal device, video distribution device, program
JP6413310B2 (en) * 2014-04-10 2018-10-31 富士通株式会社 Monitoring device, display method and program
JP6282923B2 (en) * 2014-05-12 2018-02-21 株式会社ケアコム Portable terminal device for nurse call system and nurse call system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010157119A (en) * 2008-12-26 2010-07-15 Fujitsu Ltd Monitoring device, monitoring method, and monitoring program
JP2013206012A (en) * 2012-03-28 2013-10-07 Nippon Telegraph & Telephone West Corp Monitoring system and monitoring method
JP2014166197A (en) * 2013-02-28 2014-09-11 Nk Works Co Ltd Information processing equipment, information processing method, and program
WO2015011591A1 (en) * 2013-07-22 2015-01-29 Koninklijke Philips N.V. Automatic continuous patient movement monitoring

Also Published As

Publication number Publication date
JP2019149172A (en) 2019-09-05
JPWO2017141629A1 (en) 2018-10-25
JP6804510B2 (en) 2020-12-23
JP6895090B2 (en) 2021-06-30

Similar Documents

Publication Publication Date Title
WO2017146012A1 (en) Monitored-person monitoring device, method and system
WO2017082037A1 (en) Central processing device and method for person monitoring system, and person monitoring system
JP6226110B1 (en) Monitored person monitoring apparatus, method and system
JP6696606B2 (en) Care support system, care support method and program
JP6895090B2 (en) Detection system and display processing method of detection system
JP6740633B2 (en) Central processing unit and central processing method of monitored person monitoring system, and monitored person monitoring system
JP6150025B1 (en) Display device and display method of monitored person monitoring system, and monitored person monitoring system
JP6292363B2 (en) Terminal device, terminal device display method, and monitored person monitoring system
JP2017151676A (en) Monitored person monitor device, method of monitoring monitored person and program thereof
JP6187732B1 (en) Terminal device, terminal device operation control method, and monitored person monitoring system
WO2020100461A1 (en) Sensor device for to-be-monitored person monitoring assistance system, monitoring image storage method, and to-be-monitored person monitoring assistance system
WO2019235068A1 (en) Subject monitoring assistance device, subject monitoring assistance method, subject monitoring assistance system, and subject monitoring assistance server device
JP6172424B1 (en) Terminal device, terminal device control method, and monitored person monitoring system
JP6150026B1 (en) Central processing unit and method of monitored person monitoring system, and monitored person monitoring system
JP6213699B1 (en) Central processing unit and central processing method for monitored person monitoring system, and monitored person monitoring system
JP6245415B1 (en) Terminal device, operation control method of terminal device, and monitored person monitoring system
WO2017145832A1 (en) Device, method, and system for monitoring persons to be monitored
JP7234931B2 (en) Sensor Device of Monitored Person Monitoring Support System, Processing Method of Sensor Device, and Monitored Person Monitoring Support System
WO2017188156A1 (en) Terminal apparatus of monitored person monitoring system, terminal apparatus control method, and monitored person monitoring system
WO2017130684A1 (en) Monitored-person monitoring device, method thereof, and system thereof
JP2020188487A (en) Central processing device, monitored person monitoring method, and monitored person monitoring system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17752893

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2018500001

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17752893

Country of ref document: EP

Kind code of ref document: A1