WO2017141721A1 - Terminal device, terminal device operation contol method, and monitored person monitoring system - Google Patents

Terminal device, terminal device operation contol method, and monitored person monitoring system Download PDF

Info

Publication number
WO2017141721A1
WO2017141721A1 PCT/JP2017/003833 JP2017003833W WO2017141721A1 WO 2017141721 A1 WO2017141721 A1 WO 2017141721A1 JP 2017003833 W JP2017003833 W JP 2017003833W WO 2017141721 A1 WO2017141721 A1 WO 2017141721A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
call
terminal device
sensor
request
Prior art date
Application number
PCT/JP2017/003833
Other languages
French (fr)
Japanese (ja)
Inventor
琢哉 村田
山下 雅宣
篤広 野田
雅史 西角
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Priority to JP2017536361A priority Critical patent/JP6245415B1/en
Publication of WO2017141721A1 publication Critical patent/WO2017141721A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G12/00Accommodation for nursing, e.g. in hospitals, not covered by groups A61G1/00 - A61G11/00, e.g. trolleys for transport of medicaments or food; Prescription lists
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • G08B25/04Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using a single signalling line, e.g. in a closed loop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers

Definitions

  • the present invention has been made in view of the above circumstances, and an object of the present invention is to provide a terminal device capable of talking with a monitored person while viewing an image, an operation control method for the terminal apparatus, and a monitored person monitoring system. It is.
  • FIG. 1 shows the structure of the monitoring person monitoring system in embodiment. It is a figure which shows the structure of the sensor apparatus in the said to-be-monitored person monitoring system. It is a figure which shows the external appearance of the portable terminal device in the said to-be-monitored person monitoring system. It is a figure which shows the structure of the portable terminal device in the said to-be-monitored person monitoring system. It is a flowchart which shows the operation
  • the private branch exchange (line switching machine) CX is connected to the network NW, performs extension telephone calls such as outgoing call, incoming call and telephone call between the portable terminal apparatuses TAa to carry out extension calls between the portable terminal apparatuses TAa, and For example, outgoing calls, incoming calls, and calls between the external telephone TL and the mobile terminal device TAa are connected to an external telephone TL such as a fixed telephone or a mobile telephone via a public telephone network PN such as a fixed telephone network or a mobile telephone network.
  • This is a device that performs an outside line telephone call between the outside line telephone TL and the portable terminal device TAa by controlling the outside line telephone.
  • the private branch exchange CX is, for example, a digital exchange or an IP-PBX.
  • the sensor device SU has a communication function that communicates with other devices SV, SP, TA via the network NW, detects a predetermined action in the monitored person Ob, and notifies the management server device SV of the detection result. , Accepts the nurse call, notifies the management server device SV to that effect, makes a voice call with the terminal devices SP and TA, generates an image including a moving image, and distributes the moving image to the terminal devices SP and TA It is a device to do.
  • a sensor device SU includes an imaging unit 11, a sensor side sound input / output unit (SU sound input / output unit) 12, a nurse call reception operation unit 13, and a sensor side control process.
  • Unit (SU control processing unit) 14 sensor-side communication interface unit (SU communication IF unit) 15, and sensor-side storage unit (SU storage unit) 16.
  • the imaging unit 11 is expected to be located at the head of the monitored person Ob in the bedding on which the monitored person Ob is lying (for example, a bed). It is arranged so that the imaging target can be imaged from directly above the preset planned head position (usually the position where the pillow is disposed).
  • the sensor device SU uses the imaging unit 11 to acquire an image of the monitored person Ob taken from above the monitored person Ob, preferably an image taken from directly above the planned head position.
  • the SU storage unit 16 is a circuit that is connected to the SU control processing unit 14 and stores various predetermined programs and various predetermined data under the control of the SU control processing unit 14.
  • Examples of the various predetermined programs include a SU control program that controls each unit of the sensor device SU according to the function of each unit, and a SU monitoring processing program that executes predetermined information processing related to monitoring of the monitored person Ob. And the like.
  • the SU monitoring processing program includes a behavior detection processing program for detecting a predetermined behavior in the monitored person Ob and notifying a detection result to a predetermined terminal device SP or TA via the management server device SV, or a nurse call reception operation.
  • the SU control unit 141 controls each part of the sensor device SU according to the function of each part, and governs overall control of the sensor device SU.
  • the action detection processing unit 142 determines that the position of the extracted head is within the region where the bedding BT is located, and the size of the extracted head uses the first threshold Th1 to determine the size of the lying posture.
  • the behavior detection processing unit 142 is a case where the position of the extracted head changes over time from the location of the bedding BT to the outside of the location of the bedding BT, and the size of the extracted head is the first
  • Th2 the threshold value
  • the SU nurse call processing unit 143 notifies the management server device SV when the nurse call reception operation unit 13 receives the nurse call, and uses the SU sound input / output unit 12 and the like to connect the terminal devices SP and TAa. Voice calls between the two. More specifically, when the nurse call reception operation unit 13 is input, the SU nurse call processing unit 143 accommodates the sensor ID of the own device and nurse call reception information indicating that the nurse call has been received. The first nurse call notification communication signal is transmitted to the management server device SV via the SU communication IF unit 15. Then, the SU nurse call processing unit 143 uses the SU sound input / output unit 12 or the like to make a voice call with the terminal devices SP and TAa by, for example, VoIP (Voice over Internet Protocol).
  • VoIP Voice over Internet Protocol
  • the SU streaming processing unit 144 receives the request from the fixed terminal device SP or the mobile terminal device TAa.
  • the moving image generated by the imaging unit 11 (for example, a live moving image) is distributed via the SU communication IF unit 15 by streaming reproduction.
  • FIG. 1 shows four first to fourth sensor devices SU-1 to SU-4 as an example, and the first sensor device SU-1 is one of the monitored persons Ob.
  • the second sensor device SU-2 is arranged in a room RM-2 (not shown) of Mr. B Ob-2 who is one of the monitored persons Ob.
  • the third sensor device SU-3 is disposed in the room RM-3 (not shown) of Mr. C Ob-3, one of the monitored subjects Ob, and the fourth sensor device SU-4 It is arranged in the room RM-4 (not shown) of Mr. D Ob-4, one of the monitored persons Ob.
  • the sensor device SU corresponds to an example of a sensor device including an imaging unit that performs imaging and a calling unit that performs a call.
  • the imaging unit 11 corresponds to an example of the imaging unit, and includes the SU sound input / output unit 12 and
  • the SU nurse call processing unit 143 corresponds to an example of the calling unit.
  • the management server device SV has a communication function of communicating with other devices SU, TAa, SP via the network NW, receives the detection result and the target image regarding the monitored person Ob from the sensor device SU, and receives the monitored person.
  • This is a device that manages information related to monitoring Ob (monitoring information), and notifies (re-notifies and transmits) the detection result and the target image regarding the received monitored person Ob to a predetermined terminal device SP and TAa. More specifically, the management server device SV corresponds to the notification device sensor device SU (sensor ID) and the notification destination (re-notification destination) terminal device SP, TAa (terminal ID) (notification destination correspondence relationship).
  • each device SU, SP, TAa each ID
  • the terminal ID is an identifier for identifying and identifying the terminal devices SP and TAa.
  • the management server device SV when the management server device SV receives the first nurse call notification communication signal, the management server device SV is accommodated in the notification source sensor device in the received first nurse call notification communication signal and the received first nurse call notification communication signal. The data is associated with each other and stored (recorded) as monitoring information of the monitored person Ob. Then, the management server device SV identifies the notification destination terminal devices SP and TAa corresponding to the notification source sensor device in the received first nurse call notification communication signal from the notification destination correspondence relationship, and the notification destination A second nurse call notification communication signal is transmitted to the terminal devices SP and TAa.
  • the second nurse call notification communication signal includes the sensor ID and nurse call reception information stored in the received first nurse call notification communication signal.
  • the fixed terminal device SP includes a communication function for communicating with other devices SU, SV, TAa via the network NW, a display function for displaying predetermined information, an input function for inputting predetermined instructions and data, and the like.
  • the user interface (UI) of the monitored person monitoring system MS is input by inputting predetermined instructions and data to be given to the management server device SV and the portable terminal device TAa, displaying the monitoring information obtained by the sensor device SU, and the like. ).
  • Such a fixed terminal device SP can be configured by, for example, a computer with a communication function.
  • the fixed terminal device SP as an example of the terminal device operates in the same manner as the mobile terminal device TAa. However, in this specification, the mobile terminal device TAa which is another example of the terminal device will be described.
  • a first electromechanical conversion unit 341 described later in the TA sound input / output unit 34 is arranged at the upper end) so that sound can enter and exit from the outside, and the TA sound is input to the other end (lower end in the example shown in FIG. 3).
  • a second electromechanical conversion unit 342 described later in the input / output unit 34 is arranged so that sound can enter and exit from the outside, and the first electromechanical conversion unit 341 at the one end (the upper end in the example shown in FIG. 3).
  • a separation / contact sensor unit 38 is disposed in the vicinity. As described above, the first and second electromechanical conversion units 341 and 342 are disposed on both ends of the display surface of the TA display unit 36 on one main surface of the housing HS.
  • the TA communication IF unit 31 like the SU communication IF unit 15, is a communication circuit that is connected to the TA control processing unit 32a and performs communication according to the control of the TA control processing unit 32a.
  • the TA communication IF unit 31 includes, for example, a communication interface circuit that conforms to the IEEE 802.11 standard or the like.
  • the TA display unit 36 is connected to the TA control processing unit 32a, and under the control of the TA control processing unit 32a, the predetermined operation content input from the TA input unit 35 and the monitored target MS monitored by the monitored person monitoring system MS.
  • a circuit that displays the monitoring information related to the monitoring of the monitoring person Ob for example, the type of a predetermined action detected by the sensor device SU, the image (still image and moving image) of the monitored person Ob, reception of a nurse call, etc.
  • display devices such as an LCD (Liquid Crystal Display) and an organic EL display.
  • the TA input unit 35 and the TA display unit 36 constitute a touch panel.
  • the TA input unit 35 is a position input device that detects and inputs an operation position such as a resistance film method or a capacitance method.
  • a position input device is provided on the display surface of the TA display unit 36, and one or more input content candidates that can be input are displayed on the TA display unit 36.
  • a user such as a nurse or a caregiver ( When the monitor) touches the display position where the input content to be input is displayed, the position is detected by the position input device, and the display content displayed at the detected position is input to the portable terminal device TAa as the operation input content of the user. Entered.
  • the separation / contact sensor unit 38 is a circuit that is connected to the TA control processing unit 32a and detects a person's separation / contact, and is, for example, a capacitive human sensor, an infrared human sensor, or the like.
  • the capacitance type human sensor includes a metal detection panel, outputs a change in the capacitance of a capacitor constituted by the panel and the human body as a voltage change, and a signal at a level corresponding to the degree of contact / disconnection of the human body. Is output.
  • the attitude sensor unit 39 is a circuit that is connected to the TA control processing unit 32a and detects the attitude of the mobile terminal device TAa, and is, for example, a gyro sensor that measures angular velocity.
  • the attitude sensor unit 39 outputs the sensor output (second sensor output) to the TA control processing unit 32a.
  • the TA storage unit 33 is a circuit that is connected to the TA control processing unit 32a and stores various predetermined programs and various predetermined data under the control of the TA control processing unit 32a.
  • the various predetermined programs include a TA control program for controlling each part of the mobile terminal device TAa according to the function of each part, the detection result received from the sensor device SU via the management server device SV, Sensor information is stored by recording (recording) monitoring information relating to monitoring of the monitored person Ob, such as the nurse call, and using the TA monitoring processing program for displaying the detection result and the nurse call, the TA sound input / output unit 34, and the like.
  • TA call processing program for performing a voice call with SU, streaming processing program for receiving distribution of moving images from the sensor device SU, and displaying the received moving images on the TA display unit 36 by streaming reproduction,
  • a separation determination program for determining separation of a person as a request for the call based on the first sensor output of the unit 38; Based on the second sensor output of the attitude sensor unit 39, the attitude of the portable terminal device TAa is determined based on the first arrangement position of the first electromechanical conversion unit 341 and the second arrangement position of the second electromechanical conversion unit 342.
  • a control processing program such as a posture determination program that determines a case where the extension direction of the line segment LN (see FIG. 3) is closer to the horizontal direction than the vertical direction as a request for the call.
  • the TA storage unit 33 includes, for example, a ROM and an EEPROM.
  • the TA storage unit 33 includes a RAM serving as a working memory of a so-called TA control processing unit 32a that stores data generated during execution of the predetermined program.
  • the TA storage unit 33 functionally includes a terminal-side monitoring information storage unit (TA monitoring information storage unit) 331 for storing the monitoring information.
  • the TA control processing unit 32a is a circuit for controlling each part of the portable terminal device TAa according to the function of each part, receiving and displaying the monitoring information for the monitored person Ob, and answering or calling a nurse call. It is.
  • the TA control processing unit 32a includes, for example, a CPU and its peripheral circuits.
  • the TA control processing unit 32a is configured to execute a control processing program so that the terminal side control unit (TA control unit) 321a, the terminal side monitoring processing unit (TA monitoring processing unit) 322, and the terminal side call processing unit (TA call processing unit).
  • Unit) 323, a terminal-side streaming processing unit (TA streaming processing unit) 324, a separation / contact determination unit 325, and an attitude determination unit 326 are functionally provided.
  • the TA control unit 321a controls each part of the mobile terminal apparatus TAa according to the function of each part, and controls the entire mobile terminal apparatus TAa.
  • TA sound input / output is performed.
  • the operation of the first electromechanical conversion unit 341 of the unit 34 is stopped, and the second electromechanical conversion unit 342 is controlled to operate as the mouthpiece and the earpiece.
  • the TA monitoring processing unit 322 stores (records) monitoring information related to monitoring of the monitored person Ob such as the detection result or the nurse call received from the sensor device SU via the management server device SV. A nurse call is displayed. More specifically, when the TA monitoring processing unit 322 receives the second monitoring information communication signal from the management server device SV, the TA monitoring processing unit 322 displays the monitoring information of the monitored person Ob contained in the received second monitoring information communication signal. Store (record) in the TA monitoring information storage unit 331. The TA monitoring processing unit 322 displays a screen corresponding to each piece of information contained in the received second monitoring information communication signal on the TA display unit 36.
  • the TA monitoring processing unit 322 When the TA monitoring processing unit 322 receives the second nurse call notification communication signal from the management server device SV, the TA monitoring processing unit 322 stores the monitoring information of the monitored subject Ob contained in the received second nurse call notification communication signal. Store (record) in the unit 331. The TA monitor processing unit 322 displays a nurse call reception screen pre-stored in the TA storage unit 33 on the TA display unit 36 according to the nurse call reception information accommodated in the received second nurse call notification communication signal. . Then, when receiving a predetermined input operation from the TA input unit 35, the TA monitoring processing unit 322 executes a predetermined process corresponding to the input operation.
  • the TA call processing unit 323 performs a voice call with the sensor device SU by using the TA sound input / output unit 34 or the like. More specifically, the TA call processing unit 323 uses the TA sound input / output unit 34 and the like, and the notification source sensor device that has transmitted the first monitoring information communication signal and the first nurse call notification communication signal to the management server device SV.
  • a voice call is made, for example, by VoIP with the SU or the sensor device SU selected and designated by the user (monitor) of the mobile terminal device TAa.
  • the separation / contact determination unit 325 determines the separation of a person as a request for a call with the sensor device SU based on the first sensor output of the separation / contact sensor unit 38. More specifically, the separation / contact determination unit 325 determines whether or not the person is separated based on the first sensor output of the separation / contact sensor unit 38. Judged as a request.
  • the separation sensor unit 38 is configured to include a capacitive human sensor
  • a capacitive human sensor when a person approaches the capacitive human sensor, a capacitor formed by the metal panel of the sensor and the human body
  • a relatively large capacitance and voltage are generated, and a relatively high level signal is output.
  • the capacitance-type human sensor is A relatively small capacitance and voltage are generated, and a relatively low level signal is output.
  • a threshold value separation threshold value for discriminating between the proximity and separation of a person is set in advance, and the separation determination unit 325 is configured to output the first sensor output from the capacitive human sensor.
  • the separation threshold is compared, and if the result of this comparison is that the output of the first sensor is greater than or equal to the threshold, it is determined that a person is approaching (not separated from the person). If the sensor output is less than the separation / separation threshold, it is determined that the person is separated, that is, the call request is determined.
  • the separation sensor unit 38 is configured to include an infrared human sensor, when the person approaches the infrared human sensor, the infrared human sensor receives infrared light relatively strongly. When a person is separated from the infrared human sensor, the infrared human sensor receives a relatively low infrared signal and outputs a relatively low signal. Is output.
  • a threshold value for discriminating between the proximity and separation of a person is set in advance, and the separation determination unit 325 determines whether the separation sensor 325 and the first sensor output output from the infrared human sensor and the separation sensor.
  • Contact threshold value and if the result of this comparison is that the first sensor output is greater than or equal to the threshold value, it is determined that the person is close (not separated), and the result of the comparison is that the first sensor output Is less than the separation threshold, it is determined that the person is separated, that is, the call is requested.
  • the attitude determination unit 326 determines whether the attitude of the mobile terminal device TAa is the first disposition position of the first electromechanical conversion unit 341 and the second electromechanical conversion unit 342. The case where the extension direction of the line segment LN connecting the second arrangement position is closer to the horizontal direction than the vertical direction is determined as the call request. More specifically, based on the second sensor output of the posture sensor unit 39, the posture determination unit 326 has a posture in which the mobile terminal device TAa has a posture in which the extension direction of the line segment LN is closer to the horizontal direction than the vertical direction. If it is determined whether or not the extension direction of the line segment is closer to the horizontal direction than the vertical direction as a result of the determination, it is determined that the call is requested.
  • the posture determination unit 326 extends the line segment LN based on the second sensor output of the posture sensor unit 39. Is compared with a threshold value (posture threshold value) for discriminating whether or not it is closer to the horizontal direction than the vertical direction. As a result of this comparison, the extension direction of the line segment LN and the horizontal direction are compared. Is equal to or greater than the threshold value, it is determined that the angle is not closer to the horizontal direction than the vertical direction (closer to the vertical direction than the horizontal direction). When the angle formed with the direction is less than the threshold, it is determined that the direction is closer to the horizontal direction than the vertical direction, that is, the call request is determined.
  • the posture threshold is appropriately set to 45 degrees, 30 degrees, 20 degrees, and the like, for example.
  • the TA input unit 35 and the TA display unit 36 that constitute the touch panel, and the TA control unit 321a correspond to an example of a call detection unit that detects a call request with the call unit of the sensor device.
  • the TA input unit 35 and the TA display unit 36 constituting the touch panel correspond to an example of a call request input unit that receives the call request.
  • the separation sensor unit 38 and the separation determination unit 325 correspond to another example of the call detection unit.
  • the posture sensor unit 39 and the posture determination unit 326 correspond to still another example of the call detection unit.
  • each device SU, SV, SP, TAa when the power is turned on, executes initialization of each necessary unit and starts its operation.
  • the SU control processing unit 14 is functionally configured with a SU control unit 141, a behavior detection processing unit 142, a SU nurse call processing unit 143, and a SU streaming processing unit 144.
  • the TA control processing unit 32a by executing the control processing program, includes a TA control unit 321a, a TA monitoring processing unit 322, a TA call processing unit 323, a TA streaming processing unit 324, and a disconnection determination unit 325.
  • the posture determination unit 326 is functionally configured.
  • the monitored person monitoring system MS having the above configuration generally monitors each monitored person Ob by the following operation.
  • the sensor device SU operates in the following manner for each frame or every several frames, thereby detecting a predetermined operation in the monitored person Ob and determining whether or not a nurse call is accepted.
  • the sensor device SU acquires an image (image data) for one frame from the imaging unit 11 as a target image by the SU control unit 141 of the SU control processing unit 14, and the behavior detection processing unit 142 acquires the acquired image.
  • the first monitoring information communication signal is managed in order to detect a predetermined action in the monitored person Ob based on the target image and to notify the predetermined terminal device SP and TAa when the predetermined action is detected. Transmit to server device SV.
  • the sensor device SU determines whether or not the nurse call accepting operation unit 13 accepts the nurse call by the SU nurse call processing unit 143, and when the nurse call is accepted, In order to notify the reception of the call to the predetermined terminal devices SP and TAa, the sensor device SU transmits a first nurse call notification communication signal to the management server device SV.
  • the management server device SV When the management server device SV receives the first monitoring information communication signal from the sensor device SU via the network NW, the management server device SV displays the sensor ID, the determination result, the target image, and the like accommodated in the first monitoring information communication signal as the sensor ID. Is stored (recorded) as monitoring information of the monitored person Ob monitored by the sensor device SU having Then, the management server device SV identifies the notification destination terminal device SP, TAa corresponding to the notification source sensor device SU in the received first monitoring information communication signal from the notification destination correspondence relationship, and the notification destination A second monitoring information communication signal is transmitted to the terminal devices SP and TAa.
  • the management server device SV receives the first nurse call notification communication signal from the sensor device SU through the network NW, the management server device SV receives the sensor ID, nurse call reception information, and the like contained in the first nurse call notification communication signal.
  • the information is stored (recorded) as monitoring information of the monitored person Ob monitored by the sensor device SU having the sensor ID.
  • the management server device SV identifies the notification destination terminal devices SP and TAa corresponding to the notification source sensor device SU in the received first nurse call notification communication signal from the notification destination correspondence relationship, and this notification destination
  • the second nurse call notification communication signal is transmitted to the terminal devices SP and TAa.
  • the fixed terminal device SP and the portable terminal device TAa When the fixed terminal device SP and the portable terminal device TAa receive the second monitoring information communication signal from the management server device SV via the network NW, the fixed terminal device SP and the portable terminal device TAa relate to monitoring the monitored person Ob accommodated in the second monitoring information communication signal.
  • the monitoring information is displayed. The operation of displaying the monitoring information by the mobile terminal device TAa will be described in detail below.
  • the sensor ID accommodated in the second nurse call notification communication signal is received. It is displayed that a nurse call has been received from the monitored person Ob monitored by the sensor device SU.
  • the monitored person monitoring system MS detects a predetermined action in each monitored person Ob roughly by each sensor device SU, management server device SV, fixed terminal device SP, and portable terminal device TAa. Each monitored person Ob is monitored.
  • FIG. 5 is a flowchart illustrating an operation related to monitoring information of the mobile terminal device in the monitored person monitoring system according to the embodiment.
  • FIG. 6 is a flowchart illustrating the operation of the first aspect related to the control of the first and second electromechanical conversion units of the mobile terminal device in the monitored person monitoring system of the embodiment.
  • FIG. 7 is a diagram illustrating an example of a standby screen displayed on the mobile terminal device in the monitored person monitoring system according to the embodiment.
  • FIG. 8 is a diagram illustrating an example of a monitoring information screen displayed on the mobile terminal device in the monitored person monitoring system according to the embodiment.
  • FIG. 5 is a flowchart illustrating an operation related to monitoring information of the mobile terminal device in the monitored person monitoring system according to the embodiment.
  • FIG. 6 is a flowchart illustrating the operation of the first aspect related to the control of the first and second electromechanical conversion units of the mobile terminal device in the monitored person monitoring system of the embodiment.
  • FIG. 7 is a diagram illustrating an example of a standby screen displayed
  • FIG. 9 is a diagram for explaining a call mode of the mobile terminal device in the monitored person monitoring system according to the embodiment.
  • FIG. 9A illustrates a usage mode of the portable terminal device TAa in a default mode in which the first electromechanical conversion unit 341 operates as an earpiece and the second electromechanical conversion unit 342 controls to operate as a mouthpiece. It is a figure for doing.
  • FIG. 9B shows the use of the portable terminal device TAa in the speakerphone mode in which the operation of the first electromechanical converter 341 is stopped and the second electromechanical converter 342 is controlled to operate as a mouthpiece and a receiver. It is a figure for demonstrating an aspect.
  • the operation of the terminal devices SP and TAa will be described.
  • movement of portable terminal device TAa is demonstrated.
  • the portable terminal device TAa accepts a login operation by a monitor (user) such as a nurse or a caregiver, and the TA monitor processing unit 322 automatically
  • a standby screen for waiting for a communication signal addressed to the machine is displayed on the TA display unit 36.
  • the standby screen 51 includes a menu bar area 511 for displaying a menu bar, and a standby main area for displaying a message indicating standby (for example, “no notification”) and an icon.
  • the menu bar area 511 is provided with an off-hook button 5111 for inputting an instruction for an extension call with another mobile terminal device TAa or an outgoing call with an external telephone TL.
  • portable terminal device TAa determines whether the communication signal was received by TA communication IF part 31 by TA control part 321a of TA control process part 32a (S11). If the communication signal is not received as a result of this determination (No), the mobile terminal device TAa returns the process to S11. If the communication signal is received as a result of the determination (Yes), The portable terminal device TAa performs the following process S12. That is, the portable terminal device TAa is waiting for reception of a communication signal.
  • the mobile terminal device TAa monitors the monitored person Ob contained in the second monitoring information communication signal received from the management server apparatus SV in the process S11 by the TA monitoring processing unit 322 of the TA control processing unit 32a. Is stored (recorded) in the TA monitoring information storage unit 331.
  • the TA monitoring processing unit 322 displays a screen corresponding to each information contained in the second monitoring information communication signal received in processing S11, for example, the monitoring information screen 52 shown in FIG. 36 (S14).
  • the monitoring information screen 52 is a screen for displaying the monitoring information related to the monitoring of the monitored person Ob.
  • the monitoring information screen 52 includes a menu bar area 511, an arrangement location of the sensor device SU having the sensor ID accommodated in the second monitoring information communication signal received in the process S ⁇ b> 11, and the The monitored person name area 521 for displaying the name of the monitored person Ob monitored by the sensor device SU having the sensor ID, and the reception time of the second monitoring information communication signal received in the process S11 (or the predetermined action) Detection time), a detection information display area 522 for displaying the detection result contained in the second monitoring information communication signal received in the process S11, and a second monitoring information communication signal received in the process S11.
  • An image area 523 for displaying a stored image (that is, a target image captured by the sensor device SU having the sensor ID) (here, a still image). Includes a “corresponding” button 524, the “speak” button 525, and a “View LIVE” button 526.
  • the TA storage unit 33 In order to display the installation location of the sensor device SU and the name of the monitored subject Ob in the monitored person name area 521, the TA storage unit 33 displays the installation location of the sensor device SU having the sensor ID and the sensor ID. And the name of the monitored person Ob monitored by the sensor device SU having the sensor ID is stored in advance in association with each other.
  • the detection results in this embodiment, names of getting up, getting out of bed, falling down, and falling down
  • the detection result is displayed with an icon that symbolically represents the detection result.
  • the TA storage unit 33 stores each action and an icon representative of the action in association with each other in advance.
  • the detection information display area 522 displays a wake-up icon that symbolizes wake-up.
  • the “corresponding” button 524 has an intention to perform a predetermined response (response, response) such as lifesaving, nursing, care, and assistance for the detection result displayed on the monitoring information screen 52.
  • This is a button for inputting implementation intention information indicating that the user of the mobile terminal device TA is present to the mobile terminal device TAa.
  • a “speak” button 525 is a button for requesting a voice call, and is used to input an instruction to connect the sensor device SU of the sensor ID and the mobile terminal device TAa so as to be able to talk over the network NW. It is a button.
  • the “LIVE” button 526 is a button for requesting a live video, and is a button for inputting an instruction to display a video captured by the sensor device SU of the sensor ID.
  • the portable terminal device TAa is accommodated in the second nurse call notification communication signal received from the management server apparatus SV in the process S11 by the TA monitoring processor 322 of the TA control processor 32a.
  • the monitoring information related to the monitoring of the monitored person Ob is stored (recorded) in the TA monitoring information storage unit 331.
  • the TA monitoring processing unit 322 performs a nurse call stored in advance in the TA storage unit 33 in accordance with the nurse call reception information accommodated in the second nurse call notification communication signal received in the processing S11.
  • An unillustrated nurse call acceptance screen indicating acceptance is displayed on the TA display unit 36 (S16).
  • portable terminal device TAa performs input operation with the touch panel which comprises TA input part 35 and TA display part 36 by TA control process part 32a. It is determined whether or not it has been accepted. If the input operation is not accepted as a result of this determination (No), the portable terminal device TAa returns the process to step S17. On the other hand, if the input operation is accepted as a result of the determination, The apparatus TAa executes the next process S18.
  • the portable terminal apparatus TAa performs an appropriate process according to the content of the input operation by the TA control processing unit 32a, and ends this process.
  • the portable terminal device TAa receives an input operation of the “corresponding” button 524 by the TA control processing unit 32a (ie, accepts the corresponding intention)
  • the portable terminal device TAa is currently displayed on the TA display unit 36.
  • a sensor corresponding to the monitoring information of the monitored person Ob displayed on the TA display unit 36 with the fact that “corresponding” has been received is added to the monitoring information of the monitoring person Ob and stored in the TA monitoring information storage unit 331.
  • a communication signal (correspondence acceptance notification communication signal) containing information indicating that the ID and “corresponding” are accepted (correspondence acceptance information) is transmitted to the management server device SV.
  • the TA call processing unit 323 monitors the monitored person Ob displayed on the TA display unit 36.
  • a communication signal (call request communication signal) containing information such as requesting a voice call is transmitted to the sensor device SU, and the corresponding sensor device SU is connected to the sensor device SU via the network NW so as to be able to make a voice call.
  • a voice call can be performed between the portable terminal device TAa and the sensor device SU.
  • the control for the first and second electromechanical converters 341 and 342 executed when the request for a call is received will be described later.
  • the portable terminal device TAa uses the TA call processing unit 323 to A communication signal (call end communication signal) containing information such as a request to end the voice call is transmitted to the sensor device SU that monitors the monitored person Ob displayed on the TA display unit 36. As a result, the voice call between the portable terminal device TAa and the sensor device SU is terminated.
  • the TA streaming processing unit 324 displays the monitored object currently displayed on the TA display unit 36.
  • a communication signal video distribution request communication signal
  • the video is connected to be downloadable via the sensor device SU, receives live video distribution from the sensor device SU, and displays the distributed video on the TA display unit 36 by streaming playback.
  • the video is displayed in the image area 523 instead of the still image, and the “live end” button (not shown) is displayed instead of the “view live” button 526. .
  • live video is displayed on the mobile terminal device TAa.
  • the “live end” button (not shown) is a button for requesting the end of the moving image, and ends (stops) the distribution of the moving image picked up by the sensor device SU of the sensor ID and ends (stops) the display. This is a button for inputting an instruction to be performed.
  • the portable terminal device TAa monitors the monitored person Ob currently displayed on the TA display unit 36 by the TA streaming processing unit 324.
  • a communication signal (moving image distribution end communication signal) containing information such as requesting the end of moving image distribution is transmitted to the sensor device SU, and a still image is displayed on the TA display unit 36. Accordingly, the mobile terminal device TA ends the live video display.
  • the mobile terminal device TAa operates as described above with respect to the detection results received from the sensor device SU via the management server device SV and the notifications (re-notification of each nurse call reception).
  • the portable terminal device TAa relates to the control of the first and second electromechanical converters 341 and 342 while operating as described above. It works as follows.
  • the TA control processing unit 32a displays an image captured by the imaging unit 11 of the sensor device SU on the TA display unit 36, the portable terminal device TAa is further connected between the sensor devices SU. It is determined whether a call request is detected (S21). In this determination, in the present embodiment, the TA control processing unit 32a performs first determination processing for determining whether or not the TA control unit 321a has accepted an input operation of the “speak” button 525, and the disconnection determination unit 325 performs the disconnection / detachment.
  • a second determination process for determining whether or not the person is separated based on the first sensor output of the sensor unit 38; and the portable terminal device based on the second sensor output of the attitude sensor unit 39 by the attitude determination unit 326 The third determination process is performed to determine whether the posture of TAa is a posture in which the extending direction of the line segment LN is closer to the horizontal direction than the vertical direction.
  • the TA control processing unit 32a determines that the call has been requested (Yes), and performs the next process.
  • it is determined that there is no request for the call (No) it is determined that there is no request for the call (No), and the next process S25 is executed.
  • the TA control processing unit 32a executes the next process S22.
  • the TA control processing unit 32a executes the next process S22.
  • the TA control processing unit 32a performs the following process S22.
  • the input operation of the “speak” button 525 is not accepted, the person is not separated, and the mobile terminal device TAa is in a posture in which the extension direction of the line segment LN is closer to the horizontal direction than the vertical direction.
  • the TA control processing unit 32a executes the next process S25.
  • the TA control processing unit 32a controls the TA control unit 321a to stop the operation of the first electromechanical conversion unit 341.
  • the TA control processor 32a controls the TA controller 321a to operate the second electromechanical converter 341 as a mouthpiece and a receiver (S23).
  • the first and second electromechanical converters 341 and 342 operate in a so-called speakerphone mode through the processes S22 and S23.
  • the TA control processing unit 32a determines whether or not the call is ended by the TA call processing unit 323 (S24). More specifically, in the present embodiment, the TA control processing unit 32 a determines whether or not the TA call processing unit 323 has accepted an input operation of the “end” button. If the result of this determination is that an input operation for the “end” button has been accepted, the TA control processing section 32a determines that the call has ended (Yes), and then executes step S25. On the other hand, if the input operation of the “end” button is not accepted as a result of the determination, the TA control processing unit 32a determines that the call is not ended (No), and returns the process to step S24.
  • the process S24 is repeated until the call ends.
  • the TA control processing unit 32a determines whether or not the call is ended, but determines whether or not the use of the second electromechanical conversion unit 342 as the speakerphone is ended. Also good. In this case, the process S24 is repeated until the use as the speakerphone is finished, and when the use as the speakerphone is finished, the process S25 is executed.
  • the TA control processing unit 32a controls the TA control unit 321a to operate using the first electromechanical conversion unit 341 as an earpiece.
  • the TA control processing unit 32a controls the TA control unit 321a to operate using the second electromechanical conversion unit 341 as a mouthpiece (S26).
  • the first and second electromechanical converters 341 and 342 operate in the default mode by the processes S25 and S26.
  • the TA control processing unit 32a determines whether or not the operation of the mobile terminal device TAa is finished (S27). As a result of this determination, for example, when the operation is ended due to an operation of turning off the power switch or the like (Yes), the TA control processing unit 32a ends this processing. On the other hand, if the result of the determination is that the operation has not ended (No), the TA control processing unit 32a returns the process to step S21.
  • the monitored person monitoring system MS, the terminal device SP, TAa and the operation control method implemented therein provide the TA display unit 36 with an image captured by the imaging unit 11 of the sensor device US.
  • the TA control unit 321a that controls the second electromechanical conversion unit 342 to operate as the mouthpiece and the earpiece is provided when the call request is detected.
  • FIG. 9B for example, as shown in FIG. 9B, the first electromechanical conversion unit 341 of the terminal device SP, TA is brought close to the ear and the second electromechanical conversion unit 342 is brought close to the mouth for the call. Even if not performed, the second electromechanical conversion unit 342 can transmit and receive a speech. Therefore, the monitored person monitoring system MS, the terminal devices SP and TAa, and the operation control method can talk to the monitored person Ob while viewing the image.
  • the TA input unit 35 constituting the touch panel accepts a call request from the supervisor (user). It is possible to detect the call request in the supervisor directly.
  • the monitored person monitoring system MS, the terminal devices SP, TAa, and the operation control method include the separation / contact sensor unit 38 and the separation / reception determination unit 325.
  • the contact determination unit 325 can automatically detect it.
  • the monitored person monitoring system MS, the terminal devices SP, TAa, and the operation control method include the posture sensor unit 39 and the posture determination unit 326. Therefore, the posture sensor unit 39 and the posture determination unit 326 request a call from the supervisor. Can be detected automatically.
  • the first electromechanical conversion unit 341 stops its operation.
  • the second electromechanical conversion unit 342 is controlled to operate as a mouthpiece and a mouthpiece.
  • the determination result of the first determination process is positive.
  • the first electromechanical conversion unit 341 is controlled to stop its operation.
  • the second electromechanical conversion unit 342 may be controlled to operate as a mouthpiece and a mouthpiece.
  • the mobile terminal device TAb in such a modified form includes a TA communication IF unit 31, a TA control processing unit 32b, a TA storage unit 33, a TA sound input / output unit 34, and a TA input unit. 35, a TA display unit 36, a TAIF unit 37, a separation sensor unit 38, and an attitude sensor unit 39.
  • TA communication IF unit 31 is a TA communication IF unit 31, a TA storage unit 33, a TA sound input / output unit 34, a TA input unit 35, a TA display unit 36, a TAIF unit 37, a disconnection sensor unit 38, and a mobile terminal device TAa, respectively. Since it is the same as the attitude sensor unit 39, the description thereof is omitted.
  • the TA control processing unit 32b controls each unit of the mobile terminal device TAb according to the function of each unit, accepts and displays the monitoring information for the monitored person Ob, and displays a nurse call. It is a circuit for answering and calling out.
  • the TA control processing unit 32b of this modified form is configured such that a TA control unit 321b, a TA monitoring processing unit 322, a TA call processing unit 323, a TA streaming processing unit 324, a connection / disconnection determination unit 325, and a control processing program are executed.
  • a posture determination unit 326 is functionally provided.
  • the TA monitoring processing unit 322, the TA call processing unit 323, the TA streaming processing unit 324, the disconnection determination unit 325, and the posture determination unit 326 in the TA control processing unit 32b of these modified forms are respectively in the above-described TA control processing unit 32a. Since it is similar to the TA monitoring processing unit 322, the TA call processing unit 323, the TA streaming processing unit 324, the disconnection determination unit 325, and the posture determination unit 326, description thereof is omitted.
  • the TA control unit 321b controls each part of the mobile terminal device TAb according to the function of each part, and controls the entire mobile terminal device TAb.
  • the TA control unit 321b determines that the determination result of the first determination process is positive as a result of each of the first to third determination processes, and each determination result of each of the second and third determination processes is further positive.
  • the operation of the first electromechanical conversion unit 341 is stopped, and the second electromechanical conversion unit 342 is controlled to operate as the mouthpiece and the earpiece.
  • the TA control unit 321b accepts an input operation of the “speak” button 525 at the TA input unit 35 constituting the touch panel, and further, the call request is made by at least one of the separation determination unit 325 and the posture determination unit 326. Is determined to be when the final call request is detected, the operation of the first electromechanical conversion unit 341 is stopped, and the second electromechanical conversion unit 342 is operated as the mouthpiece and the earpiece. To control.
  • FIG. 10 is a flowchart illustrating the operation of the second aspect regarding the control of the first and second electromechanical conversion units of the portable terminal device in the monitored person monitoring system of the embodiment.
  • the mobile terminal device TAb further includes the first when the TA control unit 321 b of the TA control processing unit 32 b displays an image captured by the imaging unit 11 of the sensor device SU on the TA display unit 36.
  • the connection / disconnection determination unit 325 of the TA control processing unit 32b performs the connection / disconnection.
  • the first sensor output of the sensor unit 38 is acquired (S32).
  • the portable terminal device TAb determines whether or not the person is separated based on the first sensor output of the separation / contact sensor unit 38 acquired in step S32 as the second determination process by the separation / contact determination unit 325 ( S33). If the result of this determination is that the person is separated (Yes), the portable terminal device TAb executes the process S36 by the TA control processing unit 32b. On the other hand, when the result of the determination is that the person is not separated (No), the portable terminal device TAb executes the process S34 by the TA control processing unit 32b. Instead of determining whether or not the person is separated, the portable terminal device TAb may determine whether or not the person is close as the second determination process by the separation / contact determination unit 325.
  • the portable terminal device TAb executes the process S34 by the TA control processing unit 32b, and if the person is not close (No).
  • the portable terminal device TAb executes the process S36 by the TA control processing unit 32b.
  • the portable terminal device TAb acquires the second sensor output of the attitude sensor unit 39 by the attitude determination unit 326 of the TA control processing unit 32b.
  • the mobile terminal device TAb is subjected to the attitude of the mobile terminal apparatus TAb based on the second sensor output of the attitude sensor unit 39 acquired in process S33 as the third determination process by the attitude determination unit 326.
  • the extending direction of the line segment LN connecting the first arrangement position of the first electromechanical converter 341 and the second arrangement position of the second electromechanical converter 342 is in a posture closer to the horizontal direction than the vertical direction. It is determined whether or not there is (S35). As a result of this determination, when the posture of the mobile terminal device TAb is a posture in which the extending direction of the line segment LN is closer to the horizontal direction than the vertical direction (Yes), the mobile terminal device TAb The process S36 is executed by 32b.
  • the mobile terminal device TAb Process S39 is executed by the processing unit 32b.
  • the TA control processing unit 32b controls the TA control unit 321b to stop the operation of the first electromechanical conversion unit 341, similarly to the process S22.
  • the TA control processing unit 32b controls the second electromechanical conversion unit 341 to operate as the mouthpiece and the earpiece by the TA control unit 321b, similarly to the process S23 (S37).
  • the first and second electromechanical converters 341 and 342 operate in a so-called speakerphone mode by the processes S36 and S37.
  • the TA control processing unit 32b determines whether the call is ended by the TA call processing unit 323, similarly to the process S24 (S38). If the result of this determination is that the call has ended (Yes), the TA control processing section 32b next executes processing S39. On the other hand, if the result of the determination is that the call has not ended (No), the TA control processing unit 32b returns the process to step S38. That is, the process S38 is repeated until the call ends.
  • the TA control processing unit 32b controls the TA control unit 321b to operate the first electromechanical conversion unit 341 as the earpiece, similarly to the process S25.
  • the TA control processing section 32b controls the TA control section 321b to operate the second electromechanical conversion section 341 as a mouthpiece, similarly to the processing S26 (S40).
  • the first and second electromechanical converters 341 and 342 operate in the default mode by the processes S39 and S40.
  • the processing S32 and the processing S33 are omitted, the processing S31 is executed after the processing S31, and the other processing may be executed in the same manner as described above.
  • the processing S34 and the processing S35 are omitted, and in the processing S33, when the person is not separated (No), the processing S39 is executed, and the other processing may be executed in the same manner as described above. .
  • the process S33 if the person is separated (Yes), the process S34 is executed. If the person is not separated (No), the process S39 is executed, and the other processes are as follows. It may be executed in the same manner as described above.
  • the monitored person monitoring system MS, the terminal device SP, TAb, and the operation control method implemented therein in such a modification form for example, when receiving an input operation of the “speak” button 525 and executing a voice call, for example, FIG.
  • FIG. 9B it is possible to automatically distinguish between a case where the user wants to talk while looking at the image on the TA display unit 36 and a case where the user wants to talk without looking at the image on the TA display unit 36 as shown in FIG. Only when it is desired to make a call while viewing the image on the display unit 36, the second electromechanical conversion unit 342 can be controlled to operate as a mouthpiece and a mouthpiece.
  • a terminal device is a terminal device communicably connected to a sensor device including an imaging unit that performs imaging and a calling unit that performs a call, and a display unit that performs display, and an electrical signal as a mechanical vibration signal
  • a sensor device including an imaging unit that performs imaging and a calling unit that performs a call, and a display unit that performs display, and an electrical signal as a mechanical vibration signal
  • a first electromechanical converter that operates as a mouthpiece, a second electromechanical converter that converts between an electric signal and a mechanical vibration signal, and operates as a mouthpiece
  • the sensor device A call detection unit that detects a request for a call with the call unit, and a detection result of the call detection unit, and controls each of the display unit, the first electromechanical conversion unit, and the second electromechanical conversion unit A control unit.
  • the machine conversion unit is controlled to operate as the mouthpiece and the earpiece.
  • the call detection unit when the control unit displays an image captured by the imaging unit of the sensor device on the display unit, the call detection unit further detects the call request. In this case, the operation of the first electromechanical conversion unit is stopped, and the second electromechanical conversion unit is controlled to operate as the mouthpiece and the earpiece.
  • the terminal device when an image captured by the imaging unit of the sensor device is displayed on the display unit, and further when the call detection unit detects a call request, the terminal device transmits the second electromechanical conversion unit. Since the control unit that controls to operate as the mouth and the earpiece is provided, the first electromechanical conversion unit of the terminal device does not need to be close to the ear and the second electromechanical conversion unit does not need to be close to the mouth for communication. The second electromechanical converter can send and receive speech. Therefore, the terminal device can talk with the monitored person while viewing the image.
  • the call detection unit includes a call request input unit that receives the call request.
  • Such a terminal device can accept a call request by a supervisor (user) at a call request input unit, and can directly detect a call request by a supervisor.
  • the call detection unit detects the separation of the person based on the separation sensor unit that detects the separation of the person and the first sensor output of the separation sensor unit. And a connection / disconnection determination unit that determines the call request.
  • the separation sensor unit is a capacitive human sensor.
  • the separation sensor unit is an infrared human sensor.
  • Such a terminal device determines a person's separation / contact based on the first sensor output of the separation / contact sensor unit, and determines a case where it is determined that the person is separated as a request for the call. Therefore, the terminal device can automatically detect a call request of the supervisor by the separation sensor unit and the separation determination unit.
  • the call detection unit is configured to detect a posture of the terminal device based on a posture sensor unit that detects a posture of the terminal device and a second sensor output of the posture sensor unit.
  • a posture sensor unit that detects a posture of the terminal device and a second sensor output of the posture sensor unit.
  • the posture sensor unit is a gyro sensor.
  • Such a terminal device determines the posture of the terminal device based on the second sensor output of the posture sensor unit, and the posture of the terminal device is determined based on the first arrangement position of the first electromechanical converter and the second electric device.
  • the terminal device can automatically detect a call request from the supervisor by the attitude sensor unit and the attitude determination unit.
  • the call detection unit includes a call request input unit that receives the call request, a separation sensor unit that detects a person's separation and contact, and a first of the separation sensor unit.
  • a separation / contact determination unit that determines a person's separation as a request for the call based on a sensor output, a posture sensor unit that detects a posture of the terminal device, and a second sensor output of the posture sensor unit.
  • the orientation of the line segment connecting the first placement position of the first electromechanical transducer and the second placement position of the second electromechanical transducer is closer to the horizontal direction than the vertical direction.
  • At least one of attitude determination units that determine a certain case as a request for the call, and the control unit receives the call request at the call request input unit, and further, the communication request is received by the at least one.
  • the control unit receives the call request at the call request input unit, and further, the communication request is received by the at least one.
  • Such a terminal device can separate a case where it is desired to make a call while looking at the image on the display unit and a case where it is desired to make a call without looking at the image on the display unit.
  • the second electromechanical converter can be controlled to operate as a mouthpiece and a mouthpiece.
  • an operation control method for a terminal device that controls an operation of a terminal device that is communicably connected to a sensor device including an imaging unit that performs imaging and a communication unit that performs a call.
  • a call detection step for detecting a request for a call with the call unit of the sensor device, a display unit for displaying, and a first electric device that converts an electrical signal into a mechanical vibration signal and operates as a receiver.
  • the control step when the image picked up by the image pickup unit of the sensor device is displayed on the display unit, and when the call request is detected in the call detection step, the second electric The machine conversion unit is controlled to operate as the mouthpiece and the earpiece.
  • the second electromechanical conversion is performed.
  • a monitored person monitoring system includes a sensor device including an imaging unit that performs imaging and a calling unit that performs a call, and a terminal device that is connected to be communicable with the sensor device, and the imaging unit
  • a monitored person monitoring system that detects a predetermined action in a monitored person that is a monitoring target based on an image captured in step (b) and notifies the terminal device of a detection result, wherein the terminal apparatus Is a terminal device.
  • Such a monitored person monitoring system includes any of the above-described terminal devices, it is possible to talk with the monitored person while viewing the image.
  • the present invention it is possible to provide a terminal device, an operation control method for the terminal device, and a monitored person monitoring system.

Abstract

In a terminal device, a terminal device operation control method, and a monitored person monitoring system according to the present invention, a first electromechanical transducer unit is controlled to operate as an ear piece and a second electromechanical transducer element is controlled to operate as a mouth piece. When an image captured by a sensor device is being displayed on a display unit, the second electromechanical transducer unit, upon detection of a request for a call, is controlled to operate as the mouth piece and the ear piece.

Description

端末装置および端末装置の動作制御方法ならびに被監視者監視システムTerminal device, operation control method of terminal device, and monitored person monitoring system
 本発明は、監視すべき監視対象である被監視者を監視する被監視者監視システムに好適に用いられる端末装置および前記端末装置の動作制御方法、ならびに、前記被監視者監視システムに関する。 The present invention relates to a terminal device suitably used in a monitored person monitoring system for monitoring a monitored person to be monitored, an operation control method for the terminal apparatus, and the monitored person monitoring system.
 我が国(日本)は、戦後の高度経済成長に伴う生活水準の向上、衛生環境の改善および医療水準の向上等によって、高齢化社会、より詳しくは、総人口に対する65歳以上の人口の割合である高齢化率が21%を超える超高齢化社会になっている。2005年では、総人口約1億2765万人に対し65歳以上の高齢者人口は、約2556万人であったのに対し、2020年では、総人口約1億2411万人に対し高齢者人口は、約3456万人となる予測もある。このような高齢化社会では、病気や怪我や高齢等による看護や介護を必要とする要看護者や要介護者(要看護者等)は、高齢化社会ではない通常の社会で生じる要看護者等よりもその増加が見込まれる。そして、我が国は、例えば2013年の合計特殊出生率が1.43という少子化社会でもある。そのため、高齢な要看護者等を高齢の家族(配偶者、子、兄弟)が介護する老老介護も起きて来ている。 Japan (Japan) is an aging society, more specifically the ratio of population over 65 years old to the total population due to the improvement of living standards accompanying the post-war high economic growth, improvement of sanitary environment and improvement of medical standards, etc. It is a super-aging society with an aging rate exceeding 21%. In 2005, the elderly population aged 65 and over was about 25.56 million compared to the total population of 127.65 million, whereas in 2020, the elderly population was about 124.11 million. The population is predicted to be about 34.56 million. In such an aging society, nurses who need nursing or nursing care due to illness, injury, elderly age, etc., or those who need nursing care (such as those who require nursing care) are those who need nursing in a normal society that is not an aging society. This is expected to increase more than Japan, for example, is a society with a declining birthrate with a total fertility rate of 1.43 in 2013. For this reason, elderly care has been taking place in which elderly nurses, etc., are cared for by an elderly family (spouse, child, brother).
 要看護者等は、病院や、老人福祉施設(日本の法令では老人短期入所施設、養護老人ホームおよび特別養護老人ホーム等)等の施設に入所し、その看護や介護を受ける。このような施設では、要看護者等が、例えばベッドからの転落や歩行中の転倒等によって怪我を負ったり、ベッドから抜け出して徘徊したりするなどの事態が生じ得る。このような事態に対し、可及的速やかに対応する必要がある。このような事態を放置しておくとさらに重大な事態に発展してしまう可能性もある。このため、前記施設では、看護師や介護士等は、定期的に巡視することによってその安否や様子を確認している。 Employees requiring nursing care, etc. enter hospitals and facilities for welfare for the elderly (Japanese elderly law short-term entrance facilities, nursing homes for the elderly and special nursing homes for the elderly, etc.) and receive nursing and care. In such a facility, a situation in which a nurse or the like needs to be injured or fallen out of the bed, for example, by falling from the bed or falling while walking can occur. It is necessary to respond to such a situation as quickly as possible. If such a situation is left unattended, it may develop into a more serious situation. For this reason, in the facility, nurses and caregivers regularly check their safety and state by patrol.
 しかしながら、要看護者等の増加数に対し看護師等の増加数が追い付かずに、看護業界や介護業界では、慢性的に人手不足になっている。さらに、日勤の時間帯に較べ、準夜勤や夜勤の時間帯では、看護師や介護士等の人数が減るため、一人当たりの業務負荷が増大するので、前記業務負荷の軽減が要請される。また、前記老老介護の事態は、前記施設でも例外ではなく、高齢の要看護者等を高齢の看護師等がケアすることもしばしば見られる。一般に高齢になると体力が衰えるため、健康であっても若い看護師等に比し看護等の負担が重くなり、また、その動きや判断も遅くなる。 However, the increase in the number of nurses etc. cannot keep up with the increase in the number of nurses required, and the nursing industry and the care industry are chronically short of manpower. Furthermore, since the number of nurses, caregivers, and the like is reduced in the semi-night work and night work hours compared to the day work hours, the work load per person increases, and thus the work load is required to be reduced. In addition, the situation of the elderly care is not an exception in the facility, and it is often seen that elderly nurses and the like care for elderly nurses and the like. In general, physical strength declines when older, so the burden of nursing etc. becomes heavier than young nurses etc. even if they are healthy, and their movements and judgments are also delayed.
 このような人手不足や看護師等の負担を軽減するため、看護業務や介護業務を補完する技術が求められている。このため、近年では、要看護者等の、監視すべき監視対象である被監視者を監視(モニタ)する被監視者監視技術が研究、開発されている。 In order to reduce the labor shortage and the burden on nurses, a technology that complements nursing work and nursing care work is required. For this reason, in recent years, monitored person monitoring techniques for monitoring a monitored person to be monitored, such as a care recipient, have been researched and developed.
 このような技術の一つとして、例えば特許文献1に開示されたナースコールシステムがある。この特許文献1に開示されたナースコールシステムは、ベッドに設置されて患者が看護師を呼び出すためのナースコール子機と、ナースステーションに設置されて前記ナースコール子機による呼び出しに応答するためのナースコール親機とを有するナースコールシステムであって、ベッド上の患者をベッド上方から撮像するカメラと、前記カメラの撮像映像から、患者が上半身を起こした状態及び患者がベッド上から離れた状態のうち少なくとも一方の発生を判断して注意状態発生信号を出力する状態判断手段とを有し、前記ナースコール親機は、前記注意状態発生信号を受けて報知動作する報知手段を有する。そして、このナースコールシステムは、前記ナースコール子機からの呼び出しに応答するために看護師が携行する携帯端末と、前記注意状態発生信号を受けて、前記カメラの撮像映像を前記携帯端末に送信する通信制御手段とを有する。 For example, there is a nurse call system disclosed in Patent Document 1 as one of such techniques. The nurse call system disclosed in Patent Document 1 is a nurse call slave set that is installed in a bed and a patient calls a nurse, and a nurse call set that is installed in a nurse station and responds to a call by the nurse call slave set. A nurse call system having a nurse call parent device, a camera for imaging a patient on a bed from above the bed, and a state in which the patient wakes up from a captured image of the camera and a state in which the patient is separated from the bed State judging means for judging the occurrence of at least one of them and outputting a caution state occurrence signal, and the nurse call master unit has a notifying means for performing a notification operation upon receiving the caution state occurrence signal. The nurse call system transmits a captured image of the camera to the portable terminal upon receiving the attention state generation signal and a portable terminal carried by a nurse to respond to a call from the nurse call slave. Communication control means.
 一方、安否確認の点では、一人暮らしの独居者も前記要介護者等と同様であり、被監視対象者となる。 On the other hand, in terms of safety confirmation, a single person living alone is the same as the care recipient and the like and is a subject to be monitored.
 ところで、前記特許文献1に開示されたナースコールシステムのように、カメラの映像が端末装置に表示されると、看護師等の監視者は、要看護者等の被監視者の状況を視覚で把握できるため、便利である。しかしながら、監視者は、被監視者と通話する場合、通常、端末装置の受話口を耳元に近づけ、その送話口を口元に近づけるので、端末装置に表示された前記カメラの映像を見ることができなくなってしまう。 By the way, when the video of the camera is displayed on the terminal device as in the nurse call system disclosed in Patent Document 1, the monitor such as a nurse visually indicates the status of the monitored person such as a nurse. It is convenient because it can be grasped. However, when the supervisor makes a call with the monitored person, usually the earpiece of the terminal device is brought close to the ear and the mouthpiece is brought close to the mouth, so that the video of the camera displayed on the terminal device can be seen. It becomes impossible.
特開2014-90913号公報JP 2014-90913 A
 本発明は、上述の事情に鑑みて為された発明であり、その目的は、画像を見ながら被監視者と通話できる端末装置、端末装置の動作制御方法および被監視者監視システムを提供することである。 The present invention has been made in view of the above circumstances, and an object of the present invention is to provide a terminal device capable of talking with a monitored person while viewing an image, an operation control method for the terminal apparatus, and a monitored person monitoring system. It is.
 本発明にかかる端末装置、端末装置の動作制御方法および被監視者監視システムでは、第1電気機械変換部は、受話口として動作するように、第2電気機械変換素子は、送話口として動作するように制御される一方、センサ装置で撮像した画像を表示部に表示している場合に、さらに、通話の要求が検出された場合、前記第2電気機械変換部は、前記送話口および前記受話口として動作するように制御される。 In the terminal device, the operation control method of the terminal device, and the monitored person monitoring system according to the present invention, the second electromechanical transducer operates as a mouthpiece so that the first electromechanical transducer operates as a mouthpiece. On the other hand, when the image picked up by the sensor device is displayed on the display unit, and further when a request for a call is detected, the second electromechanical conversion unit includes the mouthpiece and It is controlled to operate as the earpiece.
 上記並びにその他の本発明の目的、特徴及び利点は、以下の詳細な記載と添付図面から明らかになるであろう。 The above and other objects, features and advantages of the present invention will become apparent from the following detailed description and the accompanying drawings.
実施形態における被監視者監視システムの構成を示す図である。It is a figure which shows the structure of the monitoring person monitoring system in embodiment. 前記被監視者監視システムにおけるセンサ装置の構成を示す図である。It is a figure which shows the structure of the sensor apparatus in the said to-be-monitored person monitoring system. 前記被監視者監視システムにおける携帯端末装置の外観を示す図である。It is a figure which shows the external appearance of the portable terminal device in the said to-be-monitored person monitoring system. 前記被監視者監視システムにおける携帯端末装置の構成を示す図である。It is a figure which shows the structure of the portable terminal device in the said to-be-monitored person monitoring system. 前記被監視者監視システムにおける携帯端末装置の監視情報に関する動作を示すフローチャートである。It is a flowchart which shows the operation | movement regarding the monitoring information of the portable terminal device in the said to-be-monitored person monitoring system. 前記被監視者監視システムにおける携帯端末装置の第1および第2電気機械変換部の制御に関する第1態様の動作を示すフローチャートである。It is a flowchart which shows operation | movement of the 1st aspect regarding control of the 1st and 2nd electromechanical conversion part of the portable terminal device in the said to-be-monitored person monitoring system. 前記被監視者監視システムにおける携帯端末装置に表示される待受け画面の一例を示す図である。It is a figure which shows an example of the standby screen displayed on the portable terminal device in the said to-be-monitored person monitoring system. 前記被監視者監視システムにおける携帯端末装置に表示される監視情報画面の一例を示す図である。It is a figure which shows an example of the monitoring information screen displayed on the portable terminal device in the said to-be-monitored person monitoring system. 前記被監視者監視システムにおける携帯端末装置の通話態様を説明するための図である。It is a figure for demonstrating the telephone call mode of the portable terminal device in the said to-be-monitored person monitoring system. 前記被監視者監視システムにおける携帯端末装置の第1および第2電気機械変換部の制御に関する第2態様の動作を示すフローチャートである。It is a flowchart which shows operation | movement of the 2nd aspect regarding control of the 1st and 2nd electromechanical conversion part of the portable terminal device in the said to-be-monitored person monitoring system.
 以下、本発明にかかる実施の一形態を図面に基づいて説明する。なお、各図において同一の符号を付した構成は、同一の構成であることを示し、適宜、その説明を省略する。本明細書において、総称する場合には添え字を省略した参照符号で示し、個別の構成を指す場合には添え字を付した参照符号で示す。 Hereinafter, an embodiment according to the present invention will be described with reference to the drawings. In addition, the structure which attached | subjected the same code | symbol in each figure shows that it is the same structure, The description is abbreviate | omitted suitably. In this specification, when referring generically, it shows with the reference symbol which abbreviate | omitted the suffix, and when referring to an individual structure, it shows with the reference symbol which attached | subjected the suffix.
 実施形態における被監視者監視システムは、監視すべき(見守るべき)監視対象(見守り対象)である被監視者(見守り対象者)Obを監視するものであり、撮像を行う撮像部と通話を行う通話部とを備えるセンサ装置と、前記センサ装置と通信可能に接続される端末装置とを備え、前記撮像部で撮像された画像に基づいて監視対象である被監視者における所定の行動を検知して検知結果を前記端末装置に報知するシステムである。このような被監視者監視システムにおける前記端末装置は、表示を行う表示部と、電気信号を機械振動信号に変換し、受話口として動作する第1電気機械変換部と、電気信号と機械振動信号との間で相互に変換し、送話口として動作する第2電気機械変換部と、前記センサ装置の前記通話部との間における通話の要求を検出する通話検出部と、前記通話検出部の検出結果を受け、前記表示部、前記第1電気機械変換部および前記第2電気機械変換部それぞれを制御する制御部とを備える。そして、前記制御部は、前記センサ装置の前記撮像部で撮像した画像を前記表示部に表示している場合に、さらに、前記通話検出部が前記通話の要求を検出した場合、前記第2電気機械変換部を前記送話口および前記受話口として動作するように制御する。なお、前記端末装置は、1種類の装置であって良いが、以下の説明では、前記端末装置は、固定端末装置と携帯端末装置との2種類の装置である。これら固定端末装置と携帯端末装置との主な相違は、固定端末装置が固定的に運用される一方、携帯端末装置が例えば看護師や介護士等の監視者(ユーザ)に携行されて運用される点であり、これら固定端末装置と携帯端末装置とは、略同様であるので、以下、携帯端末装置を主に説明する。 The monitored person monitoring system in the embodiment monitors a monitored person (monitoring target) Ob that is a monitoring target (monitoring target) to be monitored (monitored), and performs a call with an imaging unit that performs imaging. A sensor device including a communication unit, and a terminal device connected to the sensor device so as to be communicable, and detecting a predetermined action in a monitored person to be monitored based on an image captured by the imaging unit. In this system, the detection result is notified to the terminal device. The terminal device in such a monitored person monitoring system includes a display unit that performs display, a first electromechanical conversion unit that converts an electrical signal into a mechanical vibration signal and operates as an earpiece, an electrical signal, and a mechanical vibration signal. Between the second electromechanical conversion unit that operates as a mouthpiece, a call detection unit that detects a call request between the call unit of the sensor device, and the call detection unit And a control unit that receives the detection result and controls each of the display unit, the first electromechanical conversion unit, and the second electromechanical conversion unit. And when the said control part is displaying the image imaged by the said imaging part of the said sensor apparatus on the said display part, and also when the said call detection part detects the request | requirement of the said call, said 2nd electricity The machine conversion unit is controlled to operate as the mouthpiece and the earpiece. In addition, although the said terminal device may be one type of apparatus, in the following description, the said terminal device is two types of apparatuses, a fixed terminal device and a portable terminal device. The main difference between these fixed terminal devices and portable terminal devices is that the fixed terminal device is fixedly operated, while the portable terminal device is operated by being carried by a supervisor (user) such as a nurse or a caregiver. Since the fixed terminal device and the mobile terminal device are substantially the same, the mobile terminal device will be mainly described below.
 図1は、実施形態における被監視者監視システムの構成を示す図である。図2は、実施形態の被監視者監視システムにおけるセンサ装置の構成を示す図である。図3は、実施形態の被監視者監視システムにおける携帯端末装置の外観を示す図である。図4は、実施形態の被監視者監視システムにおける携帯端末装置の構成を示す図である。 FIG. 1 is a diagram illustrating a configuration of a monitored person monitoring system according to the embodiment. FIG. 2 is a diagram illustrating a configuration of a sensor device in the monitored person monitoring system according to the embodiment. FIG. 3 is a diagram illustrating an appearance of the mobile terminal device in the monitored person monitoring system according to the embodiment. FIG. 4 is a diagram illustrating a configuration of the mobile terminal device in the monitored person monitoring system according to the embodiment.
 より具体的には、被監視者監視システムMSは、例えば、図1に示すように、1または複数のセンサ装置SU(SU-1~SU-4)と、管理サーバ装置SVと、固定端末装置SPと、1または複数の携帯端末装置TAa(TAa-1、TAa-2)と、構内交換機(PBX、Private branch exchange)CXとを備え、これらは、有線や無線で、LAN(Local Area Network)等の網(ネットワーク、通信回線)NWを介して通信可能に接続される。ネットワークNWは、通信信号を中継する例えばリピーター、ブリッジおよびルーター等の中継機が備えられても良い。図1に示す例では、これら複数のセンサ装置SU-1~SU-4、管理サーバ装置SV、固定端末装置SP、複数の携帯端末装置TAa-1、TAa-2および構内交換機CXは、L2スイッチの集線装置(ハブ、HUB)LSおよびアクセスポイントAPを含む有線および無線の混在したLAN(例えばIEEE802.11規格に従ったLAN等)NWによって互いに通信可能に接続されている。より詳しくは、複数のセンサ装置SU-1~SU-4、管理サーバ装置SV、固定端末装置SPおよび構内交換機CXは、集線装置LSに接続され、複数の携帯端末装置TAa-1、TAa-2は、アクセスポイントAPを介して集線装置LSに接続されている。そして、ネットワークNWは、TCP(Transmission control protocol)およびIP(Internet protocol)等のインターネットプロトコル群が用いられることによっていわゆるイントラネットを構成する。 More specifically, the monitored person monitoring system MS includes, for example, as shown in FIG. 1, one or a plurality of sensor devices SU (SU-1 to SU-4), a management server device SV, and a fixed terminal device. SP, one or a plurality of mobile terminal devices TAa (TAa-1, TAa-2), and private branch exchange (PBX) CX, which are wired or wireless, LAN (Local Area Network) Or the like via a network (network, communication line) NW. The network NW may be provided with repeaters such as repeaters, bridges, and routers that relay communication signals. In the example shown in FIG. 1, the plurality of sensor devices SU-1 to SU-4, the management server device SV, the fixed terminal device SP, the plurality of portable terminal devices TAa-1, TAa-2, and the private branch exchange CX include an L2 switch. Are connected to each other by a wired / wireless LAN (for example, a LAN in accordance with the IEEE 802.11 standard) NW including the LS and the access point AP. More specifically, the plurality of sensor devices SU-1 to SU-4, the management server device SV, the fixed terminal device SP, and the private branch exchange CX are connected to the line concentrator LS, and the plurality of portable terminal devices TAa-1, TAa-2. Is connected to the line concentrator LS via the access point AP. The network NW constitutes a so-called intranet by using Internet protocol groups such as TCP (Transmission control protocol) and IP (Internet protocol).
 被監視者監視システムMSは、被監視者Obに応じて適宜な場所に配設される。被監視者(見守り対象者)Obは、例えば、病気や怪我等によって看護を必要とする者や、身体能力の低下等によって介護を必要とする者や、一人暮らしの独居者等である。特に、早期発見と早期対処とを可能にする観点から、被監視者Obは、例えば異常状態等の所定の不都合な事象がその者に生じた場合にその発見を必要としている者であることが好ましい。このため、被監視者監視システムMSは、被監視者Obの種類に応じて、病院、老人福祉施設および住戸等の建物に好適に配設される。図1に示す例では、被監視者監視システムMSは、複数の被監視者Obが入居する複数の居室RMや、ナースステーション等の複数の部屋を備える介護施設の建物に配設されている。 The monitored person monitoring system MS is arranged at an appropriate place according to the monitored person Ob. The monitored person (person to be watched) Ob is, for example, a person who needs nursing due to illness or injury, a person who needs care due to a decrease in physical ability, a single person living alone, or the like. In particular, from the viewpoint of enabling early detection and early action, the monitored person Ob may be a person who needs the detection when a predetermined inconvenient event such as an abnormal state occurs in the person. preferable. For this reason, the monitored person monitoring system MS is suitably arranged in a building such as a hospital, a welfare facility for the elderly, and a dwelling unit according to the type of the monitored person Ob. In the example illustrated in FIG. 1, the monitored person monitoring system MS is disposed in a building of a care facility that includes a plurality of rooms RM in which a plurality of monitored persons Ob live and a plurality of rooms such as a nurse station.
 構内交換機(回線切換機)CXは、ネットワークNWに接続され、携帯端末装置TAa同士における発信、着信および通話等の内線電話の制御を行って前記携帯端末装置TAa同士の内線電話を実施し、そして、例えば固定電話網や携帯電話網等の公衆電話網PNを介して例えば固定電話機や携帯電話機等の外線電話機TLに接続され、外線電話機TLと携帯端末装置TAaとの間における発信、着信および通話等の外線電話の制御を行って外線電話機TLと携帯端末装置TAaとの間における外線電話を実施する装置である。構内交換機CXは、例えば、デジタル交換機や、IP-PBX等である。 The private branch exchange (line switching machine) CX is connected to the network NW, performs extension telephone calls such as outgoing call, incoming call and telephone call between the portable terminal apparatuses TAa to carry out extension calls between the portable terminal apparatuses TAa, and For example, outgoing calls, incoming calls, and calls between the external telephone TL and the mobile terminal device TAa are connected to an external telephone TL such as a fixed telephone or a mobile telephone via a public telephone network PN such as a fixed telephone network or a mobile telephone network. This is a device that performs an outside line telephone call between the outside line telephone TL and the portable terminal device TAa by controlling the outside line telephone. The private branch exchange CX is, for example, a digital exchange or an IP-PBX.
 センサ装置SUは、ネットワークNWを介して他の装置SV、SP、TAと通信する通信機能等を備え、被監視者Obにおける所定の行動を検知してその検知結果を管理サーバ装置SVへ報知し、ナースコールを受付けてその旨を管理サーバ装置SVへ通知し、端末装置SP、TAとの間で音声通話を行い、そして、動画を含む画像を生成して端末装置SP、TAへ動画を配信する装置である。このようなセンサ装置SUは、例えば、図2に示すように、撮像部11と、センサ側音入出力部(SU音入出力部)12と、ナースコール受付操作部13と、センサ側制御処理部(SU制御処理部)14と、センサ側通信インターフェース部(SU通信IF部)15と、センサ側記憶部(SU記憶部)16とを備える。 The sensor device SU has a communication function that communicates with other devices SV, SP, TA via the network NW, detects a predetermined action in the monitored person Ob, and notifies the management server device SV of the detection result. , Accepts the nurse call, notifies the management server device SV to that effect, makes a voice call with the terminal devices SP and TA, generates an image including a moving image, and distributes the moving image to the terminal devices SP and TA It is a device to do. For example, as shown in FIG. 2, such a sensor device SU includes an imaging unit 11, a sensor side sound input / output unit (SU sound input / output unit) 12, a nurse call reception operation unit 13, and a sensor side control process. Unit (SU control processing unit) 14, sensor-side communication interface unit (SU communication IF unit) 15, and sensor-side storage unit (SU storage unit) 16.
 撮像部11は、SU制御処理部14に接続され、SU制御処理部14の制御に従って、撮像を行い、画像(画像データ)を生成する装置である。前記画像には、静止画(静止画データ)および動画(動画データ)が含まれる。撮像部11は、監視すべき監視対象である被監視者Obが所在を予定している空間(所在空間、図1に示す例では配設場所の居室RM)を監視可能に配置され、前記所在空間を撮像対象としてその上方から撮像し、前記撮像対象を俯瞰した画像(画像データ)を生成し、前記撮像対象の画像(対象画像)をSU制御処理部14へ出力する。好ましくは、被監視者Ob全体を撮像できる蓋然性が高いことから、撮像部11は、被監視者Obが横臥する寝具(例えばベッド等)における、被監視者Obの頭部が位置すると予定されている予め設定された頭部予定位置(通常、枕の配設位置)の直上から撮像対象を撮像できるように配設される。センサ装置SUは、この撮像部11によって、被監視者Obを、被監視者Obの上方から撮像した画像、好ましくは前記頭部予定位置の直上から撮像した画像を取得する。 The imaging unit 11 is an apparatus that is connected to the SU control processing unit 14 and performs imaging in accordance with the control of the SU control processing unit 14 to generate an image (image data). The image includes a still image (still image data) and a moving image (moving image data). The imaging unit 11 is disposed so as to be able to monitor a space (location space, in the example shown in FIG. 1, where the monitored person Ob that is a monitoring target to be monitored) is located. The space is imaged as an imaging target from above, an image (image data) overlooking the imaging target is generated, and the imaging target image (target image) is output to the SU control processing unit 14. Preferably, since there is a high probability that the entire monitored person Ob can be imaged, the imaging unit 11 is expected to be located at the head of the monitored person Ob in the bedding on which the monitored person Ob is lying (for example, a bed). It is arranged so that the imaging target can be imaged from directly above the preset planned head position (usually the position where the pillow is disposed). The sensor device SU uses the imaging unit 11 to acquire an image of the monitored person Ob taken from above the monitored person Ob, preferably an image taken from directly above the planned head position.
 このような撮像部11は、可視光の画像を生成する装置であって良いが、比較的暗がりでも被監視者Obを監視できるように、本実施形態では、赤外線の画像を生成する装置である。このような撮像部11は、例えば、本実施形態では、撮像対象における赤外の光学像を所定の結像面上に結像する結像光学系、前記結像面に受光面を一致させて配置され、前記撮像対象における赤外の光学像を電気的な信号に変換するイメージセンサ、および、イメージセンサの出力を画像処理することで前記撮像対象における赤外の画像を表すデータである画像データを生成する画像処理部等を備えるデジタル赤外線カメラである。撮像部11の前記結像光学系は、本実施形態では、その配設された居室RM全体を撮像できる画角を持つ広角な光学系(いわゆる広角レンズ(魚眼レンズを含む))であることが好ましい。 Such an imaging unit 11 may be a device that generates an image of visible light, but in the present embodiment, it is a device that generates an infrared image so that the monitored person Ob can be monitored even in a relatively dark place. . For example, in this embodiment, the imaging unit 11 has an imaging optical system that forms an infrared optical image of an imaging target on a predetermined imaging surface, and a light receiving surface that matches the imaging surface. An image sensor that is arranged and converts an infrared optical image in the imaging target into an electrical signal, and image data that represents an infrared image in the imaging target by performing image processing on the output of the image sensor It is a digital infrared camera provided with the image processing part etc. which produce | generate. In the present embodiment, the imaging optical system of the imaging unit 11 is preferably a wide-angle optical system (so-called wide-angle lens (including a fisheye lens)) having an angle of view that can image the entire living room RM in which the imaging unit 11 is disposed. .
 SU音入出力部12は、音を入出力する回路である。すなわち、SU音入出力部12は、SU制御処理部14に接続され、SU制御処理部14の制御に従って音を表す電気信号に応じた音を生成して出力するための回路であって、外部の音を取得してセンサ装置SUに入力するための回路である。SU音入出力部12は、例えば、音の電気信号(音データ)を音の機械振動信号(音響信号)に変換するスピーカ等と、可聴領域の音の機械振動信号を電気信号に変換するマイクロフォン等とを備えて構成される。SU音入出力部12は、外部の音を表す電気信号をSU制御処理部14へ出力し、また、SU制御処理部14から入力された電気信号を音の機械振動信号に変換して出力する。 The SU sound input / output unit 12 is a circuit that inputs and outputs sound. That is, the SU sound input / output unit 12 is a circuit that is connected to the SU control processing unit 14 and generates and outputs a sound corresponding to an electrical signal representing sound according to the control of the SU control processing unit 14. It is a circuit for acquiring the sound of and inputting it into the sensor device SU. The SU sound input / output unit 12 includes, for example, a speaker that converts a sound electrical signal (sound data) into a sound mechanical vibration signal (acoustic signal), and a microphone that converts a sound mechanical vibration signal in the audible region into an electrical signal. And so on. The SU sound input / output unit 12 outputs an electric signal representing an external sound to the SU control processing unit 14, and converts the electric signal input from the SU control processing unit 14 into a sound mechanical vibration signal and outputs the sound. .
 ナースコール受付操作部13は、SU制御処理部14に接続され、ナースコールを当該センサ装置SUに入力するための例えば押しボタン式スイッチ等のスイッチ回路である。なお、ナースコール受付操作部13は、有線でSU制御処理部14に接続されて良く、また、例えばBluetooth(登録商標)規格等の近距離無線通信でSU制御処理部14に接続されて良い。 The nurse call reception operation unit 13 is connected to the SU control processing unit 14 and is a switch circuit such as a push button switch for inputting the nurse call to the sensor device SU. Note that the nurse call reception operation unit 13 may be connected to the SU control processing unit 14 by wire, or may be connected to the SU control processing unit 14 by short-range wireless communication such as Bluetooth (registered trademark) standard.
 SU通信IF部15は、SU制御処理部14に接続され、SU制御処理部14の制御に従って通信を行うための通信回路である。SU通信IF部15は、SU制御処理部14から入力された転送すべきデータを収容した通信信号を、この被監視者監視システムMSのネットワークNWで用いられる通信プロトコルに従って生成し、この生成した通信信号をネットワークNWを介して他の装置SV、SP、TAへ送信する。SU通信IF部15は、ネットワークNWを介して他の装置SV、SP、TAから通信信号を受信し、この受信した通信信号からデータを取り出し、この取り出したデータをSU制御処理部14が処理可能な形式のデータに変換してSU制御処理部14へ出力する。SU通信IF部15は、例えば、IEEE802.11規格等に従った通信インターフェース回路を備えて構成される。 The SU communication IF unit 15 is a communication circuit that is connected to the SU control processing unit 14 and performs communication according to the control of the SU control processing unit 14. The SU communication IF unit 15 generates a communication signal containing data to be transferred input from the SU control processing unit 14 in accordance with a communication protocol used in the network NW of the monitored person monitoring system MS, and the generated communication The signal is transmitted to other devices SV, SP, TA via the network NW. The SU communication IF unit 15 receives communication signals from other devices SV, SP, and TA via the network NW, extracts data from the received communication signals, and the SU control processing unit 14 can process the extracted data. The data is converted into data in a proper format and output to the SU control processing unit 14. The SU communication IF unit 15 includes, for example, a communication interface circuit that complies with the IEEE 802.11 standard or the like.
 SU記憶部16は、SU制御処理部14に接続され、SU制御処理部14の制御に従って、各種の所定のプログラムおよび各種の所定のデータを記憶する回路である。前記各種の所定のプログラムには、例えば、センサ装置SUの各部を当該各部の機能に応じてそれぞれ制御するSU制御プログラムや、被監視者Obに対する監視に関する所定の情報処理を実行するSU監視処理プログラム等の制御処理プログラムが含まれる。前記SU監視処理プログラムには、被監視者Obにおける所定の行動を検知して検知結果を管理サーバ装置SVを介して所定の端末装置SP、TAへ通知する行動検知処理プログラムや、ナースコール受付操作部13でナースコールを受付けた場合にその旨を管理サーバ装置SVへ通知し、SU音入出力部12等を用いることで端末装置SP、TAとの間で音声通話を行うSUナースコール処理プログラムや、撮像部11で生成した動画を、その動画を要求した端末装置SP、TAへストリーミングで配信するSUストリーミング処理プログラム等が含まれる。前記各種の所定のデータには、自機の、センサ装置SUを特定し識別するための識別子であるセンサ装置識別子(センサID)、および、管理サーバ装置SVの通信アドレス等の、各プログラムを実行する上で必要なデータ等が含まれる。SU記憶部16は、例えば不揮発性の記憶素子であるROM(Read Only Memory)や書き換え可能な不揮発性の記憶素子であるEEPROM(Electrically Erasable Programmable Read Only Memory)等を備える。SU記憶部16は、前記所定のプログラムの実行中に生じるデータ等を記憶するいわゆるSU制御処理部14のワーキングメモリとなるRAM(Random Access Memory)等を含む。 The SU storage unit 16 is a circuit that is connected to the SU control processing unit 14 and stores various predetermined programs and various predetermined data under the control of the SU control processing unit 14. Examples of the various predetermined programs include a SU control program that controls each unit of the sensor device SU according to the function of each unit, and a SU monitoring processing program that executes predetermined information processing related to monitoring of the monitored person Ob. And the like. The SU monitoring processing program includes a behavior detection processing program for detecting a predetermined behavior in the monitored person Ob and notifying a detection result to a predetermined terminal device SP or TA via the management server device SV, or a nurse call reception operation. SU nurse call processing program for notifying the management server device SV when the nurse call is accepted by the unit 13 and making a voice call with the terminal devices SP and TA by using the SU sound input / output unit 12 and the like In addition, a SU streaming processing program that distributes the moving image generated by the imaging unit 11 to the terminal device SP or TA that requested the moving image by streaming is included. For each of the various predetermined data, each program such as a sensor device identifier (sensor ID) that is an identifier for identifying and identifying the sensor device SU of the own device and a communication address of the management server device SV is executed. Data necessary for doing this is included. The SU storage unit 16 includes, for example, a ROM (Read Only Memory) that is a nonvolatile storage element, an EEPROM (Electrically Erasable Programmable Read Only Memory) that is a rewritable nonvolatile storage element, and the like. The SU storage unit 16 includes a RAM (Random Access Memory) serving as a working memory of the so-called SU control processing unit 14 that stores data generated during execution of the predetermined program.
 SU制御処理部14は、センサ装置SUの各部を当該各部の機能に応じてそれぞれ制御し、被監視者Obにおける所定の行動を検知してその検知結果を管理サーバ装置SVへ報知し、ナースコールを受付けてその旨を管理サーバ装置SVへ通知し、端末装置SP、TAとの間で音声通話を行い、そして、動画を含む画像を生成して端末装置SP、TAへ動画を配信するための回路である。SU制御処理部14は、例えば、CPU(Central Processing Unit)およびその周辺回路を備えて構成される。SU制御処理部14は、前記制御処理プログラムが実行されることによって、センサ側制御部(SU制御部)141、行動検知処理部142、センサ側ナースコール処理部(SUナースコール処理部)143およびセンサ側ストリーミング処理部(SUストリーミング処理部)144を機能的に備える。 The SU control processing unit 14 controls each unit of the sensor device SU according to the function of each unit, detects a predetermined action in the monitored person Ob, notifies the management server device SV of the detection result, and makes a nurse call. To the management server device SV, make a voice call with the terminal devices SP and TA, generate an image including a moving image, and distribute the moving image to the terminal devices SP and TA. It is a circuit. The SU control processing unit 14 includes, for example, a CPU (Central Processing Unit) and its peripheral circuits. The SU control processing unit 14 executes the control processing program so that a sensor side control unit (SU control unit) 141, a behavior detection processing unit 142, a sensor side nurse call processing unit (SU nurse call processing unit) 143, and A sensor-side streaming processing unit (SU streaming processing unit) 144 is functionally provided.
 SU制御部141は、センサ装置SUの各部を当該各部の機能に応じてそれぞれ制御し、センサ装置SUの全体制御を司るものである。 The SU control unit 141 controls each part of the sensor device SU according to the function of each part, and governs overall control of the sensor device SU.
 行動検知処理部142は、被監視者Obにおける、予め設定された所定の行動(状態、状況)を画像に基づいて検知して前記画像と共に管理サーバ装置SVへ報知(通知、送信)するものである。より具体的には、本実施形態では、前記所定の行動は、例えば、被監視者Obが起きた起床、被監視者Obが寝具から離れた離床、被監視者Obが寝具から落ちた転落、および、被監視者Obが倒れた転倒の4つの行動である。行動検知処理部142は、例えば、撮像部11で撮像した対象画像に基づいて被監視者Obの頭部を検出し、この検出した被監視者Obの頭部における大きさの時間変化に基づいて被監視者Obの起床、離床、転倒および転落を検知する。より詳しくは、まず、寝具BTの所在領域、および、第1ないし第3閾値Th1~Th3が前記各種の所定のデータの1つとして予めSU記憶部16に記憶される。前記第1閾値Th1は、寝具BTの所在領域内における横臥姿勢の頭部の大きさと座位姿勢の頭部の大きさとを識別するための値である。前記第2閾値Th2は、寝具BTの所在領域を除く居室RM内における立位姿勢の頭部の大きさであるか否かを識別するための値である。前記第3閾値Th3は、寝具BTの所在領域を除く居室RM内における横臥姿勢の頭部の大きさであるか否かを識別するための値である。そして、行動検知処理部142は、対象画像から例えば背景差分法やフレーム差分法によって被監視者Obの人体領域として動体領域を抽出する。次に、行動検知処理部142は、この抽出した動体領域から、例えば円形や楕円形のハフ変換によって、また例えば予め用意された頭部のモデルを用いたパターンマッチングによって、また例えば頭部検出用に学習したニューラルネットワークによって、被監視者Obの頭部領域を抽出する。次に、行動検知処理部142は、この抽出した頭部の位置および大きさから起床、離床、転倒および転落を検知する。例えば、行動検知処理部142は、この抽出した頭部の位置が寝具BTの所在領域内であって、前記抽出した頭部の大きさが前記第1閾値Th1を用いることによって横臥姿勢の大きさから座位姿勢の大きさへ時間変化した場合には、起床と判定し、前記起床を検知する。例えば、行動検知処理部142は、この抽出した頭部の位置が寝具BTの所在領域内から寝具BTの所在領域外へ時間変化した場合であって、前記抽出した頭部の大きさが前記第2閾値Th2を用いることによって或る大きさから立位姿勢の大きさへ時間変化した場合には、離床と判定し、前記離床を検知する。例えば、行動検知処理部142は、この抽出した頭部の位置が寝具BTの所在領域内から寝具BTの所在領域外へ時間変化した場合であって、前記抽出した頭部の大きさが前記第3閾値Th3を用いることによって或る大きさから横臥姿勢の大きさへ時間変化した場合には、転落と判定し、前記転落を検知する。例えば、行動検知処理部142は、この抽出した頭部の位置が寝具BTの所在領域を除く居室RM内であって、前記抽出した頭部の大きさが前記第3閾値Th3を用いることによって或る大きさから横臥姿勢の大きさへ時間変化した場合には、転倒と判定し、前記転倒を検知する。このように前記所定の行動を検知すると、行動検知処理部142は、この検知結果とこの検知結果を得る際に用いられた前記対象画像とを、SU通信IF部15で管理サーバ装置SVへ報知する。より詳しくは、行動検知処理部142は、自機のセンサID、検知結果(本実施形態では、起床、離床、転落および転倒のうちの1または複数)、前記所定の行動の検知に用いられた対象画像を収容した通信信号(第1監視情報通信信号)を、SU通信IF部15を介して管理サーバ装置SVへ送信する。前記画像は、静止画および動画のうちの少なくとも一方であって良く、本実施形態では、後述するように、まず、静止画が報知され、ユーザの要求に応じて動画が配信される。なお、まず、動画が配信されても良く、また、静止画および動画が送信され、画面分割で静止画および動画が端末装置SP、TAaに表示されても良い。 The behavior detection processing unit 142 detects a predetermined behavior (state, situation) set in advance in the monitored person Ob based on the image, and notifies (notifies and transmits) the management server device SV together with the image. is there. More specifically, in the present embodiment, the predetermined action includes, for example, wake-up when the monitored person Ob occurred, leaving the monitored person Ob away from the bedding, falling down when the monitored person Ob fell from the bedding, And four actions of the fall where the monitored person Ob fell. For example, the behavior detection processing unit 142 detects the head of the monitored person Ob based on the target image captured by the imaging unit 11, and based on the temporal change in the size of the detected head of the monitored person Ob. Detects the monitored person's getting up, getting out of bed, falling and falling. More specifically, first, the location area of the bedding BT and the first to third threshold values Th1 to Th3 are stored in advance in the SU storage unit 16 as one of the various predetermined data. The first threshold Th1 is a value for identifying the size of the head in the lying posture and the size of the head in the sitting posture in the region where the bedding BT is located. The second threshold Th2 is a value for identifying whether or not the head is in the standing posture in the living room RM excluding the region where the bedding BT is located. The third threshold Th3 is a value for identifying whether or not the head is in the lying position in the living room RM excluding the area where the bedding BT is located. And the action detection process part 142 extracts a moving body area | region as a human body area | region of the to-be-monitored person Ob from a target image by the background difference method or the frame difference method, for example. Next, the behavior detection processing unit 142 uses, for example, a circular or elliptical Hough transform from the extracted moving object region, for example, by pattern matching using a head model prepared in advance, or for example, for head detection. The head region of the monitored person Ob is extracted by the neural network learned in step (b). Next, the behavior detection processing unit 142 detects wake-up, getting-off, falling, and falling from the extracted position and size of the head. For example, the action detection processing unit 142 determines that the position of the extracted head is within the region where the bedding BT is located, and the size of the extracted head uses the first threshold Th1 to determine the size of the lying posture. When the time changes from the sitting posture to the size of the sitting posture, it is determined that the user gets up and the rising is detected. For example, the behavior detection processing unit 142 is a case where the position of the extracted head changes over time from the location of the bedding BT to the outside of the location of the bedding BT, and the size of the extracted head is the first When the time changes from a certain size to the size of the standing posture by using the threshold value Th2, it is determined that the user has left the bed, and the bed is detected. For example, the behavior detection processing unit 142 is a case where the position of the extracted head changes over time from the location of the bedding BT to the outside of the location of the bedding BT, and the size of the extracted head is the first When the time changes from a certain size to the size of the recumbent posture by using the three threshold Th3, it is determined that the vehicle has fallen, and the fall is detected. For example, the behavior detection processing unit 142 uses the third threshold Th3 when the extracted head position is in the room RM excluding the location area of the bedding BT and the extracted head size is the third threshold Th3. When the time changes from the size to the size of the recumbent posture, it is determined that the vehicle has fallen, and the fall is detected. When the predetermined behavior is detected in this way, the behavior detection processing unit 142 notifies the management server device SV of the detection result and the target image used when obtaining the detection result by the SU communication IF unit 15. To do. More specifically, the behavior detection processing unit 142 is used to detect the sensor ID of the own device, the detection result (one or more of getting up, getting out of bed, falling, and falling in this embodiment) and the predetermined behavior. A communication signal (first monitoring information communication signal) containing the target image is transmitted to the management server device SV via the SU communication IF unit 15. The image may be at least one of a still image and a moving image. In the present embodiment, as described later, first, a still image is notified, and the moving image is distributed in response to a user request. First, a moving image may be distributed, a still image and a moving image may be transmitted, and the still image and the moving image may be displayed on the terminal devices SP and TAa by screen division.
 SUナースコール処理部143は、ナースコール受付操作部13でナースコールを受付けた場合にその旨を管理サーバ装置SVへ通報し、SU音入出力部12等を用いることで端末装置SP、TAaとの間で音声通話を行うものである。より具体的には、SUナースコール処理部143は、ナースコール受付操作部13が入力操作されると、自機のセンサID、および、ナースコールを受付けた旨を表すナースコール受付情報を収容した第1ナースコール通知通信信号を、SU通信IF部15を介して管理サーバ装置SVへ送信する。そして、SUナースコール処理部143は、SU音入出力部12等を用い、端末装置SP、TAaとの間で例えばVoIP(Voice over Internet Protocol)によって音声通話を行う。 The SU nurse call processing unit 143 notifies the management server device SV when the nurse call reception operation unit 13 receives the nurse call, and uses the SU sound input / output unit 12 and the like to connect the terminal devices SP and TAa. Voice calls between the two. More specifically, when the nurse call reception operation unit 13 is input, the SU nurse call processing unit 143 accommodates the sensor ID of the own device and nurse call reception information indicating that the nurse call has been received. The first nurse call notification communication signal is transmitted to the management server device SV via the SU communication IF unit 15. Then, the SU nurse call processing unit 143 uses the SU sound input / output unit 12 or the like to make a voice call with the terminal devices SP and TAa by, for example, VoIP (Voice over Internet Protocol).
 SUストリーミング処理部144は、SU通信IF部15を介して固定端末装置SPまたは携帯端末装置TAaから動画の配信の要求があった場合に、この要求のあった固定端末装置SPまたは携帯端末装置TAaへ、撮像部11で生成した動画(例えばライブの動画)をストリーミング再生でSU通信IF部15を介して配信するものである。 When there is a video distribution request from the fixed terminal device SP or the mobile terminal device TAa via the SU communication IF unit 15, the SU streaming processing unit 144 receives the request from the fixed terminal device SP or the mobile terminal device TAa. The moving image generated by the imaging unit 11 (for example, a live moving image) is distributed via the SU communication IF unit 15 by streaming reproduction.
 図1には、一例として、4個の第1ないし第4センサ装置SU-1~SU-4が示されており、第1センサ装置SU-1は、被監視者Obの一人であるAさんOb-1の居室RM-1(不図示)に配設され、第2センサ装置SU-2は、被監視者Obの一人であるBさんOb-2の居室RM-2(不図示)に配設され、第3センサ装置SU-3は、被監視者Obの一人であるCさんOb-3の居室RM-3(不図示)に配設され、そして、第4センサ装置SU-4は、被監視者Obの一人であるDさんOb-4の居室RM-4(不図示)に配設されている。 FIG. 1 shows four first to fourth sensor devices SU-1 to SU-4 as an example, and the first sensor device SU-1 is one of the monitored persons Ob. The second sensor device SU-2 is arranged in a room RM-2 (not shown) of Mr. B Ob-2 who is one of the monitored persons Ob. The third sensor device SU-3 is disposed in the room RM-3 (not shown) of Mr. C Ob-3, one of the monitored subjects Ob, and the fourth sensor device SU-4 It is arranged in the room RM-4 (not shown) of Mr. D Ob-4, one of the monitored persons Ob.
 なお、センサ装置SUは、撮像を行う撮像部と通話を行う通話部とを備えるセンサ装置の一例に相当し、撮像部11は、前記撮像部の一例に相当し、SU音入出力部12およびSUナースコール処理部143は、前記通話部の一例に相当する。 The sensor device SU corresponds to an example of a sensor device including an imaging unit that performs imaging and a calling unit that performs a call. The imaging unit 11 corresponds to an example of the imaging unit, and includes the SU sound input / output unit 12 and The SU nurse call processing unit 143 corresponds to an example of the calling unit.
 管理サーバ装置SVは、ネットワークNWを介して他の装置SU、TAa、SPと通信する通信機能を備え、センサ装置SUから被監視者Obに関する前記検知結果および前記対象画像を受信して被監視者Obに対する監視に関する情報(監視情報)を管理し、これら受信した被監視者Obに関する前記検知結果および前記対象画像を所定の端末装置SP、TAaへ報知(再報知、送信)する機器である。より具体的には、管理サーバ装置SVは、報知元のセンサ装置SU(センサID)と報知先(再報知先)の端末装置SP、TAa(端末ID)との対応関係(報知先対応関係)、および、各装置SU、SP、TAa(各ID)とその通信アドレスとの対応関係(通信アドレス対応関係)を予め記憶している。端末IDは、端末装置SP、TAaを特定し識別するための識別子である。まず、管理サーバ装置SVは、第1監視情報通信信号を受信すると、この受信した第1監視情報通信信号における報知元(送信元)のセンサ装置と前記受信した第1監視情報通信信号に収容されたデータとを互いに対応付けて被監視者Obの監視情報として記憶(記録)する。そして、管理サーバ装置SVは、前記報知先対応関係から、前記受信した第1監視情報通信信号における報知元のセンサ装置SUに対応する報知先の端末装置SP、TAaを特定し、この報知先の端末装置SP、TAaへ第2監視情報通信信号を送信する。この第2監視情報通信信号には、前記受信した第1監視情報通信信号に収容されたセンサID、前記検知結果および前記対象画像、ならびに、動画のダウンロード先として、前記受信した第1監視情報通信信号に収容されたセンサIDを持つセンサ装置SUに対応する通信アドレスが収容される。通信アドレスは、前記通信アドレス対応関係から取得される。また、管理サーバ装置SVは、第1ナースコール通知通信信号を受信すると、この受信した第1ナースコール通知通信信号における報知元のセンサ装置と前記受信した第1ナースコール通知通信信号に収容されたデータとを互いに対応付けて被監視者Obの監視情報として記憶(記録)する。そして、管理サーバ装置SVは、前記報知先対応関係から、前記受信した第1ナースコール通知通信信号における報知元のセンサ装置に対応する報知先の端末装置SP、TAaを特定し、この報知先の端末装置SP、TAaへ第2ナースコール通知通信信号を送信する。この第2ナースコール通知通信信号には、前記受信した第1ナースコール通知通信信号に収容されたセンサIDおよびナースコール受付情報が収容される。なお、第2ナースコール通知通信信号に動画のダウンロード先として、前記受信した第1ナースコール通知通信信号に収容されたセンサIDを持つセンサ装置SUに対応する通信アドレスが収容されても良い。管理サーバ装置SVは、クライアント(本実施形態では端末装置SP、TAa等)の要求に応じたデータを前記クライアントに提供する。このような管理サーバ装置SVは、例えば、通信機能付きのコンピュータによって構成可能である。 The management server device SV has a communication function of communicating with other devices SU, TAa, SP via the network NW, receives the detection result and the target image regarding the monitored person Ob from the sensor device SU, and receives the monitored person. This is a device that manages information related to monitoring Ob (monitoring information), and notifies (re-notifies and transmits) the detection result and the target image regarding the received monitored person Ob to a predetermined terminal device SP and TAa. More specifically, the management server device SV corresponds to the notification device sensor device SU (sensor ID) and the notification destination (re-notification destination) terminal device SP, TAa (terminal ID) (notification destination correspondence relationship). , And a correspondence relationship (communication address correspondence relationship) between each device SU, SP, TAa (each ID) and its communication address is stored in advance. The terminal ID is an identifier for identifying and identifying the terminal devices SP and TAa. First, when the management server device SV receives the first monitoring information communication signal, the management server device SV is accommodated in the notification device (transmission source) sensor device and the received first monitoring information communication signal in the received first monitoring information communication signal. Are stored (recorded) as monitoring information of the monitored person Ob in association with each other. Then, the management server device SV identifies the notification destination terminal device SP, TAa corresponding to the notification source sensor device SU in the received first monitoring information communication signal from the notification destination correspondence relationship, and the notification destination A second monitoring information communication signal is transmitted to the terminal devices SP and TAa. The second monitoring information communication signal includes the received first monitoring information communication as a download destination of the sensor ID, the detection result, the target image, and the moving image accommodated in the received first monitoring information communication signal. A communication address corresponding to the sensor device SU having the sensor ID accommodated in the signal is accommodated. The communication address is obtained from the communication address correspondence relationship. In addition, when the management server device SV receives the first nurse call notification communication signal, the management server device SV is accommodated in the notification source sensor device in the received first nurse call notification communication signal and the received first nurse call notification communication signal. The data is associated with each other and stored (recorded) as monitoring information of the monitored person Ob. Then, the management server device SV identifies the notification destination terminal devices SP and TAa corresponding to the notification source sensor device in the received first nurse call notification communication signal from the notification destination correspondence relationship, and the notification destination A second nurse call notification communication signal is transmitted to the terminal devices SP and TAa. The second nurse call notification communication signal includes the sensor ID and nurse call reception information stored in the received first nurse call notification communication signal. The second nurse call notification communication signal may include a communication address corresponding to the sensor device SU having the sensor ID stored in the received first nurse call notification communication signal as a download destination of the moving image. The management server device SV provides the client with data corresponding to the request of the client (terminal device SP, TAa, etc. in this embodiment). Such a management server device SV can be configured by, for example, a computer with a communication function.
 固定端末装置SPは、ネットワークNWを介して他の装置SU、SV、TAaと通信する通信機能、所定の情報を表示する表示機能、および、所定の指示やデータを入力する入力機能等を備え、管理サーバ装置SVや携帯端末装置TAaに与える所定の指示やデータを入力したり、センサ装置SUで得られた監視情報を表示したり等することによって、被監視者監視システムMSのユーザインターフェース(UI)として機能する機器である。このような固定端末装置SPは、例えば、通信機能付きのコンピュータによって構成可能である。なお、前記端末装置の一例としての固定端末装置SPは、携帯端末装置TAaと同様に動作するが、本明細書では、前記端末装置の他の一例である携帯端末装置TAaについて説明される。 The fixed terminal device SP includes a communication function for communicating with other devices SU, SV, TAa via the network NW, a display function for displaying predetermined information, an input function for inputting predetermined instructions and data, and the like. The user interface (UI) of the monitored person monitoring system MS is input by inputting predetermined instructions and data to be given to the management server device SV and the portable terminal device TAa, displaying the monitoring information obtained by the sensor device SU, and the like. ). Such a fixed terminal device SP can be configured by, for example, a computer with a communication function. The fixed terminal device SP as an example of the terminal device operates in the same manner as the mobile terminal device TAa. However, in this specification, the mobile terminal device TAa which is another example of the terminal device will be described.
 携帯端末装置TAaは、ネットワークNWを介して他の装置SV、SP、SUと通信する通信機能、所定の情報を表示する表示機能、所定の指示やデータを入力する入力機能、および、音声通話を行う通話機能等を備え、管理サーバ装置SVやセンサ装置SUに与える所定の指示やデータを入力したり、管理サーバ装置SVからの通報によってセンサ装置SUで得られた監視情報(動画を含む)を表示したり、センサ装置SUとの間で音声通話によってナースコールの応答や声かけしたり等するための機器である。 The mobile terminal device TAa has a communication function for communicating with other devices SV, SP, SU via the network NW, a display function for displaying predetermined information, an input function for inputting predetermined instructions and data, and a voice call. A monitoring function (including moving images) obtained from the sensor device SU by inputting a predetermined instruction or data to be provided to the management server device SV or the sensor device SU, or by a report from the management server device SV. It is a device for displaying or making a nurse call response or calling out by voice call with the sensor device SU.
 このような携帯端末装置TAaは、本実施形態では、例えば、図4に示すように、端末側通信インターフェース部(TA通信IF部)31と、端末側制御処理部(TA制御処理部)32aと、端末側記憶部(TA記憶部)33と、端末側音入出力部(TA音入出力部)34と、端末側入力部(TA入力部)35と、端末側表示部(TA表示部)36と、端末側インターフェース部(TAIF部)37と、離接センサ部38と、姿勢センサ部39とを備える。これらTA通信IF部31、TA制御処理部32、TA記憶部33、TA音入出力部34、TA入力部35、TA表示部36、TAIF部37、離接センサ部38および姿勢センサ部39は、例えば、図3に示すように、薄型の直方体形状の筐体HSに収容される。筐体HSの一方主面には、その略中央位置に、TA表示部36の表示面が外部に臨むように配置され、このTA表示部36の表示面に対する一方端(図3に示す例では上端)に、TA音入出力部34における後述の第1電気機械変換部341が外部との間で音の出入り可能に配置され、その他方端(図3に示す例では下端)に、TA音入出力部34における後述の第2電気機械変換部342が外部との間で音の出入り可能に配置され、そして、前記一方端(図3に示す例では上端)における第1電気機械変換部341の近くに、離接センサ部38が配置されている。このように、第1および第2電気機械変換部341、342は、筐体HSの一方主面において、TA表示部36の表示面の両端それぞれに配設されている。このため、第1電気機械変換部341を耳に当接または耳元に近づけ、第2電気機械変換部342を口元に近づける使用態様では、携帯端末装置TAaのユーザ(監視者)は、TA表示部36に表示された画像を見ることができない。 In the present embodiment, for example, as shown in FIG. 4, such a portable terminal device TAa includes a terminal-side communication interface unit (TA communication IF unit) 31, a terminal-side control processing unit (TA control processing unit) 32a, A terminal side storage unit (TA storage unit) 33, a terminal side sound input / output unit (TA sound input / output unit) 34, a terminal side input unit (TA input unit) 35, and a terminal side display unit (TA display unit). 36, a terminal-side interface unit (TAIF unit) 37, a separation sensor unit 38, and a posture sensor unit 39. These TA communication IF unit 31, TA control processing unit 32, TA storage unit 33, TA sound input / output unit 34, TA input unit 35, TA display unit 36, TAIF unit 37, separation sensor unit 38 and posture sensor unit 39 are For example, as shown in FIG. 3, it is accommodated in a thin rectangular housing HS. On one main surface of the housing HS, a display surface of the TA display unit 36 is arranged at a substantially central position so as to face the outside. One end of the TA display unit 36 with respect to the display surface (in the example shown in FIG. 3) A first electromechanical conversion unit 341 described later in the TA sound input / output unit 34 is arranged at the upper end) so that sound can enter and exit from the outside, and the TA sound is input to the other end (lower end in the example shown in FIG. 3). A second electromechanical conversion unit 342 described later in the input / output unit 34 is arranged so that sound can enter and exit from the outside, and the first electromechanical conversion unit 341 at the one end (the upper end in the example shown in FIG. 3). A separation / contact sensor unit 38 is disposed in the vicinity. As described above, the first and second electromechanical conversion units 341 and 342 are disposed on both ends of the display surface of the TA display unit 36 on one main surface of the housing HS. For this reason, in the usage mode in which the first electromechanical conversion unit 341 is in contact with or close to the ear and the second electromechanical conversion unit 342 is close to the mouth, the user (monitor) of the mobile terminal device TAa The image displayed on 36 cannot be seen.
 TA通信IF部31は、SU通信IF部15と同様に、TA制御処理部32aに接続され、TA制御処理部32aの制御に従って通信を行うための通信回路である。TA通信IF部31は、例えば、IEEE802.11規格等に従った通信インターフェース回路を備えて構成される。 The TA communication IF unit 31, like the SU communication IF unit 15, is a communication circuit that is connected to the TA control processing unit 32a and performs communication according to the control of the TA control processing unit 32a. The TA communication IF unit 31 includes, for example, a communication interface circuit that conforms to the IEEE 802.11 standard or the like.
 TA音入出力部34は、SU音入出力部12と同様に、TA制御処理部32aに接続され、TA制御処理部32aの制御に従って音を表す電気信号に応じた音を生成して出力するための回路であって、外部の音を取得して携帯端末装置TAaに入力するための回路である。より具体的には、TA音入出力部34は、第1および第2電気機械変換部341、342を備える。第1電気機械変換部341は、TA制御処理部32aの制御に従って音の電気信号(音データ)を機械振動信号(音響信号)に変換し、受話口として動作する回路であり、例えば、ボイスコイル式スピーカや圧電式スピーカ等のスピーカである。第2電気機械変換部342は、TA制御処理部32aの制御に従って電気信号と機械振動信号との間で相互に変換し、送話口として動作する回路である。そして、本実施形態では、後述のように、第2電気機械変換部342は、TA制御処理部32aの制御に応じて送話口および受話口としても動作する。このため、第2電気機械変換部342は、一態様では、例えばムービングコイル式マイクロフォンや圧電式マイクロフォン等のマイクロフォンと、例えばボイスコイル式スピーカや圧電式スピーカ等のスピーカとの2個のデバイスを備えて構成される。また他の一態様では、例えばボイスコイル式スピーカやムービングコイル式マイクロフォンのダイナミック式のデバイスは、スピーカとしてもマイクロフォンとしても機能するので、第2電気機械変換部342は、このような1個のデバイスを備えて構成されても良い。 Similar to the SU sound input / output unit 12, the TA sound input / output unit 34 is connected to the TA control processing unit 32a, and generates and outputs a sound corresponding to an electrical signal representing sound according to the control of the TA control processing unit 32a. This is a circuit for acquiring an external sound and inputting it to the portable terminal device TAa. More specifically, the TA sound input / output unit 34 includes first and second electromechanical conversion units 341 and 342. The first electromechanical conversion unit 341 is a circuit that converts a sound electrical signal (sound data) into a mechanical vibration signal (acoustic signal) according to the control of the TA control processing unit 32a and operates as an earpiece, for example, a voice coil Speakers, piezoelectric speakers and the like. The second electromechanical conversion unit 342 is a circuit that converts between an electrical signal and a mechanical vibration signal according to the control of the TA control processing unit 32a and operates as a mouthpiece. In the present embodiment, as will be described later, the second electromechanical conversion unit 342 also operates as a mouthpiece and a mouthpiece according to the control of the TA control processing unit 32a. Therefore, in one aspect, the second electromechanical conversion unit 342 includes two devices, for example, a microphone such as a moving coil microphone or a piezoelectric microphone, and a speaker such as a voice coil speaker or a piezoelectric speaker. Configured. In another aspect, for example, a dynamic device such as a voice coil type speaker or a moving coil type microphone functions as both a speaker and a microphone. Therefore, the second electromechanical conversion unit 342 includes such a single device. It may be provided with.
 TA入力部35は、TA制御処理部32aに接続され、例えば、所定の操作を受け付け、携帯端末装置TAaに入力する回路であり、例えば、所定の機能を割り付けられた複数の入力スイッチ等である。前記所定の操作には、例えば、ログインするためのIDの入力操作や、音声通話の要求操作およびその終了操作や、ライブでの動画の要求操作およびその終了操作や、前記報知された被監視者Obに対する例えば救命、看護、介護および介助等の対応(対処、応対)を実行する意思がある旨(“対応する”)の入力操作等の、監視する上で必要な各種操作等が含まれる。TA表示部36は、TA制御処理部32aに接続され、TA制御処理部32aの制御に従って、TA入力部35から入力された所定の操作内容や、被監視者監視システムMSによって監視されている被監視者Obに対する監視に関する前記監視情報(例えばセンサ装置SUで検知した所定の行動の種類や、被監視者Obの画像(静止画および動画)や、ナースコールの受付等)等を表示する回路であり、例えばLCD(液晶ディスプレイ)および有機ELディスプレイ等の表示装置である。本実施形態では、TA入力部35およびTA表示部36からタッチパネルが構成されている。この場合において、TA入力部35は、例えば抵抗膜方式や静電容量方式等の操作位置を検出して入力する位置入力デバイスである。このタッチパネルでは、TA表示部36の表示面上に位置入力デバイスが設けられ、TA表示部36に入力可能な1または複数の入力内容の候補が表示され、例えば看護師や介護士等のユーザ(監視者)が、入力したい入力内容を表示した表示位置を触れると、位置入力デバイスによってその位置が検出され、検出された位置に表示された表示内容がユーザの操作入力内容として携帯端末装置TAaに入力される。 The TA input unit 35 is connected to the TA control processing unit 32a and is, for example, a circuit that accepts a predetermined operation and inputs it to the portable terminal device TAa. For example, the TA input unit 35 includes a plurality of input switches assigned with a predetermined function. . Examples of the predetermined operation include an ID input operation for logging in, a voice call request operation and its end operation, a live video request operation and its end operation, and the informed monitored person. For example, various operations necessary for monitoring, such as an input operation indicating that there is an intention to perform (response, response) such as lifesaving, nursing, nursing care, and assistance for Ob, are included. The TA display unit 36 is connected to the TA control processing unit 32a, and under the control of the TA control processing unit 32a, the predetermined operation content input from the TA input unit 35 and the monitored target MS monitored by the monitored person monitoring system MS. A circuit that displays the monitoring information related to the monitoring of the monitoring person Ob (for example, the type of a predetermined action detected by the sensor device SU, the image (still image and moving image) of the monitored person Ob, reception of a nurse call, etc.), etc. For example, display devices such as an LCD (Liquid Crystal Display) and an organic EL display. In the present embodiment, the TA input unit 35 and the TA display unit 36 constitute a touch panel. In this case, the TA input unit 35 is a position input device that detects and inputs an operation position such as a resistance film method or a capacitance method. In this touch panel, a position input device is provided on the display surface of the TA display unit 36, and one or more input content candidates that can be input are displayed on the TA display unit 36. For example, a user such as a nurse or a caregiver ( When the monitor) touches the display position where the input content to be input is displayed, the position is detected by the position input device, and the display content displayed at the detected position is input to the portable terminal device TAa as the operation input content of the user. Entered.
 TAIF部37は、TA制御処理部32aに接続され、TA制御処理部32aの制御に従って、外部機器との間でデータの入出力を行う回路であり、例えば、Bluetooth(登録商標)規格を用いたインターフェース回路、IrDA規格等の赤外線通信を行うインターフェース回路、および、USB規格を用いたインターフェース回路等である。 The TAIF unit 37 is a circuit that is connected to the TA control processing unit 32a and inputs / outputs data to / from an external device in accordance with the control of the TA control processing unit 32a. An interface circuit, an interface circuit that performs infrared communication such as the IrDA standard, and an interface circuit that uses the USB standard.
 離接センサ部38は、TA制御処理部32aに接続され、人の離接を検知する回路であり、例えば、静電容量式の人感センサや、赤外線式の人感センサ等である。静電容量式の人感センサは、金属製の検出パネルを備え、前記パネルと人体によって構成されるコンデンサの静電容量変化を電圧変化として出力し、人体の離接度合に応じたレベルの信号を出力する。赤外線式の人感センサは、例えば、焦電素子を備え、人から放射される赤外線を受光し、その受光した赤外線の強度に応じたレベルの信号を出力する。離接センサ部38は、そのセンサ出力(第1センサ出力)をTA制御処理部32aへ出力する。 The separation / contact sensor unit 38 is a circuit that is connected to the TA control processing unit 32a and detects a person's separation / contact, and is, for example, a capacitive human sensor, an infrared human sensor, or the like. The capacitance type human sensor includes a metal detection panel, outputs a change in the capacitance of a capacitor constituted by the panel and the human body as a voltage change, and a signal at a level corresponding to the degree of contact / disconnection of the human body. Is output. An infrared human sensor includes, for example, a pyroelectric element, receives infrared rays emitted from a person, and outputs a signal having a level corresponding to the intensity of the received infrared rays. The separation / contact sensor unit 38 outputs the sensor output (first sensor output) to the TA control processing unit 32a.
 姿勢センサ部39は、TA制御処理部32aに接続され、当該携帯端末装置TAaの姿勢を検出する回路であり、例えば、角速度を測定するジャイロセンサ等である。姿勢センサ部39は、そのセンサ出力(第2センサ出力)をTA制御処理部32aへ出力する。 The attitude sensor unit 39 is a circuit that is connected to the TA control processing unit 32a and detects the attitude of the mobile terminal device TAa, and is, for example, a gyro sensor that measures angular velocity. The attitude sensor unit 39 outputs the sensor output (second sensor output) to the TA control processing unit 32a.
 TA記憶部33は、TA制御処理部32aに接続され、TA制御処理部32aの制御に従って、各種の所定のプログラムおよび各種の所定のデータを記憶する回路である。前記各種の所定のプログラムには、例えば、携帯端末装置TAaの各部を当該各部の機能に応じてそれぞれ制御するTA制御プログラムや、センサ装置SUから管理サーバ装置SVを介して受信した前記検知結果や前記ナースコール等の被監視者Obに対する監視に関する監視情報を記憶(記録)し、前記検知結果や前記ナースコールを表示するTA監視処理プログラムや、TA音入出力部34等を用いることでセンサ装置SUとの間で音声通話を行うTA通話処理プログラムや、センサ装置SUから動画の配信を受け、前記配信を受けた動画をストリーミング再生でTA表示部36に表示するストリーミング処理プログラムや、離接センサ部38の第1センサ出力に基づいて人の離間を前記通話の要求として判定する離接判定プログラムや、姿勢センサ部39の第2センサ出力に基づいて、当該携帯端末装置TAaの姿勢が、第1電気機械変換部341の第1配設位置と第2電気機械変換部342の第2配設位置とを結ぶ線分LN(図3参照)の延長方向が垂直方向よりも水平方向に近い姿勢である場合を前記通話の要求として判定する姿勢判定プログラム等の制御処理プログラムが含まれる。前記各種の所定のデータでは、自機の端末ID、TA表示部36に表示される画面情報、および、被監視者Obに対する監視に関する前記監視情報等の各プログラムを実行する上で必要なデータ等が含まれる。TA記憶部33は、例えばROMやEEPROM等を備える。TA記憶部33は、前記所定のプログラムの実行中に生じるデータ等を記憶するいわゆるTA制御処理部32aのワーキングメモリとなるRAM等を含む。そして、TA記憶部33は、前記監視情報を記憶するための端末側監視情報記憶部(TA監視情報記憶部)331を機能的に備える。 The TA storage unit 33 is a circuit that is connected to the TA control processing unit 32a and stores various predetermined programs and various predetermined data under the control of the TA control processing unit 32a. Examples of the various predetermined programs include a TA control program for controlling each part of the mobile terminal device TAa according to the function of each part, the detection result received from the sensor device SU via the management server device SV, Sensor information is stored by recording (recording) monitoring information relating to monitoring of the monitored person Ob, such as the nurse call, and using the TA monitoring processing program for displaying the detection result and the nurse call, the TA sound input / output unit 34, and the like. TA call processing program for performing a voice call with SU, streaming processing program for receiving distribution of moving images from the sensor device SU, and displaying the received moving images on the TA display unit 36 by streaming reproduction, A separation determination program for determining separation of a person as a request for the call based on the first sensor output of the unit 38; Based on the second sensor output of the attitude sensor unit 39, the attitude of the portable terminal device TAa is determined based on the first arrangement position of the first electromechanical conversion unit 341 and the second arrangement position of the second electromechanical conversion unit 342. And a control processing program such as a posture determination program that determines a case where the extension direction of the line segment LN (see FIG. 3) is closer to the horizontal direction than the vertical direction as a request for the call. In the various kinds of predetermined data, data necessary for executing each program such as the terminal ID of the own device, the screen information displayed on the TA display unit 36, and the monitoring information related to monitoring of the monitored person Ob, etc. Is included. The TA storage unit 33 includes, for example, a ROM and an EEPROM. The TA storage unit 33 includes a RAM serving as a working memory of a so-called TA control processing unit 32a that stores data generated during execution of the predetermined program. The TA storage unit 33 functionally includes a terminal-side monitoring information storage unit (TA monitoring information storage unit) 331 for storing the monitoring information.
 TA監視情報記憶部331は、各装置SV、SP、SUそれぞれとの間で送受信した被監視者Obの監視情報を記憶するものである。より具体的には、TA監視情報記憶部331は、本実施形態では、前記監視情報として、管理サーバ装置SVから受信した第2監視情報通信信号に収容されたセンサID、検知結果、対象画像および動画のダウンロード先のセンサ装置SUの通信アドレス、ならびに、当該第2監視情報通信信号の受信時刻等を互いに対応付けて記憶し、管理サーバ装置SVから受信した第2ナースコール通知通信信号に収容されたセンサIDおよびナースコール受付情報、ならびに、当該第2ナースコール通知通信信号の受信時刻等を互いに対応付けて記憶する。 The TA monitoring information storage unit 331 stores monitoring information of the monitored person Ob transmitted / received to / from each of the devices SV, SP, and SU. More specifically, in this embodiment, the TA monitoring information storage unit 331 includes, as the monitoring information, a sensor ID, a detection result, a target image, and a sensor image stored in the second monitoring information communication signal received from the management server device SV. The communication address of the sensor device SU to which the video is downloaded and the reception time of the second monitoring information communication signal are stored in association with each other, and are stored in the second nurse call notification communication signal received from the management server device SV. The sensor ID, nurse call reception information, and the reception time of the second nurse call notification communication signal are stored in association with each other.
 TA制御処理部32aは、携帯端末装置TAaの各部を当該各部の機能に応じてそれぞれ制御し、被監視者Obに対する前記監視情報を受け付けて表示し、ナースコールの応答や声かけするための回路である。TA制御処理部32aは、例えば、CPUおよびその周辺回路を備えて構成される。TA制御処理部32aは、制御処理プログラムが実行されることによって、端末側制御部(TA制御部)321a、端末側監視処理部(TA監視処理部)322、端末側通話処理部(TA通話処理部)323、端末側ストリーミング処理部(TAストリーミング処理部)324、離接判定部325および姿勢判定部326を機能的に備える。 The TA control processing unit 32a is a circuit for controlling each part of the portable terminal device TAa according to the function of each part, receiving and displaying the monitoring information for the monitored person Ob, and answering or calling a nurse call. It is. The TA control processing unit 32a includes, for example, a CPU and its peripheral circuits. The TA control processing unit 32a is configured to execute a control processing program so that the terminal side control unit (TA control unit) 321a, the terminal side monitoring processing unit (TA monitoring processing unit) 322, and the terminal side call processing unit (TA call processing unit). Unit) 323, a terminal-side streaming processing unit (TA streaming processing unit) 324, a separation / contact determination unit 325, and an attitude determination unit 326 are functionally provided.
 TA制御部321aは、携帯端末装置TAaの各部を当該各部の機能に応じてそれぞれ制御し、携帯端末装置TAaの全体制御を司るものである。本実施形態では、センサ装置SUの撮像部11で撮像した画像をTA表示部36に表示している場合に、さらに、センサ装置SUとの間における通話の要求を検出した場合、TA音入出力部34の第1電気機械変換部341の動作を停止し、その第2電気機械変換部342を送話口および受話口として動作するように制御する。 The TA control unit 321a controls each part of the mobile terminal apparatus TAa according to the function of each part, and controls the entire mobile terminal apparatus TAa. In the present embodiment, when an image captured by the imaging unit 11 of the sensor device SU is displayed on the TA display unit 36, and further when a request for a call with the sensor device SU is detected, TA sound input / output is performed. The operation of the first electromechanical conversion unit 341 of the unit 34 is stopped, and the second electromechanical conversion unit 342 is controlled to operate as the mouthpiece and the earpiece.
 TA監視処理部322は、センサ装置SUから管理サーバ装置SVを介して受信した前記検知結果や前記ナースコール等の被監視者Obに対する監視に関する監視情報を記憶(記録)し、前記検知結果や前記ナースコールを表示するものである。より具体的には、TA監視処理部322は、管理サーバ装置SVから第2監視情報通信信号を受信すると、この受信した第2監視情報通信信号に収容された、被監視者Obの監視情報をTA監視情報記憶部331に記憶(記録)する。TA監視処理部322は、前記受信した第2監視情報通信信号に収容された各情報に応じた画面をTA表示部36に表示する。TA監視処理部322は、管理サーバ装置SVから第2ナースコール通知通信信号を受信すると、この受信した第2ナースコール通知通信信号に収容された、被監視者Obの監視情報をTA監視情報記憶部331に記憶(記録)する。TA監視処理部322は、前記受信した第2ナースコール通知通信信号に収容されたナースコール受付情報に応じて、TA記憶部33に予め記憶されたナースコール受付画面をTA表示部36に表示する。そして、TA監視処理部322は、TA入力部35から所定の入力操作を受け付けると、その入力操作に応じた所定の処理を実行する。 The TA monitoring processing unit 322 stores (records) monitoring information related to monitoring of the monitored person Ob such as the detection result or the nurse call received from the sensor device SU via the management server device SV. A nurse call is displayed. More specifically, when the TA monitoring processing unit 322 receives the second monitoring information communication signal from the management server device SV, the TA monitoring processing unit 322 displays the monitoring information of the monitored person Ob contained in the received second monitoring information communication signal. Store (record) in the TA monitoring information storage unit 331. The TA monitoring processing unit 322 displays a screen corresponding to each piece of information contained in the received second monitoring information communication signal on the TA display unit 36. When the TA monitoring processing unit 322 receives the second nurse call notification communication signal from the management server device SV, the TA monitoring processing unit 322 stores the monitoring information of the monitored subject Ob contained in the received second nurse call notification communication signal. Store (record) in the unit 331. The TA monitor processing unit 322 displays a nurse call reception screen pre-stored in the TA storage unit 33 on the TA display unit 36 according to the nurse call reception information accommodated in the received second nurse call notification communication signal. . Then, when receiving a predetermined input operation from the TA input unit 35, the TA monitoring processing unit 322 executes a predetermined process corresponding to the input operation.
 TA通話処理部323は、TA音入出力部34等を用いることでセンサ装置SUとの間で音声通話を行うものである。より具体的には、TA通話処理部323は、TA音入出力部34等を用い、第1監視情報通信信号や第1ナースコール通報通信信号を管理サーバ装置SVへ送信した報知元のセンサ装置SUや、携帯端末装置TAaのユーザ(監視者)によって選択され指定されたセンサ装置SU等との間で例えばVoIPによって音声通話を行う。 The TA call processing unit 323 performs a voice call with the sensor device SU by using the TA sound input / output unit 34 or the like. More specifically, the TA call processing unit 323 uses the TA sound input / output unit 34 and the like, and the notification source sensor device that has transmitted the first monitoring information communication signal and the first nurse call notification communication signal to the management server device SV. A voice call is made, for example, by VoIP with the SU or the sensor device SU selected and designated by the user (monitor) of the mobile terminal device TAa.
 TAストリーミング処理部324は、センサ装置SUから動画の配信を受け、前記配信を受けた動画をストリーミング再生でTA表示部36に表示するものである。 The TA streaming processing unit 324 receives the distribution of the moving image from the sensor device SU, and displays the distributed moving image on the TA display unit 36 by streaming reproduction.
 離接判定部325は、離接センサ部38の第1センサ出力に基づいて、人の離間を、センサ装置SUとの間における通話の要求として判定するものである。より詳しくは、離接判定部325は、離接センサ部38の第1センサ出力に基づいて人の離間であるか否かを判定し、この判定の結果、人の離間と判定すると、前記通話の要求と判定する。例えば、離接センサ部38が静電容量式の人感センサを備えて構成される場合、静電容量式の人感センサに人が近接すると、前記センサの金属パネルと人体によって構成されたコンデンサに、相対的に大きな静電容量と電圧が生じ、相対的に高レベルな信号を出力する一方、静電容量式の人感センサから人が離間すると、静電容量式の人感センサは、相対的に小さな静電容量と電圧が生じ、相対的に低レベルな信号を出力する。このため、人の近接とその離間とを弁別するための閾値(離接閾値)が予め設定され、離接判定部325は、静電容量式の人感センサから出力された第1センサ出力と前記離接閾値とを比較し、この比較の結果、前記第1センサ出力が前記閾値以上である場合には人の近接(人の離間ではない)と判定し、前記比較の結果、前記第1センサ出力が前記離接閾値未満である場合には人の離間と判定、すなわち、前記通話の要求と判定する。また例えば、離接センサ部38が赤外線式の人感センサを備えて構成される場合、赤外線式の人感センサに人が近接すると、赤外線式の人感センサは、相対的に強く赤外線を受光して相対的に高レベルな信号を出力する一方、赤外線式の人感センサから人が離間すると、赤外線式の人感センサは、相対的に低い赤外線を受光して相対的に低レベルな信号を出力する。このため、人の近接とその離間とを弁別するための閾値(離接閾値)が予め設定され、離接判定部325は、赤外線式の人感センサから出力された第1センサ出力と前記離接閾値とを比較し、この比較の結果、前記第1センサ出力が前記閾値以上である場合には人の近接(人の離間ではない)と判定し、前記比較の結果、前記第1センサ出力が前記離接閾値未満である場合には人の離間と判定、すなわち、前記通話の要求と判定する。 The separation / contact determination unit 325 determines the separation of a person as a request for a call with the sensor device SU based on the first sensor output of the separation / contact sensor unit 38. More specifically, the separation / contact determination unit 325 determines whether or not the person is separated based on the first sensor output of the separation / contact sensor unit 38. Judged as a request. For example, in the case where the separation sensor unit 38 is configured to include a capacitive human sensor, when a person approaches the capacitive human sensor, a capacitor formed by the metal panel of the sensor and the human body On the other hand, a relatively large capacitance and voltage are generated, and a relatively high level signal is output.On the other hand, when a person is separated from the capacitance-type human sensor, the capacitance-type human sensor is A relatively small capacitance and voltage are generated, and a relatively low level signal is output. For this reason, a threshold value (separation threshold value) for discriminating between the proximity and separation of a person is set in advance, and the separation determination unit 325 is configured to output the first sensor output from the capacitive human sensor. The separation threshold is compared, and if the result of this comparison is that the output of the first sensor is greater than or equal to the threshold, it is determined that a person is approaching (not separated from the person). If the sensor output is less than the separation / separation threshold, it is determined that the person is separated, that is, the call request is determined. For example, when the separation sensor unit 38 is configured to include an infrared human sensor, when the person approaches the infrared human sensor, the infrared human sensor receives infrared light relatively strongly. When a person is separated from the infrared human sensor, the infrared human sensor receives a relatively low infrared signal and outputs a relatively low signal. Is output. For this reason, a threshold value (separation threshold value) for discriminating between the proximity and separation of a person is set in advance, and the separation determination unit 325 determines whether the separation sensor 325 and the first sensor output output from the infrared human sensor and the separation sensor. Contact threshold value, and if the result of this comparison is that the first sensor output is greater than or equal to the threshold value, it is determined that the person is close (not separated), and the result of the comparison is that the first sensor output Is less than the separation threshold, it is determined that the person is separated, that is, the call is requested.
 姿勢判定部326は、姿勢センサ部39の第2センサ出力に基づいて、当該携帯端末装置TAaの姿勢が、第1電気機械変換部341の第1配設位置と第2電気機械変換部342の第2配設位置とを結ぶ線分LNの延長方向が垂直方向よりも水平方向に近い姿勢である場合を前記通話の要求として判定するものである。より詳しくは、姿勢判定部326は、姿勢センサ部39の第2センサ出力に基づいて、当該携帯端末装置TAaの姿勢が、前記線分LNの延長方向が垂直方向よりも水平方向に近い姿勢であるか否かを判定し、この判定の結果、前記線分の延長方向が垂直方向よりも水平方向に近い姿勢であると判定すると、前記通話の要求と判定する。より具体的には、水平方向を0度と定義し、垂直方向を90度と定義した場合、姿勢判定部326は、姿勢センサ部39の第2センサ出力に基づいて前記線分LNの延長方向と水平方向とのなす角度と、垂直方向よりも水平方向に近いか否かを弁別するための閾値(姿勢閾値)とを比較し、この比較の結果、前記線分LNの延長方向と水平方向とのなす角度が前記閾値以上である場合には垂直方向よりも水平方向に近くない(水平方向よりも垂直方向に近い)と判定し、前記比較の結果、前記線分LNの延長方向と水平方向とのなす角度が前記閾値未満である場合には垂直方向よりも水平方向に近いと判定、すなわち、前記通話の要求と判定する。前記姿勢閾値は、例えば、45度、30度、20度等に適宜に設定される。 Based on the second sensor output of the attitude sensor unit 39, the attitude determination unit 326 determines whether the attitude of the mobile terminal device TAa is the first disposition position of the first electromechanical conversion unit 341 and the second electromechanical conversion unit 342. The case where the extension direction of the line segment LN connecting the second arrangement position is closer to the horizontal direction than the vertical direction is determined as the call request. More specifically, based on the second sensor output of the posture sensor unit 39, the posture determination unit 326 has a posture in which the mobile terminal device TAa has a posture in which the extension direction of the line segment LN is closer to the horizontal direction than the vertical direction. If it is determined whether or not the extension direction of the line segment is closer to the horizontal direction than the vertical direction as a result of the determination, it is determined that the call is requested. More specifically, when the horizontal direction is defined as 0 degrees and the vertical direction is defined as 90 degrees, the posture determination unit 326 extends the line segment LN based on the second sensor output of the posture sensor unit 39. Is compared with a threshold value (posture threshold value) for discriminating whether or not it is closer to the horizontal direction than the vertical direction. As a result of this comparison, the extension direction of the line segment LN and the horizontal direction are compared. Is equal to or greater than the threshold value, it is determined that the angle is not closer to the horizontal direction than the vertical direction (closer to the vertical direction than the horizontal direction). When the angle formed with the direction is less than the threshold, it is determined that the direction is closer to the horizontal direction than the vertical direction, that is, the call request is determined. The posture threshold is appropriately set to 45 degrees, 30 degrees, 20 degrees, and the like, for example.
 このような携帯端末装置TAaは、例えば、いわゆるタブレット型コンピュータやスマートフォンや携帯電話機等の、持ち運び可能な通信端末装置によって構成可能である。 Such a portable terminal device TAa can be configured by a portable communication terminal device such as a so-called tablet computer, a smartphone, or a mobile phone.
 なお、タッチパネルを構成するTA入力部35およびTA表示部36、ならびに、TA制御部321aが、前記センサ装置の前記通話部との間における通話の要求を検出する通話検出部の一例に相当し、タッチパネルを構成するTA入力部35およびTA表示部36が、前記通話の要求を受け付ける通話要求入力部の一例に相当する。離接センサ部38および離接判定部325が、前記通話検出部の他の一例に相当する。姿勢センサ部39および姿勢判定部326が、前記通話検出部のさらに他の一例に相当する。 The TA input unit 35 and the TA display unit 36 that constitute the touch panel, and the TA control unit 321a correspond to an example of a call detection unit that detects a call request with the call unit of the sensor device. The TA input unit 35 and the TA display unit 36 constituting the touch panel correspond to an example of a call request input unit that receives the call request. The separation sensor unit 38 and the separation determination unit 325 correspond to another example of the call detection unit. The posture sensor unit 39 and the posture determination unit 326 correspond to still another example of the call detection unit.
 次に、本実施形態の動作について説明する。上記構成の被監視者監視システムMSでは、各装置SU、SV、SP、TAaは、電源が投入されると、必要な各部の初期化を実行し、その稼働を始める。センサ装置SUでは、その制御処理プログラムの実行によって、SU制御処理部14には、SU制御部141、行動検知処理部142、SUナースコール処理部143およびSUストリーミング処理部144が機能的に構成される。携帯端末装置TAaでは、その制御処理プログラムの実行によって、TA制御処理部32aには、TA制御部321a、TA監視処理部322、TA通話処理部323、TAストリーミング処理部324、離接判定部325および姿勢判定部326が機能的に構成される。 Next, the operation of this embodiment will be described. In the monitored person monitoring system MS having the above-described configuration, each device SU, SV, SP, TAa, when the power is turned on, executes initialization of each necessary unit and starts its operation. In the sensor device SU, by executing the control processing program, the SU control processing unit 14 is functionally configured with a SU control unit 141, a behavior detection processing unit 142, a SU nurse call processing unit 143, and a SU streaming processing unit 144. The In the portable terminal device TAa, by executing the control processing program, the TA control processing unit 32a includes a TA control unit 321a, a TA monitoring processing unit 322, a TA call processing unit 323, a TA streaming processing unit 324, and a disconnection determination unit 325. The posture determination unit 326 is functionally configured.
 そして、上記構成の被監視者監視システムMSは、大略、次の動作によって、各被監視者Obそれぞれを監視している。 The monitored person monitoring system MS having the above configuration generally monitors each monitored person Ob by the following operation.
 センサ装置SUは、各フレームごとに、あるいは、数フレームおきに、次のように動作することで、被監視者Obにおける所定の動作を検知し、ナースコールの受付の有無を判定している。まず、センサ装置SUは、SU制御処理部14のSU制御部141によって、撮像部11から1フレーム分の画像(画像データ)を対象画像として取得し、その行動検知処理部142によって、前記取得した対象画像に基づいて被監視者Obにおける所定の行動を検知し、前記所定の行動を検知すると、その検知結果を所定の端末装置SP、TAaへ報知するために、第1監視情報通信信号を管理サーバ装置SVへ送信する。このように動作している間に、センサ装置SUは、SUナースコール処理部143によって、ナースコール受付操作部13でナースコールを受け付けているか否かを判定し、ナースコールを受け付けると、そのナースコールの受付を所定の端末装置SP、TAaへ報知するために、センサ装置SUは、第1ナースコール通知通信信号を管理サーバ装置SVへ送信する。 The sensor device SU operates in the following manner for each frame or every several frames, thereby detecting a predetermined operation in the monitored person Ob and determining whether or not a nurse call is accepted. First, the sensor device SU acquires an image (image data) for one frame from the imaging unit 11 as a target image by the SU control unit 141 of the SU control processing unit 14, and the behavior detection processing unit 142 acquires the acquired image. The first monitoring information communication signal is managed in order to detect a predetermined action in the monitored person Ob based on the target image and to notify the predetermined terminal device SP and TAa when the predetermined action is detected. Transmit to server device SV. While operating in this way, the sensor device SU determines whether or not the nurse call accepting operation unit 13 accepts the nurse call by the SU nurse call processing unit 143, and when the nurse call is accepted, In order to notify the reception of the call to the predetermined terminal devices SP and TAa, the sensor device SU transmits a first nurse call notification communication signal to the management server device SV.
 管理サーバ装置SVは、第1監視情報通信信号をネットワークNWを介してセンサ装置SUから受信すると、この第1監視情報通信信号に収容されたセンサID、判定結果および対象画像等を、このセンサIDを持つセンサ装置SUで監視されている被監視者Obの監視情報として記憶(記録)する。そして、管理サーバ装置SVは、前記報知先対応関係から、前記受信した第1監視情報通信信号における報知元のセンサ装置SUに対応する報知先の端末装置SP、TAaを特定し、この報知先の端末装置SP、TAaへ第2監視情報通信信号を送信する。一方、管理サーバ装置SVは、第1ナースコール通知通信信号をネットワークNWを介してセンサ装置SUから受信すると、この第1ナースコール通知通信信号に収容されたセンサIDおよびナースコール受付情報等を、このセンサIDを持つセンサ装置SUで監視されている被監視者Obの監視情報として記憶(記録)する。そして、管理サーバ装置SVは、前記報知先対応関係から、前記受信した第1ナースコール通知通信信号における報知元のセンサ装置SUに対応する報知先の端末装置SP、TAaを特定し、この報知先の端末装置SP、TAaへ第2ナースコール通知通信信号を送信する。 When the management server device SV receives the first monitoring information communication signal from the sensor device SU via the network NW, the management server device SV displays the sensor ID, the determination result, the target image, and the like accommodated in the first monitoring information communication signal as the sensor ID. Is stored (recorded) as monitoring information of the monitored person Ob monitored by the sensor device SU having Then, the management server device SV identifies the notification destination terminal device SP, TAa corresponding to the notification source sensor device SU in the received first monitoring information communication signal from the notification destination correspondence relationship, and the notification destination A second monitoring information communication signal is transmitted to the terminal devices SP and TAa. On the other hand, when the management server device SV receives the first nurse call notification communication signal from the sensor device SU through the network NW, the management server device SV receives the sensor ID, nurse call reception information, and the like contained in the first nurse call notification communication signal. The information is stored (recorded) as monitoring information of the monitored person Ob monitored by the sensor device SU having the sensor ID. Then, the management server device SV identifies the notification destination terminal devices SP and TAa corresponding to the notification source sensor device SU in the received first nurse call notification communication signal from the notification destination correspondence relationship, and this notification destination The second nurse call notification communication signal is transmitted to the terminal devices SP and TAa.
 固定端末装置SPおよび携帯端末装置TAaは、前記第2監視情報通信信号をネットワークNWを介して管理サーバ装置SVから受信すると、この第2監視情報通信信号に収容された被監視者Obの監視に関する前記監視情報を表示する。携帯端末装置TAaによるこの監視情報を表示する動作については、以下で詳述する。また、固定端末装置SPおよび携帯端末装置TAaは、前記第2ナースコール通知通信信号をネットワークNWを介して管理サーバ装置SVから受信すると、この第2ナースコール通知通信信号に収容されたセンサIDを持つセンサ装置SUで監視されている被監視者Obからナースコールを受け付けたことを表示する。このような動作によって、被監視者監視システムMSは、各センサ装置SU、管理サーバ装置SV、固定端末装置SPおよび携帯端末装置TAaによって、大略、各被監視者Obにおける所定の行動を検知して各被監視者Obを監視している。 When the fixed terminal device SP and the portable terminal device TAa receive the second monitoring information communication signal from the management server device SV via the network NW, the fixed terminal device SP and the portable terminal device TAa relate to monitoring the monitored person Ob accommodated in the second monitoring information communication signal. The monitoring information is displayed. The operation of displaying the monitoring information by the mobile terminal device TAa will be described in detail below. Further, when the fixed terminal device SP and the portable terminal device TAa receive the second nurse call notification communication signal from the management server device SV via the network NW, the sensor ID accommodated in the second nurse call notification communication signal is received. It is displayed that a nurse call has been received from the monitored person Ob monitored by the sensor device SU. Through such operations, the monitored person monitoring system MS detects a predetermined action in each monitored person Ob roughly by each sensor device SU, management server device SV, fixed terminal device SP, and portable terminal device TAa. Each monitored person Ob is monitored.
 次に、被監視者監視システムMSにおける、被監視者Obの監視に関する前記監視情報を表示する動作およびそれに関連する動作について、説明する。図5は、実施形態の被監視者監視システムにおける携帯端末装置の監視情報に関する動作を示すフローチャートである。図6は、実施形態の被監視者監視システムにおける携帯端末装置の第1および第2電気機械変換部の制御に関する第1態様の動作を示すフローチャートである。図7は、実施形態の被監視者監視システムにおける携帯端末装置に表示される待受け画面の一例を示す図である。図8は、実施形態の被監視者監視システムにおける携帯端末装置に表示される監視情報画面の一例を示す図である。図9は、実施形態の被監視者監視システムにおける携帯端末装置の通話態様を説明するための図である。図9Aは、第1電気機械変換部341を受話口として動作し、第2電気機械変換部342を送話口として動作するように制御するデフォルトのモードでの携帯端末装置TAaの使用態様を説明するための図である。図9Bは、第1電気機械変換部341の動作を停止し、第2電気機械変換部342を送話口および受話口として動作するように制御するスピーカフォンのモードでの携帯端末装置TAaの使用態様を説明するための図である。 Next, the operation of displaying the monitoring information related to the monitoring of the monitored person Ob and the related operation in the monitored person monitoring system MS will be described. FIG. 5 is a flowchart illustrating an operation related to monitoring information of the mobile terminal device in the monitored person monitoring system according to the embodiment. FIG. 6 is a flowchart illustrating the operation of the first aspect related to the control of the first and second electromechanical conversion units of the mobile terminal device in the monitored person monitoring system of the embodiment. FIG. 7 is a diagram illustrating an example of a standby screen displayed on the mobile terminal device in the monitored person monitoring system according to the embodiment. FIG. 8 is a diagram illustrating an example of a monitoring information screen displayed on the mobile terminal device in the monitored person monitoring system according to the embodiment. FIG. 9 is a diagram for explaining a call mode of the mobile terminal device in the monitored person monitoring system according to the embodiment. FIG. 9A illustrates a usage mode of the portable terminal device TAa in a default mode in which the first electromechanical conversion unit 341 operates as an earpiece and the second electromechanical conversion unit 342 controls to operate as a mouthpiece. It is a figure for doing. FIG. 9B shows the use of the portable terminal device TAa in the speakerphone mode in which the operation of the first electromechanical converter 341 is stopped and the second electromechanical converter 342 is controlled to operate as a mouthpiece and a receiver. It is a figure for demonstrating an aspect.
 次に、端末装置SP、TAaの動作について説明する。ここでは、代表的に、携帯端末装置TAaの動作について説明する。上述したように、電源が投入され、その稼働を始めると、携帯端末装置TAaでは、例えば看護師や介護士等の監視者(ユーザ)によるログイン操作が受け付けられ、TA監視処理部322によって、自機宛の通信信号を待ち受ける待受け画面がTA表示部36に表示される。この待受け画面51は、例えば、図7に示すように、メニューバーを表示するメニューバー領域511と、待ち受け中であることを表すメッセージ(例えば「通知はありません」)およびアイコンを表示する待受けメイン領域512と、現在時刻を表示する時刻領域513と、今日の年月日曜日を表示する年月日曜日領域514と、今、当該携帯端末装置TAにログインしているユーザ名を表示するユーザ名領域515とを備える。メニューバー領域511には、他の携帯端末装置TAaとの内線通話や外線電話機TLとの外線電話の発信の指示を入力するためのオフフックボタン5111が備えられている。 Next, the operation of the terminal devices SP and TAa will be described. Here, typically, operation | movement of portable terminal device TAa is demonstrated. As described above, when the power is turned on and the operation starts, the portable terminal device TAa accepts a login operation by a monitor (user) such as a nurse or a caregiver, and the TA monitor processing unit 322 automatically A standby screen for waiting for a communication signal addressed to the machine is displayed on the TA display unit 36. For example, as shown in FIG. 7, the standby screen 51 includes a menu bar area 511 for displaying a menu bar, and a standby main area for displaying a message indicating standby (for example, “no notification”) and an icon. 512, a time area 513 for displaying the current time, a year / month / sunday area 514 for displaying the current year / month / sunday, and a user name area 515 for displaying the name of the user who is currently logged in to the portable terminal device TA. Is provided. The menu bar area 511 is provided with an off-hook button 5111 for inputting an instruction for an extension call with another mobile terminal device TAa or an outgoing call with an external telephone TL.
 そして、図5において、携帯端末装置TAaは、TA制御処理部32aのTA制御部321aによって、TA通信IF部31で通信信号を受信したか否かを判定する(S11)。この判定の結果、通信信号を受信していない場合(No)には、携帯端末装置TAaは、処理をS11に戻し、前記判定の結果、通信信号を受信している場合(Yes)には、携帯端末装置TAaは、次の処理S12を実行する。すなわち、携帯端末装置TAaは、通信信号の受信を待ち受けている。 And in FIG. 5, portable terminal device TAa determines whether the communication signal was received by TA communication IF part 31 by TA control part 321a of TA control process part 32a (S11). If the communication signal is not received as a result of this determination (No), the mobile terminal device TAa returns the process to S11. If the communication signal is received as a result of the determination (Yes), The portable terminal device TAa performs the following process S12. That is, the portable terminal device TAa is waiting for reception of a communication signal.
 処理S12では、携帯端末装置TAaは、TA制御部321aによって、この受信した通信信号の種類を判定する。この判定の結果、携帯端末装置TAaは、前記受信した通信信号が第2監視情報通信信号である場合(第2監視情報)には、次の処理S13および処理S14を順次に実行した後に処理S17を実行し、前記受信した通信信号が第2ナースコール通知通信信号である場合(第2NC通知)には、次の処理S15および処理S16を順次に実行した後に処理S17を実行し、前記受信した通信信号が第2監視情報通信信号および第2ナースコール通知通信信号ではない場合(その他)には、処理S11で受信した通信信号に応じた適宜な処理を行う処理S19を実行した後に本処理を終了する。 In process S12, the portable terminal device TAa determines the type of the received communication signal by the TA control unit 321a. As a result of the determination, when the received communication signal is the second monitoring information communication signal (second monitoring information), the portable terminal device TAa performs the following processing S13 and processing S14 in sequence and then processing S17. When the received communication signal is the second nurse call notification communication signal (second NC notification), the process S17 and the process S16 are sequentially performed, and then the process S17 is performed. When the communication signal is not the second monitoring information communication signal and the second nurse call notification communication signal (others), the process is performed after executing the process S19 for performing an appropriate process according to the communication signal received in the process S11. finish.
 処理S13では、携帯端末装置TAaは、TA制御処理部32aのTA監視処理部322によって、処理S11で管理サーバ装置SVから受信した第2監視情報通信信号に収容された、被監視者Obに対する監視に関する監視情報をTA監視情報記憶部331に記憶(記録)する。 In the process S13, the mobile terminal device TAa monitors the monitored person Ob contained in the second monitoring information communication signal received from the management server apparatus SV in the process S11 by the TA monitoring processing unit 322 of the TA control processing unit 32a. Is stored (recorded) in the TA monitoring information storage unit 331.
 この処理S13の次に、TA監視処理部322は、処理S11で受信した第2監視情報通信信号に収容された各情報に応じた画面を、例えば図8に示す監視情報画面52をTA表示部36に表示する(S14)。 After this processing S13, the TA monitoring processing unit 322 displays a screen corresponding to each information contained in the second monitoring information communication signal received in processing S11, for example, the monitoring information screen 52 shown in FIG. 36 (S14).
 この監視情報画面52は、被監視者Obの監視に関する前記監視情報を表示するための画面である。前記監視情報画面52は、例えば、図8に示すように、メニューバー領域511と、処理S11で受信した第2監視情報通信信号に収容されたセンサIDを持つセンサ装置SUの配設場所および前記センサIDを持つ前記センサ装置SUによって監視される被監視者Obの名前を表示する被監視者名領域521と、処理S11で受信した第2監視情報通信信号の受信時刻(または前記所定の行動の検知時刻)からの経過時間、および、処理S11で受信した第2監視情報通信信号に収容された前記検知結果を表示する検知情報表示領域522と、処理S11で受信した第2監視情報通信信号に収容された画像(すなわち、前記センサIDを持つ前記センサ装置SUによって撮像された対象画像)(ここでは静止画)を表示する画像領域523と、「対応する」ボタン524と、「話す」ボタン525と、「LIVEを見る」ボタン526とを備える。 The monitoring information screen 52 is a screen for displaying the monitoring information related to the monitoring of the monitored person Ob. For example, as shown in FIG. 8, the monitoring information screen 52 includes a menu bar area 511, an arrangement location of the sensor device SU having the sensor ID accommodated in the second monitoring information communication signal received in the process S <b> 11, and the The monitored person name area 521 for displaying the name of the monitored person Ob monitored by the sensor device SU having the sensor ID, and the reception time of the second monitoring information communication signal received in the process S11 (or the predetermined action) Detection time), a detection information display area 522 for displaying the detection result contained in the second monitoring information communication signal received in the process S11, and a second monitoring information communication signal received in the process S11. An image area 523 for displaying a stored image (that is, a target image captured by the sensor device SU having the sensor ID) (here, a still image). Includes a "corresponding" button 524, the "speak" button 525, and a "View LIVE" button 526.
 被監視者名領域521に、センサ装置SUの配設場所および被監視者Obの名前を表示するために、TA記憶部33には、センサID、前記センサIDを持つセンサ装置SUの配設場所および前記センサIDを持つ前記センサ装置SUによって監視される被監視者Obの名前が互いに対応付けられて予め記憶される。検知情報表示領域522には、処理S11で受信した第2監視情報通信信号に収容された検知結果(本実施形態では、起床、離床、転落および転倒の各名称)がそのまま表示されても良いが、本実施形態では、前記検知結果を象徴的に表すアイコンで表示されている。このアイコンで表示するために、TA記憶部33には、各行動およびその行動を象徴的に表すアイコンが互いに対応付けられて予め記憶される。図8に示す例では、検知情報表示領域522には、起床を象徴的に表す起床アイコンが表示されている。「対応する」ボタン524は、監視情報画面52では、この監視情報画面52に表示された検知結果に対し例えば救命、看護、介護および介助等の所定の対応(応対、対処)を実施する意思が当該携帯端末装置TAのユーザにある旨を表す実施意思情報を、当該携帯端末装置TAaに入力するためのボタンである。「話す」ボタン525は、音声通話を要求するためのボタンであって、前記センサIDの前記センサ装置SUと当該携帯端末装置TAaとをネットワークNWを介して通話可能に接続する指示を入力するためのボタンである。「LIVEを見る」ボタン526は、ライブでの動画を要求するためのボタンであって、前記センサIDの前記センサ装置SUによって撮像される動画を表示させる指示を入力するためのボタンである。 In order to display the installation location of the sensor device SU and the name of the monitored subject Ob in the monitored person name area 521, the TA storage unit 33 displays the installation location of the sensor device SU having the sensor ID and the sensor ID. And the name of the monitored person Ob monitored by the sensor device SU having the sensor ID is stored in advance in association with each other. In the detection information display area 522, the detection results (in this embodiment, names of getting up, getting out of bed, falling down, and falling down) contained in the second monitoring information communication signal received in step S11 may be displayed as they are. In this embodiment, the detection result is displayed with an icon that symbolically represents the detection result. In order to display this icon, the TA storage unit 33 stores each action and an icon representative of the action in association with each other in advance. In the example illustrated in FIG. 8, the detection information display area 522 displays a wake-up icon that symbolizes wake-up. In the monitoring information screen 52, the “corresponding” button 524 has an intention to perform a predetermined response (response, response) such as lifesaving, nursing, care, and assistance for the detection result displayed on the monitoring information screen 52. This is a button for inputting implementation intention information indicating that the user of the mobile terminal device TA is present to the mobile terminal device TAa. A “speak” button 525 is a button for requesting a voice call, and is used to input an instruction to connect the sensor device SU of the sensor ID and the mobile terminal device TAa so as to be able to talk over the network NW. It is a button. The “LIVE” button 526 is a button for requesting a live video, and is a button for inputting an instruction to display a video captured by the sensor device SU of the sensor ID.
 図5に戻って、一方、処理S15では、携帯端末装置TAaは、TA制御処理部32aのTA監視処理部322によって、処理S11で管理サーバ装置SVから受信した第2ナースコール通知通信信号に収容された、被監視者Obに対する監視に関する監視情報をTA監視情報記憶部331に記憶(記録)する。 Returning to FIG. 5, on the other hand, in the process S15, the portable terminal device TAa is accommodated in the second nurse call notification communication signal received from the management server apparatus SV in the process S11 by the TA monitoring processor 322 of the TA control processor 32a. The monitoring information related to the monitoring of the monitored person Ob is stored (recorded) in the TA monitoring information storage unit 331.
 この処理S15の次に、TA監視処理部322は、処理S11で受信した第2ナースコール通知通信信号に収容されたナースコール受付情報に応じて、TA記憶部33に予め記憶されたナースコールを受け付けた旨を表す図略のナースコール受付画面をTA表示部36に表示する(S16)。 Following this processing S15, the TA monitoring processing unit 322 performs a nurse call stored in advance in the TA storage unit 33 in accordance with the nurse call reception information accommodated in the second nurse call notification communication signal received in the processing S11. An unillustrated nurse call acceptance screen indicating acceptance is displayed on the TA display unit 36 (S16).
 そして、これら処理S14および処理S16それぞれの後に実行される前記処理S17では、携帯端末装置TAaは、TA制御処理部32aによって、TA入力部35およびTA表示部36を備えて成るタッチパネルで入力操作を受け付けたか否かを判定する。この判定の結果、入力操作を受け付けていない場合(No)には、携帯端末装置TAaは、処理を処理S17に戻し、一方、前記判定の結果、入力操作を受け付けている場合には、携帯端末装置TAaは、次の処理S18を実行する。 And in said process S17 performed after each of these process S14 and process S16, portable terminal device TAa performs input operation with the touch panel which comprises TA input part 35 and TA display part 36 by TA control process part 32a. It is determined whether or not it has been accepted. If the input operation is not accepted as a result of this determination (No), the portable terminal device TAa returns the process to step S17. On the other hand, if the input operation is accepted as a result of the determination, The apparatus TAa executes the next process S18.
 この処理S18では、携帯端末装置TAaは、TA制御処理部32aによって、入力操作の内容に応じた適宜な処理を実行し、本処理を終了する。 In this process S18, the portable terminal apparatus TAa performs an appropriate process according to the content of the input operation by the TA control processing unit 32a, and ends this process.
 例えば、携帯端末装置TAaは、TA制御処理部32aによって、「対応する」ボタン524の入力操作を受け付けると(すなわち、前記対応意思を受け付けると)、現在、TA表示部36に表示している被監視者Obの監視情報に、「対応する」を受け付けた旨を付してTA監視情報記憶部331に記憶し、TA表示部36に表示している被監視者Obの監視情報に対応するセンサIDおよび「対応する」を受け付けた旨を表す情報(対応受付情報)を収容した通信信号(対応受付通知通信信号)を管理サーバ装置SVへ送信する。この対応受付通知通信信号を受信した管理サーバ装置SVは、この受信した対応受付通知通信信号に収容されたセンサIDおよび対応受付情報を収容した通信信号(対応受付周知通信信号)を同報通信で端末装置SP、TAへ送信する。これによって、TA表示部36に表示している被監視者Obの監視情報に対応するセンサIDに関し、「対応する」を受け付けた旨が各端末装置SP、TAa間で同期される。 For example, when the portable terminal device TAa receives an input operation of the “corresponding” button 524 by the TA control processing unit 32a (ie, accepts the corresponding intention), the portable terminal device TAa is currently displayed on the TA display unit 36. A sensor corresponding to the monitoring information of the monitored person Ob displayed on the TA display unit 36 with the fact that “corresponding” has been received is added to the monitoring information of the monitoring person Ob and stored in the TA monitoring information storage unit 331. A communication signal (correspondence acceptance notification communication signal) containing information indicating that the ID and “corresponding” are accepted (correspondence acceptance information) is transmitted to the management server device SV. The management server device SV that has received the correspondence reception notification communication signal broadcasts the communication signal (correspondence reception well-known communication signal) containing the sensor ID and the correspondence reception information contained in the received correspondence reception notification communication signal. Transmit to the terminal devices SP and TA. As a result, regarding the sensor ID corresponding to the monitoring information of the monitored person Ob displayed on the TA display unit 36, the fact that “corresponding” has been received is synchronized between the terminal devices SP and TAa.
 また例えば、携帯端末装置TAaは、TA制御処理部32aによって、「話す」ボタン525の入力操作を受け付けると、TA通話処理部323によって、TA表示部36に表示している被監視者Obを監視するセンサ装置SUへ、音声通話を要求する旨等の情報を収容した通信信号(通話要求通信信号)を送信し、これに応じたセンサ装置SUとネットワークNWを介して音声通話可能に接続する。これによって携帯端末装置TAaとセンサ装置SUとの間で音声通話が可能となる。この通話の要求を受け付けた際に実行される第1および第2電気機械変換部341、342に対する制御については、後述する。なお、携帯端末装置TAaは、TA制御処理部32aによって、音声通話の終了の指示を入力するためのボタンである図略の「終了」ボタンの入力操作を受け付けると、TA通話処理部323によって、TA表示部36に表示している被監視者Obを監視するセンサ装置SUへ、音声通話の終了を要求する旨等の情報を収容した通信信号(通話終了通信信号)を送信する。これによって携帯端末装置TAaとセンサ装置SUとの間での音声通話が終了される。 Further, for example, when the mobile terminal device TAa receives an input operation of the “speak” button 525 by the TA control processing unit 32a, the TA call processing unit 323 monitors the monitored person Ob displayed on the TA display unit 36. A communication signal (call request communication signal) containing information such as requesting a voice call is transmitted to the sensor device SU, and the corresponding sensor device SU is connected to the sensor device SU via the network NW so as to be able to make a voice call. As a result, a voice call can be performed between the portable terminal device TAa and the sensor device SU. The control for the first and second electromechanical converters 341 and 342 executed when the request for a call is received will be described later. When the TA control processing unit 32a accepts an input operation of an “end” button (not shown), which is a button for inputting an instruction to end a voice call, the portable terminal device TAa uses the TA call processing unit 323 to A communication signal (call end communication signal) containing information such as a request to end the voice call is transmitted to the sensor device SU that monitors the monitored person Ob displayed on the TA display unit 36. As a result, the voice call between the portable terminal device TAa and the sensor device SU is terminated.
 また例えば、携帯端末装置TAaは、TA制御処理部32aによって、「LIVEを見る」ボタン526の入力操作を受け付けると、TAストリーミング処理部324によって、現在、TA表示部36に表示している被監視者Obを監視するセンサ装置SUへ、ライブでの動画の配信を要求する旨等の情報を収容した通信信号(動画配信要求通信信号)を送信し、これに応じたセンサ装置SUとネットワークNWを介して動画のダウンロード可能に接続し、前記センサ装置SUからライブでの動画の配信を受け、この配信を受けた動画をストリーミング再生でTA表示部36に表示する。このライブでの動画を表示する監視情報画面52では、画像領域523に静止画に代え動画が表示され、そして、「LIVEを見る」ボタン526に代え図略の「LIVE終了」ボタンが表示される。これによって携帯端末装置TAaには、ライブでの動画が表示される。前記図略の「LIVE終了」ボタンは、動画の終了を要求するためのボタンであって、前記センサIDの前記センサ装置SUによって撮像される動画の配信を終了(停止)させ表示を終了(停止)させる指示を入力するためのボタンである。携帯端末装置TAaは、TA制御処理部32aによって、「LIVE終了」ボタンの入力操作を受け付けると、TAストリーミング処理部324によって、現在、TA表示部36に表示している被監視者Obを監視するセンサ装置SUへ、動画配信の終了を要求する旨等の情報を収容した通信信号(動画配信終了通信信号)を送信し、静止画をTA表示部36に表示する。これによって携帯端末装置TAは、ライブでの動画の表示を終了する。 Further, for example, when the mobile terminal device TAa receives an input operation of the “watch live” button 526 by the TA control processing unit 32a, the TA streaming processing unit 324 displays the monitored object currently displayed on the TA display unit 36. A communication signal (video distribution request communication signal) containing information such as requesting live video distribution is transmitted to the sensor device SU that monitors the person Ob, and the sensor device SU and the network NW corresponding thereto are transmitted. The video is connected to be downloadable via the sensor device SU, receives live video distribution from the sensor device SU, and displays the distributed video on the TA display unit 36 by streaming playback. On the monitoring information screen 52 that displays the live video, the video is displayed in the image area 523 instead of the still image, and the “live end” button (not shown) is displayed instead of the “view live” button 526. . Thereby, live video is displayed on the mobile terminal device TAa. The “live end” button (not shown) is a button for requesting the end of the moving image, and ends (stops) the distribution of the moving image picked up by the sensor device SU of the sensor ID and ends (stops) the display. This is a button for inputting an instruction to be performed. When the TA control processing unit 32a accepts an input operation of the “live end” button, the portable terminal device TAa monitors the monitored person Ob currently displayed on the TA display unit 36 by the TA streaming processing unit 324. A communication signal (moving image distribution end communication signal) containing information such as requesting the end of moving image distribution is transmitted to the sensor device SU, and a still image is displayed on the TA display unit 36. Accordingly, the mobile terminal device TA ends the live video display.
 センサ装置SUから管理サーバ装置SVを介して報知を受けた検知結果やナースコール受付の各報知(各再報知)に関し、携帯端末装置TAaは、以上のように動作している。 The mobile terminal device TAa operates as described above with respect to the detection results received from the sensor device SU via the management server device SV and the notifications (re-notification of each nurse call reception).
 前記検知結果や前記ナースコール受付の各報知(各再報知)に関し、上述のように動作している間に、第1および第2電気機械変換部341、342の制御に関し、携帯端末装置TAaは、次のように動作している。 Regarding the notification of each detection result and each reception of the nurse call (each re-notification), the portable terminal device TAa relates to the control of the first and second electromechanical converters 341 and 342 while operating as described above. It works as follows.
 図6において、携帯端末装置TAaは、TA制御処理部32aによって、センサ装置SUの撮像部11で撮像した画像をTA表示部36に表示している場合に、さらに、前記センサ装置SUの間における通話の要求を検出したか否かを判定する(S21)。この判定では、本実施形態では、TA制御処理部32aは、TA制御部321aによって「話す」ボタン525の入力操作を受け付けたか否かを判定する第1判定処理、離接判定部325によって離接センサ部38の第1センサ出力に基づいて人の離間であるか否かを判定する第2判定処理、および、姿勢判定部326によって姿勢センサ部39の第2センサ出力に基づいて当該携帯端末装置TAaの姿勢が、前記線分LNの延長方向が垂直方向よりも水平方向に近い姿勢であるか否かを判定する第3判定処理それぞれを実行する。これら第1ないし第3判定処理それぞれの結果、TA制御処理部32aは、少なくともいずれか1つの判定結果が肯定であった場合、前記通話の要求が有ったと判定し(Yes)、次の処理S22を実行し、一方、全て判定結果が否定であった場合、前記通話の要求が無かったと判定し(No)、次の処理S25を実行する。すなわち、前記第1判定処理の結果、「話す」ボタン525の入力操作を受け付けていた場合、肯定的な判定結果として、TA制御処理部32aは、次の処理S22を実行する。前記第2判定処理の結果、人の離間である場合、肯定的な判定結果として、TA制御処理部32aは、次の処理S22を実行する。前記第3判定処理の結果、当該携帯端末装置TAaの姿勢が、前記線分LNの延長方向が垂直方向よりも水平方向に近い姿勢である場合、肯定的な判定結果として、TA制御処理部32aは、次の処理S22を実行する。一方、「話す」ボタン525の入力操作を受け付けてなく、かつ、人の離間ではなく、かつ、当該携帯端末装置TAaの姿勢が、前記線分LNの延長方向が垂直方向よりも水平方向に近くない姿勢である場合、否定的な判定結果として、TA制御処理部32aは、次の処理S25を実行する。 In FIG. 6, when the TA control processing unit 32a displays an image captured by the imaging unit 11 of the sensor device SU on the TA display unit 36, the portable terminal device TAa is further connected between the sensor devices SU. It is determined whether a call request is detected (S21). In this determination, in the present embodiment, the TA control processing unit 32a performs first determination processing for determining whether or not the TA control unit 321a has accepted an input operation of the “speak” button 525, and the disconnection determination unit 325 performs the disconnection / detachment. A second determination process for determining whether or not the person is separated based on the first sensor output of the sensor unit 38; and the portable terminal device based on the second sensor output of the attitude sensor unit 39 by the attitude determination unit 326 The third determination process is performed to determine whether the posture of TAa is a posture in which the extending direction of the line segment LN is closer to the horizontal direction than the vertical direction. As a result of each of the first to third determination processes, if at least one of the determination results is affirmative, the TA control processing unit 32a determines that the call has been requested (Yes), and performs the next process. On the other hand, if all the determination results are negative, it is determined that there is no request for the call (No), and the next process S25 is executed. That is, as a result of the first determination process, when an input operation of the “speak” button 525 is accepted, as a positive determination result, the TA control processing unit 32a executes the next process S22. As a result of the second determination process, if the person is separated, as a positive determination result, the TA control processing unit 32a executes the next process S22. As a result of the third determination process, if the posture of the mobile terminal device TAa is a posture in which the extending direction of the line segment LN is closer to the horizontal direction than the vertical direction, as a positive determination result, the TA control processing unit 32a Performs the following process S22. On the other hand, the input operation of the “speak” button 525 is not accepted, the person is not separated, and the mobile terminal device TAa is in a posture in which the extension direction of the line segment LN is closer to the horizontal direction than the vertical direction. When the posture is not present, as a negative determination result, the TA control processing unit 32a executes the next process S25.
 処理S22では、TA制御処理部32aは、TA制御部321aによって第1電気機械変換部341の動作を停止するように制御する。 In process S22, the TA control processing unit 32a controls the TA control unit 321a to stop the operation of the first electromechanical conversion unit 341.
 この処理S22の次に、TA制御処理部32aは、TA制御部321aによって第2電気機械変換部341を送話口および受話口として動作するように制御する(S23)。 Next to this process S22, the TA control processor 32a controls the TA controller 321a to operate the second electromechanical converter 341 as a mouthpiece and a receiver (S23).
 これら処理S22および処理S23の各処理によって、第1および第2電気機械変換部341、342は、いわゆるスピーカフォンのモードで動作する。 The first and second electromechanical converters 341 and 342 operate in a so-called speakerphone mode through the processes S22 and S23.
 この処理S23の次に、TA制御処理部32aは、TA通話処理部323によって通話の終了であるか否かを判定する(S24)。より具体的には、本実施形態では、TA制御処理部32aは、TA通話処理部323によって前記「終了」ボタンの入力操作を受け付けたか否かを判定する。この判定の結果、前記「終了」ボタンの入力操作を受け付けている場合には、TA制御処理部32aは、通話の終了であると判定し(Yes)、次に処理S25を実行する。一方、前記判定の結果、前記「終了」ボタンの入力操作を受け付けていない場合には、TA制御処理部32aは、通話の終了ではないと判定し(No)、処理を処理S24に戻す。すなわち、通話が終了するまで、処理S24が繰り返される。なお、上述では、TA制御処理部32aは、通話の終了であるか否かを判定したが、第2電気機械変換部342の前記スピーカフォンとしての使用の終了であるか否かを判定しても良い。この場合では、前記スピーカフォンとしての使用が終了するまで、処理S24が繰り返され、前記スピーカフォンとしての使用が終了すると、処理S25が実行される。 After this process S23, the TA control processing unit 32a determines whether or not the call is ended by the TA call processing unit 323 (S24). More specifically, in the present embodiment, the TA control processing unit 32 a determines whether or not the TA call processing unit 323 has accepted an input operation of the “end” button. If the result of this determination is that an input operation for the “end” button has been accepted, the TA control processing section 32a determines that the call has ended (Yes), and then executes step S25. On the other hand, if the input operation of the “end” button is not accepted as a result of the determination, the TA control processing unit 32a determines that the call is not ended (No), and returns the process to step S24. That is, the process S24 is repeated until the call ends. In the above description, the TA control processing unit 32a determines whether or not the call is ended, but determines whether or not the use of the second electromechanical conversion unit 342 as the speakerphone is ended. Also good. In this case, the process S24 is repeated until the use as the speakerphone is finished, and when the use as the speakerphone is finished, the process S25 is executed.
 処理S25では、TA制御処理部32aは、TA制御部321aによって第1電気機械変換部341を受話口として動作するように制御する。 In process S25, the TA control processing unit 32a controls the TA control unit 321a to operate using the first electromechanical conversion unit 341 as an earpiece.
 この処理S25の次に、TA制御処理部32aは、TA制御部321aによって第2電気機械変換部341を送話口として動作するように制御する(S26)。 After this process S25, the TA control processing unit 32a controls the TA control unit 321a to operate using the second electromechanical conversion unit 341 as a mouthpiece (S26).
 これら処理S25および処理S26の各処理によって、第1および第2電気機械変換部341、342は、デフォルトのモードで動作する。 The first and second electromechanical converters 341 and 342 operate in the default mode by the processes S25 and S26.
 この処理S26の次に、TA制御処理部32aは、当該携帯端末装置TAaの稼働の終了であるか否かを判定する(S27)。この判定の結果、例えば電源スイッチのオフ操作等によって稼働の終了である場合(Yes)には、TA制御処理部32aは、本処理を終了する。一方、前記判定の結果、稼働の終了ではない場合(No)には、TA制御処理部32aは、処理を処理S21に戻す。 Next to this processing S26, the TA control processing unit 32a determines whether or not the operation of the mobile terminal device TAa is finished (S27). As a result of this determination, for example, when the operation is ended due to an operation of turning off the power switch or the like (Yes), the TA control processing unit 32a ends this processing. On the other hand, if the result of the determination is that the operation has not ended (No), the TA control processing unit 32a returns the process to step S21.
 以上説明したように、本実施形態における被監視者監視システムMS、端末装置SP、TAaおよびこれに実装された動作制御方法は、センサ装置USの撮像部11で撮像した画像をTA表示部36に表示している場合に、さらに、通話の要求を検出した場合、第2電気機械変換部342を送話口および受話口として動作するように制御するTA制御部321aを備えるので、例えば図9Aに示すように、通話のために、端末装置SP、TAの第1電気機械変換部341を耳元に近づけ、その第2電気機械変換部342を口元に近づけることを、例えば図9Bに示すように、行わなくても、第2電気機械変換部342で送話でき、受話できる。したがって、上記被監視者監視システムMS、上記端末装置SP、TAaおよび上記動作制御方法は、画像を見ながら被監視者Obと通話できる。 As described above, the monitored person monitoring system MS, the terminal device SP, TAa and the operation control method implemented therein according to the present embodiment provide the TA display unit 36 with an image captured by the imaging unit 11 of the sensor device US. When the call request is detected, the TA control unit 321a that controls the second electromechanical conversion unit 342 to operate as the mouthpiece and the earpiece is provided when the call request is detected. As shown in FIG. 9B, for example, as shown in FIG. 9B, the first electromechanical conversion unit 341 of the terminal device SP, TA is brought close to the ear and the second electromechanical conversion unit 342 is brought close to the mouth for the call. Even if not performed, the second electromechanical conversion unit 342 can transmit and receive a speech. Therefore, the monitored person monitoring system MS, the terminal devices SP and TAa, and the operation control method can talk to the monitored person Ob while viewing the image.
 上記被監視者監視システムMS、上記端末装置SP、TAaおよび上記動作制御方法は、「話す」ボタン525を備えるので、タッチパネルを構成するTA入力部35で監視者(ユーザ)による通話の要求を受け付けることができ、監視者における通話の要求を直接的に検出できる。 Since the monitored person monitoring system MS, the terminal device SP, TAa, and the operation control method include the “speak” button 525, the TA input unit 35 constituting the touch panel accepts a call request from the supervisor (user). It is possible to detect the call request in the supervisor directly.
 上記被監視者監視システムMS、上記端末装置SP、TAaおよび上記動作制御方法は、離接センサ部38および離接判定部325を備えるので、監視者における通話の要求を離接センサ部38および離接判定部325によって自動的に検出できる。 The monitored person monitoring system MS, the terminal devices SP, TAa, and the operation control method include the separation / contact sensor unit 38 and the separation / reception determination unit 325. The contact determination unit 325 can automatically detect it.
 上記被監視者監視システムMS、上記端末装置SP、TAaおよび上記動作制御方法は、姿勢センサ部39および姿勢判定部326を備えるので、監視者における通話の要求を姿勢センサ部39および姿勢判定部326によって自動的に検出できる。 The monitored person monitoring system MS, the terminal devices SP, TAa, and the operation control method include the posture sensor unit 39 and the posture determination unit 326. Therefore, the posture sensor unit 39 and the posture determination unit 326 request a call from the supervisor. Can be detected automatically.
 なお、上述の実施形態では、第1ないし第3判定処理それぞれの結果、少なくともいずれか1つの判定結果が肯定であった場合に、第1電気機械変換部341は、その動作を停止するように制御され、第2電気機械変換部342は、送話口および受話口として動作するように制御されたが、第1ないし第3判定処理それぞれの結果、第1判定処理の判定結果が肯定であって、さらに、第2および第3判定処理それぞれの各判定結果のうちの少なくとも一方の判定結果が肯定であった場合に、第1電気機械変換部341は、その動作を停止するように制御され、第2電気機械変換部342は、送話口および受話口として動作するように制御されても良い。 In the above-described embodiment, when at least any one determination result is affirmative as a result of each of the first to third determination processes, the first electromechanical conversion unit 341 stops its operation. The second electromechanical conversion unit 342 is controlled to operate as a mouthpiece and a mouthpiece. However, as a result of each of the first to third determination processes, the determination result of the first determination process is positive. In addition, when at least one of the determination results of the second and third determination processes is affirmative, the first electromechanical conversion unit 341 is controlled to stop its operation. The second electromechanical conversion unit 342 may be controlled to operate as a mouthpiece and a mouthpiece.
 このような変形形態における携帯端末装置TAbは、図4に示すように、TA通信IF部31と、TA制御処理部32bと、TA記憶部33と、TA音入出力部34と、TA入力部35と、TA表示部36と、TAIF部37と、離接センサ部38と、姿勢センサ部39とを備える。これら変形形態の携帯端末装置TAbにおけるTA通信IF部31、TA記憶部33、TA音入出力部34、TA入力部35、TA表示部36、TAIF部37、離接センサ部38および姿勢センサ部39は、それぞれ、上述の携帯端末装置TAaにおけるTA通信IF部31、TA記憶部33、TA音入出力部34、TA入力部35、TA表示部36、TAIF部37、離接センサ部38および姿勢センサ部39と同様であるので、その説明を省略する。 As shown in FIG. 4, the mobile terminal device TAb in such a modified form includes a TA communication IF unit 31, a TA control processing unit 32b, a TA storage unit 33, a TA sound input / output unit 34, and a TA input unit. 35, a TA display unit 36, a TAIF unit 37, a separation sensor unit 38, and an attitude sensor unit 39. The TA communication IF unit 31, the TA storage unit 33, the TA sound input / output unit 34, the TA input unit 35, the TA display unit 36, the TAIF unit 37, the separation sensor unit 38, and the attitude sensor unit in the mobile terminal device TAb of these modified forms. 39 is a TA communication IF unit 31, a TA storage unit 33, a TA sound input / output unit 34, a TA input unit 35, a TA display unit 36, a TAIF unit 37, a disconnection sensor unit 38, and a mobile terminal device TAa, respectively. Since it is the same as the attitude sensor unit 39, the description thereof is omitted.
 TA制御処理部32bは、TA制御処理部32aと同様に、携帯端末装置TAbの各部を当該各部の機能に応じてそれぞれ制御し、被監視者Obに対する前記監視情報を受け付けて表示し、ナースコールの応答や声かけするための回路である。この変形形態のTA制御処理部32bは、制御処理プログラムが実行されることによって、TA制御部321b、TA監視処理部322、TA通話処理部323、TAストリーミング処理部324、離接判定部325および姿勢判定部326を機能的に備える。これら変形形態のTA制御処理部32bにおけるTA監視処理部322、TA通話処理部323、TAストリーミング処理部324、離接判定部325および姿勢判定部326は、それぞれ、上述のTA制御処理部32aにおけるTA監視処理部322、TA通話処理部323、TAストリーミング処理部324、離接判定部325および姿勢判定部326と同様であるので、その説明を省略する。 Similar to the TA control processing unit 32a, the TA control processing unit 32b controls each unit of the mobile terminal device TAb according to the function of each unit, accepts and displays the monitoring information for the monitored person Ob, and displays a nurse call. It is a circuit for answering and calling out. The TA control processing unit 32b of this modified form is configured such that a TA control unit 321b, a TA monitoring processing unit 322, a TA call processing unit 323, a TA streaming processing unit 324, a connection / disconnection determination unit 325, and a control processing program are executed. A posture determination unit 326 is functionally provided. The TA monitoring processing unit 322, the TA call processing unit 323, the TA streaming processing unit 324, the disconnection determination unit 325, and the posture determination unit 326 in the TA control processing unit 32b of these modified forms are respectively in the above-described TA control processing unit 32a. Since it is similar to the TA monitoring processing unit 322, the TA call processing unit 323, the TA streaming processing unit 324, the disconnection determination unit 325, and the posture determination unit 326, description thereof is omitted.
 TA制御部321bは、携帯端末装置TAbの各部を当該各部の機能に応じてそれぞれ制御し、携帯端末装置TAbの全体制御を司るものである。この変形形態では、TA制御部321bは、第1ないし第3判定処理それぞれの結果、第1判定処理の判定結果が肯定であって、さらに、第2および第3判定処理それぞれの各判定結果のうちの少なくとも一方の判定結果が肯定であった場合に、第1電気機械変換部341の動作を停止し、第2電気機械変換部342を送話口および受話口として動作するように制御する。すなわち、TA制御部321bは、タッチパネルを構成するTA入力部35で「話す」ボタン525の入力操作を受け付け、さらに、離接判定部325および姿勢判定部326のうちの少なくとも一方によって前記通話の要求を判定した場合を、最終的な前記通話の要求を検出した場合とし、第1電気機械変換部341の動作を停止し、第2電気機械変換部342を送話口および受話口として動作するように制御する。 The TA control unit 321b controls each part of the mobile terminal device TAb according to the function of each part, and controls the entire mobile terminal device TAb. In this modification, the TA control unit 321b determines that the determination result of the first determination process is positive as a result of each of the first to third determination processes, and each determination result of each of the second and third determination processes is further positive. When at least one of the determination results is affirmative, the operation of the first electromechanical conversion unit 341 is stopped, and the second electromechanical conversion unit 342 is controlled to operate as the mouthpiece and the earpiece. That is, the TA control unit 321b accepts an input operation of the “speak” button 525 at the TA input unit 35 constituting the touch panel, and further, the call request is made by at least one of the separation determination unit 325 and the posture determination unit 326. Is determined to be when the final call request is detected, the operation of the first electromechanical conversion unit 341 is stopped, and the second electromechanical conversion unit 342 is operated as the mouthpiece and the earpiece. To control.
 このような変形形態の携帯端末装置TAbにおける第1および第2電気機械変換部341、342の制御について説明する。図10は、実施形態の被監視者監視システムにおける携帯端末装置の第1および第2電気機械変換部の制御に関する第2態様の動作を示すフローチャートである。 The control of the first and second electromechanical conversion units 341 and 342 in the mobile terminal device TAb having such a modification will be described. FIG. 10 is a flowchart illustrating the operation of the second aspect regarding the control of the first and second electromechanical conversion units of the portable terminal device in the monitored person monitoring system of the embodiment.
 図10において、携帯端末装置TAbは、TA制御処理部32bのTA制御部321bによって、センサ装置SUの撮像部11で撮像した画像をTA表示部36に表示している場合に、さらに、第1判定処理を実行し、その結果、タッチパネルを構成するTA入力部35で「話す」ボタン525の入力操作を受け付けていると(S31)、TA制御処理部32bの離接判定部325によって、離接センサ部38の第1センサ出力を取得する(S32)。 In FIG. 10, the mobile terminal device TAb further includes the first when the TA control unit 321 b of the TA control processing unit 32 b displays an image captured by the imaging unit 11 of the sensor device SU on the TA display unit 36. When the determination process is executed and, as a result, an input operation of the “speak” button 525 is received by the TA input unit 35 constituting the touch panel (S31), the connection / disconnection determination unit 325 of the TA control processing unit 32b performs the connection / disconnection. The first sensor output of the sensor unit 38 is acquired (S32).
 次に、携帯端末装置TAbは、離接判定部325によって、第2判定処理として、処理S32で取得した離接センサ部38の第1センサ出力に基づいて人の離間か否かを判定する(S33)。この判定の結果、人の離間である場合(Yes)には、携帯端末装置TAbは、TA制御処理部32bによって処理S36を実行する。一方、前記判定の結果、人の離間ではない場合(No)には、携帯端末装置TAbは、TA制御処理部32bによって処理S34を実行する。なお、人の離間か否かの判定に代え、携帯端末装置TAbは、離接判定部325によって、第2判定処理として、人の近接か否かを判定しても良い。この場合では、その判定の結果、人の近接である場合(Yes)には、携帯端末装置TAbは、TA制御処理部32bによって処理S34を実行し、人の近接ではない場合(No)には、携帯端末装置TAbは、TA制御処理部32bによって処理S36を実行する。 Next, the portable terminal device TAb determines whether or not the person is separated based on the first sensor output of the separation / contact sensor unit 38 acquired in step S32 as the second determination process by the separation / contact determination unit 325 ( S33). If the result of this determination is that the person is separated (Yes), the portable terminal device TAb executes the process S36 by the TA control processing unit 32b. On the other hand, when the result of the determination is that the person is not separated (No), the portable terminal device TAb executes the process S34 by the TA control processing unit 32b. Instead of determining whether or not the person is separated, the portable terminal device TAb may determine whether or not the person is close as the second determination process by the separation / contact determination unit 325. In this case, if the result of the determination is that the person is close (Yes), the portable terminal device TAb executes the process S34 by the TA control processing unit 32b, and if the person is not close (No). The portable terminal device TAb executes the process S36 by the TA control processing unit 32b.
 処理S34では、携帯端末装置TAbは、TA制御処理部32bの姿勢判定部326によって、姿勢センサ部39の第2センサ出力を取得する。 In process S34, the portable terminal device TAb acquires the second sensor output of the attitude sensor unit 39 by the attitude determination unit 326 of the TA control processing unit 32b.
 この処理S34の次に、携帯端末装置TAbは、姿勢判定部326によって、第3判定処理として、処理S33で取得した姿勢センサ部39の第2センサ出力に基づいて、当該携帯端末装置TAbの姿勢が、第1電気機械変換部341の第1配設位置と第2電気機械変換部342の第2配設位置とを結ぶ前記線分LNの延長方向が垂直方向よりも水平方向に近い姿勢であるか否かを判定する(S35)。この判定の結果、当該携帯端末装置TAbの姿勢が、前記線分LNの延長方向が垂直方向よりも水平方向に近い姿勢である場合(Yes)には、携帯端末装置TAbは、TA制御処理部32bによって処理S36を実行する。一方、前記判定の結果、当該携帯端末装置TAbの姿勢が、前記線分LNの延長方向が垂直方向よりも水平方向に近い姿勢ではない場合(No)には、携帯端末装置TAbは、TA制御処理部32bによって処理S39を実行する。 After this process S34, the mobile terminal device TAb is subjected to the attitude of the mobile terminal apparatus TAb based on the second sensor output of the attitude sensor unit 39 acquired in process S33 as the third determination process by the attitude determination unit 326. However, the extending direction of the line segment LN connecting the first arrangement position of the first electromechanical converter 341 and the second arrangement position of the second electromechanical converter 342 is in a posture closer to the horizontal direction than the vertical direction. It is determined whether or not there is (S35). As a result of this determination, when the posture of the mobile terminal device TAb is a posture in which the extending direction of the line segment LN is closer to the horizontal direction than the vertical direction (Yes), the mobile terminal device TAb The process S36 is executed by 32b. On the other hand, as a result of the determination, when the posture of the mobile terminal device TAb is not a posture in which the extending direction of the line segment LN is closer to the horizontal direction than the vertical direction (No), the mobile terminal device TAb Process S39 is executed by the processing unit 32b.
 処理S36では、TA制御処理部32bは、処理S22と同様に、TA制御部321bによって第1電気機械変換部341の動作を停止するように制御する。 In the process S36, the TA control processing unit 32b controls the TA control unit 321b to stop the operation of the first electromechanical conversion unit 341, similarly to the process S22.
 この処理S36の次に、TA制御処理部32bは、処理S23と同様に、TA制御部321bによって第2電気機械変換部341を送話口および受話口として動作するように制御する(S37)。 Following this process S36, the TA control processing unit 32b controls the second electromechanical conversion unit 341 to operate as the mouthpiece and the earpiece by the TA control unit 321b, similarly to the process S23 (S37).
 これら処理S36および処理S37の各処理によって、第1および第2電気機械変換部341、342は、いわゆるスピーカフォンのモードで動作する。 The first and second electromechanical converters 341 and 342 operate in a so-called speakerphone mode by the processes S36 and S37.
 この処理S37の次に、TA制御処理部32bは、処理S24と同様に、TA通話処理部323によって通話の終了であるか否かを判定する(S38)。この判定の結果、通話の終了である場合(Yes)には、TA制御処理部32bは、次に処理S39を実行する。一方、前記判定の結果、通話の終了ではない場合(No)には、TA制御処理部32bは、処理を処理S38に戻す。すなわち、通話が終了するまで、処理S38が繰り返される。 After this process S37, the TA control processing unit 32b determines whether the call is ended by the TA call processing unit 323, similarly to the process S24 (S38). If the result of this determination is that the call has ended (Yes), the TA control processing section 32b next executes processing S39. On the other hand, if the result of the determination is that the call has not ended (No), the TA control processing unit 32b returns the process to step S38. That is, the process S38 is repeated until the call ends.
 処理S39では、TA制御処理部32bは、処理S25と同様に、TA制御部321bによって第1電気機械変換部341を受話口として動作するように制御する。 In the process S39, the TA control processing unit 32b controls the TA control unit 321b to operate the first electromechanical conversion unit 341 as the earpiece, similarly to the process S25.
 この処理S39の次に、TA制御処理部32bは、処理S26と同様に、TA制御部321bによって第2電気機械変換部341を送話口として動作するように制御する(S40)。 Following this processing S39, the TA control processing section 32b controls the TA control section 321b to operate the second electromechanical conversion section 341 as a mouthpiece, similarly to the processing S26 (S40).
 これら処理S39および処理S40の各処理によって、第1および第2電気機械変換部341、342は、デフォルトのモードで動作する。 The first and second electromechanical converters 341 and 342 operate in the default mode by the processes S39 and S40.
 なお、上述において、処理S32および処理S33が省略され、処理S31の次に処理S34が実行され、その他の処理は、上述と同様に実行されても良い。また、上述において、処理S34および処理S35が省略され、処理S33で、人の離間ではない場合(No)には、処理S39が実行され、その他の処理は、上述と同様に実行されても良い。また、上述において、処理S33で、人の離間である場合(Yes)には、処理S34が実行され、人の離間ではない場合(No)には、処理S39が実行され、その他の処理は、上述と同様に実行されても良い。 In the above description, the processing S32 and the processing S33 are omitted, the processing S31 is executed after the processing S31, and the other processing may be executed in the same manner as described above. Further, in the above, the processing S34 and the processing S35 are omitted, and in the processing S33, when the person is not separated (No), the processing S39 is executed, and the other processing may be executed in the same manner as described above. . Further, in the above description, in the process S33, if the person is separated (Yes), the process S34 is executed. If the person is not separated (No), the process S39 is executed, and the other processes are as follows. It may be executed in the same manner as described above.
 このような変形形態における被監視者監視システムMS、端末装置SP、TAbおよびこれに実装された動作制御方法は、「話す」ボタン525の入力操作を受け付けて音声通話を実行する場合に、例えば図9Bに示すようにTA表示部36の画像を見ながら通話したい場合と、例えば図9Aに示すようにTA表示部36の画像を見ずに通話したい場合とを自動的に切り分けることができ、TA表示部36の画像を見ながら通話したい場合のみ、第2電気機械変換部342を送話口および受話口として動作するように制御できる。 The monitored person monitoring system MS, the terminal device SP, TAb, and the operation control method implemented therein in such a modification form, for example, when receiving an input operation of the “speak” button 525 and executing a voice call, for example, FIG. As shown in FIG. 9B, it is possible to automatically distinguish between a case where the user wants to talk while looking at the image on the TA display unit 36 and a case where the user wants to talk without looking at the image on the TA display unit 36 as shown in FIG. Only when it is desired to make a call while viewing the image on the display unit 36, the second electromechanical conversion unit 342 can be controlled to operate as a mouthpiece and a mouthpiece.
 本明細書は、上記のように様々な態様の技術を開示しているが、そのうち主な技術を以下に纏める。 This specification discloses various modes of technology as described above, and the main technologies are summarized below.
 一態様にかかる端末装置は、撮像を行う撮像部と通話を行う通話部とを備えるセンサ装置と通信可能に接続される端末装置であって、表示を行う表示部と、電気信号を機械振動信号に変換し、受話口として動作する第1電気機械変換部と、電気信号と機械振動信号との間で相互に変換し、送話口として動作する第2電気機械変換部と、前記センサ装置の前記通話部との間における通話の要求を検出する通話検出部と、前記通話検出部の検出結果を受け、前記表示部、前記第1電気機械変換部および前記第2電気機械変換部それぞれを制御する制御部とを備える。そして、前記制御部は、前記センサ装置の前記撮像部で撮像した画像を前記表示部に表示している場合に、さらに、前記通話検出部が前記通話の要求を検出した場合、前記第2電気機械変換部を前記送話口および前記受話口として動作するように制御する。好ましくは、上述の端末装置において、前記制御部は、前記センサ装置の前記撮像部で撮像した画像を前記表示部に表示している場合に、さらに、前記通話検出部が前記通話の要求を検出した場合、前記第1電気機械変換部の動作を停止し、前記第2電気機械変換部を前記送話口および前記受話口として動作するように制御する。 A terminal device according to one aspect is a terminal device communicably connected to a sensor device including an imaging unit that performs imaging and a calling unit that performs a call, and a display unit that performs display, and an electrical signal as a mechanical vibration signal A first electromechanical converter that operates as a mouthpiece, a second electromechanical converter that converts between an electric signal and a mechanical vibration signal, and operates as a mouthpiece, and the sensor device A call detection unit that detects a request for a call with the call unit, and a detection result of the call detection unit, and controls each of the display unit, the first electromechanical conversion unit, and the second electromechanical conversion unit A control unit. And when the said control part is displaying the image imaged by the said imaging part of the said sensor apparatus on the said display part, and also when the said call detection part detects the request | requirement of the said call, said 2nd electricity The machine conversion unit is controlled to operate as the mouthpiece and the earpiece. Preferably, in the above terminal device, when the control unit displays an image captured by the imaging unit of the sensor device on the display unit, the call detection unit further detects the call request. In this case, the operation of the first electromechanical conversion unit is stopped, and the second electromechanical conversion unit is controlled to operate as the mouthpiece and the earpiece.
 このような端末装置は、センサ装置の撮像部で撮像した画像を表示部に表示している場合に、さらに、通話検出部が通話の要求を検出した場合、第2電気機械変換部を送話口および受話口として動作するように制御する制御部を備えるので、通話のために、端末装置の第1電気機械変換部を耳元に近づけ、その第2電気機械変換部を口元に近づけなくても、第2電気機械変換部で送話でき、受話できる。したがって、上記端末装置は、画像を見ながら被監視者と通話できる。 In such a terminal device, when an image captured by the imaging unit of the sensor device is displayed on the display unit, and further when the call detection unit detects a call request, the terminal device transmits the second electromechanical conversion unit. Since the control unit that controls to operate as the mouth and the earpiece is provided, the first electromechanical conversion unit of the terminal device does not need to be close to the ear and the second electromechanical conversion unit does not need to be close to the mouth for communication. The second electromechanical converter can send and receive speech. Therefore, the terminal device can talk with the monitored person while viewing the image.
 他の一態様では、上述の端末装置において、前記通話検出部は、前記通話の要求を受け付ける通話要求入力部を含む。 In another aspect, in the above-described terminal device, the call detection unit includes a call request input unit that receives the call request.
 このような端末装置は、通話要求入力部で監視者(ユーザ)による通話の要求を受け付けることができ、監視者における通話の要求を直接的に検出できる。 Such a terminal device can accept a call request by a supervisor (user) at a call request input unit, and can directly detect a call request by a supervisor.
 他の一態様では、これら上述の端末装置において、前記通話検出部は、人の離接を検知する離接センサ部と、前記離接センサ部の第1センサ出力に基づいて、人の離間を前記通話の要求として判定する離接判定部とを含むことを特徴とする。好ましくは、上述の端末装置において、前記離接センサ部は、静電容量式の人感センサである。また好ましくは、上述の端末装置において、前記離接センサ部は、赤外線式の人感センサである。 In another aspect, in the above-described terminal devices, the call detection unit detects the separation of the person based on the separation sensor unit that detects the separation of the person and the first sensor output of the separation sensor unit. And a connection / disconnection determination unit that determines the call request. Preferably, in the above-described terminal device, the separation sensor unit is a capacitive human sensor. Preferably, in the terminal device described above, the separation sensor unit is an infrared human sensor.
 このような端末装置は、離接センサ部の第1センサ出力に基づいて人の離接を判定し、人の離間と判定した場合を前記通話の要求として判定する。したがって、上記端末装置は、監視者における通話の要求を離接センサ部および離接判定部によって自動的に検出できる。 Such a terminal device determines a person's separation / contact based on the first sensor output of the separation / contact sensor unit, and determines a case where it is determined that the person is separated as a request for the call. Therefore, the terminal device can automatically detect a call request of the supervisor by the separation sensor unit and the separation determination unit.
 他の一態様では、これら上述の端末装置において、前記通話検出部は、当該端末装置の姿勢を検出する姿勢センサ部と、前記姿勢センサ部の第2センサ出力に基づいて、当該端末装置の姿勢が、前記第1電気機械変換部の第1配設位置と前記第2電気機械変換部の第2配設位置とを結ぶ線分の延長方向が垂直方向よりも水平方向に近い姿勢である場合を前記通話の要求として判定する姿勢判定部とを含む。好ましくは、上述の端末装置において、前記姿勢センサ部は、ジャイロセンサである。 In another aspect, in the above-described terminal devices, the call detection unit is configured to detect a posture of the terminal device based on a posture sensor unit that detects a posture of the terminal device and a second sensor output of the posture sensor unit. However, when the extending direction of the line segment connecting the first arrangement position of the first electromechanical conversion unit and the second arrangement position of the second electromechanical conversion unit is a posture closer to the horizontal direction than the vertical direction. And an attitude determination unit that determines as a call request. Preferably, in the terminal device described above, the posture sensor unit is a gyro sensor.
 このような端末装置は、姿勢センサ部の第2センサ出力に基づいて当該端末装置の姿勢を判定し、当該端末装置の姿勢が、第1電気機械変換部の第1配設位置と第2電気機械変換部の第2配設位置とを結ぶ線分の延長方向が垂直方向よりも水平方向に近い姿勢であると判定した場合を前記通話の要求として判定する。したがって、上記端末装置は、監視者における通話の要求を姿勢センサ部および姿勢判定部によって自動的に検出できる。 Such a terminal device determines the posture of the terminal device based on the second sensor output of the posture sensor unit, and the posture of the terminal device is determined based on the first arrangement position of the first electromechanical converter and the second electric device. The case where it is determined that the extension direction of the line connecting the second arrangement position of the machine conversion unit is in a posture closer to the horizontal direction than the vertical direction is determined as the request for the call. Therefore, the terminal device can automatically detect a call request from the supervisor by the attitude sensor unit and the attitude determination unit.
 他の一態様では、上述の端末装置において、前記通話検出部は、前記通話の要求を受け付ける通話要求入力部と、人の離接を検知する離接センサ部および前記離接センサ部の第1センサ出力に基づいて人の離間を前記通話の要求として判定する離接判定部、ならびに、当該端末装置の姿勢を検出する姿勢センサ部および前記姿勢センサ部の第2センサ出力に基づいて当該端末装置の姿勢が、前記第1電気機械変換部の第1配設位置と前記第2電気機械変換部の第2配設位置とを結ぶ線分の延長方向が垂直方向よりも水平方向に近い姿勢である場合を前記通話の要求として判定する姿勢判定部、のうちの少なくとも一方とを含み、前記制御部は、前記通話要求入力部で前記通話の要求を受け付け、さらに、前記少なくとも一方によって前記通話の要求を判定した場合を、前記通話検出部が前記通話の要求を検出した場合とする。 In another aspect, in the terminal device described above, the call detection unit includes a call request input unit that receives the call request, a separation sensor unit that detects a person's separation and contact, and a first of the separation sensor unit. A separation / contact determination unit that determines a person's separation as a request for the call based on a sensor output, a posture sensor unit that detects a posture of the terminal device, and a second sensor output of the posture sensor unit. The orientation of the line segment connecting the first placement position of the first electromechanical transducer and the second placement position of the second electromechanical transducer is closer to the horizontal direction than the vertical direction. At least one of attitude determination units that determine a certain case as a request for the call, and the control unit receives the call request at the call request input unit, and further, the communication request is received by the at least one. When determining the request, and if the call detecting unit detects the request for the call.
 このような端末装置は、表示部の画像を見ながら通話したい場合と、表示部の画像を見ずに通話したい場合とを切り分けることができ、前記表示部の画像を見ながら通話したい場合のみ、第2電気機械変換部を送話口および受話口として動作するように制御できる。 Such a terminal device can separate a case where it is desired to make a call while looking at the image on the display unit and a case where it is desired to make a call without looking at the image on the display unit. The second electromechanical converter can be controlled to operate as a mouthpiece and a mouthpiece.
 他の一態様にかかる端末装置の動作制御方法は、撮像を行う撮像部と通話を行う通話部とを備えるセンサ装置と通信可能に接続される端末装置の動作を制御する端末装置の動作制御方法であって、前記センサ装置の前記通話部との間における通話の要求を検出する通話検出工程と、表示を行う表示部、電気信号を機械振動信号に変換し、受話口として動作する第1電気機械変換部、および、電気信号と機械振動信号との間で相互に変換し、送話口として動作する第2電気機械変換部それぞれを制御する制御工程とを備える。そして、前記制御工程は、前記センサ装置の前記撮像部で撮像した画像を前記表示部に表示している場合に、さらに、前記通話検出工程で前記通話の要求を検出した場合、前記第2電気機械変換部を前記送話口および前記受話口として動作するように制御する。 According to another aspect of the present invention, there is provided an operation control method for a terminal device that controls an operation of a terminal device that is communicably connected to a sensor device including an imaging unit that performs imaging and a communication unit that performs a call. A call detection step for detecting a request for a call with the call unit of the sensor device, a display unit for displaying, and a first electric device that converts an electrical signal into a mechanical vibration signal and operates as a receiver. A mechanical conversion unit, and a control step of controlling each of the second electromechanical conversion units that convert each other between an electric signal and a mechanical vibration signal and operate as a mouthpiece. In the control step, when the image picked up by the image pickup unit of the sensor device is displayed on the display unit, and when the call request is detected in the call detection step, the second electric The machine conversion unit is controlled to operate as the mouthpiece and the earpiece.
 このような端末装置の動作制御方法は、センサ装置の撮像部で撮像した画像を表示部に表示している場合に、さらに、通話検出工程で通話の要求を検出した場合、第2電気機械変換部を送話口および前記受話口として動作するように制御する制御工程を備えるので、通話のために、端末装置の第1電気機械変換部を耳元に近づけ、その第2電気機械変換部を口元に近づけなくても、第2電気機械変換部で送話でき、受話できる。したがって、上記端末装置の動作制御方法は、画像を見ながら被監視者と通話できる。 In such a terminal device operation control method, when the image picked up by the image pickup unit of the sensor device is displayed on the display unit, and further when a call request is detected in the call detection step, the second electromechanical conversion is performed. A control step of controlling the unit to operate as a mouthpiece and the earpiece, so that for a call, the first electromechanical conversion unit of the terminal device is brought close to the ear and the second electromechanical conversion unit is moved to the mouth Even if it is not close to, the second electromechanical converter can send and receive. Therefore, the operation control method of the terminal device can make a call with the monitored person while viewing the image.
 他の一態様にかかる被監視者監視システムは、撮像を行う撮像部と通話を行う通話部とを備えるセンサ装置と、前記センサ装置と通信可能に接続される端末装置とを備え、前記撮像部で撮像された画像に基づいて監視対象である被監視者における所定の行動を検知して検知結果を前記端末装置に報知する被監視者監視システムであって、前記端末装置は、これら上述のいずれかの端末装置である。 A monitored person monitoring system according to another aspect includes a sensor device including an imaging unit that performs imaging and a calling unit that performs a call, and a terminal device that is connected to be communicable with the sensor device, and the imaging unit A monitored person monitoring system that detects a predetermined action in a monitored person that is a monitoring target based on an image captured in step (b) and notifies the terminal device of a detection result, wherein the terminal apparatus Is a terminal device.
 このような被監視者監視システムは、これら上述のいずれかの端末装置を備えるので、画像を見ながら被監視者と通話できる。 Since such a monitored person monitoring system includes any of the above-described terminal devices, it is possible to talk with the monitored person while viewing the image.
 この出願は、2016年2月16日に出願された日本国特許出願特願2016-26662を基礎とするものであり、その内容は、本願に含まれるものである。 This application is based on Japanese Patent Application No. 2016-26662 filed on Feb. 16, 2016, the contents of which are included in the present application.
 本発明を表現するために、上述において図面を参照しながら実施形態を通して本発明を適切且つ十分に説明したが、当業者であれば上述の実施形態を変更および/または改良することは容易に為し得ることであると認識すべきである。したがって、当業者が実施する変更形態または改良形態が、請求の範囲に記載された請求項の権利範囲を離脱するレベルのものでない限り、当該変更形態または当該改良形態は、当該請求項の権利範囲に包括されると解釈される。 In order to express the present invention, the present invention has been properly and fully described through the embodiments with reference to the drawings. However, those skilled in the art can easily change and / or improve the above-described embodiments. It should be recognized that this is possible. Therefore, unless the modifications or improvements implemented by those skilled in the art are at a level that departs from the scope of the claims recited in the claims, the modifications or improvements are not covered by the claims. To be construed as inclusive.
 本発明によれば、端末装置および端末装置の動作制御方法ならびに被監視者監視システムが提供できる。
 
According to the present invention, it is possible to provide a terminal device, an operation control method for the terminal device, and a monitored person monitoring system.

Claims (7)

  1.  撮像を行う撮像部と通話を行う通話部とを備えるセンサ装置と通信可能に接続される端末装置であって、
     表示を行う表示部と、
     電気信号を機械振動信号に変換し、受話口として動作する第1電気機械変換部と、
     電気信号と機械振動信号との間で相互に変換し、送話口として動作する第2電気機械変換部と、
     前記センサ装置の前記通話部との間における通話の要求を検出する通話検出部と、
     前記通話検出部の検出結果を受け、前記表示部、前記第1電気機械変換部および前記第2電気機械変換部それぞれを制御する制御部とを備え、
     前記制御部は、前記センサ装置の前記撮像部で撮像した画像を前記表示部に表示している場合に、さらに、前記通話検出部が前記通話の要求を検出した場合、前記第2電気機械変換部を前記送話口および前記受話口として動作するように制御する、
     端末装置。
    A terminal device communicably connected to a sensor device including an imaging unit that performs imaging and a calling unit that performs a call,
    A display unit for displaying, and
    A first electromechanical converter that converts an electrical signal into a mechanical vibration signal and operates as an earpiece;
    A second electromechanical converter that converts between the electrical signal and the mechanical vibration signal and operates as a mouthpiece;
    A call detection unit for detecting a request for a call with the call unit of the sensor device;
    A control unit that receives the detection result of the call detection unit and controls each of the display unit, the first electromechanical conversion unit, and the second electromechanical conversion unit;
    When the control unit displays an image captured by the imaging unit of the sensor device on the display unit, and the call detection unit detects the call request, the second electromechanical conversion is performed. Control the unit to operate as the mouthpiece and the earpiece,
    Terminal device.
  2.  前記通話検出部は、前記通話の要求を受け付ける通話要求入力部を含む、
     請求項1に記載の端末装置。
    The call detection unit includes a call request input unit that receives a request for the call.
    The terminal device according to claim 1.
  3.  前記通話検出部は、人の離接を検知する離接センサ部と、前記離接センサ部の第1センサ出力に基づいて、人の離間を前記通話の要求として判定する離接判定部とを含む、
     請求項1または請求項2に記載の端末装置。
    The call detection unit includes: a separation / contact sensor unit that detects a person's separation / contact; and a separation / contact determination unit that determines a person's separation as a request for the call based on a first sensor output of the separation / contact sensor unit. Including,
    The terminal device according to claim 1 or 2.
  4.  前記通話検出部は、当該端末装置の姿勢を検出する姿勢センサ部と、前記姿勢センサ部の第2センサ出力に基づいて、当該端末装置の姿勢が、前記第1電気機械変換部の第1配設位置と前記第2電気機械変換部の第2配設位置とを結ぶ線分の延長方向が垂直方向よりも水平方向に近い姿勢である場合を前記通話の要求として判定する姿勢判定部とを含む、
     請求項1なしい請求項3のいずれか1項に記載の端末装置。
    The call detection unit includes a posture sensor unit that detects a posture of the terminal device and a second sensor output of the posture sensor unit, and the posture of the terminal device is determined by the first arrangement of the first electromechanical conversion unit. A posture determining unit that determines, as a request for the call, a case in which an extension direction of a line segment connecting an installation position and a second arrangement position of the second electromechanical conversion unit is a posture closer to the horizontal direction than the vertical direction; Including,
    The terminal device according to any one of claims 1 and 3.
  5.  前記通話検出部は、
     前記通話の要求を受け付ける通話要求入力部と、
     人の離接を検知する離接センサ部および前記離接センサ部の第1センサ出力に基づいて人の離間を前記通話の要求として判定する離接判定部、ならびに、当該端末装置の姿勢を検出する姿勢センサ部および前記姿勢センサ部の第2センサ出力に基づいて当該端末装置の姿勢が、前記第1電気機械変換部の第1配設位置と前記第2電気機械変換部の第2配設位置とを結ぶ線分の延長方向が垂直方向よりも水平方向に近い姿勢である場合を前記通話の要求として判定する姿勢判定部、のうちの少なくとも一方とを含み、
     前記制御部は、前記通話要求入力部で前記通話の要求を受け付け、さらに、前記少なくとも一方によって前記通話の要求を判定した場合を、前記通話検出部が前記通話の要求を検出した場合とする、
     請求項1に記載の端末装置。
    The call detection unit
    A call request input unit for receiving the call request;
    A separation sensor unit for detecting separation / contact of a person, a separation / contact determination unit for determining separation of a person as a request for the call based on a first sensor output of the separation / contact sensor unit, and an attitude of the terminal device The attitude of the terminal device is determined based on the attitude sensor unit to be operated and the second sensor output of the attitude sensor unit, and the second arrangement of the second electromechanical conversion unit and the second arrangement of the second electromechanical conversion unit. Including at least one of posture determination units that determine a case where an extension direction of a line connecting the positions is a posture closer to the horizontal direction than the vertical direction as a request for the call,
    The control unit accepts the call request at the call request input unit, and further determines that the call request is determined by the at least one when the call detection unit detects the call request.
    The terminal device according to claim 1.
  6.  撮像を行う撮像部と通話を行う通話部とを備えるセンサ装置と通信可能に接続される端末装置の動作を制御する端末装置の動作制御方法であって、
     前記センサ装置の前記通話部との間における通話の要求を検出する通話検出工程と、
     表示を行う表示部、電気信号を機械振動信号に変換し、受話口として動作する第1電気機械変換部、および、電気信号と機械振動信号との間で相互に変換し、送話口として動作する第2電気機械変換部それぞれを制御する制御工程とを備え、
     前記制御工程は、前記センサ装置の前記撮像部で撮像した画像を前記表示部に表示している場合に、さらに、前記通話検出工程で前記通話の要求を検出した場合、前記第2電気機械変換部を前記送話口および前記受話口として動作するように制御する、
     端末装置の動作制御方法。
    An operation control method for a terminal device that controls the operation of a terminal device that is communicably connected to a sensor device that includes an imaging unit that performs imaging and a communication unit that performs a call,
    A call detection step of detecting a request for a call with the call unit of the sensor device;
    A display unit for displaying, a first electromechanical conversion unit that converts an electrical signal into a mechanical vibration signal and operates as an earpiece, and a mutual conversion between the electrical signal and the mechanical vibration signal and operates as a transmission port And a control process for controlling each of the second electromechanical converters,
    In the control step, when the image picked up by the image pickup unit of the sensor device is displayed on the display unit, and when the call request is detected in the call detection step, the second electromechanical conversion is performed. Control the unit to operate as the mouthpiece and the earpiece,
    Operation control method of terminal device.
  7.  撮像を行う撮像部と通話を行う通話部とを備えるセンサ装置と、前記センサ装置と通信可能に接続される端末装置とを備え、前記撮像部で撮像された画像に基づいて監視対象である被監視者における所定の行動を検知して検知結果を前記端末装置に報知する被監視者監視システムであって、
     前記端末装置は、請求項1ないし請求項5のいずれか1項に記載の端末装置である、
     被監視者監視システム。
    A sensor device including an imaging unit that performs imaging and a communication unit that performs a call, and a terminal device that is connected to be communicable with the sensor device, and is a monitoring target based on an image captured by the imaging unit. A monitored person monitoring system for detecting a predetermined action in a monitor and notifying a detection result to the terminal device,
    The terminal device is the terminal device according to any one of claims 1 to 5.
    Monitored person monitoring system.
PCT/JP2017/003833 2016-02-16 2017-02-02 Terminal device, terminal device operation contol method, and monitored person monitoring system WO2017141721A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2017536361A JP6245415B1 (en) 2016-02-16 2017-02-02 Terminal device, operation control method of terminal device, and monitored person monitoring system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016026662 2016-02-16
JP2016-026662 2016-02-16

Publications (1)

Publication Number Publication Date
WO2017141721A1 true WO2017141721A1 (en) 2017-08-24

Family

ID=59625122

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/003833 WO2017141721A1 (en) 2016-02-16 2017-02-02 Terminal device, terminal device operation contol method, and monitored person monitoring system

Country Status (2)

Country Link
JP (1) JP6245415B1 (en)
WO (1) WO2017141721A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0658967A (en) * 1992-04-01 1994-03-04 American Teleph & Telegr Co <Att> Capacitive proximity sensor
JP2002540680A (en) * 1999-03-19 2002-11-26 エリクソン インコーポレイテッド Communication device and method for operating according to communication device orientation determined with reference to gravity sensor
JP2004221638A (en) * 2003-01-09 2004-08-05 Nappu Enterprise Kk Circuit shared for transmission and reception
JP2005020435A (en) * 2003-06-26 2005-01-20 Nec Commun Syst Ltd Mobile phone with loudspeaking function, loudspeaking method and its program
JP2011114734A (en) * 2009-11-30 2011-06-09 Aiphone Co Ltd Television intercom device
JP2015061228A (en) * 2013-09-19 2015-03-30 アイホン株式会社 Nurse call system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0658967A (en) * 1992-04-01 1994-03-04 American Teleph & Telegr Co <Att> Capacitive proximity sensor
JP2002540680A (en) * 1999-03-19 2002-11-26 エリクソン インコーポレイテッド Communication device and method for operating according to communication device orientation determined with reference to gravity sensor
JP2004221638A (en) * 2003-01-09 2004-08-05 Nappu Enterprise Kk Circuit shared for transmission and reception
JP2005020435A (en) * 2003-06-26 2005-01-20 Nec Commun Syst Ltd Mobile phone with loudspeaking function, loudspeaking method and its program
JP2011114734A (en) * 2009-11-30 2011-06-09 Aiphone Co Ltd Television intercom device
JP2015061228A (en) * 2013-09-19 2015-03-30 アイホン株式会社 Nurse call system

Also Published As

Publication number Publication date
JP6245415B1 (en) 2017-12-13
JPWO2017141721A1 (en) 2018-02-22

Similar Documents

Publication Publication Date Title
WO2017146012A1 (en) Monitored-person monitoring device, method and system
WO2017209094A1 (en) Monitoring system
JP7044060B2 (en) Observer monitoring device, method and system
JP6226110B1 (en) Monitored person monitoring apparatus, method and system
JP2017148504A (en) Device for monitoring person to be monitored, method and system thereof
JP6740633B2 (en) Central processing unit and central processing method of monitored person monitoring system, and monitored person monitoring system
JP6504255B2 (en) Care support system and care support method
WO2017179605A1 (en) Watching system and management server
JP6150025B1 (en) Display device and display method of monitored person monitoring system, and monitored person monitoring system
JP6245415B1 (en) Terminal device, operation control method of terminal device, and monitored person monitoring system
JP6895090B2 (en) Detection system and display processing method of detection system
JP6172424B1 (en) Terminal device, terminal device control method, and monitored person monitoring system
JP6292363B2 (en) Terminal device, terminal device display method, and monitored person monitoring system
JP2017151676A (en) Monitored person monitor device, method of monitoring monitored person and program thereof
JP6187732B1 (en) Terminal device, terminal device operation control method, and monitored person monitoring system
WO2017145832A1 (en) Device, method, and system for monitoring persons to be monitored
WO2017188156A1 (en) Terminal apparatus of monitored person monitoring system, terminal apparatus control method, and monitored person monitoring system
JP7234931B2 (en) Sensor Device of Monitored Person Monitoring Support System, Processing Method of Sensor Device, and Monitored Person Monitoring Support System
WO2017130684A1 (en) Monitored-person monitoring device, method thereof, and system thereof
WO2017145520A1 (en) Terminal device, control method for terminal device, and system for monitoring persons to be monitored
JP2020188487A (en) Central processing device, monitored person monitoring method, and monitored person monitoring system
JP6150026B1 (en) Central processing unit and method of monitored person monitoring system, and monitored person monitoring system

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2017536361

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17752984

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17752984

Country of ref document: EP

Kind code of ref document: A1