WO2019013105A1 - Système d'aide à la surveillance et son procédé de commande - Google Patents

Système d'aide à la surveillance et son procédé de commande Download PDF

Info

Publication number
WO2019013105A1
WO2019013105A1 PCT/JP2018/025596 JP2018025596W WO2019013105A1 WO 2019013105 A1 WO2019013105 A1 WO 2019013105A1 JP 2018025596 W JP2018025596 W JP 2018025596W WO 2019013105 A1 WO2019013105 A1 WO 2019013105A1
Authority
WO
WIPO (PCT)
Prior art keywords
person
watching
area
watched
image
Prior art date
Application number
PCT/JP2018/025596
Other languages
English (en)
Japanese (ja)
Inventor
信二 高橋
田中 清明
純平 松永
達哉 村上
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2019013105A1 publication Critical patent/WO2019013105A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • G08B21/22Status alarms responsive to presence or absence of persons
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • G08B25/04Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using a single signalling line, e.g. in a closed loop

Definitions

  • the present invention relates to a technique for supporting watching of a subject on a bed.
  • Patent Document 1 proposes a system which detects a patient from an image (photographed image) photographed by a camera, determines the operation of the patient based on the detection result, and performs notification according to the judgment result.
  • the present invention has been made in view of the above circumstances, and it is an object of the present invention to provide a technology capable of suppressing unnecessary notification and obtaining effects such as reduction of burden on system users and privacy protection. Do.
  • Adopt a method of omitting notification when it is determined that a person is watching over.
  • a first aspect of the present invention is a watching support system for watching a target person on a bed, and an image acquiring unit for acquiring an image photographed by an imaging device; and the photographed image
  • a watching judgment unit that judges whether or not the subject is watched by another person based on an image of a judgment region that is a region of a part of the region and that includes the bed
  • the apparatus further includes a state determination unit that determines the state of the subject based on an image, and an output unit that performs notification according to the determination result of the state determination unit, and the output unit
  • the present invention provides a watching support system characterized by omitting the notification when it is judged by the watching judgment unit that the person is watched by a person.
  • this configuration it is determined from the image obtained by photographing the area of the bed and the surrounding area whether or not the subject is watched by another person.
  • a nurse, a caregiver, or another person watches over a target person such as a patient or a care recipient.
  • the subject is likely to be in bed and the other is likely to be in the bed vicinity. Therefore, according to the above configuration, it can be determined with high accuracy whether the target person is being watched by another person.
  • a notification is necessary to indirectly watch the subject who is not watched directly, and is not necessary when the subject is watched directly by another person.
  • the notification is omitted when it is determined that the target person is being watched by another person.
  • unnecessary notifications can be suppressed, and effects such as reduction of burden on the system user and protection of privacy can be obtained. For example, it is possible to save time and effort that a system user confirms unnecessary notifications.
  • unnecessary notification of the image of another person who is not involved in using the system such as a visitor can be suppressed, and the privacy of the other person can be protected.
  • the watching judgment unit may judge that the target person is watched by the other person when two or more persons exist in the judgment area. This is because there is a high possibility that two or more persons include the target person and the others who watch over the target person.
  • the watching judgment unit determines that the target person is the other person. It may be determined that the It is likely that the first person present in the bed is the target person, and the second person facing the target person (the first person) is likely to be the person watching the target person. is there.
  • the watching determination unit determines a person whose angle between the face direction vector and the vector directed to the first person is equal to or less than a threshold value as the second person facing the first person. It is good to do. This is because a person whose angle between the face direction vector and the vector directed to the first person is equal to or less than the threshold value is likely to face the first person.
  • the watching determination unit determines that the target person is watched by the other person when the person performing the watched operation and the person performing the watching operation exist in the determination area. It is also good. It is highly likely that the person performing the action to be watched is a target person being watched by another person, and the person performing the action to watch is likely to be another person watching the subject person It is.
  • the present invention can be understood as a watching support system having at least a part of the above configuration or function.
  • the present invention also provides a watching support method or a watching support system control method including at least a part of the above-described processing, a program for causing a computer to execute these methods, or non-temporarily such a program. It can also be regarded as a recorded computer readable recording medium.
  • FIG. 1 is a block diagram schematically showing a hardware configuration and a functional configuration of a watching support system.
  • FIG. 2 is a view showing an installation example of the imaging device.
  • FIG. 3A is an example of a photographed image
  • FIGS. 3B and 3D are examples of a watching judgment area
  • FIG. 3C is an example of a behavior judgment area.
  • FIG. 4 is a flowchart of the state monitoring process.
  • FIG. 5 is a flowchart of the watching determination process.
  • 6A and 6B are examples of photographed images.
  • FIG. 7 is a flowchart of watching judgment processing (first modification).
  • 8A and 8B are examples of photographed images.
  • FIG. 9 is a flowchart of the watching determination process (second modified example).
  • the present invention relates to a technique for supporting watching of a subject on a bed.
  • This technology can be applied to a system that automatically detects the getting-up and getting-up behavior of patients and care recipients in hospitals and nursing facilities, etc., and performs necessary notification when a dangerous state occurs.
  • This system can be preferably used, for example, for watching and supporting elderly people, patients with dementia, children and the like.
  • FIG. 1 is a block diagram schematically showing a hardware configuration and a functional configuration of the watching support system 1
  • FIG. 2 is a diagram showing an installation example of an imaging device.
  • the watching support system 1 includes an imaging device 10 and an information processing device 11 as main hardware configurations.
  • the imaging device 10 and the information processing device 11 are connected by wire or wirelessly. Although only one imaging device 10 is shown in FIG. 1, a plurality of imaging devices 10 may be connected to the information processing device 11.
  • the imaging device 10 is a device for capturing a subject on a bed and capturing image data.
  • the imaging device 10 a monochrome or color visible light camera, an infrared camera, a three-dimensional camera or the like can be used.
  • the imaging device 10 configured by the infrared LED illumination 100 and the near infrared camera 101 is adopted in order to enable watching of the target person even at night (even when the room is dark).
  • the imaging device 10 is installed to look over the entire bed 20 from the top of the bed 20 to the foot.
  • the imaging device 10 captures an image at a predetermined time interval (for example, 30 fps), and the image data is sequentially captured by the information processing device 11.
  • the orientation of the imaging device 10 is not particularly limited.
  • the imaging device 10 may be installed in the direction from the foot side of the bed 20, or the imaging device 10 may be installed in the direction from the side of the bed 20.
  • the information processing apparatus 11 has a function of analyzing image data taken in from the imaging apparatus 10 in real time, automatically detecting a rising state or leaving state of the target person 21 on the bed 20, and notifying when necessary. It is.
  • the information processing apparatus 11 includes an image acquisition unit 110, a detection unit 111, a watching determination unit 112, an action determination unit 113, an output unit 114, an area setting unit 115, and a storage unit 116 as specific functions.
  • the information processing apparatus 11 according to this embodiment includes a CPU (processor), memory, storage (HDD, SSD, etc.), input device (keyboard, mouse, touch panel, etc.), output device (display, speaker, etc.), communication interface, etc.
  • Each function of the information processing apparatus 11 described above is realized by the CPU executing a program stored in the storage or the memory.
  • the configuration of the information processing apparatus 11 is not limited to this example.
  • distributed computing may be performed by a plurality of computers, a part of the above functions may be performed by a cloud server, or a part of the above functions may be performed by a circuit such as an ASIC or an FPGA. It is also good.
  • the image acquisition unit 110 is a function of acquiring an image (captured image) captured by the imaging device 10.
  • the image data input from the image acquisition unit 110 is temporarily recorded in a memory or a storage, and is provided to the processing of the detection unit 111, the watching determination unit 112, the action determination unit 113, and the like.
  • the detection unit 111 analyzes the captured image acquired by the image acquisition unit 110, and detects the watching target person 21 and others (nurse, carer, etc.) from the captured image. For example, the detection unit 111 detects a human body of a person or a part of it (head, face, upper body, etc.). Any method may be used as a method of detecting a human body or a part thereof from a photographed image. For example, it is possible to preferably use an object detection algorithm using a classical Haar-like feature or a classifier using HoG feature or a recent Faster R-CNN.
  • the detection unit 111 detects the head (a part above the neck; for example, the head 22 of the target person 21) by the classifier using the Haar-like feature amount, and the position of the head as a detection result Output (x, y) and size (number of vertical and horizontal pixels).
  • the position (x, y) of the head is represented by, for example, image coordinates of the center point of a rectangular frame surrounding the head.
  • the detection unit 111 of this embodiment outputs the detection result as the position and size of the image coordinate system
  • the detection unit 111 converts the image coordinate system into a space coordinate system, and the three-dimensional position of the person in the space coordinate system A three dimensional size may be output.
  • the watching judgment unit 112 is a partial area of the photographed image acquired by the image acquiring unit 110, and based on the image of the watching judgment area which is an area including the bed 20, the target person 21 watches the other person It is a function to determine whether or not In the present embodiment, the watching determination unit 112 determines whether the target person 21 is watched by another person based on the detection result of the detection unit 111 with respect to the watching determination area. The process (the watching determination process) of the watching determination unit 112 will be described later.
  • the action determination unit 113 is a function of determining the state or action of the target person 21 based on the captured image acquired by the image acquisition unit 110.
  • the behavior determination unit 113 performs wakeup determination, release determination, and the like based on the detection result of the detection unit 111 with respect to the entire captured image.
  • the output unit 114 is a function of performing notification according to the determination result of the action determination unit 113.
  • the output unit 114 performs a necessary notification when the behavior determination unit 113 detects the wake-up operation or the bed-out operation of the target person 21.
  • the output unit 114 determines whether or not the notification is necessary (for example, notifies only in the dangerous state), the content of the notification (for example, the content of the message), the notification means (for example, voice) according to the degree of danger of the operation of the target person 21 , E-mail, buzzer, warning light), notification destination (eg, nurse, doctor), frequency of notification, etc. can be switched.
  • the output unit 114 omits the notification when it is determined by the watching determination unit 112 that the target person 21 is watched by another person.
  • the area setting unit 115 is a function of setting a determination area (a watching determination area or an action determination area) on a captured image captured by the imaging device 10.
  • the watching support system 1 sets a determination area based on the area of the bed 20 in the captured image.
  • the setting of the judgment area may be performed manually or automatically.
  • the area setting unit 115 may provide a user interface for allowing the user to input the bed area or the determination area itself in the captured image.
  • the area setting unit 115 may detect the bed area from the captured image by object recognition processing.
  • FIG. 3A is an example of a photographed image
  • FIG. 3B is an example of a watching judgment area set for the photographed image of FIG. 3A
  • FIG. 3C is an example of the action determination area set for the captured image of FIG. 3A.
  • the area setting unit 115 sets the watching determination area A0 and the action determination areas A1 to A3 on the basis of the bed area 30.
  • the watching judgment area A0 is an area including the bed area 30 and the periphery thereof, and corresponds to a range in which the target person 21 and the other person may exist when the target person 21 is watched by the other person.
  • the action determination area A1 is an area set on the head side of the bed 20, and corresponds to a range in which the head 22 of the subject 21 may exist when going to bed (when the subject 21 is sleeping on the bed 20) (Hereinafter referred to as a bedtime area A1).
  • the action determination area A2 is an area set on the foot side of the bed 20, and corresponds to a range in which the head 22 of the subject 21 may exist when getting up (when the subject 21 raises the upper body). (Hereafter, it is called rising area A2).
  • relative positions and sizes of the respective determination areas A0 to A2 with respect to the bed area 30 are determined in advance, and if the bed area 30 is specified, the range of each determination area A0 to A2 is determined by calculation. I assume.
  • the action determination area A3 is an area other than the action determination areas A1 and A2.
  • the bed leaving area A3 When leaving the bed (when the target person 21 is away from the bed 20), the head 22 of the target person 21 exists in the action determination area A3 (hereinafter referred to as the bed leaving area A3).
  • the watching determination unit 112 determines whether or not detection positions of two or more heads belong to the watching determination area A0. Then, when the detection positions of two or more heads belong to the watching determination area A0, the watching determination unit 112 determines that the target person 21 is watched by another person, and in the case other than that, the watching person 21 is the target person 21. Is determined not to be watched by others.
  • the action determination unit 113 determines which of the action determination areas A1 to A3 the detected position of the head belongs to, and classifies the state of the target person 21.
  • bed state the case where the head is detected in the bed area A1
  • wake area A2 the case where the head is detected in the wake area A2 as “wake up state”
  • bed departure Call it a state the case where the head is detected in the bed area A3 is referred to as bed departure Call it a state.
  • the action determination unit 113 detects a state change from the “sleeping state” to the “wake up state” as a wake up operation, and detects a state change from the “wake up state” to the “becoming out state” as a bed departure operation.
  • the storage unit 116 is a function of storing various data used by the watching support system 1 for processing.
  • the storage unit 116 at least includes various parameters (such as threshold values) used for wakeup determination, bed departure determination, etc., determination area setting information, image data of a plurality of past frames, or detection results (for calculation of moving speed and moving direction).
  • various parameters such as threshold values used for wakeup determination, bed departure determination, etc., determination area setting information, image data of a plurality of past frames, or detection results (for calculation of moving speed and moving direction).
  • a storage area for storing is provided.
  • step S ⁇ b> 40 the image acquisition unit 110 acquires a captured image of one frame from the imaging device 10.
  • the acquired photographed image is temporarily recorded in the storage unit 116.
  • step S41 the detection unit 111 detects a head from the captured image acquired in step S40.
  • the information on the detected head position (xy coordinates) is recorded in the storage unit 116 in association with the information on the photographing time of the photographed image acquired in step S40 or the frame number of the photographed image.
  • step S42 the watching determination unit 112 performs watching determination processing using the detection result of step S41.
  • step S43 the process is switched according to the determination result of step S42. If it is determined that the target person 21 is watched by another person, the process proceeds to step S44. If it is not determined that the target person 21 is being watched by another person, the processes of steps S44 and S45 are omitted, and the present flowchart is ended.
  • step S44 the action determination unit 113 determines the state or action of the target person 21 using the detection result of step S41.
  • the output unit 114 performs notification according to the determination result in step S44.
  • FIG. 5 is a flowchart of watching determination processing executed by the watching determination unit 112
  • FIGS. 6A and 6B are examples of photographed images acquired by the image acquiring unit 110.
  • step S50 the watching determination unit 112 determines whether or not two or more heads have been detected in the watching determination area A0. If two or more heads are detected in the watching determination area A0, the process proceeds to step S51. If only one head is detected in the watching determination area A0, the process proceeds to step S52. Processing proceeds.
  • step S51 the watching judgment unit 112 judges that the target person 21 is watched by another person (watching judgment).
  • step S ⁇ b> 52 the watching determination unit 112 determines that the target person 21 is not watched by another person (judgment of no watching).
  • step S40 consider the case where the captured image 60 of FIG. 6A is acquired in step S40.
  • the head 61 of the target person 21 is detected as the head in the watching determination area A0 in step S41. Therefore, it is determined that the target person 21 is not watched by the other person (step S52), and a notification is performed according to the state or action of the target person 21 (step S44 and step S45).
  • step S41 the head 61 of the target person 21 and the head 63 of another person watching the target person 21 are detected as the head in the watching determination area A0. Therefore, it is determined that the target person 21 is watched by the other person (step S51), and the notification in step S45 is omitted.
  • the present embodiment it is determined from the image obtained by photographing the area of the bed and the area around it that the subject is being watched by another person.
  • a target person such as a patient or a care recipient.
  • the subject is likely to be in bed and the other is likely to be in the bed vicinity. Therefore, according to the above configuration, it can be determined with high accuracy whether the target person is being watched by another person.
  • the target person is watching by another person. According to this configuration, it is highly possible that the target person is watched by the other person, because the target person and the other person who watches the target person are likely to be included in the two or more persons. It can be detected. Furthermore, when two or more persons appear in the photographed image, the state of the other person may be erroneously detected as the state of the target person, but in the present embodiment, such erroneous detection is suppressed. It can also be done.
  • a notification is necessary to indirectly watch the subject who is not watched directly, and is not necessary when the subject is watched directly by another person.
  • the notification when it is determined that the target person is being watched by another person, the notification is omitted.
  • unnecessary notifications can be suppressed, and effects such as reduction of burden on the system user and protection of privacy can be obtained. For example, it is possible to save time and effort that a system user confirms unnecessary notifications.
  • unnecessary notification of the image of the other person can be suppressed, and the privacy of the other person can be protected.
  • the above description of the embodiments merely illustrates the present invention.
  • the present invention is not limited to the above specific embodiments, and various modifications are possible within the scope of the technical idea thereof.
  • the shape and size of the watching determination area A0 are not particularly limited.
  • a hexagonal watching judgment area A0 may be set.
  • the flowchart of the watching determination process is not limited to the flowchart of FIG.
  • a modified example of the watching determination process will be described.
  • FIG. 7 is a flowchart of watching determination processing executed by the watching determination unit 112
  • FIGS. 8A and 8B are examples of photographed images acquired by the image acquiring unit 110.
  • step S70 the watching determination unit 112 determines whether or not two or more heads have been detected in the watching determination area A0. If two or more heads are detected in the watching determination area A0, the process proceeds to step S71. If only one head is detected in the watching determination area A0, the process proceeds to step S75. Processing proceeds.
  • step S71 the watching determination unit 112 detects a face from the watching determination area A0 of the captured image using the captured image acquired by the image acquisition unit 110.
  • a face is detected in the area where the head is detected. However, in an area such as the back of the head, the face is not detected, so the head is detected but the face is not detected.
  • step S72 the watching determination unit 112 determines whether one or more faces have been detected in step S71. If one or more faces are detected, the process proceeds to step S73. If no face is detected, the process proceeds to step S75.
  • step S73 the watching judgment unit 112 watches the first person present in the bed 20 and the second person facing the first person based on the detection results in step S41 and step S72. It is determined whether it exists in the determination area A0.
  • the first person since the first person is considered to be watched by the second person, hereinafter, the first person is described as a "guardee" and the second person is described as a "watcher”. . If there is a watcher and a watcher, the process proceeds to step S74. If not, the process proceeds to step S75.
  • step S74 the watching judgment unit 112 judges that the target person 21 is watched by another person (watching judgment).
  • step S ⁇ b> 75 the watching determination unit 112 determines that the target person 21 is not watched by another person (judgment of no watching).
  • step S41 the head 81 of the object person 21 and the head 82 of the other person who does not observe the object person 21 are detected as the head in the watching determination area A0. Then, in step S71, the face 83 of the other person is detected.
  • step S73 the following processing is performed.
  • the watching judgment unit 112 determines the head 81 detected in the bed 20 as the head of the watcher.
  • the watching judgment unit 112 calculates the vector 84 of the direction of the face 83 and the vector 85 directed from the face 83 (head 82) to the head 81.
  • the watching judgment unit 112 judges whether an angle ⁇ 1 ( ⁇ 180 degrees) formed by the vector 84 and the vector 85 is equal to or less than a threshold.
  • the threshold is an angle for determining whether or not the user is facing the watcher, and is 45 degrees, for example.
  • the watching determination unit 112 determines the person with the face 83 not as the watching person but as the person who does not face the watching person. That is, the watching determination unit 112 determines that there is a watched person in the watching determination area A0 but does not have a watching person in the watching determination area A0. Therefore, it is determined that the target person 21 is not watched by the other person (step S75), and a notification is performed according to the state or action of the target person 21 (step S44 and step S45).
  • step S41 the head 81 of the target person 21 and the head 87 of another person watching the target person 21 are detected as the head in the watching determination area A0. Then, in step S71, the face 88 of the other person is detected.
  • step S73 the following processing is performed.
  • the watching judgment unit 112 determines the head 81 detected in the bed 20 as the head of the watcher.
  • the watching judgment unit 112 calculates the vector 89 of the direction of the face 88 and the vector 90 directed from the face 88 (head 87) to the head 81.
  • the watching judgment unit 112 judges whether or not an angle ⁇ 2 ( ⁇ 180 degrees) formed by the vector 89 and the vector 90 is equal to or less than a threshold.
  • the watching determination unit 112 determines the person with the face 88 as the watching person. That is, the watching determination unit 112 determines that the watching person and the watching person exist in the watching determination area A0. Therefore, it is determined that the target person 21 is watched by the other person (step S74), and the notification in step S45 is omitted.
  • the watcher (first person) present in the bed and the watcher (second person) facing the watcher (second person) watch over It is determined that the target person is being watched by the other person only when it exists in the determination area.
  • the watcher present in the bed is likely to be the target person, and the watcher facing the target person (the watched person) is likely to be a person watching the target person. And even if the other person is beside the target person, it is difficult to say that the other person is watching over the target person if the other person is not facing the target person. Therefore, according to this configuration, it is possible to determine with high accuracy whether the target person is being watched by another person.
  • a person whose angle between the vector of the face direction and the vector directed to the watched person is equal to or less than the threshold is determined as the watcher.
  • a person whose angle between the face direction vector and the vector toward the watcher is equal to or less than the threshold value is likely to face the watcher. Therefore, according to this configuration, the watcher can be detected with high accuracy.
  • FIG. 9 is a flowchart of the watching determination process performed by the watching determination unit 112.
  • step S90 the watching determination unit 112 determines whether or not two or more heads are detected in the watching determination area A0. If two or more heads are detected in the watching determination area A0, the process proceeds to step S91. If only one head is detected in the watching determination area A0, the process proceeds to step S94. Processing proceeds.
  • the watching judgment unit 112 uses the photographed image acquired by the image acquiring unit 110 to observe the watching operation person who is performing watching operation from the watching judgment area A0 of the photographed image. Detect the watching person who is watching.
  • the motions to be watched are motions such as getting up, assistance to sleep, assistance to meals, assistance to changing clothes, etc.
  • the motions to watch are motions such as assistance to wake up or sleeping, assistance to food, assistance to changing clothes, etc.
  • the watching operation person and watching operation person are detected, for example, by an object detection algorithm using a method using a classifier using HoG feature values or a method using Faster R-CNN.
  • step S92 the watching determination unit 112 determines whether or not both the watch operation person and the watching operation person are detected in step S91. If both the watch operation person and the watching operation person are detected, the process proceeds to step S93. If not, the process proceeds to step S94.
  • step S93 the watching judgment unit 112 judges that the target person 21 is watched by another person (watching judgment).
  • step S94 the watching determination unit 112 determines that the target person 21 is not watched by another person (judgment of no watching).
  • the target person is the other person only when the person performing the operation to be watched and the person performing the operation to watch are present in the watching determination area. It is judged that it is watched by.
  • a person performing a watched action is likely to be a target person being watched by another person, and a person performing a watching action is likely to be another person watching a subject person.
  • the other person is near the target person, it is difficult to say that the other person is watching the target person if the other person is not performing the watching operation. Therefore, according to this configuration, it is possible to determine with high accuracy whether the target person is being watched by another person.
  • Oversight support system 10 Imaging device 11: Information processing device 110: Image acquisition unit 111: Detection unit 112: Oversight determination unit 113: Action determination unit 114: Output unit 115: Region setting unit 116: Storage unit 100: Infrared LED Lighting 101: Near infrared camera 20: Bed 21: Target person 22: Head 30: Bed area A0: Overwatch judgment area A1: Bed area (action judgment area) A2: Wake area (action judgment area) A3: Leave area ( Action judgment area) 60: shooting image 61: head 62: shooting image 63: head 80: shooting image 81: head 82: head 83: face 84: vector 85: vector 86: shooting image 87: head 88: face 89: Vector 90: Vector ⁇ 1: Angle ⁇ 2: Angle

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Alarm Systems (AREA)
  • Emergency Alarm Devices (AREA)
  • Image Analysis (AREA)

Abstract

La présente invention concerne un système d'aide à la surveillance, qui aide à surveiller une cible qui est alitée et qui est pourvu d'une unité d'acquisition d'image qui acquiert une image capturée par un dispositif d'imagerie, d'une unité de détermination de surveillance qui détermine si la cible est ou non surveillée par une autre personne sur la base d'une image d'une région de détermination, qui est une sous-région de l'image capturée et qui est une région qui comprend le lit, d'une unité de détermination d'état qui détermine l'état de la cible sur la base de l'image capturée, et d'une unité de sortie qui produit une notification en fonction du résultat de détermination de l'unité de détermination d'état. L'unité de sortie omet la notification susmentionnée si l'unité de détermination de surveillance a établi que la cible est surveillée par une autre personne.
PCT/JP2018/025596 2017-07-14 2018-07-05 Système d'aide à la surveillance et son procédé de commande WO2019013105A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-138539 2017-07-14
JP2017138539A JP6870514B2 (ja) 2017-07-14 2017-07-14 見守り支援システム及びその制御方法

Publications (1)

Publication Number Publication Date
WO2019013105A1 true WO2019013105A1 (fr) 2019-01-17

Family

ID=65002445

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/025596 WO2019013105A1 (fr) 2017-07-14 2018-07-05 Système d'aide à la surveillance et son procédé de commande

Country Status (2)

Country Link
JP (1) JP6870514B2 (fr)
WO (1) WO2019013105A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112766185A (zh) * 2021-01-22 2021-05-07 燕山大学 基于深度学习的头部姿态监控方法、装置及系统

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022215158A1 (fr) * 2021-04-06 2022-10-13 三菱電機株式会社 Dispositif de commande de notification et procédé de commande de notification

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013149156A (ja) * 2012-01-20 2013-08-01 Fujitsu Ltd 状態検知装置および状態検知方法
JP2015139550A (ja) * 2014-01-29 2015-08-03 シャープ株式会社 離床判定装置および離床判定方法
WO2015133195A1 (fr) * 2014-03-06 2015-09-11 Nkワークス株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations, et programme
WO2016071314A1 (fr) * 2014-11-03 2016-05-12 Koninklijke Philips N.V. Dispositif, système et procédé de détection automatisée de l'orientation et/ou de la localisation d'une personne

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015220596A (ja) * 2014-05-16 2015-12-07 株式会社ニコン 電子機器
JP2016177680A (ja) * 2015-03-20 2016-10-06 株式会社リコー 情報処理システム、情報処理端末、情報処理方法、及びプログラム
WO2016199749A1 (fr) * 2015-06-10 2016-12-15 コニカミノルタ株式会社 Système de traitement d'image, dispositif de traitement d'image, procédé de traitement d'image et programme de traitement d'image
JP6503262B2 (ja) * 2015-08-19 2019-04-17 アイホン株式会社 動作認識装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013149156A (ja) * 2012-01-20 2013-08-01 Fujitsu Ltd 状態検知装置および状態検知方法
JP2015139550A (ja) * 2014-01-29 2015-08-03 シャープ株式会社 離床判定装置および離床判定方法
WO2015133195A1 (fr) * 2014-03-06 2015-09-11 Nkワークス株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations, et programme
WO2016071314A1 (fr) * 2014-11-03 2016-05-12 Koninklijke Philips N.V. Dispositif, système et procédé de détection automatisée de l'orientation et/ou de la localisation d'une personne

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112766185A (zh) * 2021-01-22 2021-05-07 燕山大学 基于深度学习的头部姿态监控方法、装置及系统
CN112766185B (zh) * 2021-01-22 2022-06-14 燕山大学 基于深度学习的头部姿态监控方法、装置及系统

Also Published As

Publication number Publication date
JP6870514B2 (ja) 2021-05-12
JP2019021002A (ja) 2019-02-07

Similar Documents

Publication Publication Date Title
JP6717235B2 (ja) 見守り支援システム及びその制御方法
US9600993B2 (en) Method and system for behavior detection
JP6137425B2 (ja) 画像処理システム、画像処理装置、画像処理方法、および画像処理プログラム
JP6167563B2 (ja) 情報処理装置、情報処理方法、及び、プログラム
WO2015133195A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, et programme
US9295390B2 (en) Facial recognition based monitoring systems and methods
JP6822328B2 (ja) 見守り支援システム及びその制御方法
WO2015118953A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, et programme
TW201943260A (zh) 物件監控方法及其運算裝置
JP6119938B2 (ja) 画像処理システム、画像処理装置、画像処理方法、および画像処理プログラム
WO2019013105A1 (fr) Système d'aide à la surveillance et son procédé de commande
JP6822326B2 (ja) 見守り支援システム及びその制御方法
US10762761B2 (en) Monitoring assistance system, control method thereof, and program
JPWO2018047795A1 (ja) 見守りシステム、見守り装置、見守り方法、および見守りプログラム
JP2018533240A (ja) 占有検出
WO2019009377A1 (fr) Système de support de visualisation et son procédé de commande
JP2017041079A (ja) 動作認識装置
JP2023548886A (ja) カメラを制御するための装置及び方法
TWI697869B (zh) 姿態判斷方法、電子系統以及非暫態電腦可讀取記錄媒體
JP6729512B2 (ja) 見守り支援システム及びその制御方法
US20220054046A1 (en) Assessing patient out-of-bed and out-of-chair activities using embedded infrared thermal cameras
JP6635074B2 (ja) 見守り支援システム及びその制御方法
JP2023051147A (ja) ナースコールシステム、および状態判断システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18831027

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18831027

Country of ref document: EP

Kind code of ref document: A1