WO2022208147A1 - System for controlling a stream of images relative to a monitorable subject - Google Patents

System for controlling a stream of images relative to a monitorable subject Download PDF

Info

Publication number
WO2022208147A1
WO2022208147A1 PCT/IB2021/055368 IB2021055368W WO2022208147A1 WO 2022208147 A1 WO2022208147 A1 WO 2022208147A1 IB 2021055368 W IB2021055368 W IB 2021055368W WO 2022208147 A1 WO2022208147 A1 WO 2022208147A1
Authority
WO
WIPO (PCT)
Prior art keywords
subject
stream
closed line
images
contour
Prior art date
Application number
PCT/IB2021/055368
Other languages
French (fr)
Inventor
Enzo FELICI
Pierluigi MORELLI
Original Assignee
Future Care S.R.L.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Future Care S.R.L. filed Critical Future Care S.R.L.
Publication of WO2022208147A1 publication Critical patent/WO2022208147A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0438Sensor means for detecting
    • G08B21/0446Sensor means for detecting worn on the body to detect changes of posture, e.g. a fall, inclination, acceleration, gait
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0438Sensor means for detecting
    • G08B21/0469Presence detectors to detect unsafe condition, e.g. infrared sensor, microphone
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0438Sensor means for detecting
    • G08B21/0476Cameras to detect unsafe condition, e.g. video cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance

Definitions

  • the present invention relates to a system for controlling the stream of images relative to a subject to be monitored.
  • the present invention relates to a system for controlling the stream of images relative to a subject inside a closed room.
  • An object of the present invention is to overcome the drawbacks of the prior art.
  • a particular object of the present invention is to carry out a detection of a subject with an uncomplicated processing.
  • a further object of the present invention is to ensure a simple use of a subject detection system.
  • the invention describes a system for controlling a stream of images of a subject to be monitored, according to what is described in claim 1.
  • the invention describes a method for controlling a stream of images relative to a subject to be monitored, according to what is described in claim 15.
  • the invention allows to control a subject remotely without manual intervention and with high efficiency.
  • the invention achieves the following technical effects: - activation of the detection independent from human intervention;
  • Figure 1 is a schematic view of a portion of the control system of a stream of images relative to a subject, according to the invention.
  • Figure 1 A is a detailed view of an element of figure 1.
  • Figure 2A is a schematic view of a first embodiment of a control system of a stream of images relative to a subject, according to the invention.
  • Figure 2B is a schematic view of a second embodiment of a control system of a stream of images relative to a subject, according to the invention.
  • Figures 3A, 3B and 3C represent possible graphic outputs generated by the camera of the previous figures, according to the invention.
  • the invention describes a system for controlling a stream of images of a subject to be monitored.
  • the system comprises a camera 10 for picking up and processing a stream of images Fl of a subject S inside a room A, as shown in figure 1.
  • the camera 10 comprises an objective 11 configured to capture a stream of images Fl of the room A.
  • the stream of images Fl detected will contain the images of the subject S moving inside the room A.
  • the camera 10 further comprises a motion sensor 12 configured to detect a movement of the subject S inside the room A.
  • the camera 10 further comprises an infrared sensor 13 for detecting a presence of a subject S also in night-time hours, and configured to detect the movement of the subject S.
  • the system of the invention comprises an electronic processing unit 14 configured to process the detected stream of images Fl.
  • the electronic processing unit is a raspberry single card computer.
  • the electronic processing unit 14, configured to process the detected stream of images Fl, is connected to the camera 10 but located outside it.
  • the electronic processing unit 14 configured to process the detected stream of images Fl, is integrated into the camera 10.
  • the electronic processing unit 14 is logically divided into distinct functional modules (memory modules or operating modules) which perform the described functions.
  • Such an electronic processing unit 14 can comprise a single electronic device, appropriately programmed to perform the functionalities described, and the different modules can correspond to hardware entities and/or software routines which are part of the programmed device.
  • these functions can be performed by a plurality of electronic devices over which the aforesaid functional modules can be distributed.
  • the electronic processing unit 14 can make use of, also one or more processors for performing the instructions contained in the memory modules; the aforesaid functional modules may, also, be distributed over different local or remote calculators based on the architecture of the network in which they reside.
  • the electronic processing unit 14, in particular a receiving module 141 is configured to receive a signal of a detected movement S1 from the motion sensor 12.
  • the electronic processing unit 14, in particular a control module 142, is further configured to S2 command the objective 11 to capture the stream of images Fl after having framed the subject S on the basis of the signal of the detected movement S1.
  • the electronic processing unit 14, in particular a receiving module 143, is configured to S3 receive the captured stream of images Fl.
  • S4 is configured to S4 process the stream of images Fl by carrying out the step of S41 identifying the subject S generating a closed line L for containing the subject S overlaid on the stream of images Fl.
  • the closed line L comprises an ellipse which encloses the subject S framed by the objective 11.
  • 145 is configured to S42 compare the contour of the closed line L with a contour of a closed line of reference L1_ref representative of a "falling position" of the subject S in the room A.
  • Figure 3B shows a view of the room A captured by the camera in which the closed line L represents a positive detection of the subject's fall; in fact, the ellipse is arranged parallel to the floor of the room A and at the same height as the floor.
  • the electronic processing unit 14, in particular a second processing module 146, is configured to S43, generate a warning signal SPAL representative of a potential situation of danger for the subject S, if the contour of the closed line L matches the contour of the closed line of reference L1_ref.
  • the contour of said closed line L matches the contour of a closed line of reference L1_ref when the closed line L has the same contour as the closed line of reference L1_ref and the same inclination with respect to a common reference, for example the base/floor of the room A.
  • the second processing module 146 is further configured to perform a step of repeating the step S42 of comparing the contour of the closed line with the closed line of reference L1_ref if S44 the contour of the closed line L does not match the contour of the closed line of reference L1_ref.
  • the second processing module 146 is also configured to perform, at a predefined time interval AT the steps of S41 identifying the subject S generating a closed line L for containing the subject S overlaid on the stream of images Fl and of S42 comparing the contour of the closed line L with a contour of a closed line of reference L1_ref representative of the "falling position" of the subject S in the room A.
  • the predefined time interval AT is comprised between 1 and 4 seconds, preferably 2 seconds.
  • the electronic processing unit 14 checks that the subject is in a correct position, not representative of a falling position.
  • Figure 3A shows a view of the room A captured by the camera in which the closed line L represents a negative detection of the subject's fall; in fact, the ellipse is arranged not parallel to the floor of the room A.
  • the falling position is considered real and an actual alarm signal S AL is generated.
  • the second processing module 146 is further configured to:
  • Figure 3B shows a view of the room A captured by the camera in which the closed line L represents a false positive detection of the subject's fall; in fact, the ellipse is arranged parallel to the floor of the room A but not at the same height as the floor; in particular, such a position can be representative of the subject in a position lying in bed.
  • the electronic processing unit 14, in particular a third processing module 147, is further configured to: prior to step S4 of processing the stream of images Fl, carry out a step S40A of identifying a fixed element 20, 30, 40 in the stream of images Fl, for example detecting a chair 20 or a sofa 30 or a table 40, and to carry out a step S40B of marking the fixed element 20, 30, 40 as a potential generator of a false positive.
  • the third processing module 147 is further configured to, during the step S4 of processing the stream of images Fl and if the contour of the closed line L matches S43 the contour of a closed line of reference L1_ref, calculate a safety distance AD between the contour of the closed line L and the fixed element 20, 30, 40 identified.
  • the third processing module 147 If the safety distance AD is greater than a threshold safety distance AD_TH, then the third processing module 147 generates the warning signal SPAL.
  • the invention verifies that it is not a false positive - figure 3C - but a real positive - figure 3B before confirming the warning signal SPAL.
  • the electronic processing unit 14 commands a return to the step S41 of identifying the subject S.
  • the electronic processing unit 14 continues to control the subject S, as if everything were regular.
  • the electronic processing unit 14 is further configured to deactivate the camera 10 if the motion sensor 12 and/or the infrared sensor 13 detect a movement of a plurality of subjects S inside the room A.
  • the camera is deactivated as soon as there are multiple subjects in the room and therefore there is no need to monitor the subject S.
  • the technical effect achieved is energy and bandwidth savings.
  • FIG. 2A a first embodiment of the control system of a stream of images of a subject S is shown.
  • the system comprises a processing device 50 arranged remotely from the camera 10 and in data connection with the electronic processing unit 14.
  • the electronic processing unit 14 is configured to send the actual alarm signal S AL , representative of a real situation of danger for the subject S, to the processing device 50.
  • FIG 2B a second embodiment of the control system of a stream of images of a subject S is shown in which the electronic processing unit 14 is integrated into the camera 10.
  • the processing device 50 comprises a smartphone, a tablet PC, a PDA or the like.
  • the processing device 50 is a device provided to the caregiver of the subject S.
  • the electronic processing unit 14 is configured to send the actual alarm signal S AL to the processing device 50, and transmit to the processing device 50 in real time also the stream of images Fl of the subject S inside the room A.
  • the stream of images Fl is sent to the caregiver so that the latter can ascertain, only when the actual alarm signal S AL arrives, whether it is a false positive or a real fall.
  • the technical effect achieved is further energy and bandwidth savings, as the images are sent only if an actual alarm signal is detected.
  • the invention describes a computer-implemented method for controlling a stream of images of a subject S to be monitored, comprising the steps of: providing at least one camera 10 for picking up and processing a stream of images Fl of a subject S inside a room A, wherein the camera 10 comprises: an objective 11 configured to capture a stream of images Fl of the room A; a motion sensor 12 configured to detect a movement of the subject S inside the room A; wherein the system further comprises: receiving a signal of a detected movement S1 from the motion sensor 12; commanding S2 said objective 11 to capture the stream of images Fl after having framed the subject S on the basis of the signal of the detected movement S1 ;
  • the invention allows to remotely detect a monitorable subject without manual intervention and with high efficiency.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Gerontology & Geriatric Medicine (AREA)
  • Emergency Management (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Business, Economics & Management (AREA)
  • Human Computer Interaction (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Emergency Alarm Devices (AREA)
  • Alarm Systems (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Burglar Alarm Systems (AREA)

Abstract

The invention describes a system for controlling a stream of images of a subject (S) to be monitored comprising: at least one camera (10) for picking up and processing a stream of images (FI) of a subject (S) inside a room (A), comprising an objective (11) and a motion sensor (12); an electronic processing unit (14) configured to receive a signal of a detected movement, command the objective (11) to capture the stream of images (FI) on the basis of the signal of detected movement (S1); process the stream of images (FI) by carrying out the steps of identifying the subject (S), generating a closed line (L) for containing the subject (S), comparing the contour of the closed line (L) with a contour of a closed line of reference (L1_ref); if the contour of the closed line (L) matches the contour of the closed line of reference (L1_ref), then generating a warning signal (SR) representative of a potential situation of danger for the subject (S).

Description

SYSTEM FOR CONTROLLING A STREAM OF IMAGES RELATIVE TO A MONITORABLE SUBJECT
FIELD OF APPLICATION
The present invention relates to a system for controlling the stream of images relative to a subject to be monitored.
In particular, the present invention relates to a system for controlling the stream of images relative to a subject inside a closed room.
PRIOR ART
Person detection systems based on the detection of the exoskeleton of the person for determining the positioning of the person are known.
The disadvantage of these systems is the high computational complexity thereof.
An object of the present invention is to overcome the drawbacks of the prior art.
A particular object of the present invention is to carry out a detection of a subject with an uncomplicated processing.
A further object of the present invention is to ensure a simple use of a subject detection system.
SUMMARY OF THE INVENTION
In a first aspect, the invention describes a system for controlling a stream of images of a subject to be monitored, according to what is described in claim 1.
Advantageous aspects are described in dependent claims 2 to 14.
In a second aspect, the invention describes a method for controlling a stream of images relative to a subject to be monitored, according to what is described in claim 15.
The invention allows to control a subject remotely without manual intervention and with high efficiency.
The invention achieves the following technical effects: - activation of the detection independent from human intervention;
- non-complex construction;
- low computational complexity;
- simple operation and use.
The technical effects/advantages mentioned, and other technical effects/advantages of the invention will emerge in further detail from the description provided herein below of an example embodiment provided by way of approximate and non-limiting example with reference to the attached drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 is a schematic view of a portion of the control system of a stream of images relative to a subject, according to the invention.
Figure 1 A is a detailed view of an element of figure 1.
Figure 2A is a schematic view of a first embodiment of a control system of a stream of images relative to a subject, according to the invention.
Figure 2B is a schematic view of a second embodiment of a control system of a stream of images relative to a subject, according to the invention.
Figures 3A, 3B and 3C represent possible graphic outputs generated by the camera of the previous figures, according to the invention.
DETAILED DESCRIPTION
In a first aspect, the invention describes a system for controlling a stream of images of a subject to be monitored.
The system comprises a camera 10 for picking up and processing a stream of images Fl of a subject S inside a room A, as shown in figure 1.
The camera 10 comprises an objective 11 configured to capture a stream of images Fl of the room A.
In particular, the stream of images Fl detected will contain the images of the subject S moving inside the room A.
The camera 10 further comprises a motion sensor 12 configured to detect a movement of the subject S inside the room A. Preferably, the camera 10 further comprises an infrared sensor 13 for detecting a presence of a subject S also in night-time hours, and configured to detect the movement of the subject S.
The system of the invention comprises an electronic processing unit 14 configured to process the detected stream of images Fl.
Preferably, the electronic processing unit is a raspberry single card computer.
In a first embodiment of the system according to the invention, as shown in figure 2A, the electronic processing unit 14, configured to process the detected stream of images Fl, is connected to the camera 10 but located outside it.
In a second embodiment of the system according to the invention, as shown in figure 2B, the electronic processing unit 14, configured to process the detected stream of images Fl, is integrated into the camera 10.
For all the embodiments, in the course of the present description and in the following claims, the electronic processing unit 14 is logically divided into distinct functional modules (memory modules or operating modules) which perform the described functions.
Such an electronic processing unit 14 can comprise a single electronic device, appropriately programmed to perform the functionalities described, and the different modules can correspond to hardware entities and/or software routines which are part of the programmed device.
Alternatively, or in addition, these functions can be performed by a plurality of electronic devices over which the aforesaid functional modules can be distributed.
In general, the electronic processing unit 14 can make use of, also one or more processors for performing the instructions contained in the memory modules; the aforesaid functional modules may, also, be distributed over different local or remote calculators based on the architecture of the network in which they reside. With particular reference to figure 1A, in all the embodiments, the electronic processing unit 14, in particular a receiving module 141 , is configured to receive a signal of a detected movement S1 from the motion sensor 12.
The electronic processing unit 14, in particular a control module 142, is further configured to S2 command the objective 11 to capture the stream of images Fl after having framed the subject S on the basis of the signal of the detected movement S1.
The electronic processing unit 14, in particular a receiving module 143, is configured to S3 receive the captured stream of images Fl.
The electronic processing unit 14, in particular a first processing module
144, is configured to S4 process the stream of images Fl by carrying out the step of S41 identifying the subject S generating a closed line L for containing the subject S overlaid on the stream of images Fl.
In a preferred embodiment of the invention, the closed line L comprises an ellipse which encloses the subject S framed by the objective 11.
The electronic processing unit 14, in particular a first comparison module
145, is configured to S42 compare the contour of the closed line L with a contour of a closed line of reference L1_ref representative of a "falling position" of the subject S in the room A.
Figure 3B shows a view of the room A captured by the camera in which the closed line L represents a positive detection of the subject's fall; in fact, the ellipse is arranged parallel to the floor of the room A and at the same height as the floor.
The electronic processing unit 14, in particular a second processing module 146, is configured to S43, generate a warning signal SPAL representative of a potential situation of danger for the subject S, if the contour of the closed line L matches the contour of the closed line of reference L1_ref.
According to the invention, the contour of said closed line L matches the contour of a closed line of reference L1_ref when the closed line L has the same contour as the closed line of reference L1_ref and the same inclination with respect to a common reference, for example the base/floor of the room A.
The second processing module 146 is further configured to perform a step of repeating the step S42 of comparing the contour of the closed line with the closed line of reference L1_ref if S44 the contour of the closed line L does not match the contour of the closed line of reference L1_ref.
The second processing module 146 is also configured to perform, at a predefined time interval AT the steps of S41 identifying the subject S generating a closed line L for containing the subject S overlaid on the stream of images Fl and of S42 comparing the contour of the closed line L with a contour of a closed line of reference L1_ref representative of the "falling position" of the subject S in the room A.
In a preferred embodiment of the invention, the predefined time interval AT is comprised between 1 and 4 seconds, preferably 2 seconds.
In other words, at a predefined time interval, the electronic processing unit 14 checks that the subject is in a correct position, not representative of a falling position.
Figure 3A shows a view of the room A captured by the camera in which the closed line L represents a negative detection of the subject's fall; in fact, the ellipse is arranged not parallel to the floor of the room A.
According to the invention, only upon the repeated detection of a representative falling position on several subsequent time intervals AT, the falling position is considered real and an actual alarm signal SAL is generated.
In other words, the second processing module 146 is further configured to:
544 increase an alarm value ContAL every time the warning signal SPAL , representative of a potential situation of danger for the subject S, is generated;
545 compare the alarm value ContAL with a pre-established threshold alarm value Cont_TH; S46 generate an actual alarm signal SAL, representative of a real situation of danger for the subject S, when the alarm value ContAL exceeds the pre-established alarm threshold value Cont_TH.
According to the invention, it is also important to establish which detections of the closed line L may represent false positives not representative of a falling position of the subject S.
Figure 3B shows a view of the room A captured by the camera in which the closed line L represents a false positive detection of the subject's fall; in fact, the ellipse is arranged parallel to the floor of the room A but not at the same height as the floor; in particular, such a position can be representative of the subject in a position lying in bed.
The electronic processing unit 14, in particular a third processing module 147, is further configured to: prior to step S4 of processing the stream of images Fl, carry out a step S40A of identifying a fixed element 20, 30, 40 in the stream of images Fl, for example detecting a chair 20 or a sofa 30 or a table 40, and to carry out a step S40B of marking the fixed element 20, 30, 40 as a potential generator of a false positive.
The third processing module 147 is further configured to, during the step S4 of processing the stream of images Fl and if the contour of the closed line L matches S43 the contour of a closed line of reference L1_ref, calculate a safety distance AD between the contour of the closed line L and the fixed element 20, 30, 40 identified.
If the safety distance AD is greater than a threshold safety distance AD_TH, then the third processing module 147 generates the warning signal SPAL.
In other words, when detecting a warning signal SPAL, the invention verifies that it is not a false positive - figure 3C - but a real positive - figure 3B before confirming the warning signal SPAL.
If during the step S4 of processing the stream of images Fl, the safety distance AD1 is less than the threshold safety distance AD_TH, the electronic processing unit 14 commands a return to the step S41 of identifying the subject S.
In other words, if the detection is a false positive, the electronic processing unit 14 continues to control the subject S, as if everything were regular.
According to the invention, the electronic processing unit 14 is further configured to deactivate the camera 10 if the motion sensor 12 and/or the infrared sensor 13 detect a movement of a plurality of subjects S inside the room A.
In other words, the camera is deactivated as soon as there are multiple subjects in the room and therefore there is no need to monitor the subject S. The technical effect achieved is energy and bandwidth savings.
As already mentioned, with particular reference to figure 2A, a first embodiment of the control system of a stream of images of a subject S is shown.
In addition to the camera 10, the system, as previously described, comprises a processing device 50 arranged remotely from the camera 10 and in data connection with the electronic processing unit 14.
According to the invention, the electronic processing unit 14 is configured to send the actual alarm signal SAL, representative of a real situation of danger for the subject S, to the processing device 50.
With particular reference to figure 2B, a second embodiment of the control system of a stream of images of a subject S is shown in which the electronic processing unit 14 is integrated into the camera 10.
In both embodiments of the control system of a stream of images of a subject S, the processing device 50 comprises a smartphone, a tablet PC, a PDA or the like.
Preferably, the processing device 50 is a device provided to the caregiver of the subject S.
In both embodiments, the electronic processing unit 14 is configured to send the actual alarm signal SAL to the processing device 50, and transmit to the processing device 50 in real time also the stream of images Fl of the subject S inside the room A.
According to the invention, the stream of images Fl is sent to the caregiver so that the latter can ascertain, only when the actual alarm signal SAL arrives, whether it is a false positive or a real fall.
The technical effect achieved is further energy and bandwidth savings, as the images are sent only if an actual alarm signal is detected.
In a second aspect, the invention describes a computer-implemented method for controlling a stream of images of a subject S to be monitored, comprising the steps of: providing at least one camera 10 for picking up and processing a stream of images Fl of a subject S inside a room A, wherein the camera 10 comprises: an objective 11 configured to capture a stream of images Fl of the room A; a motion sensor 12 configured to detect a movement of the subject S inside the room A; wherein the system further comprises: receiving a signal of a detected movement S1 from the motion sensor 12; commanding S2 said objective 11 to capture the stream of images Fl after having framed the subject S on the basis of the signal of the detected movement S1 ;
53 receiving said captured stream of images Fl;
54 processing said stream of images Fl by carrying out the steps of:
541 identifying the subject S, generating a closed line L for containing the subject S overlaid on the stream of images Fl,
542 comparing the contour of the closed line L with a contour of a closed line of reference L1_ref representative of a "falling position" of the subject S in the room A;
543 if the contour of the closed line L matches the contour of a closed line of reference L1_ref, then generating a warning signal SPAL representative of a potential situation of danger for the subject (S)
A system and a method for controlling a stream of images of a subject inside a room have been described.
The invention allows to remotely detect a monitorable subject without manual intervention and with high efficiency.
The invention achieves the following technical effects:
- activation of the detection independent from human intervention;
- non-complex construction;
- low computational complexity;
- simple operation and use.

Claims

1. A system for controlling a stream of images of a subject (S) to be monitored, comprising: at least one camera (10) for picking up and processing a stream of images (Fl) of a subject (S) inside a room (A), wherein said camera (10) comprises: an objective (11) configured to capture a stream of images (Fl) of said room (A); a motion sensor (12) configured to detect a movement of said subject (S) inside said room (A); wherein said system further comprises: an electronic processing unit (14) configured to: receive a signal of a detected movement (S1) from said motion sensor (12); command (S2) said objective (11 ) to capture said stream of images (Fl) after having framed said subject (S) on the basis of the signal of the detected movement (S1);
(53) receive said captured stream of images (Fl);
(54) process said stream of images (Fl) by carrying out the steps of:
(S41
Figure imgf000012_0001
identifying said subject (S), generating a closed line (L) for containing said subject (S) overlaid on said stream of images (Fl),
(542) comparing the contour of said closed line (L) with a contour of a closed line of reference (L1_ref) representative of a “falling position” of said subject (S) in said room (A);
(543) if the contour of said closed line (L) matches said contour of a closed line of reference (L1_ref), generating a warning signal (SPAL) representative of a potential situation of danger for said subject (S).
2. The system according to claim 1 , wherein said an electronic processing unit (14) is configured to perform a step of:
(S44) if the contour of said closed line (L) does not match said contour of a closed line of reference (L1_ref), repeating the step (S42) of comparing the contour of said closed line with said closed line of reference (L1_ref).
3. The system according to claim 1 or 2, wherein said electronic processing unit (14) is configured to perform, at a predefined time interval (AT), said steps:
(541) identifying said subject (S), generating a closed line (L) containing said subject (S) overlaid on said stream of images (Fl),
(542) comparing the contour of said closed line (L) with a contour of a closed line of reference (L1_ref) representative of a “falling position” of said subject (S) in said room (A).
4. The system according to claim 3, wherein said electronic processing unit (14) is configured to:
(544) increase an alarm value (ContAL) every time said warning signal (SPAL) is generated;
(545) comparing said alarm value (ContAL) with a pre-established threshold alarm value (Cont_TH);
(546) generate an actual alarm signal (SAL) representative of a real situation of danger for said subject (S) when said alarm value (ContAL) exceeds said pre-established threshold alarm value (Cont_TH).
5. The system according to any one of the preceding claims, wherein said closed line (L) comprises an ellipse that encloses said subject (S) framed by said objective (11).
6. The system according to any one of the preceding claims, wherein said contour of said closed line (L) matches said contour of a closed line of reference (L1_ref) when said closed line (L) has the same shape as said closed line of reference (L1_ref) and the same inclination relative to a common reference.
7. The system according to any one of the preceding claims, wherein said electronic processing unit (14) is further configured to: prior to said step (S4) of processing said stream of images (Fl), (S40A) identify a fixed element (20, 30, 40) in said stream of images (Fl);
(S40B) mark said fixed element (20, 30, 40) as a potential generator of a false positive; during said step (S4) of processing said stream of images (Fl), if said contour of said closed line (L) matches (S43) said contour of a closed line of reference (L1_ref), calculate a safety distance (AD) between said contour of said closed line (L) matching said contour of the closed line of reference (L1_ref) and said fixed element (20, 30, 40) identified; if said safety distance (AD1) is greater than a threshold safety distance (AD_TH), generate said warning signal (SPAL).
8. The system according to claim 7, wherein said electronic processing unit (14) is further configured, during said step (84) of processing said stream of images (Fl), to go back to the step (S41 ) of identifying said subject (S) if said safety distance (AD1 ) is less than a threshold safety distance (AD_TH).
9. The system according to any one of the preceding claims, wherein said electronic processing unit (14) is further configured to deactivate said camera (10) if said motion sensor (12) detects a movement of a plurality of subjects (S) inside said room (A).
10. The system according to any one of the preceding claims, comprising an infrared sensor (13) for detecting a presence of a subject (S) also in night-time hours, configured to detect the movement of said subject (S).
11. The system according to any one of the preceding claims wherein said electronic processing unit (14) is integrated into said camera (10).
12. The system according to any one of the preceding claims, wherein it further comprises: - a processing device (50) located remotely from said camera (10) and in data connection with said electronic processing unit (14), wherein said electronic processing unit (14) is configured to send said actual alarm signal (SAL) representative of a real situation of danger for said subject (S) to said processing device (50).
13. The control system according to claim 12, wherein said electronic processing unit (14) is configured to send said actual alarm signal (SAL) to said processing device (50), and to transmit said image stream (Fl) of said subject (S) inside said room (A) to said processing device (50) in real time.
14. The control system according to claim 13 or 14, wherein said processing device (50) comprises a smartphone, a PC tablet, a PDA or the like.
15. A computer-implemented method for controlling a stream of images of a subject (S) to be monitored, comprising the steps of: providing at least one camera (10) for picking up and processing a stream of images (Fl) of a subject (S) inside a room (A), wherein said camera (10) comprises: an objective (11) configured to capture a stream of images (Fl) of said room (A); a motion sensor (12) configured to detect a movement of said subject (S) inside said room (A); wherein said system further comprises: receiving a signal of a detected movement (S1 ) from said motion sensor (12); commanding (S2) said objective (11) to capture said stream of images (Fl) after having framed said subject (S) on the basis of the signal of the detected movement (S1);
(53) receiving said captured stream of images (Fl);
(54) processing said stream of images (Fl) by carrying out the steps of:
(S41) identifying said subject (S), generating a closed line (L) for containing said subject (S) overlaid on said stream of images (Fl), (542) comparing the contour of said closed line (L) with a contour of a closed line of reference (L1_ref) representative of a “falling position” of said subject (S) in said room (A);
(543) if the contour of said closed line (L) matches said contour of a closed line of reference (L1_ref), generating a warning signal (SPAL) representative of a potential situation of danger for said subject (S).
PCT/IB2021/055368 2021-03-30 2021-06-17 System for controlling a stream of images relative to a monitorable subject WO2022208147A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IT102021000007784A IT202100007784A1 (en) 2021-03-30 2021-03-30 CONTROL SYSTEM OF A FLOW OF IMAGES RELATING TO A MONITORING SUBJECT
IT102021000007784 2021-03-30

Publications (1)

Publication Number Publication Date
WO2022208147A1 true WO2022208147A1 (en) 2022-10-06

Family

ID=76375552

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2021/055368 WO2022208147A1 (en) 2021-03-30 2021-06-17 System for controlling a stream of images relative to a monitorable subject

Country Status (2)

Country Link
IT (1) IT202100007784A1 (en)
WO (1) WO2022208147A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120116252A1 (en) * 2010-10-13 2012-05-10 The Regents Of The University Of Colorado, A Body Corporate Systems and methods for detecting body orientation or posture
US20190108740A1 (en) * 2017-10-06 2019-04-11 Tellus You Care, Inc. Non-contact activity sensing network for elderly care
WO2019231861A1 (en) * 2018-05-28 2019-12-05 Greenwave Systems PTE Ltd. Area monitoring and communication

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120116252A1 (en) * 2010-10-13 2012-05-10 The Regents Of The University Of Colorado, A Body Corporate Systems and methods for detecting body orientation or posture
US20190108740A1 (en) * 2017-10-06 2019-04-11 Tellus You Care, Inc. Non-contact activity sensing network for elderly care
WO2019231861A1 (en) * 2018-05-28 2019-12-05 Greenwave Systems PTE Ltd. Area monitoring and communication

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
EZATZADEH SHABNAM ET AL: "ViFa: an analytical framework for vision-based fall detection in a surveillance environment", MULTIMEDIA TOOLS AND APPLICATIONS, KLUWER ACADEMIC PUBLISHERS, BOSTON, US, vol. 78, no. 18, 28 May 2019 (2019-05-28), pages 25515 - 25537, XP036873581, ISSN: 1380-7501, [retrieved on 20190528], DOI: 10.1007/S11042-019-7720-3 *
VIET ANH NGUYEN ET AL: "Single camera based fall detection using motion and human shape features", INFORMATION AND COMMUNICATION TECHNOLOGY, ACM, 2 PENN PLAZA, SUITE 701 NEW YORK NY 10121-0701 USA, 8 December 2016 (2016-12-08), pages 339 - 344, XP058308186, ISBN: 978-1-4503-4815-7, DOI: 10.1145/3011077.3011103 *

Also Published As

Publication number Publication date
IT202100007784A1 (en) 2022-09-30

Similar Documents

Publication Publication Date Title
Nasution et al. Intelligent video surveillance for monitoring elderly in home environments
Bevilacqua et al. Fall detection in indoor environment with kinect sensor
US10853698B2 (en) System and method of using multi-frame image features for object detection
US20200394384A1 (en) Real-time Aerial Suspicious Analysis (ASANA) System and Method for Identification of Suspicious individuals in public areas
JP3924171B2 (en) Monitoring device for identifying monitoring objects
KR20190046351A (en) Method and Apparatus for Detecting Intruder
Humenberger et al. Embedded fall detection with a neural network and bio-inspired stereo vision
WO2015125701A1 (en) Monitoring system
WO2016194402A1 (en) Image analysis device, image analysis method, and image analysis program
JP2012212236A (en) Left person detection device
US11594035B2 (en) Monitoring device, and method for monitoring a man overboard situation
JP5370009B2 (en) Monitoring system
KR102580434B1 (en) Dangerous situation detection device and dangerous situation detection method
WO2022208147A1 (en) System for controlling a stream of images relative to a monitorable subject
CN111191499B (en) Fall detection method and device based on minimum center line
JP5701657B2 (en) Anomaly detection device
JP2008224396A (en) Moving body detection device and moving body detection system
CN111144260A (en) Detection method, device and system of crossing gate
JP2014021619A (en) Patient recognition device
Miaou et al. A smart vision-based human fall detection system for telehealth applications
JP2011227614A (en) Image monitoring device
KR102134771B1 (en) Device and method for determining an emergency situation through object detection
JP6046559B2 (en) Specific motion detection device
WO2017029841A1 (en) Image analyzing device, image analyzing method, and image analyzing program
JP6124739B2 (en) Image sensor

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21740199

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21740199

Country of ref document: EP

Kind code of ref document: A1