WO2022199986A1 - Dispositif et methode pour detecter un mouvement ou l'arret d'un mouvement d'une personne ou d'un objet dans une piece, ou un evenement relatif a cette personne - Google Patents
Dispositif et methode pour detecter un mouvement ou l'arret d'un mouvement d'une personne ou d'un objet dans une piece, ou un evenement relatif a cette personne Download PDFInfo
- Publication number
- WO2022199986A1 WO2022199986A1 PCT/EP2022/054862 EP2022054862W WO2022199986A1 WO 2022199986 A1 WO2022199986 A1 WO 2022199986A1 EP 2022054862 W EP2022054862 W EP 2022054862W WO 2022199986 A1 WO2022199986 A1 WO 2022199986A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- volume
- person
- processing unit
- distance
- movement
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 29
- 238000001514 detection method Methods 0.000 claims abstract description 21
- 238000012545 processing Methods 0.000 claims description 111
- 238000005259 measurement Methods 0.000 claims description 48
- 238000012544 monitoring process Methods 0.000 description 5
- 238000006073 displacement reaction Methods 0.000 description 3
- 239000011159 matrix material Substances 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 206010012289 Dementia Diseases 0.000 description 1
- 230000021615 conjugation Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000002980 postoperative effect Effects 0.000 description 1
- 230000033764 rhythmic process Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1113—Local tracking of patients, e.g. in a hospital or private home
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
- A61B5/0015—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
- A61B5/002—Monitoring the patient using a local or closed circuit, e.g. in a room or building
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1113—Local tracking of patients, e.g. in a hospital or private home
- A61B5/1115—Monitoring leaving of a patient support, e.g. a bed or a wheelchair
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1116—Determining posture transitions
- A61B5/1117—Fall detection
Definitions
- the invention relates to the field of devices for monitoring people and/or objects, in particular to devices and methods making it possible to determine events concerning a person.
- the detection of certain events occurring to a person placed under surveillance is a major concern, particularly in hospitals and retirement homes.
- This detection is important, for example, when the person is subject to specific conditions or circumstances, such as a disease (in particular dementia, Alzheimer's, etc.), a post-operative situation, or a history of falls.
- a disease in particular dementia, Alzheimer's, etc.
- a post-operative situation for example, it is important to detect that a person voluntarily leaves their bed or that they have fallen out of it or that they do not return to their bed within a reasonable time.
- This monitoring is particularly sought after in care units, such as hospitals, care clinics or retirement homes, in order to prevent any deterioration in the condition of patients, although this monitoring can also be done in other settings. .
- the rhythm within the care units prevents qualified personnel from monitoring all the people in bed to detect these events.
- the device disclosed in document EP3687379A1 comprises a detector having a detection field capable of covering at least part of the bed and at least one part of its environment, said detector comprising several distance sensors, each distance sensor being capable of providing distance measurements over time between said sensor and a corresponding obstacle in its line of sight, said device further comprising a processor connected to the detector and configured to process the distance measurements provided by the detector's distance sensors.
- An object of the invention is to provide a device and a method making it possible to determine whether a person in a room has made a movement or whether an event concerning this person has occurred.
- the device and the method according to the invention make it possible to detect with relative certainty that a person having left a first volume of the room and not having fallen has returned there, which makes it possible to avoid the triggering of an alarm if this return takes place within a predefined period. It can for example be a departure from a bed and a return to bed. It can just as well be a person seated in an armchair or occupying a certain part of the room.
- a device for detecting a movement or the stopping of a movement of a person or an object in a room, or an event relating to said person comprising:
- a detector having a detection field capable of covering at least a first volume of said part and at least part of the environment of the first volume, said detector comprising several distance sensors, each distance sensor having a line of sight and being able to provide distance measurements over time between said sensor and a corresponding obstacle, that is to say an obstacle located in the line of sight of said sensor,
- a processing unit connected to the detector and configured to process over time the distance measurements received from the distance sensors of the detector, characterized in that the processing unit is configured to perform the following steps and in order: has. for each of the distance sensors of a first set of distance sensors of the detector, determining a first corresponding reference distance either as being a distance measured at a first instant (ti) by the corresponding distance sensor, or as being a combination distances measured at several first instants (ti, t2, t3, t4) by the corresponding distance sensor, b.
- determining a second set of distance sensors which consists of those of the distance sensors of the first set including a distance measurement carried out at a later instant (ts) at the first instant (ti) or at the first instants (ti, t2, t3, t4 ) differs by more than a first predetermined value from the first reference distance of the corresponding distance sensor, c. selecting at least a first part of the distance sensors from the second set of distance sensors and associating said posterior instant (ts) with the at least a first part of distance sensors, d.
- Such a device it is possible to identify a movement in the detection field of the detector, and in particular a movement of a certain magnitude. Such a movement could be leaving a bed, falling or returning the person to bed.
- the combination of the distances measured at the first instants is an average of the distances measured at the first instants (t-i, t2, t3, t4).
- the reference distance of a sensor is the result of a low-pass filtering over time of some of its measurements, which makes it possible to obtain a better contrast of the last distance measurement.
- said first representative position is the position of the geometric center of the obstacles corresponding to said at least one first part of distance sensors selected.
- the geometric center can be similar to the center of mass of an object or a person, its displacement represents an average of the displacements of the parts of the object or the person and is therefore more representative of the displacement during the time of the object or the person.
- the processing unit is configured for:
- the speeds are calculated for pairs of selected representative positions forming part of the same association of pairs of representative positions formed by pairing, for example, each representative position forming part of said association with a representative position of said association which directly follows it. chronologically, and only with the latter.
- the device has a continuous measurement of the speed of the person or of an object moved by the person.
- the processing unit is configured for:
- the first volume being a volume extending vertically upwards and/or downwards from a first horizontal surface of said room
- the second volume being a finished volume extending outward from the side boundaries of the first volume
- the third volume being a volume extending outward from the side boundaries of the second volume
- the processing unit is configured to determine, or to interrogate another processing unit to find out, if said person is in the first volume, has left the first volume or has fallen after leaving the first volume.
- the processing unit is configured to check at a given time whether:
- the processing unit is configured to check at a given instant whether:
- the processing unit is configured to check at a given time if:
- the present invention also relates to a method for detecting a movement or the stopping of a movement of a person or an object in a room, or an event concerning said person.
- Fig.1 schematically shows an overview of a device according to the invention in the context of a bedroom
- Fig.1 bis schematically shows a top view of the chamber of Fig 1;
- Fig.2 schematically shows part of a treatment carried out by the device according to the invention
- Fig.3 schematically shows part of a processing carried out by the device according to a preferred embodiment of the invention
- Fig.1 schematically shows an overall view of a device (10) according to the invention when it is mounted, for example, in a room in which a bed (40) is installed. ) on which a person (the person not shown) whose movements must be monitored can be bedridden.
- the device (10) according to the invention comprises a detector (20) having a detection field (50) capable of covering at least part of the bed (40) and at least part of the environment of the bed.
- the detection field of the detector is that part of the space in which the detector is capable of fulfilling its function.
- the detector (20) comprises several distance sensors, each distance sensor having a line of sight (51) and being capable of providing distance measurements (52) over time between said sensor and an obstacle corresponding to said sensor, c ie an obstacle located in the line of sight (51) of said distance sensor.
- An obstacle corresponding to a distance sensor is therefore that part of an object or of a person who is at the end of the line of sight of said distance sensor and it can for the ease of understanding be assimilated to a point if the detection angle of the sensor is small, which is usually the case.
- the detector (20) can for example be a camera operating on the principle of time of flight (in English: “Time of Flight” or even “TOF”) and which makes it possible to measure directly or indirectly and in real time distances with respect to to an observed three-dimensional scene.
- a TOF camera illuminates the scene that is in its detection field and calculates, for each distance sensor (sometimes also called “photosensitive element” or “pixel” in this context) of the camera, the time that the light emitted takes to travel between the distance sensor and its corresponding obstacle.
- this travel time is directly proportional to the distance between a distance sensor and its corresponding obstacle. This travel time measurement is carried out independently for each distance sensor of the camera.
- a concrete example of such a detector is the “SwissRanger 4000” or “SR4000” camera from MESA Imaging, which includes an array of 176 c 144 distance sensors (photosensitive elements).
- the detector (20) is preferably placed or configured to be placed at a height above the ground which is greater than the maximum height of the upper surface of said bed and it is oriented or designed to be oriented so that its detection field (50) covers at least part of the bed (40), preferably the entire bed, and at least part of its environment.
- the detector (20) can be placed against a wall above the headboard, but other positions are possible, such as against a wall opposite that against which the bed is placed.
- the detector (20) can also be arranged so as to have in its detection field (50), on the one hand at least part of the bed, preferably all of it, and on the other hand a door allowing exit and exit. enter the room where the detector (20) is located.
- the device (10) further comprises a processing unit (30) connected to the detector (20) and configured to acquire and process the measurements of distance (or travel time, which is equivalent to a constant factor) provided by the detector's distance sensors over time.
- a processing unit (30) connected to the detector (20) and configured to acquire and process the measurements of distance (or travel time, which is equivalent to a constant factor) provided by the detector's distance sensors over time.
- the processing unit is preferably able to memorize the various distance measurements supplied by the distance sensors over time and to process at a given moment distance measurements taken and memorized at different moments.
- the processing unit (30) can process these distance measurements periodically, aperiodically or continuously, in the last case within the limits of the maximum rate at which the detector can provide the distance measurements.
- the processing unit (30) processes the distance measurements received, for example every 5 seconds, or every 4 seconds, or every 3 seconds, or every 2 seconds, or every seconds, or every 0.5 seconds.
- the processing unit (30) can process the distance measurements received at times chosen for example according to the history of the distance measurements received.
- the processing unit (30) can be any means making it possible to receive and analyze the distance measurements supplied by the detector (20), and this as a function of time.
- it may be a microprocessor executing a program for analyzing the data corresponding to the distance measurements supplied by the detector.
- the processing unit (30) can be grouped together with the detector (20), for example within the same box.
- the detector (20) can be separated from the processing unit (30) and be provided with data communication means, preferably wireless, such as for example by Wl-Fl (IEEE 802.11 standard), allowing transferring data to the processing unit (30).
- the processing unit (30) then comprises means for receiving the data transmitted by the detector (20) via the data communication means. Said data comprises in this case at least the distance measurements or the times of flight supplied by the distance sensors of the detector (20).
- the processing unit (30) determines a first reference distance for each of the distance sensors of a first set of detector distance sensors (20).
- the first reference distance is either a distance measured at a first instant (t-i) by the corresponding distance sensor, or a combination of distances measured at several first instants (t-i, t2, t3, t4) by the corresponding distance sensor .
- said first set may include all or part of the distance sensors of the detector.
- the processing unit (30) therefore combines several distance measurements taken at different times by the same sensor to create a first reference distance associated with said sensor.
- a first reference distance can be determined for all the sensors of the detector (20) or only for a part thereof.
- the processing unit (30) calculates a first reference distance for all the sensors of the detector.
- the combination may be different for each sensor.
- the processing unit (30) performs the same combination of distance measurements for all the sensors.
- the processing unit (30) combines distance measurements received from the sensors during a period between 0 minutes and 30 minutes, preferably for a period of 5 minutes.
- Each first reference distance can be updated by the processing unit (30) independently for each sensor, in particular at different times depending on the sensor.
- the first reference distances can be determined by means of distance measurements obtained at instants which are different for each sensor.
- the processing unit (30) uses, to determine a first reference distance, distance measurements received at the same instants for all the distance sensors of the detector.
- the processing unit (30) updates the first reference distances periodically using the distance measurement(s) respectively the most recent received from the sensors.
- the processing unit (30) updates the first reference distance of a sensor each time a new distance measurement is received from said sensor.
- the processing unit (30) determines a second set (60) of distance sensors which consists of those of the distance sensors of the first set including a distance measurement carried out at a later time (ts) at the first time (ti ) or at the first instants (ti, t2, t3, t4) differs by more than a first predetermined value from the first reference distance of the corresponding distance sensor.
- the first predetermined value is between 0 cm and 30 cm, more preferably between 0 and 20 cm, even more preferably between 0 and 10 cm. More preferably the first predetermined value is 0 cm.
- the processing unit (30) selects at least a part (61) of the sensors of the second set (60) and associates with the at least a part (61) the instant (ts) (i.e. - say the instant when the sensors of the first set have carried out distance measurements having each decreased or increased by more of the first predetermined value with respect to their reference distances)
- Fig. 2 schematically represents various distance sensors of the detector (20) arranged in a matrix whose indices correspond to the angular coordinates (in the case of a spherical coordinate system) of the beams coming from these sensors, that is to say of their lines of sight (assumed rectilinear, as shown in FIG. 1), the origin of the spherical coordinate system being able to be taken for example on the detector (20).
- Fig. 2 schematically represents various distance sensors of the detector (20) arranged in a matrix whose indices correspond to the angular coordinates (in the case of a spherical coordinate system) of the beams coming from these sensors, that is to say of their lines of sight (assumed rectilinear, as shown in FIG. 1), the origin of the spherical coordinate system being able to be taken for example on the detector (20).
- Fig. 2 shows an example of a second set (60) of distance sensors as defined above and comprising a first part (61) of distance sensors and a second part (62) of distance sensors at the instant ( ts).
- Fig. 2 also illustrates a third part of distance sensors (71) which are those distance sensors whose distance measurement performed at a time ( ⁇ b) differs more than the first predetermined value of the reference distance of said distance sensor.
- the reference distances that led to the determination of a part of the distance sensors must not be the same as the reference distances that led to the determination of another part of the distance sensors.
- the processing unit (30) groups together distance sensors of neighboring coordinates (of neighboring indices) of a set of distance sensors (60) within the same part of distance sensors (61) .
- the union of all the parts of distance sensors (61, 62) of the same set of distance sensors (60) associated with the same instant (ts) includes all the distance sensors of said set of range (60).
- the processing unit (30) determines, for at least the first part (61) of distance sensors selected, a first representative position (65) of the positions in space of the obstacles corresponding to said first part of sensors ( 61) selected, and associates therewith the instant (ts) associated with said first part of distance sensors (61).
- a first representative position (65) can for example be that of an obstacle corresponding to one of the sensors of the first part of distance sensors (61).
- the first representative position (65) can be a combination of the positions in space of the obstacles corresponding to some or all of the distance sensors of the first part of distance sensors (61).
- said first representative position (65) is the position of the geometric center of the obstacles corresponding to said first part (61) of distance sensors selected. This geometric center can be calculated by the processing unit from the spatial coordinates of said obstacles and therefore represents a point in space.
- the processing unit (30) is configured to associate representative positions (65, 75, 85, 95) associated with several different instants (ts, te, tz, ts).
- said representative positions (65, 75, 85, 95) are determined for parts of distance sensors (61, 71, 81, 91) which may be identical or different.
- Fig. 2 illustrates a combination (100) of two representative positions (65,
- Fig. 2 represents the two representative positions (65, 75) according to their angular coordinates only although a representative position
- the processing unit (30) can be configured to associate within the same association (100) any number of representative positions (65, 75, 85, 95).
- the durations separating the instants (ts, t6, tz, te) associated with representative positions (65, 75, 85, 95) can be arbitrary and in particular different.
- the processing unit (30) associates representative positions 65, 75, 85, 95) associated with instants (ts, t6, tz, ts) separated by a duration which can vary between 0.1 seconds and 300 seconds, preferably it is configured for a duration of 5 seconds.
- the processing unit (30) associates representative positions (65, 75, 85, 95) associated with the most recent instants.
- the processing unit (30) associates representative positions (65, 75, 85, 95) each time new distance measurements are received from sensors.
- representative positions are associated which correspond to parts of distance sensors which have distance sensors in common.
- Fig. 3 illustrates such an association (100) of four representative positions (65, 75, 85, 95) which correspond to four parts of distance sensors (the four areas surrounded by a dotted line: 61, 71, 81, 91) associated with four different instants (ts, t6, tz, ts) and having a certain number of sensors in common (in the overlapping zones of the dotted lines).
- a part of distance sensors (61, 71, 81, 91) of the association (100) of representative positions (65, 75, 85, 95) has at least one distance sensor in common with another part distance sensors of the same association (100).
- the part (61) has three distance sensors in common with the part (71).
- the processing unit (30) is configured to then determine, within an association (100) of representative positions (65, 75, 85, 95), at least one pair of representative positions (65, 75) and to calculating for said at least one pair of representative positions (65, 75) a speed as being the distance between said representative positions (65, 75) of said pair, divided by the duration separating the instants (ts, te) associated with said representative positions (65, 75).
- this is an association of two representative positions corresponding to distance measurements made by sensors at times ts and t6.
- the processing unit (30) is configured to then decide, when said speed is greater than a second predetermined value, that said person or an object has made a movement in the chamber between the instants associated with the associated representative positions (65, 75 ) for which said speed was calculated.
- the second predetermined value may be different for each association (100) of representative positions (65, 75, 85, 95). In general, the second predetermined value can be different for each speed calculated for a pair of representative positions (65, 75).
- the second predetermined value is between 0 m/s and 2 m/s, more preferably between 0 m/s and 1 m/s. Even more preferably, the second predetermined value is 0.1 m/s.
- the combination of the sensor distance measurements obtained at different instants is an average. Also, in accordance with this preferred embodiment, a combined distance measurement is obtained for a distance sensor by taking the average of distance measurements made by said sensor over several instants.
- processing unit (30) is configured to:
- the pairs of representative positions can be formed in any way, just as their number can vary.
- a representative position (65, 75, 85, 95) can be associated with several representative positions (65, 75, 85, 95) of the same association (100) of representative positions.
- speeds are determined by the processing unit (30) for pairs of representative positions formed by pairing each representative position of an association (100) of representative positions (65, 75, 85, 95) with the position representative associated directly following it chronologically and only with the latter.
- the chronology of the representative positions (65, 75, 85, 95) is given by the successive instants associated with them (ts, te, tz, ts).
- Fig. 3 illustrates four parts of distance sensors (61, 71, 81, 91) selected corresponding to measurements made by the sensors at four successive instants (ts, ⁇ b , tz, ts) and whose representative positions (65, 75, 85, 95) have been paired in accordance with this preferred embodiment of the invention, thus providing, in the case of FIG. 3, three pairs of representative positions: (65, 75), (75, 85) and (85, 95).
- the successive instants (ts, t6, tz, ts) are separated by the same period.
- the processing unit (30) is configured to: • determine whether the first representative position (65) is located in a first volume (1) or in a second volume (2) or in a third volume (3) in space, the first volume being a vertically extending volume upwards and/or downwards from a first horizontal surface of said part (e.g. the top surface of the bed in the present exemplary context), the second volume being a finite volume extending outwards at from the side limits of the first volume, the third volume being a volume extending outward from the side limits of the second volume,
- Fig.1 shows an example of first volume (1), second volume (2) and third volume (3), in dotted lines. Although it does not appear clearly in the figure, it must be understood that these three volumes are preferably exclusive, that is to say that they have no common points, apart possibly from their common border which , in the case of Fig.1, is the outer envelope of the first volume (1) and the outer envelope of the second volume (2).
- Fig.1 bis schematically shows a top view of the chamber of Fig. 1 which allows to see the orthogonal projections of the first volume and the second volume on the ground.
- the processing unit (30) is configured to determine, or to interrogate another processing unit, whether said person is in bed, has left bed or has fallen after having left his bed. If the processing unit (30) is configured to interrogate another processing unit, it may be a processing unit of the same device or a processing unit of a third-party device.
- the logical interface between the two processing units can be standardized or proprietary, it can take the form of an API or be a web service or a REST service or be generally arbitrary.
- the interrogation of a third-party processing unit can be carried out through a network, or an interconnection of networks, including through a public network such as the Internet.
- the processing unit (30) is configured to cross-reference different information concerning said person, in particular information relating to events which have occurred to said person (such as example that she has just made a movement or that she has just ceased all movement or that she has gotten out of bed), whether this information is the result of processing by the processing unit (30) or are obtained by querying a third-party device.
- the processing unit (30) is also configured to deduce from the crossing of information carried out an event relating to said person.
- the processing unit (30) is configured to check at a given time if:
- the processing unit (30) crosses this information each time a movement in the chamber has been detected.
- the processing unit (30) is configured to check at a given time if:
- the processing unit (30) crosses this information each time that said person or an object has ceased all movement.
- the processing unit (30) is configured to check at a given time if:
- the processing unit (30) crosses this information each time said person or object has made a movement.
- the present invention also relates to a method for detecting a movement of a person in a room comprising a bed or an event concerning said person, the method comprising the following steps: a. Placing a detector (20) so that a detection field (50) of the detector covers at least a first volume (1) of said room and at least part of the environment of said first volume (1) , said detector (20) comprising several distance sensors, each distance sensor being capable of providing distance measurements (52) over time between said sensor and a corresponding obstacle in the line of sight (51) of said sensor, said detector (20) being connected to a processing unit (30) configured to process over time the distance measurements received from the detector, b.
- the first volume (1) being a volume extending vertically upwards and/or downwards from a first horizontal surface of said part
- the second volume (2) being a volume extending outwards from lateral limits of the first volume ( 1)
- the third volume (3) being a volume extending outwards from the side limits of the second volume (2), c.
- a corresponding first reference distance either as being a distance measured at a first instant (t1) by the corresponding distance sensor, or as being a combination of distances measured at several first instants (t1, t2, t3, t4) by the corresponding distance sensor, d.
- a second set of distance sensors (60) which comprises those of the distance sensors of the first set including a distance measurement carried out at a time subsequent (ts) to the first time (ti) or at the first instants (ti, t2, t3, t4) differs by more than a first predetermined value from the first reference distance of the corresponding distance sensor, e. selecting, by means of the processing unit (30), at least a first part of the distance sensors (61) of the second set of distance sensors (60) and associating said posterior instant (ts) with the at least one first part of distance sensors (61), f.
- the different steps of the method are executed at regular time intervals, preferably each time new distance measurements are received from sensors.
- the method comprises the following steps: a. if said person or an object has made a movement in the room, by means of the processing unit (30), determining, or interrogating another processing unit to find out, if, at the instant of said movement, said person had left the first volume (1) of said play and had not dropped, b. if so, by means of the processing unit (30), determining whether the last detected movement of said person or of an object was in the first volume (1), c. if so, deciding, by means of the processing unit (30), that said person has returned to the first volume (1).
- the method comprises the following steps: a. if said person or an object has ceased all movement, by means of the processing unit (30), determining, or interrogating another processing unit to know, if, at the moment of the cessation of all movement , said person had left the first volume (1) and had not dropped, b. if so, determining, by means of the processing unit (30), whether the last detected movement of said person or of the object was in the first volume (1) or in the second volume (2), vs. if so, deciding, by means of the processing unit (30), that said person has returned to the first volume (1).
- the present invention has been described in connection with specific embodiments, which are purely illustrative and should not be construed as limiting. In general, it will appear obvious to those skilled in the art that the present invention is not limited to the examples illustrated and/or described above. The presence of reference numbers in the drawings cannot be considered as limiting, including when these numbers are indicated in the claims.
- the invention can also be described as follows: a device (10) and a method for detecting whether a person or an object located in a room has performed a movement or has interrupted a movement, and being able to deduce therefrom whether an event has said person such as, for example, that he has left an area of the room or that after leaving it he has returned to it.
- the device performs an analysis over time of the distances between the sensors of a detector (20) and the obstacles present in the detection field (50) of the detector.
- the detection of a movement or the interruption of a movement is based on the analysis of the variations over time of said distances.
- the deduction of an event is based on the location of moving obstacles within a space (60) of the room as well as on the history of events that have occurred to said person within the room.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Heart & Thoracic Surgery (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Physics & Mathematics (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Physiology (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Computer Networks & Wireless Communication (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA3213198A CA3213198A1 (fr) | 2021-03-26 | 2022-02-25 | Dispositif et methode pour detecter un mouvement ou l'arret d'un mouvement d'une personne ou d'un objet dans une piece, ou un evenement relatif a cette personne |
US18/283,935 US20240164662A1 (en) | 2021-03-26 | 2022-02-25 | Device and method for detecting a movement or stopping of a movement of a person or of an object in a room, or an event relating to said person |
EP22708138.7A EP4312754A1 (fr) | 2021-03-26 | 2022-02-25 | Dispositif et methode pour detecter un mouvement ou l'arret d'un mouvement d'une personne ou d'un objet dans une piece, ou un evenement relatif a cette personne |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP21165410 | 2021-03-26 | ||
EPEP21165410 | 2021-03-26 | ||
EP21183963 | 2021-07-06 | ||
EPEP21183963 | 2021-07-06 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022199986A1 true WO2022199986A1 (fr) | 2022-09-29 |
Family
ID=80786248
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2022/054862 WO2022199986A1 (fr) | 2021-03-26 | 2022-02-25 | Dispositif et methode pour detecter un mouvement ou l'arret d'un mouvement d'une personne ou d'un objet dans une piece, ou un evenement relatif a cette personne |
Country Status (4)
Country | Link |
---|---|
US (1) | US20240164662A1 (fr) |
EP (1) | EP4312754A1 (fr) |
CA (1) | CA3213198A1 (fr) |
WO (1) | WO2022199986A1 (fr) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003290154A (ja) * | 2002-03-29 | 2003-10-14 | Sumitomo Osaka Cement Co Ltd | 監視装置 |
EP2589330A1 (fr) * | 2010-06-30 | 2013-05-08 | Panasonic Corporation | Dispositif de surveillance et programme |
US20170014051A1 (en) * | 2014-03-20 | 2017-01-19 | Noritsu Precision Co., Ltd. | Information processing device, information processing method, and program |
US20180049677A1 (en) * | 2016-08-16 | 2018-02-22 | Sociedade Beneficente Israelita Brasileira Hospital Albert Einstein | System and method of monitoring patients in hospital beds |
WO2019063808A1 (fr) * | 2017-09-29 | 2019-04-04 | KapCare SA | Dispositif et methode pour detecter qu'une personne alitee quitte son lit ou a chute |
-
2022
- 2022-02-25 CA CA3213198A patent/CA3213198A1/fr active Pending
- 2022-02-25 WO PCT/EP2022/054862 patent/WO2022199986A1/fr active Application Filing
- 2022-02-25 US US18/283,935 patent/US20240164662A1/en active Pending
- 2022-02-25 EP EP22708138.7A patent/EP4312754A1/fr active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003290154A (ja) * | 2002-03-29 | 2003-10-14 | Sumitomo Osaka Cement Co Ltd | 監視装置 |
EP2589330A1 (fr) * | 2010-06-30 | 2013-05-08 | Panasonic Corporation | Dispositif de surveillance et programme |
US20170014051A1 (en) * | 2014-03-20 | 2017-01-19 | Noritsu Precision Co., Ltd. | Information processing device, information processing method, and program |
US20180049677A1 (en) * | 2016-08-16 | 2018-02-22 | Sociedade Beneficente Israelita Brasileira Hospital Albert Einstein | System and method of monitoring patients in hospital beds |
WO2019063808A1 (fr) * | 2017-09-29 | 2019-04-04 | KapCare SA | Dispositif et methode pour detecter qu'une personne alitee quitte son lit ou a chute |
EP3687379A1 (fr) | 2017-09-29 | 2020-08-05 | Kapcare SA | Dispositif et methode pour detecter qu'une personne alitee quitte son lit ou a chute |
Also Published As
Publication number | Publication date |
---|---|
CA3213198A1 (fr) | 2022-09-29 |
EP4312754A1 (fr) | 2024-02-07 |
US20240164662A1 (en) | 2024-05-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10485452B2 (en) | Fall detection systems and methods | |
EP2768389B1 (fr) | Procédé de détection d'activité à capteur de mouvements, dispositif et programme d'ordinateur correspondants | |
EP3463051A1 (fr) | Dispositif connecté de suivi comportemental d'un individu et permettant la détection et/ou la prévention d'une anomalie | |
BE1025269B9 (fr) | Dispositif et methode pour detecter qu’une personne alitee quitte son lit ou a chute. | |
WO2009071598A1 (fr) | Procede et equipement de detection de situation critique d'un sujet | |
BE1020583A3 (fr) | Dispositif et methode de detection de mouvements respiratoires. | |
FR2886532A1 (fr) | Procede et systeme de detection de chute d'une personne | |
WO2008037797A1 (fr) | Procedes et systeme pour la detection de situations anormales d'une personne dans un lieu de vie | |
FR2925737A1 (fr) | Installation de detection de personnes dans un espace delimite. | |
FR2966965A1 (fr) | Systeme et procede pour surveiller la position de personnes et d'objets | |
EP3470891B1 (fr) | Inspection d'une chaussure avec une caméra thermique | |
US20240193992A1 (en) | Systems and methods for on-the-floor detection without the need for wearables | |
FR3028343A1 (fr) | Procede de detection de chute d'un sujet humain et dispositif actimetrique correspondant | |
WO2022199986A1 (fr) | Dispositif et methode pour detecter un mouvement ou l'arret d'un mouvement d'une personne ou d'un objet dans une piece, ou un evenement relatif a cette personne | |
EP2467061B1 (fr) | Systeme et procede de detection de crise d'epilepsie d'une personne epileptique allongee | |
FR3076442A1 (fr) | Chaussure comprenant un accelerometre et un gyroscope, ensemble et procede de detection de chute correspondant | |
FR3030056A1 (fr) | Procede de detection d'elements mobiles dans un batiment et installation pour la mise en œuvre de ce procede. | |
EP3814809B1 (fr) | Système de détection portable comprenant des capteurs magnétostatiques | |
CN105662347A (zh) | 一种睡眠异常监控及报警方法 | |
FR3108767A1 (fr) | Aide à la lutte contre la propagation d’une maladie, par des équipements en réseau | |
EP3636142B1 (fr) | Procédé et dispositif d'analyse de la marche d'un individu | |
WO2012089959A1 (fr) | Procede et systeme de construction d'un graphe representant le plan d'un batiment batiment | |
FR2966238A1 (fr) | Niveau laser comprenant un module de detection de mouvement. | |
CN105615836A (zh) | 一种睡眠异常监控系统 | |
US20230016640A1 (en) | System and method for automated ambient mobility testing |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22708138 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 3213198 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18283935 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022708138 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022708138 Country of ref document: EP Effective date: 20231026 |