EP3767602A1 - Système de capteur de reconnaissance d'activité - Google Patents
Système de capteur de reconnaissance d'activité Download PDFInfo
- Publication number
- EP3767602A1 EP3767602A1 EP20185382.7A EP20185382A EP3767602A1 EP 3767602 A1 EP3767602 A1 EP 3767602A1 EP 20185382 A EP20185382 A EP 20185382A EP 3767602 A1 EP3767602 A1 EP 3767602A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- sensor system
- sensor
- data
- activity
- sensors
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/04—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
- G08B21/0407—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis
- G08B21/0423—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis detecting deviation from an expected pattern of behaviour or schedule
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/04—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
- G08B21/0438—Sensor means for detecting
- G08B21/0469—Presence detectors to detect unsafe condition, e.g. infrared sensor, microphone
Definitions
- the present invention relates to a sensor system. More particularly, the present invention relates to a sensor system for activity recognition.
- sensor devices such as temperatures sensors, relative humidity sensors, CO 2 sensors, movement sensors and the like are integrated in homes or buildings, to improve comfort of a user. This goes from simple, single sensor devices to more sophisticated sensor systems that makes the home or building "smarter”.
- US 2019/0103005 relates to multi-resolution audio activity tracker based on acoustic scene recognition.
- the method is based on collecting data from different acoustic sensors to learn about the habits of an elderly individual and notify dedicated medical staff or a close relative about detected behavior anomalies or a shift in habits of the elderly individual.
- the system described in US 2019/0103005 comprises e.g. three microphones that are connected to a centralized device, such as e.g. an RGW (residential gateway) for example, wirelessly or by PLC (programmable Logic Controller) technology. Audio feature are extracted from the acoustic signals received from the microphones to determine location and activity of the elderly person.
- a centralized device such as e.g. an RGW (residential gateway) for example, wirelessly or by PLC (programmable Logic Controller) technology.
- Audio feature are extracted from the acoustic signals received from the microphones to determine location and activity of the elderly person.
- Feature extraction is done in the remote gateway or remote centralized device.
- a disadvantage hereof is that special measures have to be taken for user privacy to be guaranteed. Further, information is continuously sent back and forth from the sensor system to the cloud, any loss of connection with the cloud can interrupt the working of the system and/or can stop the system from working properly.
- US 2018/0306609 describes a sensing system comprising a sensor assembly that is communicably connected to a computer system, such as a server or a cloud computing system.
- a block diagram of the sensing system of US 2018/0306609 is shown in Fig. 1 .
- the sensing system 100 comprises a sensor assembly 102 having one or more sensors 110 that sense a variety of different physical phenomena.
- the sensor assembly 102 featurizes, via a featurization module 112, the raw sensor data and transmits the featurized data to a computer system 104.
- machine learning machine learning module 116
- the computer system 104 trains a classifier to serve as a virtual sensor 118 for an event that is correlated to the data from one or more sensor streams within the featurized sensor data.
- the virtual sensor 118 can then subscribe to the relevant sensor feeds from the sensor assembly 102 and monitor for subsequent occurrences of the event. Higher order virtual sensors can receive the outputs from lower order virtual sensors to infer nonbinary details about the environment in which the sensor assemblies 102 are located.
- the sensing system 100 is configured to train and implement one or more virtual sensors 118, which are machine learning based classification systems or algorithms trained to detect particular events to which the virtual sensors 118 are assigned as correlated to the data sensed by the sensors 110 of the sensor assembly 102 and/or other virtual sensors 118.
- the present invention provides a system for activity recognition.
- the sensor system comprises at least two sensors for capturing environmental data, a data processing unit for each of the at least two sensors for processing the captured data, a feature extraction unit for each of the at least two sensors for compacting the (raw) processed data by filtering out information irrelevant for the activity out of the processed data, thereby obtaining activity relevant data, and a primary activity recognition unit for, from the extracted relevant data, recognizing a primary activity.
- the feature extraction unit and the primary activity recognition unit are part of the sensor system, or in other words, are located in the sensor system.
- the at least two sensors are located in a same room or area, i.e. in the room or area where the activity has to be recognized.
- the feature extraction unit and the primary activity recognition unit are part of the sensor system and are located close to the at least two sensors in a same unit. In other words, all data processing is done locally, i.e. internally in the sensor system.
- primary activity is meant a basic activity such as, for example but not limited to, a door that is opened or closed, water running out of a water tap, a light that is on, high relative humidity in a room, increasing temperature in a room, ....
- the at least two sensors may, for example but not limited to, be at least one of a temperature sensor, a CO 2 sensor, a radar sensor, a relative humidity sensor, an acoustic sensor, a VOC sensor or the like.
- the at least two sensors may be at least two sensors of the same type.
- the at least two sensors may be at least two sensors of a different type.
- An advantage of the latter is that the results will be more robust and reliable.
- the at least two sensors may, as described above, of a same type.
- An advantage of a sensor system is that processing of the data collected by the at least two sensors in the sensor module is done close to the sensors at the extreme edge level, on in other words is done locally. No data coming from the at least two sensors is transferred into the cloud, which increases safety and offers a better protection of the data of a user. Moreover, people tend to feel more at ease when realising their data is not transferred over the internet.
- a further advantage is also that it is a more simple system, as no big amounts of data have to be sent to the cloud or any other remote system to make the sensor system according to embodiments of the invention work, as processing of the sensor results is done locally in the system. Only according to particular embodiments of the invention, where also secondary activity recognition is done, the recognized primary activities are sent to the cloud for coupling with other primary activities to determine a secondary activity (see further).
- a still further advantage of a system according to embodiments of the invention is that, as the data are processed locally in the system, i.e. the model for recognizing at least a primary activity is stored locally in the system, and not in the cloud, it may learn and be adapted locally as well, so that it can be specifically adapted and optimised for the specific location of the system.
- the energy necessary to process the data is rather limited, which allows to have battery based sensors being used in the system, which is a big advantage over the existing systems, which do send all data captured by the sensors to the cloud for processing. This requires a lot of energy, which makes the use of battery based sensors very difficult.
- each of the sensors thus has its own data processing unit and feature extraction unit.
- the sensor system may furthermore comprise a depoty activity recognition unit for, from a combination of each of the primary activities recognized by the primary recognition unit, determine a higher level depoty activity.
- higher level depoty activity is meant an activity that can be derived from a combination of at least two primary activities.
- the depoty activity recognition unit may be part of the sensor system or in other words may be located in the sensor system. According to other embodiments of the invention, the depoty activity recognition unit may be provided on a location remote from the sensor system such as e.g. in a gateway or in the cloud.
- the data processing unit may comprise means for capturing data received from the at least two sensors, and an A/D converter for converting the captured data.
- the feature extraction unit may comprise means for framing the A/D processed data into overlapping frames, detector unit for evaluating each of the overlapping frames, and extracting unit for extracting activity relevant frames from the overlapping frames.
- the sensor system may furthermore comprise at least one further sensor, and a further data processing unit and a further feature extraction unit for each of the at least one further sensor.
- the sensor system may further comprise a memory for storing parameters of the relevant data and correlated primary and/or transity activities.
- the sensor system may furthermore comprise a training unit for, from subsequent relevant data and correlated primary and/or transity activities, update the stored parameters for improved performance.
- the sensor system may furthermore comprise a communication unit for sending signals representative of the primary and/or informy activity to an electric or electronic device.
- This may, for example, be sending a notification to e.g. a smartphone, a tablet or any other suitable device for notifying a user e.g. of someone entering the home, a temperature that is increasing in a home, water that is running, or the like.
- sending a signal may be sending a signal to a remote electric or electronic device for starting an action. For example but not limited to, when the sensor system detects that a door has been opened, it can be decided that someone is entering the home and a signal can be sent to a thermostat to start heating the home.
- each of the at least two sensors of the sensor system may be located inside the sensor system.
- at least one of the at least two sensors may located outside the sensor system.
- sensors already present in a home or building can also be used to send sensor data to the sensor system, in order to help recognize the activity and make the system more robust.
- the sensor system may be a standalone system.
- the sensor system may be part of an automation system.
- part A being connected to part B is not limited to part A being in direct contact to part B, but also includes indirect contact between part A and part B, in other words also includes the case where intermediate parts are present in between part A and part B.
- Not all embodiments of the invention comprise all features of the invention. In the following description and claims, any of the claimed embodiments can be used in any combination.
- the present invention provides a sensor system for activity recognition.
- the sensor system comprises at least two sensors for capturing environmental data, a data processing unit for each of the at least two sensors for processing the captured data, a feature extraction unit for each of the at least two sensors for compacting the processed data by filtering out of the processed data information irrelevant for the activity, thereby obtaining activity relevant data, and a primary activity recognition unit for, from the extracted relevant data, recognizing a primary activity.
- the feature extraction unit and the primary activity recognition unit are part of the sensor system, or in other words are located in the sensor system. Hence, the feature extraction unit and the primary activity recognition unit are part of the sensor system and are located close to the sensor in a same unit. In other words, all data processing is done internally in the sensor system.
- activity recognition within the scope of the invention is meant using sensor data and data mining and machine learning techniques to model a wide range of human activities.
- primary activity is meant a basic activity such as, for example but not limited to, a door that is opened or closed, water running out of a water tap, a light that is on, high relative humidity in a room, increasing temperature in a room, presence of a person in the room,,....
- a sensor for capturing environmental data is meant any connected object that is capable of providing various types of information with respect to the environment, such as e.g. location, position, an individual's movements, sounds, humidity, temperature, ....
- An advantage of a sensor system is that processing of the data collected by the at least two sensors in the sensor module is done close to the sensors at the extreme edge level, on in other words is done locally. No data coming from the at least two sensors is transferred into the cloud, which increases safety and offers a better protection of the data of a user. Hence, user privacy is improved with respect to prior art sensor systems. Moreover, people tend to feel more at ease when realising their data is not transferred over the internet.
- a further advantage is also that it is a more simple system, as no big amounts of data have to be sent the cloud or any other remote system to make the sensor system according to embodiments of the invention work, as processing of the sensor data is done locally in the system.
- a still further advantage of a system according to embodiments of the invention is that, as the data are processed locally in the system, i.e. the model for recognizing at least a primary activity is stored locally in the system, and not in the cloud, it may learn and be adapted locally as well, so that it can be specifically adapted and optimised for the specific location of the system.
- Fig. 2 illustrates a hierarchical approach according to one embodiment of the invention. It is the intention according to embodiments of the invention to collect environmental information by means of at least two sensors.
- the at least two sensors may, for example but not limited to, be at least one of a temperature sensor, a CO 2 sensor, a radar sensor, a relative humidity sensor, an acoustic sensor, a VOC sensor or the like.
- the at least two sensors may be at least two sensors of the same type.
- the at least two sensors may be at least two sensors of a different type.
- the at least two sensors may, as described above, of a same type.
- the at least two sensors are located in a same room or area, i.e. in the room or area where the activity has to be recognized.
- primary activities are detected or recognized on the extreme edge level, i.e. locally in the sensor system.
- primary activities is meant basic, simple activities, such as e.g. footsteps, speech, running faucet, ventilation active, kitchen hood active, gas stove active, temperature increasing, CO 2 amount increasing, flushing toilet, opening or closing of a door or window, locking or unlocking the door with a key and the like.
- Fig. 3 illustrates a sensor system 1 according to a first embodiment.
- the sensor system 1 comprises at least two sensors S 1 , S 2 , ..., S n for capturing environmental data.
- the sensor system 1 may comprise any number of sensors as is required or wanted by a user.
- the at least two sensor S 1 , S 2 , ..., S n may, for example, be one of a temperature sensor, a CO 2 sensor, a radar sensor, a relative humidity sensor, an acoustic sensor, a VOC sensor or the like.
- the at least two sensors S 1 , S 2 , ..., S n may sensors of a different type.
- the at least two sensors S 1 , S 2 , ..., S n may also be sensors of a same type.
- the sensor system 1 further comprises a data processing unit 3 for each of the at least two sensors S 1 , S 2 , ..., S n for processing the environmental data captured by the sensor S 1 .
- the data processing unit 3 is part of the sensor system 1, or in other words is located in the sensor system 1.
- the data processing unit 3 may comprise means 4 for capturing data received from the sensor S 1 and an A/D converter 5 for converting the captured data.
- the sensor system 1 further comprises a feature extraction unit 6 for each of the at least two sensors S 1 , S 2 , ..., S n for filtering out information irrelevant for the activity out of the A/D processed data, after which only activity relevant data remains in the data that is further processed within the sensor system 1.
- the feature extraction unit 6 is part of the sensor system 1, or in other words, is located in the sensor system.
- the feature extraction unit 6 may comprise means 7 for framing the A/D processed data into overlapping frames, a detector unit 8 for evaluating each of the frames and an extracting unit 9 for extracting activity relevant frames from the frames.
- Irrelevant data is data that indicates different things than what the sensor wants to detect, for example in case of an acoustic sensor, irrelevant data can be data indicating silence in the environment or when in case of a movement sensor, data indicating that in between movements there is a moment of no movement, or in general an interruption in the event that the sensor wants to detect or measure. Hence, when such interruptions occur, the data will only contain sensor noise. This sensor noise is not relevant for the recognition of the primary activities and can thus be seen as irrelevant data. Consequently, relevant data is all data that related to the event that a particular sensor wants to detect or measure, such as sound, humidity, movement, ... and that thus is relevant for the recognition of the primary activities. As an example, a simple way of making a difference between relevant and irrelevant data coming from an acoustic sensor can be, as is described more in detail below, on a basis of an energy threshold.
- the sensor system 1 further comprises a primary activity recognition unit 10 for, from the extracted activity relevant data, recognizing a primary activity.
- primary activity is meant a basic activity such as, for example but not limited to, a door that is opened or closed, water running out of a water tap, a light that is on, high relative humidity in a room, increasing temperature in a room, ....
- the primary activity recognition unit 10 is part of the sensor system 1, or in other words is located in the sensor system.
- the primary activity recognition unit 10 is part of the sensor system 1 and is located close to the sensor S 1 in a same unit. In other words, all data processing for determining or recognizing the primary activity is done internally in the sensor system.
- the primary activity recognition unit 10 takes the features, which were extracted from the data, and compares it with data models of possible activities, stored in the primary activity recognition unit 10. Based on the similarity between both, the system decides if a certain activity is taking place or not.
- data coming from at least two sensors is used to be able to make a better decision.
- the data models are created and continuously updated with such features to have a better and more reliable activity detection in the future.
- the A/D processed data is thus cut into little pieces, the frames, in the feature extraction unit 6.
- an activity e.g. sound
- only the frames with sufficient signal energy or signal pressure are used for such activity recognition.
- a set of frames where the signal energy is too low which can indicate that there are no relevant sounds to be detected, will not be sent to the primary activity recognition unit 10.
- number of processing steps in the primary activity recognition unit 10 can be kept to a minimum. This is a big advantage as the primary activity recognition unit 10 is part of the sensor system 1. In that way, heating because of the processing can be kept low.
- the primary activity recognition system 10 can be seen as an artificial intelligence unit. Machine learning is applied to data frames to recognize or detect the relevant activity.
- the sensor system 1 may furthermore comprise a memory for storing parameters of the relevant data and correlated primary activity, as was described above.
- the sensor system 1 may furthermore also comprise a training unit 12 for, from subsequent relevant data and correlated primary activities, update the stored parameters for improved performance. Activities are then more efficiently and correctly recognized by comparing the A/D processed features with a database of features related to activities that is stored in the sensor system 1.
- the sensor system 10 may furthermore comprise a communication unit 12 for sending signals representative of the primary activity to a remote electric or electronic device.
- the communication unit 13 may be adapted for sending a notification to the remote device so as to notify a user of the recognized primary activity.
- a user may receive a notification on his/her smartphone or tablet from the sensor system 1 that, for example, it was detected that a water faucet is running.
- the communication unit 13 may be adapted for sending a signal to the remote or electronic device so as to start an action. For example, when it is detected that it is getting dark, a signal may be sent to a lighting device for being turned on and/or a signal may be sent to blinds for going down.
- a primary activity is recognized or detected for each of the at least two sensors S 1 , S 2 , ..., S n .
- sensor S 1 may be temperature sensor, so the primary activity detected by the sensor system S may, for example, be that temperature is increasing.
- the second sensor S 2 may, for example, be an environmental sensor and the primary activity detected by the sensor system 1 may be that gas concentration is increasing, ...
- each of the sensors S 1 , S 2 , ... , S n leads to another primary activity.
- a sensor system 1 according to embodiments of the invention may be adapted for using sensor fusion.
- the primary activity recognition unit 10 may, according to this embodiment, be adapted to, from the extracted features of data of different sensors, detect a primary activity. According to further embodiment, a further step is taken in the hierarchical approach, as illustrated in Fig. 5 . According to this embodiment, the sensor system 1 may furthermore comprise a secondary activity recognition unit 14, as schematically illustrated in Fig. 6 .
- the second activity recognition unit 14 is adapted for, from a combination of each of the primary activities recognized by the primary activity recognition unit 10, determine a higher level depoty activity.
- higher level depoty activity is meant an activity that can be derived from a combination of at least two primary activities.
- a higher level propely activity is more complex than a primary, basic activity.
- a higher level propely activity may, for example, be gas stove activity derived from primary activities such as increased temperature, sound (of gas), and gas concentration increase.
- Another example may, for example, be presence prediction as a higher level propely activity determined from primary activities such as CO 2 increase, increase of relative humidity and increase of temperature.
- a further example may be abnormal water consumption as a higher level depoty activity determined from primary activities such as water leakage.
- the depoty activity recognition unit 14 may be part of the sensor system 1, or in other words, may be located in the sensor system 1 at the location of the sensor system 1 (see Fig.
- the depoty activity recognition unit 14 may be provided on a remote location, or in other words, is not part of the sensor system. According to these embodiments, the depoty activity recognition unit 14 may be located in a remote gateway or in the cloud (see Fig. 7 ). According to these embodiments, although information has to be sent over the internet, no crucial private information is sent over the internet to the cloud or to a remote gateway, only primary activities have to be sent.
- a sensor system 1 is still very secure and takes care of a user's privacy because processing of crucial, private data is all done locally in the sensor system 1 and does not have to be transferred over the internet to the cloud.
- the sensor system 1 may also comprise a memory 11.
- the memory 11 may be adapted to store parameters of relevant data and correlated primary and secondary activities.
- the sensor system 1 may also comprise a training unit 12 for, from subsequent relevant data and correlated primary and/or propely activities, update the stored parameters for improved performance.
- the sensor system 1 may furthermore comprise a communication unit for sending signals representative of the recognized primary and/or propely activity to a remote electric or electronic device.
- the at least two sensors S 1 , S 2 , ..., S n may all be located within the sensor system 1. However, according to other embodiments, at least one of the at least two sensors S 1 , S 2 , ..., S n may be located outside the sensor system 1. For example, already existing sensors present in a home or building can be integrated so as to work with the sensor system 1. This means that the sensor system 1 can take into account input received from the "outside" sensor(s) to determine the primary and/or propely activities. According to embodiments of the invention, the sensor system 1 may be a standalone system, which means that it can perfectly work on its own. According to other embodiments, the sensor system 1 may be part of an automation system.
Landscapes
- Health & Medical Sciences (AREA)
- Emergency Management (AREA)
- General Health & Medical Sciences (AREA)
- Gerontology & Geriatric Medicine (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Engineering & Computer Science (AREA)
- Psychiatry (AREA)
- Psychology (AREA)
- Social Psychology (AREA)
- Emergency Alarm Devices (AREA)
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
BE20195453A BE1027428B1 (nl) | 2019-07-14 | 2019-07-14 | Sensorsysteem voor activiteitherkenning |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3767602A1 true EP3767602A1 (fr) | 2021-01-20 |
Family
ID=68295882
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP20185382.7A Pending EP3767602A1 (fr) | 2019-07-14 | 2020-07-10 | Système de capteur de reconnaissance d'activité |
Country Status (2)
Country | Link |
---|---|
EP (1) | EP3767602A1 (fr) |
BE (1) | BE1027428B1 (fr) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2264988A1 (fr) * | 2009-06-18 | 2010-12-22 | Deutsche Telekom AG | Procédé de détection de l'activité actuelle d'un utilisateur et contexte environnemental d'un utilisateur d'un téléphone mobile utilisant un capteur d'accélération et un microphone, produit de programme informatique, et téléphone mobile |
US20150100643A1 (en) * | 2013-06-28 | 2015-04-09 | Facebook, Inc. | User Activity Tracking System |
US20160314255A1 (en) * | 2015-04-21 | 2016-10-27 | Diane J. Cook | Environmental sensor-based cognitive assessment |
WO2018151003A1 (fr) * | 2017-02-14 | 2018-08-23 | パナソニックIpマネジメント株式会社 | Dispositif de communication, système de notification d'anomalie et procédé de notification d'anomalie |
US20180306609A1 (en) | 2017-04-24 | 2018-10-25 | Carnegie Mellon University | Virtual sensor system |
US20190059725A1 (en) * | 2016-03-22 | 2019-02-28 | Koninklijke Philips N.V. | Automated procedure-determination and decision-generation |
US20190103005A1 (en) | 2016-03-23 | 2019-04-04 | Thomson Licensing | Multi-resolution audio activity tracker based on acoustic scene recognition |
-
2019
- 2019-07-14 BE BE20195453A patent/BE1027428B1/nl active IP Right Grant
-
2020
- 2020-07-10 EP EP20185382.7A patent/EP3767602A1/fr active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2264988A1 (fr) * | 2009-06-18 | 2010-12-22 | Deutsche Telekom AG | Procédé de détection de l'activité actuelle d'un utilisateur et contexte environnemental d'un utilisateur d'un téléphone mobile utilisant un capteur d'accélération et un microphone, produit de programme informatique, et téléphone mobile |
US20150100643A1 (en) * | 2013-06-28 | 2015-04-09 | Facebook, Inc. | User Activity Tracking System |
US20160314255A1 (en) * | 2015-04-21 | 2016-10-27 | Diane J. Cook | Environmental sensor-based cognitive assessment |
US20190059725A1 (en) * | 2016-03-22 | 2019-02-28 | Koninklijke Philips N.V. | Automated procedure-determination and decision-generation |
US20190103005A1 (en) | 2016-03-23 | 2019-04-04 | Thomson Licensing | Multi-resolution audio activity tracker based on acoustic scene recognition |
WO2018151003A1 (fr) * | 2017-02-14 | 2018-08-23 | パナソニックIpマネジメント株式会社 | Dispositif de communication, système de notification d'anomalie et procédé de notification d'anomalie |
US20180306609A1 (en) | 2017-04-24 | 2018-10-25 | Carnegie Mellon University | Virtual sensor system |
Also Published As
Publication number | Publication date |
---|---|
BE1027428A1 (nl) | 2021-02-05 |
BE1027428B1 (nl) | 2021-02-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018232922A1 (fr) | Procédé et système de sécurité domestique | |
EP3886066A2 (fr) | Centre d'appel de sonnette de porte | |
EP2353153B1 (fr) | Système pour suivre la présence de personnes dans un bâtiment, et procédé et produit de programme informatique | |
US10593174B1 (en) | Automatic setup mode after disconnect from a network | |
US20130100268A1 (en) | Emergency detection and response system and method | |
US20190103005A1 (en) | Multi-resolution audio activity tracker based on acoustic scene recognition | |
US11810437B2 (en) | Integrated security for multiple access control systems | |
WO2015184700A1 (fr) | Dispositif et procédé permettant une surveillance automatique et une réponse autonome | |
US20220215725A1 (en) | Integrated doorbell devices | |
WO2014139415A1 (fr) | Appareil et système anti-intrusion par les portes/fenêtres intelligents et système de commande d'accès intelligent | |
US12073698B1 (en) | Security device with user-configurable motion detection settings | |
CN106842356B (zh) | 一种室内有无人的检测方法及检测系统 | |
US11875571B2 (en) | Smart hearing assistance in monitored property | |
US11544924B1 (en) | Investigation system for finding lost objects | |
US12094249B2 (en) | Accessibility features for property monitoring systems utilizing impairment detection of a person | |
US10834363B1 (en) | Multi-channel sensing system with embedded processing | |
KR101182986B1 (ko) | 영상결합기를 이용한 보안감시시스템과 그 방법 | |
EP3767602A1 (fr) | Système de capteur de reconnaissance d'activité | |
US11550276B1 (en) | Activity classification based on multi-sensor input | |
CA2792621A1 (fr) | Methode et systeme de detection et de reponse aux urgences | |
Floeck et al. | Monitoring patterns of inactivity in the home with domotics networks | |
CN115499627A (zh) | 一种基于大数据的安全监控系统 | |
CN110965894B (zh) | 门窗启闭的控制方法、控制装置和控制系统 | |
CN114648831B (zh) | 一种智能门锁控制系统 | |
CN111311786A (zh) | 智能门锁系统及其智能门锁控制方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20210714 |
|
RBV | Designated contracting states (corrected) |
Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20230104 |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230526 |