WO2022136819A1 - A Method and Apparatus for Controlling Actions of a Monitoring System - Google Patents

A Method and Apparatus for Controlling Actions of a Monitoring System Download PDF

Info

Publication number
WO2022136819A1
WO2022136819A1 PCT/GB2021/052473 GB2021052473W WO2022136819A1 WO 2022136819 A1 WO2022136819 A1 WO 2022136819A1 GB 2021052473 W GB2021052473 W GB 2021052473W WO 2022136819 A1 WO2022136819 A1 WO 2022136819A1
Authority
WO
WIPO (PCT)
Prior art keywords
scenario
sensor data
data
sensor
probability
Prior art date
Application number
PCT/GB2021/052473
Other languages
French (fr)
Inventor
Jason Souloglou
Original Assignee
Seechange Technologies Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Seechange Technologies Limited filed Critical Seechange Technologies Limited
Publication of WO2022136819A1 publication Critical patent/WO2022136819A1/en

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B29/00Checking or monitoring of signalling or alarm systems; Prevention or correction of operating errors, e.g. preventing unauthorised operation
    • G08B29/18Prevention or correction of operating errors
    • G08B29/185Signal analysis techniques for reducing or preventing false alarms or for enhancing the reliability of the system
    • G08B29/188Data fusion; cooperative systems, e.g. voting among different detectors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/809Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of classification results, e.g. where the classifiers operate on the same input data
    • G06V10/811Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of classification results, e.g. where the classifiers operate on the same input data the classifiers operating on different input data, e.g. multi-modal recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/46Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
    • G06V20/47Detecting features for summarising video content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/16Actuation by interference with mechanical vibrations in air or other fluid
    • G08B13/1654Actuation by interference with mechanical vibrations in air or other fluid using passive vibration detection systems
    • G08B13/1672Actuation by interference with mechanical vibrations in air or other fluid using passive vibration detection systems using sonic detecting means, e.g. a microphone operating in the audio frequency range
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19613Recognition of a predetermined image pattern or behaviour pattern indicating theft or intrusion
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19639Details of the system layout
    • G08B13/19647Systems specially adapted for intrusion detection in or around a vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B29/00Checking or monitoring of signalling or alarm systems; Prevention or correction of operating errors, e.g. preventing unauthorised operation
    • G08B29/18Prevention or correction of operating errors
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B29/00Checking or monitoring of signalling or alarm systems; Prevention or correction of operating errors, e.g. preventing unauthorised operation
    • G08B29/18Prevention or correction of operating errors
    • G08B29/185Signal analysis techniques for reducing or preventing false alarms or for enhancing the reliability of the system
    • G08B29/186Fuzzy logic; neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/44Event detection
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B19/00Alarms responsive to two or more different undesired or abnormal conditions, e.g. burglary and fire, abnormal temperature and abnormal rate of flow

Abstract

A computer implemented method of controlling actions of a monitoring system is disclosed. The method comprises: receiving first sensor data from a first sensor, applying a scenario recognition model to the first sensor data to determine a first value indicative of a first probability that a given first scenario is represented by the first sensor data, and analysing the first sensor data to identify one or more of the components. The method comprises, responsive to a determination that the first value is above a first threshold, accessing data for information on the identified one or more components; and modifying the first value based on the results of data access to determine a second value indicative of a second probability that the first scenario is represented by the first sensor data. The method comprises, responsive to a determination that the second value is above a second threshold, triggering a second action to be performed by the monitoring system. An apparatus is also disclosed.

Description

A METHOD AND APPARATUS FOR CONTROLLING ACTIONS OF A MONITORING SYSTEM
Technical Field
The present invention relates to a computer implemented method and apparatus for controlling actions of a monitoring system.
Background
Sensors are used to capture data about an environment. For example, a smoke detector may be used to provide data on a level of smoke in an environment. As another example, a camera may be used to capture images of an environment and provide image data.
Monitoring systems may collect and monitor data from one or more sensors. A scenario recognition model may be applied to the sensor data in order to recognise a given scenario represented by the sensor data. For example, smoke alarm data may indicate a high level of smoke in a house, and camera data may be analysed to indicate that there is a person in the house. The scenario recognition model may use these indicated events to determine a probability that a given scenario is occurring, for example the probability that the scenario of a person inside a burning house is occurring. Actions may be taken by the monitoring system based on the outcome of the scenario recognition, for example to automatically notify emergency services.
It is desirable to improve the accuracy of scenario recognition and/or the operational efficiency of monitoring systems.
Summary
According to a first aspect of the present disclosure, there is provided a computer implemented method of controlling actions of a monitoring system. The method comprises: receiving first sensor data from a first sensor, the first sensor data representing a scenario involving one or more components, the first sensor data having been captured as part of a first action by the monitoring system; applying a scenario recognition model to the first sensor data to determine a first value indicative of a first probability that a given first scenario is represented by the first sensor data; analysing the first sensor data to identify one or more of the components; responsive to a determination that the first value is above a first threshold: accessing data for information on the identified one or more components; and modifying the first value based on the results of the data access to determine a second value indicative of a second probability that the first scenario is represented by the first sensor data; and responsive to a determination that the second value is above a second threshold, triggering a second action to be performed by the monitoring system.
According to a second aspect of the present disclosure, there is provided an apparatus for controlling actions of a monitoring system, the apparatus being configured to perform the method according to the first aspect.
According to a third aspect of the present disclosure, there is provided a monitoring system comprising the apparatus according to the second aspect.
According to a fourth aspect of the present disclosure, there is provided a computer program comprising instructions which, when executed by a computer, cause the computer to perform the method according to the first aspect. The computer program may be stored on a computer readable medium.
Further features and advantages of the invention will become apparent from the following description of preferred embodiments of the invention, given by way of example only, which is made with reference to the accompanying drawings.
Brief Description of the Drawings
Figure 1 is a flow diagram that illustrates schematically a method according to an example;
Figure 2 is a schematic diagram that illustrates a monitoring system according to an example;
Figure 3 is a schematic diagram that illustrates function elements of an apparatus according to an example; and
Figure 4 is a schematic diagram that illustrates an apparatus according to an example. Detailed Description
Referring to Figure 1, there is illustrated an example method of controlling actions of a monitoring system.
Referring briefly to Figure 2, there is illustrated an example monitoring system 220 in which examples of the present invention may be implemented. In this example, the monitoring system 220 comprises a first sensor 224, a second sensor 226, and a network node 222. The first sensor 224 and the second sensor 226 are communicatively connected to the network node 222, and each provide sensor data to the network node 222. The network node 222 is, in turn, communicatively connected to a computer network 221, such as the internet 221. The network node 222 may be, for example, a server or server device. In some examples, the network node 222 may be or comprise a gateway or gateway device, i.e. a network node that functions as a gateway for communication between the first sensor 224 and/or the second sensor 226, and the computer network 221. As will be described in more detail below, example methods may be performed by a component of the monitoring system 220 itself, for example by network node 222, or by an entity external to the monitoring system 220 (not shown). In either case, in examples, the method may be performed by a processing unit of the network node 222 or entity.
Returning now to Figure 1, the method comprises, in step 102, receiving first sensor data from a first sensor 224. The first sensor data represents a scenario, i.e. a situation, involving one or more components. The first sensor data is data that has been captured as part of a first action by the monitoring system 220.
For example, the first sensor 224 may be a camera and the first sensor data may be a stream of images captured by the camera 224. The scenario (i.e. situation) represented by the images may involve a person and a car as components, for example. The first action by the monitoring system 220 may be to monitor images of an environment in which the camera is located, for example a parking lot.
The method comprises, in step 104, applying a scenario recognition model to the first sensor data to determine a first value indicative of a first probability that a given first scenario is represented by the first sensor data. For example, the given first scenario may be a person breaking into a car. In some examples, the first value may be the first probability that the given first scenario is represented by the first sensor data. Indeed, for ease of explanation, in examples described herein, reference is made to the scenario recognition model being applied to the first sensor data to determine the first probability that a given first scenario is represented by the first sensor data. However, it will be appreciated that in each of the examples described herein, the first value determined and/or output by the scenario recognition model need not necessarily be the first probability itself, for example in the sense of a number between 0 and 1, but may instead be a first value indicative of the first probability, for example a number on any predetermined scale of likelihood.
In some examples, the scenario recognition model may analyse the first sensor data to infer information about the scenario represented by the data, and use the inferred information to determine the probability that a given scenario is represented by the sensor data. In some examples, the first sensor data may itself be or comprise the inferred information, for example as determined by an analysis performed at the first sensor 224, and the scenario recognition may use the inferred information to determine the probability that a given scenario is represented by the sensor data.
In some examples, the scenario recognition model may comprise a machine learning or artificial intelligence computing algorithm trained to map the inferred information onto a given scenario, and provide as output a probability that the given scenario is represented thereby. It will be appreciated that other algorithms or methods for determining the first probability are possible and may be used. For example, the scenario recognition model may comprise a statistical model based on historical data associated with the first scenario. The historical data may be stored in a cloud computing environment, for example in a static data lake stored in the cloud (not shown). A statistical model may calculate the likelihood of a particular outcome occurring given one or more input conditions, based on an analysis of the likelihood of that particular outcome occurring in the historical data given those one or more conditions. Accordingly, in some examples, the statistical model may take as input one or more conditions derived from the first sensor data, and output, based on the historical data, the first probability that the first scenario is represented by the first sensor data. As another example, the scenario recognition model may comprise a heuristic algorithm. For example, the heuristic algorithm may be based on one or more heuristics or rules relating to the first scenario. The heuristic algorithm may output the likelihood of a particular outcome occurring given one or more input conditions, based on an analysis of the whether the one or more input conditions satisfy one or more heuristics associated with the first scenario. Accordingly, in some examples, the heuristic algorithm may take as input one or more conditions derived from the first sensor data, and output, based on one or more heuristics associated with the first scenario, the first probability that the first scenario is represented by the first sensor data. For example, movement of an object may be detected based on changes in pixel values of images captured by a camera.
In some examples, the inferred information may be one or more objects or events represented by the sensor data. For example, the scenario recognition model may analyse the first sensor data to recognise or infer one or more objects or events represented by the sensor data, and use the recognised objects or events to determine the probability that a given scenario is represented by the sensor data. The scenario recognition model may comprise a machine learning or artificial intelligence computing algorithm trained to map recognised objects and/or events onto a given scenario, and provide as output the probability that the given scenario is represented thereby.
In some examples, as mentioned above, the first sensor data may be a stream of images (i.e. digital images) captured by a camera. The scenario recognition model may acquire and analyse the images to infer information about the scenario represented by the image. For example, the scenario recognition model may detect features and objects within the images, for example by detecting lines, edges, ridges, corners, blobs, textures, shapes, gradients, regions, boundaries, surfaces, volumes, colours and shadings. Object recognition may be achieved, for example, by a process of comparing stored representations of objects of interest to features of the image representing the scenario, and applying a matching rule for determining a match. Such object recognition may utilise a data store of pre-specified objects when trying to identify an object represented by an image. For example, the scenario recognition model may group a set of image features as a candidate object in a given scenario and refer to the data store of pre-specified objects in order to identify the detected object. The data store, also known as a model-base, of pre-specified objects, also known as templates, may be associated with the scenario recognition model. The detected features and/or objects of the images may be used to infer information about the scenario represented by the images, such as spatial models of the scenario, lists of objects in the scenario, identifications of unique objects in the scenario, tracking of objects though a space, estimation of the motion of objects in the scenario, detection of events in the scenario, and recognition of gestures. This inferred information may be used by the scenario recognition model to determine the probability that a given scenario is represented by the images.
In some examples, the first sensor may be another type of sensor, for example a sound sensor, brightness sensor, odour sensor, temperature sensor, humidity sensor, proximity sensor, fitness tracker, Passive Infra-Red sensor, and/or a motion detector. For example, the scenario recognition model may analyse the sound data from a sound sensor (e.g. microphone) to infer information about the scenario represented by the data, for example the occurrence of a gun-shot or glass breaking, and use the inferred information to determine the probability that a given scenario is represented by the sensor data.
In any case, the scenario recognition model determines a first value indicative of a first probability that a first scenario is represented by the first sensor data. The method comprises, in step 106, determining whether the first value is above a first threshold. Responsive to a determination that the first value is below the first threshold (i.e. ‘N’ in Figure 1), the method returns to the steps 102 and 104. However, responsive to a determination that the first value is above the first threshold (i.e. ‘Y’ in Figure 1), the method moves to steps 108 to 114, as described in more detail below.
For example, the first value may be the first probability that first scenario is represented by the first sensor data. The first scenario may be a car being broken into. The first threshold may be 0.05 (i.e. 5%). The first sensor data may be a stream of images from a camera located in a parking lot. The scenario recognition model may analyse the images to detect that a person is in the vicinity of a car. The scenario recognition model may determine from this that the first probability that the scenario of a car being broken into is represented by the image data is 0.02 (i.e. 2%). In this case, since the first probability is below the first threshold, no further action is taken and the scenario recognition model continues to receive images and determine the first probability. However, the scenario recognition model may analyse the images to detect that a person is touching a car. The scenario recognition model may determine from this that the first probability that the scenario of a car being broken into is represented by the image data is 0.06 (i.e. 6%). In this case, since the first probability is above the first threshold, further processing is performed in the form of at least steps 108 to 114, as described in more detail below.
The method comprises, in step 108, analysing the first sensor data to identify one or more of the components of the scenario represented by the first sensor data.
In some examples, such as that illustrated in Figure 1, analysing the first sensor data to identify one or more of the components is responsive to a determination that the first value is above the first threshold. However, analysing the first sensor data to identify one or more of the components may not be responsive to a determination that the first value is above a first threshold. For example, the scenario recognition model applied in step 104 may identify one or more of the components of the scenario represented by the first data. In these examples, the analysis of the first sensor data to identify one or more of the components need not necessarily be responsive to a determination that the first value is above the first threshold.
In some examples, analysing the first sensor data to identify one or more of the components comprises determining an identifier of a component recognised by the scenario recognition model. For example, the first sensor data may be images from a camera. The scenario recognition model may detect or recognise that a component of the scenario represented by the first data is a person. Analysing the first sensor data to identify the person may comprise applying facial recognition to the image to identify the person, for example the name of the person. As another example, the scenario recognition model may detect or recognise that a component of the scenario represented by the first data is a car. Analysing the first sensor data to identify the car may comprise applying number plate recognition to the image to identify the car, for example the registration number of the car. As another example, the first sensor data may be sound data. The scenario recognition model may detect or recognise that a component of the scenario represented by the sound data is a person speaking. Analysing the sound data to identify the person may comprise applying speaker recognition to the sound data to identify the person, for example the name of the person. The method comprises, in step 110, accessing data for information on the identified one or more components. Accessing data for information on the identified one or more components is responsive to a determination that the first value is above the first threshold.
In some examples, the data that is accessed may be stored in a database, and accessing data for information on the identified one or more components may comprise performing a look-up in the database for information on the identified one or more components. Indeed, for ease of explanation, in examples described herein, reference is made to performing a look-up in a database for information on the identified one or more components. However, it will be appreciated that in each of these examples the data need not necessarily be stored in a database or accessed by performing a look-up, and may instead be stored in other forms and/or accessed in other ways in order to obtain information on the identified one or more components.
In some examples, the information on an identified component may be or comprise historical information associated with the identified component. For example, the information may relate to previous actions associated with the component. For example, where the identified component is a person, the information may be or comprise a criminal record of the person. As another example, where the identified component is a car, the information may be or comprise a log of previously observed or recognised suspicious activity around the car, such as people loitering near the car.
In some examples, the database in which the look-up is performed may be a relational database that relates component identifiers to respective component information. In some examples, performing the look-up in a database for information on an identified component may comprise using the determined identifier of the component to query the database. In some examples, the method may comprise determining the database in which to perform the look-up and/or may comprise determining a field of the database from which to extract information. For example, if the identified component is a person, the method may comprise determining that a criminal record database is to be queried using the identifier (e.g. name) of the person to extract criminal record information associated with the person.
In some examples, the database in which the look-up is performed may be external to the monitoring system 220. For example, the database may be communicatively connected to the monitoring system 220 via the network 221 such as the internet. Performing the look-up in the database may comprise transmitting a request over the network 221 for a look-up to be performed in a database for information on the identified one or more components, and receiving in response to the request the requested information on the identified one or more components.
In some examples, analysing the first sensor data to identify one or more of the components comprises analysing the first sensor data to identify at least two of the components, and the look-up in the database for information on the identified one or more components comprises a look-up in the database for information on a relationship between the at least two components. For example, a first component may be a person (e.g. identified by name), and a second component may be a car (e.g. identified by registration number). For example, a look-up in a car ownership registration database may be performed to obtain information on whether the identified person is or is not the registered owner or user of the identified car. As another example, a database may store information on a list of authorised users of a car, and a look-up in this database may be performed to obtain information on whether the identified person is or is not the registered owner or an authorised user of the identified car.
The method comprises, in step 112, modifying the first value based on the results of the data access to determine a second value indicative of a second probability that the first scenario is represented by the first sensor data.
In some examples, the first value may be the first probability that the first scenario is represented by the first data, and the second value may be the second probability that the first scenario is represented by the first sensor data. Indeed, for ease of explanation, in examples described herein, reference is made to modifying the first probability based on the results of the data access to determine the second probability. However, it will be appreciated that in each of the examples described herein, the first value that is modified need not necessarily be the first probability itself but may instead be a first value indicative of the first probability, and/or the second value that is determined need not necessarily be the second probability itself but may instead be a second value indicative of the second probability.
In some examples, modifying the first probability may comprise adding data representing the results of the data access (i.e. data, if any, representing the information on the identified one or more components) as an input to the scenario recognition model.
For example, as mentioned above, the scenario recognition model may comprise a machine learning or artificial intelligence computing algorithm trained to map features inferred from the first sensor data onto a given scenario, and provide as output a probability that the given scenario is represented thereby. In some examples, the machine learning or artificial intelligence computing algorithm may be trained to map features inferred from the first sensor data and data representing the accessed information on the one or more components onto a given scenario, and provide as output a probability that the given scenario is represented thereby. In these cases, modifying the first probability may comprise providing, in addition to the features inferred from the first sensor data, the data representing the accessed information as input to the trained machine learning or artificial intelligence computing algorithm, and running the algorithm again to determine an updated, second, probability that the first scenario is represented by the first sensor data.
Other algorithms or methods for modifying the first probability are possible and may be used. For example, rule based algorithms may be used. For example, an algorithm may take as input the first probability and the results of the data access (i.e. the data representing the information on the identified one or more components) and may output the second probability. This algorithm may, for example, be based on one or more rules defining how the first probability is to be modified based on the results of the data access. For example, if the results of the data access indicate that the identified person has a criminal record for breaking into cars, then multiply the first probability by 1.2, for example. The use of a rule-based algorithm may be relatively computationally inexpensive and the results provided thereby tractable. On the other hand, the use of a trained machine learning or artificial intelligence computing algorithm may be more adaptive to different inputs and may provide for more accurate results.
The method comprises, in step 114, determining whether the second value indicative of the second probability is greater than a second threshold. Responsive to a determination that the second value is less than the second threshold (i.e. ‘N’ in Figure 1), no further action may be taken by the monitoring system 220, and the method may return to step 102 in which first sensor data is received in order that the scenario recognition model may be applied to it. Responsive to a determination that the second value is higher than the second threshold, the method comprises, in step 116, triggering a second action to be performed by the monitoring system 220. For example, the second action may be a second monitoring action by the monitoring system 220, such as triggering the second sensor 226 of the monitoring system 220 to capture second sensor data representing the scenario.
As an example, the first value may be first probability that the first scenario is represented by the first sensor data, and the second value may be the second probability that the first scenario is represented by the first sensor data. The first scenario may be a car being broken into. The scenario recognition model may analyse images from a first sensor to determine that a person is touching a car, that the first probability that the first scenario is represented by the first sensor data is 6%, and that this is higher than the first threshold. As a result, the first sensor data may be analysed to identify the person touching the car, and a look-up in a criminal record database for information on the criminal record of the identified person may be performed. Based on this information (i.e. based on the results of the look-up), the first probability is modified to determine the second probability. For example, if the results of the look-up indicate that the identified person has no criminal record or no relevant criminal record, the first probability may be reduced to determine the second probability of 2%. If the results of the look-up indicate that the identified person has a relevant criminal record, the first probability may be increased to determine the second probability of 10%.
As another example, which may be an alternative or in addition to the above example, as a result of the first probability being above the first threshold, the first sensor data may be analysed to identify the car and the person touching the car, and a look-up in a car ownership database may be performed for information on whether the identified person is the owner of the identified car. Based on this information (i.e. based on the results of the look-up), the first probability is modified to determine the second probability. For example, if the results of the look-up indicate that the identified person is the owner of the identified car, the first probability may be modified to determine a second probability, e.g. 0.5% that is lower than the first probability. On the other hand, if the results of the look-up indicate that the identified person is not the owner of the car, the first probability may be increased to determine the second probability of 12%, for example.
In any case, responsive to the second value being higher than the second threshold, a second action to be performed by the monitoring system 220 is triggered. For example, the second value may be the second probability, and the second threshold may be 8%. Therefore, responsive to the second probability being higher than this (e.g. 10% or 12%), a second action is triggered to be performed by the monitoring system 220. For example, a second monitoring action, such as capture of sensor data by the second sensor 226, may be triggered.
Determining the second value indicative of the second probability that the first scenario is represented by the first sensor data, by modifying the first value indicative of the first probability based on the results of data access for information on identified components of the scenario represented by the first sensor data, may provide for an informed scenario recognition and hence for the second value to be accurate and/or reliable. Providing for a more informed scenario recognition may, in turn, allow for a more informed and hence accurate determination of the action the monitoring system 220 is to take. This may, in turn, allow for a more appropriate action to be taken by the monitoring system 220. Triggering actions of the monitoring system 220, such as further monitoring action, based on an accurate and/or reliable second value, may allow for the operational efficiency of the monitoring system 220 to be improved. For example, this may be as compared to performing the first and second actions at all times as opposed to triggering the second action in response to the second value being higher than the second threshold, or for example, as compared to triggering the second action based on the first value determined by the scenario recognition model without the benefit of the results of the data access for information on identified components of the scenario represented by the first sensor data. For example, the improved operational efficiency may take the form of a reduced energy consumption and/or storage burden that may associated with performing the second action.
As mentioned above, in some examples, the results of the data access may be information on a relationship between at least two of the identified components of the scenario (e.g. whether an identified person is an authorised user of an identified car). Modifying the first value based on such relationship information may provide for a more informed and hence accurate scenario recognition and/or determination of the second value. A more accurate scenario recognition may, in turn, allow for a more informed and hence accurate determination of the action the monitoring system 220 is to take. Triggering actions of the monitoring system 220 based on an accurate and/or reliable second value, may allow for the operational efficiency of the monitoring system 220 to be improved.
In some examples, triggering the second action may comprise triggering the monitoring system 220 to perform a more active recognition of the first scenario, in which further analysis is performed to recognise from the first sensor data, or to recognise from data of further sensors triggered to operate, further features or evidence expected to be associated with the first scenario. For example, in the example first scenario of a car being broken into, the active recognition may comprise the triggering of the second sensor 226 to capture sound data and analysing the sound data for the sound of broken glass, which may be a feature expected to be associated with the first scenario of a car being broken into, as described in more detail below. As another example, the active recognition may comprise the application of an additional analysis to the first sensor data, such as intent recognition, to determine whether a recognised intent of the person is to enter the car, which may be a feature expected to be associated with the first scenario of a car being broken into, as described in more detail below. The active recognition may allow for the recognition of a given scenario more quickly and/or more accurately, for example as compared to without the more active recognition being applied. This may, in turn, allow for a more informed determination of an action that the monitoring system is to take, and hence for a more appropriate action to be taken by the monitoring system 220. Similarly to as described above, triggering the more active recognition of the first scenario only when needed may improve the operational efficiency of the monitoring system 220.
In some examples, the second action, such as a second monitoring action, may be triggered to be performed concurrently with the first action. For example, this may allow for a monitoring level of the monitoring system 220 to be dynamically adjusted in response to the second probability being higher than the second threshold. This may allow for certain monitoring actions to be triggered to be performed only when needed, for example as opposed to running continuously. This may improve the operational efficiency of the monitoring system 220.
As mentioned above, the first sensor data is captured by the first sensor 224 as part of the first action by the monitoring system 220. Triggering the second monitoring action may comprise triggering the second sensor 226 of the monitoring system 220 to capture second sensor data representing the scenario. Triggering the second sensor 226 to start capturing second data only when needed may reduce the operational costs of the second sensor, for example as compared to running the second senor continuously.
In some examples, the second sensor 226 and second sensor data may be of a different modality to the first sensor 224 and the first sensor data.
For example, the first sensor 224 may be a camera and the first sensor data may be images representing the scenario, and the second sensor 226 may be a sound sensor and the second action may be to capture sound data using the sound sensor. For example, in the example first scenario of a car being broken into, the sound data may represent the sound of glass being broken, which may be used to further inform the probability that the first scenario is represented by the first data.
As another example, triggering the second monitoring action may comprise triggering an application of additional analysis to the first sensor data to determine one or more further characteristics of the scenario represented by the first data. For example, the scenario recognition model may have analysis modules that can be dynamically applied to the first sensor data. For example, the additional analysis may comprise applying intent recognition to the first sensor data to infer an intent of a person represented by the first sensor data, for example based on an analysis of their actions. For example, it may be determined that the actions of the person correspond to a repeated pulling on a handle of the car, and an intent recognition module may be applied to infer that the intent of the person is to enter the car. This may be used to further inform the probability that the first scenario is represented by the first data. Triggering the additional analysis to be applied to the first sensor data only when needed may reduce the operational costs of the scenario recognition, for example as compared to running the additional analysis continuously.
In some examples, triggering the application of the additional analysis may comprise triggering the loading, from a repository, of an additional analysis module to perform the additional analysis. For example, there may be repository of analysis modules, i.e. code scripts, that the scenario recognition model may load and execute. For example, the repository may be located within the monitoring system 220. This may help reduce the processing load and/or operational memory used by the scenario recognition model. As another example, the repository may be located remotely of the monitoring system 220, and the triggering the application of the additional analysis may comprise trigging the loading of the additional analysis module from the remote repository. For example, triggering the loading may comprise transmitting a request, for example over the computer network 221 for the additional analysis module, to the repository for the additional analysis module, and receiving the additional analysis module from the repository. This may help reduce the storage space required to be provided of the network node 222. This may be particularly beneficial, for example, where the network node 222 is a edge device such as a gateway for one or more of the sensors 224, and/or is implemented within a sensor 224, 226 itself, which device or sensor may have limited storage capacity.
In some examples, the method may further comprise modifying the second value based on data from the triggered second monitoring action to determine a third value indicative of a third probability that the first scenario is represented by the first sensor data or by the first data and the second data; and responsive to a determination that the third value is above a third threshold, triggering a third action to be performed by the monitoring system 220.
In some examples, the second value may be the second probability that the first scenario is represented by the first sensor data, and/or the third value may be the third probability that the first scenario is represented by the first sensor data or by the first data and the second data. Indeed, for ease of explanation, in examples described in, reference is made to the second probability being modified to determine the third probability. However, it will be appreciated that in each of the examples described herein, the second value that is modified need not necessarily be the second probability itself and may instead be a value indicative of the second probability, and/or the determined third value need not necessarily be the third probability itself and may instead be a value indicative of the third probability. In some examples, the data from the triggered second monitoring action may be the second sensor data from the second sensor 226 and/or for example the one or more characteristics determined from the additional analysis applied to the first sensor data.
In some examples, the machine learning or artificial intelligence computing algorithm may be trained to map features inferred from the first sensor data, features inferred from the data from the triggered second monitoring action, and/or data representing the accessed information on the one or more components, onto a given scenario, and provide as output a probability that the given scenario is represented thereby. In these cases, modifying the second probability may comprise providing, in addition to the features inferred from the first sensor data and/or the data representing the looked-up information, features inferred from the data from the triggered second monitoring action as input to the trained machine learning or artificial intelligence computing algorithm, and running the algorithm again to determine an updated, third, probability that the first scenario is represented by the first sensor data or by the first data and the second data.
Similarly to as mentioned above, other algorithms or methods for modifying the second probability are possible and may be used. For example, rule-based algorithms may be used. For example, an algorithm may take as input the second probability and features inferred from the second sensor data, and may output the third probability. For example, if inferred features of the second sensor data indicate that the sound of glass breaking has been captured and/or that an inferred intent of the person is to enter the car, then the second probability may be increased by a predefined multiplier, for example.
Determining the third value indicative of the third probability that the first scenario is represented by the first sensor data, by modifying the second value indicative of the second probability based on data from the second monitoring action, may provide for a yet further informed scenario recognition and hence for the third value to be yet more accurate and/or reliable. Triggering a third action of the monitoring system 220 based on a yet more accurate and/or reliable third value, may allow for the operational efficiency of the monitoring system 220 to be further improved.
For example, the triggered third action may be an additional monitoring action, for example a monitoring action similar to the those described above. Similarly to as described above, triggering the additional monitoring action only when needed may make more efficient use of monitoring resources, for example as compared to performing the monitoring actions at all ties.
As another example, triggering the third action may comprise triggering a recording of the first sensor data and/or the second sensor data in a storage medium. For example, recording of the first sensor data and/or second sensor data in a storage medium may be used as evidence of the first scenario occurring. Triggering the recording of the first sensor data and/or the second sensor data in response to the third value being higher than the third threshold may provide that the first sensor data and/or second sensor data is only recorded when needed, which may reduce the storage burden associated with recording all of the first sensor data and/or second sensor data.
As another example, triggering the third action may comprise triggering the issuance of an alert or notification indicating that the first scenario is likely to be occurring or about to occur. For example, the method may comprise determining, based on the second value and/or the modified second value (i.e. the third value), that the first scenario is occurring or is about to occur; and responsive to the determination that the first scenario is occurring or about to occur, issuing an alert that the first scenario is occurring or is about to occur. For example, the alert or notification may be issued to a system or device over the network 221. For example, the alert or notification may be issued to a security system to implement one or more security measures, such as disabling a car, in response to receiving the alert or notification. Triggering the issuance of an alert or notification in response to the third probability being higher than the third threshold may provide that the alert or notification is only issued when needed, which may reduce the communication burden of the monitoring system 220 associated with issuing alerts and/or the burden of taking action based on an issued alert.
In some examples, triggering the second action or triggering the third action may comprise triggering a decrease in a privacy level implemented by the monitoring system 220. For example, the decrease in privacy level may provide for one or more additional functions to be performed by the monitoring system 220 which otherwise would not have been permitted to be performed for privacy policy reasons. For example, the additional functions may comprise one or more of recording of the first sensor data in a storage medium, triggering of additional sensors to capture additional sensor data, and/or notifying of authorities of the likelihood that the first scenario is occurring or is about to occur. For example, one or more of the additional functions may not be permitted under a given privacy level associated with a privacy policy normally implemented by the monitoring system 220, but triggering of a lowering of the privacy level in response to the second probability that the first sensor data represents the first scenario (e.g. a crime) being higher than the second threshold (and/or in response to the third probability being higher than the third threshold) may invoke a different privacy policy under which the additional functions are permitted. For example, the privacy policies may be in accordance with the General Data Protection Regulations, which apply constraints to the type of data that can be captured and/or recorded in different situations.
It will be appreciated that the triggered second action need not necessarily be performed concurrently with the first action. For example, the triggered second action may replace the first action. For example the triggered second action may replace the first action for a period of time, for example a predetermined period of time, after which the first action will resume. Similarly, the triggered third action may replace the triggered second action and/or the first action.
It will be appreciated that the triggered second action of the monitoring system 220 need not necessarily be a monitoring action. Other examples of the triggered second action of the monitoring system 220 may include recording the first sensor data in a storage medium, and/or issuing an alert or notification indicating that the first scenario is likely to be occurring or about to occur. For example, recording of the first sensor data in a storage medium may be used as evidence of the first scenario occurring. Triggering the recoding of the first sensor data in response to the second probability being higher than the second threshold may provide that the second sensor data is only recorded when needed, which may reduce the storage burden associated with recording all of the first sensor data. As another example, the alert or notification may be issued to a security system to implement one or more security measures, such as disabling a car. Triggering the issuance of an alert or notification in response to the second probability being higher than the second threshold may provide that an alert is only issued when needed, which may reduce the communication burden of the monitoring system 220 associated with issuing alerts and/or the burden of taking action based on an issued alert.
Example implementations of the method given above make reference to components of a scenario represented by the first sensor data being a car and a person in a parking lot, the first scenario being a person breaking into a car, the results of the data acess being information on the criminal record of the person and/or the ownership status of the car with respect to the person, and/or the triggered second action comprising triggering a sound sensor to listen for the sound of breaking glass or triggering additional intent recognition analysis to be performed on the first sensor data to determine if an intent of the person is to enter the car, for example.
However, many additional example implementations of the method exist. Three such further example implementations are provided below.
In a first further example implementation, the first scenario may be a person drowning. The first sensor 224 may be a camera located at a beach, the first sensor data may be a series of images of a swimming area of the beach, and the first sensor data may represent a scenario involving the swimming area and a person swimming in the swimming area as components.
The scenario recognition model may be applied to the images to determine a first probability that the given first scenario of a person drowning is represented by the images. For example, object and activity recognition may be applied to the images to determine that the images include the features of a person in the sea who is waving, with no other people or objects nearby the person. These features may be used, for example as input to a trained machine learning algorithm, to determine that the probability that the first sensor data represents the first scenario of a person drowning is higher than a first threshold.
The first sensor data may be analysed to identify the swimming area or beach. For example, the first sensor data may include data on the location of the camera, the name of the beach at which the camera is located, and/or an indication of the particular swimming area of which the camera is configured to capture images. As another example, feature recognition may be applied to the images to determine the identity, for example the name, of the beach at which the swimming area is located. Responsive to a determination that the first probability is above the first threshold, a look-up in a database for information on the identified component is performed. For example, the name or identifier of the beach or swimming area may be used to perform a look-up in a database logging historical drownings or difficulties at different beaches. The results of the lookup may be information indicating that the particular beach that is a component of the scenario represented by the first data is associated with a relatively large number of historical drownings or difficulties.
The first probability is modified based on the results of the look-up to determine a second probability that the first scenario is represented by the first sensor data. For example, the information indicating that the identified beach is associated with a relatively large number of historical drowning may be used as an additional input to the trained machine learning algorithm, and the algorithm run again to determine a second probability (which in this example would be higher than the first probability) that the first scenario (i.e. a person is drowning) is represented by the first data.
Responsive to a determination that the second probability is above a second threshold, a second action is triggered to be performed by the monitoring system 220. For example, similarly to as described above, an active, or a more active, recognition of the ‘person drowning’ scenario may be triggered. For example, additional analysis may be applied to the first sensor data to determine further features, for example whether there is another person or boat approaching the person, or whether the person is disappearing under the water every so often. As another example, further sensors may be triggered to operate, such as a microphone, and analysis may be applied to the sensor data from the further sensors, such as an analysis of whether sound data captured by the microphone represents distressed shouting or screaming for help. As another example, further information may be obtained from one or more sources, such as information on the current sea condition at the beach, wave size or under-current strength at the beach, and/or information on the current weather conditions at the beach. This further information may be obtained from sensors or from other data sources such as weather and sea state services. In an example, the additional data obtained from the more active recognition may be used to modify the second probability, similarly to as described above, to determine a third probability. Responsive to a determination that the second probability is higher than the second threshold, or responsive to a determination that the third probability is higher than a third threshold, an alert may be issued to the emergency services indicating that the scenario of ‘a person drowning’ is occurring or is about to occur at the beach, for example.
In a second further example implementation, the first scenario may be ‘a child in danger in a public changing rooms’ . The first sensor 224 may be a camera located at the entrance to a public changing rooms but outside the public changing rooms, the first sensor data may be a series of images of the entrance, and the first sensor data may represent a scenario involving people entering the public changing rooms.
The scenario recognition model may be applied to the series of images to determine a first probability that the given first scenario is represented by the series of images. For example, the public changing rooms may be known to be empty. Object recognition may be applied to the images to determine the features of a child entering the changing room alone and a few minutes later an adult entering the changing room. These features may be used, for example as input to a trained machine learning algorithm, to determine that the probability that the first sensor data represents the first scenario of ‘a child in danger in a public changing rooms’ is higher than a first threshold.
The first sensor data may be analysed to identify the adult. For example, facial recognition of the adult may be applied to the first sensor data to determine an identity, for example a name, of the adult.
Responsive to a determination that the first probability is above the first threshold, a look-up in a database for information on the identified component is performed. For example, the name of the adult may be used to perform a look-up in a criminal record database. The results of the lookup may be information indicating that the identified adult has a relevant criminal record.
The first probability is modified based on the results of the look-up to determine a second probability that the first scenario is represented by the first sensor data. For example, the information indicating that the adult has a relevant criminal record may be used as an additional input to the trained machine learning algorithm, and the algorithm run again to determine a second probability (which in this example would be higher than the first probability) that the first scenario (i.e. ‘a child is in danger in a public changing rooms’) is represented by the first data. Responsive to a determination that the second probability is above a second threshold, a second action is triggered to be performed by the monitoring system 220. For example, similarly to as described above, an active, or a more active, recognition of the ‘child is in danger in a public changing rooms’ scenario may be triggered. For example, an additional camera and microphone inside the changing rooms may be triggered to operate and an analysis applied to the data from these sensors to determine further features, for example inappropriate proximity of the adult and child or sounds. In an example, the additional data obtained from the more active recognition may be used to modify the second probability, similarly to as described above, to determine a third probability. Responsive to a determination that the third probability is higher than a third threshold, a third action of the monitoring system 220 may be triggered, such as recording the sensor data from the additional camera and microphone inside the changing rooms to a storage medium. In this example, there is a decrease in the privacy level implemented by the monitoring system 220 both in response to the second probability being higher than the second threshold (e.g. monitoring inside of a changing room as opposed to just outside the changing room) and again in response to the third probability being higher than the third threshold (e.g. recording the sensor data from inside the changing room in a storage medium as opposed to just applying analysis to it).
In a third further example implementation, the first scenario may be ‘a shopper not finding a stocked product’. The first sensor 224 may be a camera located in a shop, the first sensor data may be a series of images of an aisle of the shop, and the first sensor data may represent a scenario involving a person looking for a product in the aisle of the shop.
The scenario recognition model may be applied to the series of images to determine a first probability that the given first scenario is represented by the series of images. For example, object recognition may be applied to the images to determine that a person is looking at or interacting with a portion of an aisle in which a product is missing. These features may be used, for example as input to a trained machine learning algorithm, to determine that the probability that the first sensor data represents the first scenario of ‘a shopper not finding a stocked product’ is higher than a first threshold. The first sensor data may be analysed to identify the product which is missing. For example, the portion of the aisle that is empty may be identified, and the product that is missing may be inferred from the identified portion of the aisle. As another example, text recognition may be applied to a portion of the first image showing the shelf label, and the missing product may be identified from the shelf label.
Responsive to a determination that the first probability is above the first threshold, a look-up in a database for information on the identified component is performed. For example, the name of the missing product may be used to perform a look-up in stock database of the shop. The results of the lookup may be information indicating that the missing product is in stock but is located in a store room and has not yet been delivered to the aisle.
The first probability is modified based on the results of the look-up to determine a second probability that the first scenario is represented by the first sensor data. For example, the information indicating that the missing product is in fact in stock may be used as an additional input to the trained machine learning algorithm, and the algorithm run again to determine a second probability (which in this example would be higher than the first probability) that the first scenario (i.e. ‘a shopper not finding a stocked product’) is represented by the first data.
Responsive to a determination that the second probability is above a second threshold, a second action is triggered to be performed by the monitoring system 220. For example, one or more additional store cameras may be triggered to track the person through the shop, for example to determine when the shopper is nearing the exit of the shop. For example, when it is determined that the shopper is nearing the exit of the shop, an alert may be issued for a store assistant to bring the identified missing product to the shopper.
Referring to Figure 3, there is a schematic diagram showing functional components of an example network node 222 in which examples of the method described with reference to Figures 1 and 2 may be implemented. As illustrated in Figure 3, the example network node 222 comprises the following functional components: a first interface 331, a second interface 332, a third interface 358, a scenario processing engine 330, an event data store 346, a scenario data store 348, a probability generator 341, an intent data store 344, an intent builder 342, a streaming context data store 340, a streaming context builder 338, a static context data store 336, a static context builder 334, a user interface 210, a subscribed scenario data store 352, and an alert store 356.
The scenario processing engine 330, the event data store 346, the scenario data store 348, and the probability generator 341 are bounded by box A in Figure 3 and represent functional components implementing the scenario recognition model described above with reference to Figures 1 and 2, according to an example.
The first interface 331 is configured to receive the first sensor data from a first sensor (e.g. the first sensor 224 in Figure 2) and pass the first sensor data to the scenario processing engine 330.
The scenario processing engine 330 is configured to analyse the first sensor data to infer information about the scenario represented by the data. For example, the scenario processing engine 330 may analyse the first sensor data to recognise or infer one or more objects or events or other features represented by the first sensor data. The scenario processing engine 330 is configured to output data indicating the recognised or inferred features for logging in the event data store 346 and/or the scenario data store 348. For example, when the scenario processing engine 330 recognises in or infers from the first sensor data an event (e.g. there is a person and a car in the image), information relating to the event is output to the event data store 346. As another example, when the scenario processing engine 330 recognises in or infers from the first sensor data one or more other features relevant to the first scenario (e.g. the person is in physical contact with the car) information relating to the feature of the scenario is output to the scenario data store 348.
In some examples, the events or other features that the scenario processing engine 330 is configured to recognise or infer from the first sensor data are defined with reference to the subscribed scenario data store 352. For example, a user may specify via the user interface 350 one or more first scenarios, including features or events of the or each first scenario to be recognised, and this may be stored in the subscribed scenario data store 352. The scenario processing engine 330 may then refer to the subscribed scenario data store 352 to determine the given first scenario and/or the analysis to apply to the first sensor data. The probability generator 341 is configured to access the event data store 346 and/or the scenario data store 348, and retrieve the information inferred from the first sensor data by the scenario recognition engine 330. The probability generator 341 is configured to use the retrieved information to determine the probability that the given first scenario is represented by the first sensor data. For example, the probability generator 341 may implement a machine learning or artificial intelligence computing algorithm trained to map the retrieved information onto the given first scenario, and provide as output a first probability that the given first scenario is represented thereby. Other algorithms may be used to determine the first probability.
The scenario processing engine 330 is also configured to analyse the first sensor data to identify one or more components of the scenario represented by the first sensor data. For example, the scenario processing engine 330 may be configured to apply facial recognition to a person to identify the person (e.g. the name of the person), or apply number plate recognition to a car to identify the car (e.g. the registration number of the car). The identity of the one or more components may be output to the scenario data store 348, for example.
The probability generator 341 is configured to compare the first probability to a first threshold. For example, the first threshold may be set by a user via the user interface 350. The probability generator 341 is configured to, responsive to a determination that the first probability is higher than the first threshold, perform a lookup in a database for information on the identified one or more components; and modify the first probability based on the results of the look-up to determine a second probability that the first scenario is represented by the first sensor data.
For example, in response to the first probability being greater than the first threshold, the probability generator 341 may activate the static context builder 334. For example, the probability generator 341 may pass to the static context builder 334 the identity of one or more components of the scenario represented by the first data. For example, the probability generator 341 may retrieve the identity (e.g. name) of a recognised person in the first sensor data from the scenario data store 348. The probability generator 341 may pass the identity to the static context builder 334. The static context builder 334 is configured to use the identity (e.g. name) to perform a lookup in a database (e.g. a criminal record database, not shown in Figure 3) for information (e.g. a criminal record) on the identified one or more components (e.g. the recognised person) via the second interface 332. For example, the database (not shown in Figure 3) may be part of the internet (e.g. the network 221 in Figure 2). The static context builder 334 is configured to analyse this information to determine one or more properties of the information (e.g. that the person does have a relevant criminal record). The static context builder 334 stores the information and/or the one or more determined properties in the static context data store 336. The probability generator 341 is configured to access the static context store 346 and retrieve the looked-up information or properties derived from the looked-up information. The probability generator 341 is configured to use the retrieved additional information to modify the first probability to determine a second probability that the first scenario is represented by the first sensor data. For example, the retrieved additional information may be used as an additional input to the trained machine learning algorithm, and the algorithm run again to determine the second probability.
As another example, in response to the first probability being greater than the first threshold, the probability generator 341 may activate the streaming context builder 338. For example, the probability generator 341 may pass to the streaming context builder 338 the identity of one or more components of the scenario represented by the first data. For example, the probability generator 341 may retrieve the identity (e.g. name) of a recognised location in the first sensor data from the scenario data store 348. The probability generator 341 may pass the identity to the streaming context builder 338. The streaming context builder 338 is configured to use the identity (e.g. name of the location) to perform a look-up in a database (e.g. a weather database) for information on (e.g. the current weather at) the identified one or more components (e.g. the identified location) via the second interface 332. The database (not shown) may be part of the internet (e.g. the network 221 in Figure 2). The streaming context builder 334 is configured to analyse this information to determine one or more properties of the information (e.g. that the weather is poor). The streaming context builder 334 stores the information and/or the one or more determined properties in the streaming context data store 336. The probability generator 341 is configured to access the streaming context store 336 and retrieve the looked-up information or properties derived from the looked- up information. The probability generator 341 is configured to use the retrieved additional information to modify the first probability to determine a second probability that the first scenario is represented by the first sensor data. For example, the probability generator 341 may be configured to use the retrieved additional information as an additional input to the trained machine learning algorithm, and run the algorithm again to determine the second probability.
The probability generator 341 compares the second probability to a second threshold. For example, the second threshold may be set by a user via the user interface 350. The probability generator 341 is configured to, responsive to a determination that the second probability is higher than the second threshold, trigger a second action to be performed by the monitoring system 220.
For example, responsive to a determination that the second probability is higher than the second threshold, the probability generator 341 may transmit an activation signal, via the third interface 358, to a second sensor (e.g. the second sensor 226 in Figure 2) to capture sensor data and provide it to the scenario processing engine 330.
As another example, responsive to a determination that the second probability is higher than the second threshold, the probability generator 341 may transmit a notification or alert, via the third interface 358, to an external entity (not shown in Figure 3), such as an emergency service or security system. The nature, content and/or recipient of the alert may be derived from the alert store 356, which in turn may be set by a user using the user interface 350.
As another example, responsive to a determination that the second probability is higher than the second threshold, the probability generator 341 may activate the intent builder 342 and intent data store 344. The intent builder 342 requests the scenario processing engine 330 to provide recognised or inferred features or movements of a person of the first sensor data. The intent builder 342 uses these features to determine an intent of the person (for example, repeated pulling on a handle of a car may be used to infer an intent of the person to enter the car). The intent builder 342 stores data relating to the determined intent in the intent data store 344. The probability generator 341 is configured to access the intent data store 334, retrieve the stored intent data, and use the retrieved intent data to modify the first second probability to determine a third probability that the first scenario is represented by the first sensor data. For example, the retrieved additional information may be used as an additional input to the trained machine learning algorithm, and the algorithm run again to determine the third probability.
In some examples, the network node 222 is configured to implement the functionality of any one or combination of the examples described above with reference to Figures 1 to 3.
Referring to Figure 4, there is illustrated an apparatus 460 for controlling actions of a monitoring system, for example the monitoring system 220 according to any of the examples described above with reference to Figures 1 to 3. The apparatus 460 comprises a processing unit 462, a memory 464, and input interface 468 and an output interface 470. In some examples, the apparatus 460 is configured to perform the method according to any of the examples described above with reference to Figures 1 to 2, and/or the functional blocks according to any of the examples described above with reference to Figure 3. In some examples, the memory 464 stores a computer program comprising instructions which, when executed by the processing unit 462 cause the processing unit 462 to perform the method according to any of the examples described above with reference to Figures 1 to 2, and/or the functional blocks according to any of the examples described above with reference to Figure 3.
In some examples, the apparatus 460 is embodied by a computer. In some examples, the apparatus 460 is embodied by a network device (e.g. the network node 222 of Figure 2). For example, the apparatus 460 may be embodied by a server 222 positioned between the sensors 224, 226 and a computer network 221 such as the internet. In some examples, the apparatus 460 is embodied by a gateway or gateway device, i.e. a network device that functions as a gateway for communication between the one or more of the sensors 224, 22, and the computer network 221. In some examples, the apparatus 460 is part of a sensor, such as the first sensor 224. In these cases, the monitoring system 220 may comprise or in some cases consist of the sensor, such as the first sensor 224. The abovementioned improvement in operational efficiency of the monitoring system 220 provided by examples described herein may be particularly useful in cases where the apparatus 460 is embodied by a device with limited processing power and/or memory such as a sensor device or a sensor gateway.
As mentioned, in some examples, the apparatus 460 may be part of a sensor 224 or a first sensor gateway (not shown) associated with the first sensor 224. The first sensor 224 or first sensor gateway may trigger further actions within the first sensor 224 or first sensor gateway. Alternatively or additionally, the first sensor 224 or first sensor gateway may trigger actions by a second sensor 226 or a second sensor gateway associated with the second sensor 226. For example, this triggering may occur over a local network connecting the first sensor 224 and second sensor 226. This may avoid the need for communication over the internet or for coordination by a centralised server, for example. Such an architecture may help reduce latency and/or communication overheads associated with the monitoring system 220.
In some examples, the apparatus 460 is embodied by a component of the monitoring system 220. However, it will be appreciated that this need not necessarily be the case, and that the apparatus 460 may be embodied by an entity or device external to the monitoring system 220 and in communication with the monitoring system 220. For example, the apparatus 460 may be located in the computer network 221 and be configured to control the monitoring system 220, such as to trigger the second action of the monitoring system 220.
The above examples are to be understood as illustrative examples of the invention. It is to be understood that any feature described in relation to any one example may be used alone, or in combination with other features described, and may also be used in combination with one or more features of any other of the examples, or any combination of any other of the examples. Furthermore, equivalents and modifications not described above may also be employed without departing from the scope of the invention, which is defined in the accompanying claims.

Claims

1. A computer implemented method of controlling actions of a monitoring system, the method comprising: receiving first sensor data from a first sensor, the first sensor data representing a scenario involving one or more components, the first sensor data having been captured as part of a first action by the monitoring system; applying a scenario recognition model to the first sensor data to determine a first value indicative of a first probability that a given first scenario is represented by the first sensor data; analysing the first sensor data to identify one or more of the components; responsive to a determination that the first value is above a first threshold: accessing data for information on the identified one or more components; and modifying the first value based on the results of the data access to determine a second value indicative of a second probability that the first scenario is represented by the first sensor data; and responsive to a determination that the second value is above a second threshold, triggering a second action to be performed by the monitoring system.
2. The computer implemented method according to claim 1, wherein the second action is triggered to be performed concurrently with the first action.
3. The computer implemented method according to claim 1 or claim 2, wherein triggering the second action comprises triggering a second monitoring action to be performed by the monitoring system.
4. The computer implemented method according to claim 3 , wherein triggering the second monitoring action comprises triggering one or more of: a second sensor of the monitoring system to capture second sensor data; and an application of additional analysis to the first sensor data to determine one or more characteristics of the scenario represented by the first sensor data.
5. The computer implemented method according to claim 4, wherein the second sensor and second sensor data is of a different modality to the first sensor and the first sensor data.
6. The computer implemented method according to claim 4 or claim 5, wherein triggering the application of the additional analysis comprises triggering the loading, from a repository, of an analysis module to perform the additional analysis.
7. The computer implemented method according to any one of claim 3 to claim 6, wherein the method further comprises: modifying the second value based on data from the triggered second monitoring action to determine a third value indicative of a third probability that the first scenario is represented by the first sensor data or by the first data and the second data; and responsive to a determination that the third value is above a third threshold, triggering a third action in the monitoring system.
8. The computer implemented method according to claim 7, wherein triggering the third action comprises triggering a recording of the first sensor data and/or the second sensor data in a storage medium.
9. The computer implemented method according to any one of claim 1 to claim 8, wherein triggering the second action comprises triggering a decrease in a privacy level implemented by the monitoring system.
10. The computer implemented method according to any one of claim 1 to claim 9, wherein the analysis of the first sensor data to identify the one or more components is performed in response to the determination that the first value is above the first threshold.
11. The computer implemented method according to any one of claim 1 to claim 10, wherein the first sensor is a camera and the first sensor data comprises images captured by the camera.
12. The computer implemented method according to claim 11, wherein a component of the scenario represented by the first sensor data is a person, and the analysis of the first sensor data to identify one or more of the components comprises applying facial recognition to identify the person.
13. The computer implemented method according to any one of claim 1 to claim
12, wherein accessing data for information on the identified one or more components comprises accessing data for information relating to previous actions associated with the identified one or more components.
14. The computer implemented method according to any one of claim 1 to claim
13, wherein analysing the first sensor data to identify one or more of the components comprises analysing the first sensor data to identify at least two of the components, and accessing data for information on the identified one or more components comprises accessing data for information on a relationship between the at least two components.
15. The computer implemented method according to any one of claim 1 to claim
14, further comprising: determining, based on the second value or modified second value, that the first scenario is occurring or is about to occur; and responsive to the determination that the first scenario is occurring or about to occur, issuing an alert that the first scenario is occurring or is about to occur.
16. The computer implemented method according to any one of claim 1 to claim
15, wherein the method is performed by the first sensor or a sensor gateway associated with the first sensor.
17. The computer implemented method according to any one of claim 1 to claim 16, wherein accessing data for information on the identified one or more components comprises performing a look-up in a database for information on the identified one or more components; and modifying the first value based on the results of the data access comprises modifying the first value based on the results of the look-up.
18. The computer implemented method according to ay one of claim 1 to claim 17, wherein the first value is the first probability that a given first scenario is represented by the first sensor data; and the second value is the second probability that the first scenario is represented by the first sensor data.
19. Apparatus for controlling actions of a monitoring system, the apparatus being configured to perform the method according to any one of claim 1 to claim 18.
20. The apparatus according to claim 19, wherein the apparatus is part of the first sensor or a sensor gateway associated with the first sensor.
21. A monitoring system comprising the apparatus according to claim 19 or claim 20, and the first sensor.
22. A computer program comprising instructions which, when executed by a computer, cause the computer to perform the method according to any one of claim 1 to claim 18.
PCT/GB2021/052473 2020-12-23 2021-09-23 A Method and Apparatus for Controlling Actions of a Monitoring System WO2022136819A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB2020535.7 2020-12-23
GB2020535.7A GB2602790A (en) 2020-12-23 2020-12-23 A method and apparatus for controlling actions of a monitoring system

Publications (1)

Publication Number Publication Date
WO2022136819A1 true WO2022136819A1 (en) 2022-06-30

Family

ID=74221488

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2021/052473 WO2022136819A1 (en) 2020-12-23 2021-09-23 A Method and Apparatus for Controlling Actions of a Monitoring System

Country Status (2)

Country Link
GB (1) GB2602790A (en)
WO (1) WO2022136819A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060209072A1 (en) * 2005-03-21 2006-09-21 Marc Jairam Image-based vehicle occupant classification system
US20200221054A1 (en) * 2013-03-15 2020-07-09 James Carey Video identification and analytical recognition system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060209072A1 (en) * 2005-03-21 2006-09-21 Marc Jairam Image-based vehicle occupant classification system
US20200221054A1 (en) * 2013-03-15 2020-07-09 James Carey Video identification and analytical recognition system

Also Published As

Publication number Publication date
GB2602790A (en) 2022-07-20
GB202020535D0 (en) 2021-02-03

Similar Documents

Publication Publication Date Title
US11735018B2 (en) Security system with face recognition
CN110933955B (en) Improved generation of alarm events based on detection of objects from camera images
CN108600202B (en) Information processing method and device and computer readable storage medium
US20200013273A1 (en) Event entity monitoring network and method
US11461441B2 (en) Machine learning-based anomaly detection for human presence verification
JPWO2007138811A1 (en) Suspicious behavior detection apparatus and method, program, and recording medium
CN114218992B (en) Abnormal object detection method and related device
US11688220B2 (en) Multiple-factor recognition and validation for security systems
US10650651B1 (en) Automated geospatial security event grouping
KR101979375B1 (en) Method of predicting object behavior of surveillance video
CN112211496B (en) Monitoring method and system based on intelligent door lock and intelligent door lock
WO2022136819A1 (en) A Method and Apparatus for Controlling Actions of a Monitoring System
US20240046702A1 (en) Deep learning-based abnormal behavior detection system and method using anonymized data
Nishanthini et al. Smart Video Surveillance system and alert with image capturing using android smart phones
KR20220000209A (en) Recording medium that records the operation program of the intelligent security monitoring device based on deep learning distributed processing
US20230005360A1 (en) Systems and methods for automatically detecting and responding to a security event using a machine learning inference-controlled security device
Nandhini et al. IoT Based Smart Home Security System with Face Recognition and Weapon Detection Using Computer Vision
KR102635351B1 (en) Crime prevention system
CN117173847B (en) Intelligent door and window anti-theft alarm system and working method thereof
US20230360402A1 (en) Video-based public safety incident prediction system and method therefor
WO2023084814A1 (en) Communication system, server, communication method, and communication program
Hernandez et al. Community-Based Multi-layer Analytics Architecture for Civic Violations
CN117315818A (en) Intelligent door lock alarm control method and device and electronic equipment
KR20220031258A (en) A method for providing active security control service based on learning data corresponding to counseling event
KR20220031316A (en) A recording medium in which an active security control service provision program is recorded

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21782796

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21782796

Country of ref document: EP

Kind code of ref document: A1