GB2578746A - Monitoring system - Google Patents

Monitoring system Download PDF

Info

Publication number
GB2578746A
GB2578746A GB1818096.8A GB201818096A GB2578746A GB 2578746 A GB2578746 A GB 2578746A GB 201818096 A GB201818096 A GB 201818096A GB 2578746 A GB2578746 A GB 2578746A
Authority
GB
United Kingdom
Prior art keywords
edge device
edge
information
sensor
central
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1818096.8A
Other versions
GB201818096D0 (en
Inventor
Macdonald Harris Andrew
Basil Harrold William
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Telensa Holdings Ltd
Original Assignee
Telensa Holdings Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Telensa Holdings Ltd filed Critical Telensa Holdings Ltd
Priority to GB1818096.8A priority Critical patent/GB2578746A/en
Publication of GB201818096D0 publication Critical patent/GB201818096D0/en
Priority to GB1916083.7A priority patent/GB2580495A/en
Publication of GB2578746A publication Critical patent/GB2578746A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/54Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • G08B25/014Alarm signalling to a central station with two-way communication, e.g. with signalling back
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0116Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • G08G1/0129Traffic data processing for creating historical data or processing based on historical data
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0137Measuring and analyzing of parameters relative to traffic conditions for specific applications
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/052Detecting movement of traffic to be counted or controlled with provision for determining speed or overspeed
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/056Detecting movement of traffic to be counted or controlled with provision for distinguishing direction of travel

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Traffic Control Systems (AREA)

Abstract

A monitoring system comprises at least one monitoring (edge) device (20,22,24,26,28) and a central device 40. Each monitoring device is provided at a location within a coverage area, and comprises at least one sensor generating sensor data. The sensor may be one of a camera, a RADAR/LIDAR sensor, environmental/pollution sensor, an occupancy/presence sensor, an RFID detector, a radio detector or an acoustic/sound sensor. Each monitoring device processes the sensor data to obtain attribute information (e.g. position, identity, condition etc) about an entity in the vicinity of the monitoring device. The monitoring device transmits the attribute information to the central device. The central device combines received attribute information (e.g. from multiple monitoring devices) to obtain information about the attribute of the entity within the coverage area. The system is able to monitor attributes of entities within an area and send them to a central device for collation/reporting purposes. The system may be deployed in a variety of applications including traffic/pedestrian monitoring (figure 1), wind farm/turbine monitoring and supervision of autonomous robots in a warehouse.

Description

Intellectual Property Office Application No. GII1818096.8 RTM Date:26 April 2019 The following terms are registered trade marks and should be read as such wherever they occur in this document: Bluetooth Wi-Fi Intellectual Property Office is an operating name of the Patent Office www.gov.uk/ipo
MONITORING SYSTEM
This relates to a monitoring system, and in particular to a system for monitoring an environment and entities within it, including, but not limited to, physical objects.
Systems have been described, in which multiple roadside sensors are deployed, in order to obtain information about an environment. For example, the roadside sensors may be cameras, and the system may obtain information about the movement of vehicles within the coverage area of the cameras.
However, when there are a large number of cameras, and the image data is analysed at a central site, transmitting all of the image data to the central site requires a significant backhaul infrastructure to carry the data. Moreover, storage of all the image data at a central site may present greater opportunity for unauthorised access to that data, which can be more easily controlled if the data is stored across multiple roadside installations. One possible prior art solution is to perform the image analysis at the camera sites and return only compressed and selected images to reduce the backhaul demands, with the camera data being lost and therefore unavailable for later analysis to look for specific features that were not recognized as significant when the image analysis was programmed.
According to an aspect of the invention, there is provided a system comprising: at least one edge device; and a central device, wherein the or each edge device is provided at a respective location within a coverage area, wherein the or each edge device comprises at least one sensor generating sensor data, wherein the or each edge device is configured to manipulate the sensor data from the or each sensor to obtain information about one or more attribute of an entity in the environment in a vicinity of the edge device, wherein the or each edge device is configured to transmit said information about the attribute of the entity to the central device, and wherein the central device is configured to receive said information about the attribute of the entity from the or each respective edge device, and to combine the received information to obtain information about the attribute of the entity within said coverage area.
This has the advantage that only the information about the attribute of the entity need be transmitted to the central device, allowing the central device to obtain the relevant information without a large data transmission requirement.
In some embodiments, the or each edge device comprises a memory, configured for storing the sensor data generated by the at least one sensor, in an encrypted form.
This allows the original sensor data to be retained for possible future use.
In some embodiments, the central device is configured to send search queries to one or more edge device, requesting the one or more edge device to perform a search on the stored sensor data. This allows searches to be performed on sensor data, even when the nature of the search was not defined at the time that the data was gathered.
In some embodiments, the central device is configured to send manipulation requests to one or more edge device, requesting the one or more edge device to perform a specific manipulation on the sensor data. This allows the system to change in response to new requirements.
In some embodiments, the central device is configured to send data requests to one or more edge device, requesting the one or more edge device to transmit some or all of its stored sensor data to the central device. This allows the central device to retrieve sensor data as required.
In some embodiments, the system comprises a plurality of edge devices. This allows the central device to gather information from multiple sources, for example over a wider coverage area.
In some embodiments, at least one edge device is configured to communicate directly with at least one other edge device. This allows the system to react to situations with minimal delay. Specifically, the at least one edge device may be configured to transmit the information about the attributes of entities directly to the at least one other edge device.
In some embodiments, the or each edge device comprises a plurality of sensors generating respective sensor data. This allows the edge device to generate information about one or more attribute using the sensor data from the plurality of sensors.
According to another aspect of the invention, there is provided an edge device, having the edge device features set out above.
According to another aspect of the invention, there is provided a central device, having the central device features set out above.
BRIEF DESCRIPTION OF DRAWINGS
For a better understanding of the invention, and to show how it may be put into effect, reference will now be made, by way of example, to the accompanying drawings, in which: Figure 1 shows a use of a part of a system in accordance with the disclosure; Figure 2 shows a part of the system; Figure 3 illustrates a form of an edge device in accordance with one aspect; Figure 4 illustrates a form of a central device in accordance with one aspect; and Figure 5 illustrates communication between two edge devices.
DETAILED DESCRIPTION
Figure 1 shows a possible deployment of a system in accordance with the present
disclosure.
Specifically, Figure 1 shows an arrangement of streets 10, 12, 14, 16, 18 in an urban environment. For monitoring the environment, edge devices 20, 22, 24, 26, 28, 30 are provided in the illustrated area. The edge devices are provided at respective locations in an overall coverage area of the system. In some embodiments, the edge devices are provided at respective fixed locations. In other embodiments, some or all of the edge devices may be mounted on mobile platforms. For example, one or more edge device may be mounted on a vehicle.
The number of edge devices can be any convenient number for providing the intended coverage. In a typical urban environment, the number of edge devices may be in the tens, hundreds, or thousands.
In this illustrated example, some or all of the edge devices 20, 22, 24, 26, 28, 30 may be mounted on existing street furniture, for example street lamps or associated poles, brackets or similar, in order to avoid the need for providing additional separate mounting points.
Although the system will be described with reference to its use in monitoring an exterior urban environment, it may be used in interior environments, or in exterior environments away from streets etc. Figure 2 shows the functional relationships in the system, in which the edge devices, namely the edge devices 20, 22, 24, 26, 28 in Figure 2, have a functional connection to a central device 40.
As described in more detail below, each edge device has one or more sensor, which generates sensor data. The sensor data is processed, in a way which compresses it, to form information about entities within the environment. This compressed information can then be sent to the central device 40.
The edge devices and the central device are described in more detail below.
Thus, Figure 3 shows one of the edge devices 20 by way of illustration, though the other edge devices in the system may be similar.
The edge device 20 includes one or more sensor 50 For example, the sensor(s) 50 may include but are not limited to: Cameras; Radar or Lidar detectors; Environmental sensors; Pollution sensors; Occupancy or presence detectors; RFID detectors; Radio detectors; Acoustic sensors.
Each sensor 50 is configured to generate sensor data. For example: A camera may provide image data and/or video data of an overall scene in the area local to the camera, or may provide image data of a limited area or of specific features or objects in that area; its field of view may be fixed or it may be steerable and or zoomable under control of the edge device 20, the central device 40 or another device or user; Radar or Lidar detectors may provide data relating to the presence and/or movement of individual objects or assemblages of objects within their coverage area, or to the general level of the occupancy of, or movement within, part or all of their coverage area; Environmental sensors may provide data descriptive of weather conditions (for example temperature, wind, cloud cover, rainfall), light level, frost risk (for example by sensing (road or other) surface temperatures in the vicinity of the sensor), humidity or barometric pressure; Pollution sensors may provide data descriptive of air quality (for example particulate levels or levels of gases such as CO, CO2 or volatile organic compounds); Occupancy or presence sensors may provide data related to the presence of people or other objects in the vicinity of the sensor or relating to movement of the same, and technologies employed may for example include Passive Infra-Red (PIR) or microwave-based sensors; RFID detectors may provide data relating to the presence and or movement of objects carrying RFID tags in the vicinity of the detectors; Radio detectors may provide data relating to radio transmissions from nearby wireless transmitter devices; Acoustic sensors may for example record data representative of sound in the area of the sensor (with a frequency range similar to or greater or less than a typical human hearing range) or may be designed to detect specific acoustic artefacts or events such as a gunshot. With the use of multiple acoustic sensors then a vector to the source of the noise could also be included.
Each sensor 50 is connected to a memory 52 in the edge device. The sensor data generated by the or each sensor can therefore be stored in the memory 52. For example, the memory 52 may be a first-in, first-out type memory, storing the data generated during a rolling time window, for example of a few days. In some embodiments, all of the data generated by the sensor(s) 50 is stored by the memory 52. In other embodiments, only a fraction of the sensor data is stored. For example, in the case of image data, it may be sufficient to store only one image frame per second, for example. The image data may be stored together with an associated time stamp indicating a time at which the data was generated. In the case of video image data, this may be stored in a compressed format, for example according to MPEG H.264 or H.265 format.
In addition, the or each sensor 50 is connected to a processor 54 in the edge device.
The processor 54 is configured to manipulate the sensor data from the or each sensor to obtain information about one or more entities within the environment in the vicinity of the edge device. For example, when one of the sensors is an imaging device, the processor 54 is configured to manipulate image data from the imaging device to obtain information about objects that are visible in the image. Specifically, the processor 54 may be configured to manipulate image data from the imaging device to obtain information about the movement of objects that are visible in the image.
The processor 54 is connected to the memory 52, so that the obtained information about the one or more entity in the environment can be stored. The obtained information may be stored together with an associated time stamp indicating a time at which the information was generated and/or the time or times at which the sensor data was generated.
In particular, the memory 52 may be configured for storing information (for example the sensor data and the information about one or more entity in the environment) securely, for example in an encrypted form. Any suitable form of encryption may be used.
In addition, as shown in Figure 3, each edge device 20 includes a transceiver 56, for communication with one or more other devices. In particular, the transceiver 56 is configured for communicating with the central device 40. The transceiver 56 is connected to the sensor(s) 50, and/or the memory 52, and/or the processor 54. If the transceiver 56 is connected to the sensor(s) 50, it is able to transmit raw sensor data to the central device 40 immediately. If the transceiver 56 is connected to the memory 52, it is able to transmit stored data to the central device 40. As mentioned above, the data that is stored in the memory 52 may comprise raw sensor data, and information obtained by the processor 54 about one or more entity in the environment. If the transceiver 56 is connected to the processor 54, it is able to transmit the information about the entity or entities in the environment to the central device 40 immediately.
For example, the transceiver 56 may be configured for communicating with the central device 40 for example by a public mobile cellular network, a fixed broadband data network (e.g. ADSL or fibre) or a radio network deployed for this application or for other smart city applications.
In particular, the edge device 20 may be configured for transmitting information to the central device 40 in an encrypted form. Any suitable form of encryption may be used.
In addition, the edge device 20 may be configured for transmitting information (that is, for example, the sensor data and/or the information about one or more entity in the environment) to the central device together with an associated time stamp indicating a time at which the information was generated.
The edge device 20 is also configured for receiving information from the central device 40. For example, the edge device 20 may be configured for receiving from the central device 40 information regarding the processing to be performed by the processor 54. Thus, the central device 40 may send instructions to the edge device, indicating what form of processing should be applied to the sensor data, in order to identify entities in the environment. Moreover, the instructions may indicate what form of processing the processor 54 should perform in order to detect specified attributes of the identified
S
entities. In addition, the central device 40 may send instructions to the edge device 20, controlling which information about the one or more entity in the environment, including information about one or more specified attribute of the entity, should be transmitted to the central device.
In other embodiments, as described in more detail below, the transceiver 56 is additionally configured for communicating with other edge devices. Specifically, two edge devices may be configured for communicating with each other, for example, by means of a public mobile cellular network, a fixed broadband data network (e.g. ADSL or fibre) or a radio network deployed for this application or for other smart city applications or by a short-range radio technology such as Bluetooth or Wi-Fi.
Figure 4 shows a central device 40 by way of illustration. Although Figure 2 shows only one central device 40, any suitable number of central devices may be provided in the system depending on the required system capacity, and other central devices in the system may be similar to that shown in Figure 4. The or each central device 40 may for example be provided in a cloud computing environment.
The central device 40 includes a memory 70. Information received from the or each edge device can therefore be stored in the memory 70. For example, the memory 70 may be considerably larger than that provided in the respective edge devices, with the capacity to store several days, week, or months of data from each edge device that is connected to it.
The central device 40 also includes a processor 72. The processor 72 is configured to manipulate the data received from the or each edge device to obtain information about one or more entity in the environment in the overall coverage area.
For example, when one of the sensors in each of the edge devices is a respective imaging device, and hence the entities being considered by that edge device are objects that are visible in the image, the processor 54 in the edge device 20 is configured to manipulate image data from the imaging device to obtain information about attributes of those objects. The processor 72 may then be configured to manipulate the information received from multiple edge devices to obtain information about attributes of objects in the coverage area of multiple edge devices.
More specifically, when the processor 54 in each edge device 20 is configured to manipulate image data from the respective imaging device to obtain information about movement of objects that are visible in the image generated by that imaging device, the processor 72 in the central device 40 may be configured to obtain information about the movement of objects between locations where they are visible by different imaging devices.
The processor 72 is connected to the memory 70, so that the obtained information about the one or more entity in the environment can be stored.
In addition, as shown in Figure 4, the central device 40 includes a transceiver 74, for communication with other devices, and in particular for communicating with the edge devices 20. The transceiver 74 is connected to the memory 70, and/or the processor 15 72.
As mentioned above, the transceiver 74 may be configured for communicating with the edge device 20 for example by a public mobile cellular network, a fixed broadband data network (e.g. ADSL or fibre) or a radio network deployed for this application or for other smart city applications.
Figure 5 illustrates this. Specifically, Figure 5 illustrates a situation in which two edge devices 20, 22 are able to communicate with each other directly, over a communications link 80. For example, in some embodiments, two or more edge devices may be configured (for example by control information transmitted from the central device) to communicate information about entities that are detected, or about attributes of those entities.
A system as described above may be deployed in any environment that is to be monitored, for example an exterior urban environment, an interior environment, or an exterior environment away from streets etc. For example, a system may be deployed in a warehouse or similar industrial facility.
The sensors 50 may include imaging devices, but may additionally or alternatively include other sensors that are triggered by locating devices (such as RFID tags) on autonomous robotic or other devices within the warehouse.
The relevant entities in the environment may then relate to the location of the robotic or other devices within the warehouse, or other aspects of the operational state of such devices. In that case, the processing devices 54 may be configured to process the raw data from the sensor(s) 50 to obtain relevant information about the locations of the devices, or about the other aspects of the operational state.
As another example, a system may be deployed in a wind farm, where the sensors 50 may include devices for monitoring the state of the wind turbines, and devices for monitoring the environment (for example monitoring the wind speed and direction). In that case, the processing devices 54 may be configured to process the raw data from the sensor(s) 50 to obtain relevant information about attributes of the wind turbines and/or about their interaction with the wind conditions. For example, each turbine may be provided with sensors 50 which continuously report such conditions as temperature, vibration, and noise. The entity being considered may then be the turbine, and the respective processing device 54 may be configured to process the raw data from the sensors to obtain relevant information about attributes of the turbine, such as the state of wear of the bearings, the condition of the blades, etc. As another example, a system may be deployed with an edge device on each of multiple vehicles (for example autonomous vehicles), where the sensors 50 may include imaging devices or other sensors, and the processing devices 54 may be configured to process the raw data from the sensors to obtain information about attributes of the vehicle and its environment, such as the location and velocity of the vehicle on which the respective edge device is mounted, and information about the location, velocity and classification of other objects (such as vehicles, pedestrians, etc) in the vicinity of the vehicle, for the purposes of navigation and safety.
As a further example, as mentioned above, Figure 1 shows a system deployed in an urban environment, with streets 10, 12, 14, 16, 18. Edge devices 20, 22, 24, 26, 28, 30 are provided at respective locations (which in this example are fixed locations). Specifically, the edge devices 20, 22, 24, 26, 28, 30 may be mounted on street lights, and the exact locations of the edge devices may be known from data provided by a Giobal Navigation Satellite System (GNSS), for example the Global Positioning System (GPS). Providing each edge device with a GPS receiver, and requiring it to report its location to the central device, means that installation is simplified, because the edge device can report its exact location, without requiring any manual intervention.
The system can be scaled as desired, from a system with just a single edge device, to a system with a large number of edge devices, where Figure 1 shows a small part of a very much larger system. For example, the edge devices of the system may have contiguous or non-contiguous coverage areas, with an overall coverage area that covers a part of one street, a whole street, a neighbourhood, a borough, a city, or a country, or is world-wide.
A single central device may be connected to every edge device, or separate central devices may be connected to respective groups of edge devices. Moreover, where there are multiple central devices, these may also be connected in a hierarchical structure, with the data available in the lowest layer of central device being organised and merged into larger federated data sets at a higher level, for example on a citywide or national level.
In this illustrative example, the sensors 50 on each edge device include a video image sensor, which may for example be a standard video camera, or which may additionally be provided with Automatic Number Plate Recognition (ANPR) functionality. The sensors on each device also include a radar based object detection sensor, for example using millimetre wave spectrum (i.e. in the frequency range between about 30GHz and 300GHz). Thus, the data generated by the sensors 50 includes raw image data and raw radar data about objects that are in the field of view of those sensors.
The sensor package is such that it can accurately detect objects in 2D or 3D space at as high a sampling frequency rate as possible.
The processor 54 in each of the edge devices may then identify objects that are within the field of view. For example, returning to Figure 15a vehicle 100 (for example, a car, truck, motorcycle, etc) can be identified in the field of view of the edge device 20. The identification may for example be performed by an artificial neural network (ANN), for example a convolutional neural network (CNN) or a deep neural network (DNN). The processor 54 is able to identify the vehicle 100 in images taken at different times, and is therefore able to track the vehicle as it moves over time. Further, additional data properties can be added over time. For example, when a moving object is first identified by a radar sensor, it may be possible to determine from its speed that it is a vehicle, but impossible to determine what type of vehicle it is. As it comes closer to the edge device, a camera may determine that it is a car, for example. Then, as the car comes closer still to the edge device, the Automatic Number Plate Recognition feature of the sensor may be able to read the number plate of the car.
Thus, the vehicle 100 is an entity in the environment around the edge device 20 and its position and movement are, amongst others, its attributes. The information about the entity and these attributes is therefore stored in the memory 52.
The processor 54 may similarly obtain information about other attributes of entities in the environment. As one example, the processor 54 may obtain information about the position and movement of pedestrians or other people, in the field of view of the sensors. As another example, the processor 54 may obtain information about the position of fixed objects such as benches and litter bins. The positions may be expressed in terms of co-ordinates (for example in the GPS system or in World Geodetic System 1984 (WG584) projections).
The information that is stored in each edge device can be regarded as a "digital twin" of the environment itself. That is, the stored information acts as a digital representation of the physical street environment with the various tracked objects in it.
As described above, some or all of the raw data generated by the sensors 50, and the synthesized information about the entities in the environment, such as the movement of the objects that are detected by the sensors, is stored securely in the memory 52.
However, the information is also made available under specified conditions to the central device 40. Restrictions on what data can be transmitted from an edge device to the central device may be imposed by policies around trust and privacy and relevant laws, as well as by any backhaul bandwidth limitations that may exist.
One advantage of deriving the information about the entities in the environment in the edge devices, and only transmitting that derived information to the central device 40, is that this requires very much less bandwidth than transmitting the entirety of the sensor data. This means that the cost of transmission is reduced, and the latency and quality of service are improved. Moreover, there is less reliance on the cloud computing environment when this is used, and in particular there is less reliance placed in the security of the cloud computing environment, because the raw sensor data will in general not be stored in the cloud. Moreover the system can still operate when the backhaul data link is interrupted, because the data can be transmitted when the link is restored.
If the information that is stored in each edge device is regarded as a "digital twin" of the environment around the respective edge device, acting as a digital representation of the physical street environment with the various tracked objects in it, then the information that is stored in the central device can be regarded as a higher layer, or federated, "digital twin" of the wider environment, acting as a digital representation of that wider environment and the various tracked objects.
As mentioned above, the edge device may be controlled from the central device 40. For example, the central device 40 may instruct each edge device to transmit information identifying each vehicle that it identifies, and information about the speed and direction of travel of that vehicle. Alternatively, the central device 40 may instruct each edge device to transmit information identifying each truck or van that it identifies, and information about the speed and direction of travel of each such vehicle.
These instructions may be varied or updated as required. For example, after the system has been operating for a period of time, transmitting information identifying each truck or van that it identifies, the central device 40 may instruct the edge device to start transmitting information identifying every vehicle that it identifies thereafter.
Moreover, because each edge device stores some or all of its raw sensor data, as well as the information that it derives about the entities in the environment, the central device may instruct one or more edge device to transmit to it either some or all of that stored data or information. For example, in a typical scenario, the central device may instruct one or more edge device to transmit to it the derived information about attributes of the entities as the information is generated. The central device may also or alternatively instruct one or more edge device to transmit to it some part of the stored sensor data, either immediately or when some criterion is met.
Further, the central device may instruct one or more edge device to perform some additional processing on the stored sensor data, in order to extract information about additional entities in the environment that were not originally considered when the data was generated.
For example, in a case where each edge device is configured to generate and transmit information about the movement of vehicles, a central device may send a request to one or more edge device to perform additional processing on stored data, which relates to a previous time period. For example, the edge device may be instructed to analyse the stored raw image data in order to identify the presence of people in the field of view generally, or to identify the presence of people with more specific characteristics, such as a person wearing red clothing, or a lone child, or person carrying a weapon, for example. When the central device instructs the edge device to repeat a search that it has performed previously, the edge device may search the data only starting from the last time the search was executed. This reduces the processing load on the edge device, and potentially also reduces the backhaul costs from the edge devices to the central device.
Thus, the storage of the detailed raw image data on each edge device means that it is possible to extract information about specific entities in the environment of the edge device, even when the specific entities are only identified at a time that is later than the time at which the data was obtained.
When a central device 40 requests one or more edge device to analyse the sensor data to obtain information about one or more specific entity in the environment, the central device can also specify the way in which it is notified about that entity.
For example, when the attribute being monitored is the movement of an entity such as a vehicle, the edge device may send the relevant information continuously, or only when the attribute meets some specified criterion, for example movement above a specified speed limit, or movement into a particular area. Similarly, when the central device 40 instructs the edge device to analyse the previously stored raw image data in order to identify the presence of, say, a person wearing red clothing, it may also instruct the edge device to notify it as soon as such a person is identified. Alternatively, the central device 40 may instruct the edge device to execute a trigger to inform an external authorised application (via a subscribe/publish interface) of that event, and possibly to transmit the relevant data to it.
The central device may also perform a search on the inforrnation that has been transmitted to it from the edge devices.
Thus, the process of collecting information about the environment is separated from the applications using the data, which are typically carried out in the central device, or in a further device having access to the data stored in the central device.
Thus, storing the raw sensor data at the edge device means that all of the data is available, and can be searched subsequently, without requiring expensive backhaul of all of the data to the central device. Abstracting from the raw data the attributes of the entities in the environment, and transmitting that information to the central device, means that the backhaul requirement is significantly reduced.
As mentioned above, storing the raw sensor data at the edge device means that new applications can be developed at the central device, and can access the raw sensor data from an earlier time period.
This architecture also means that edge devices can be upgraded piecemeal. For example, more accurate sensor hardware can be introduced at just some of the edge devices, without affecting the overall architecture. For example, some edge devices may be provided with image sensors having increased camera resolution or improved shutter speed. The same edge devices, or other edge devices, may be provided with radar sensors having increased radar object resolution and ranging. Similarly, improved detection and tracking algorithms can be introduced at some or all edge devices while they are in operation.
In addition, the upgrading of the edge devices may include adding new sensor types after initial deployment of the edge device, thereby improving performance, and potentially allowing the possibility of collecting different data, and generating information about additional attributes of entities in the environment.
In the illustrative example above, the operation of the system has been described with reference to image sensors, for tracking physical objects. The environment in the locality of an edge device may also be considered to have aspects which are less physical such as, but not limited to, 'environmental sound state' or 'environmental air quality' These too may be considered to be entities and may be described by reference to their attributes.
In the context of the less physical entities described above, additionally, or alternatively, some or all edge devices may be provided with audio microphones, for detecting environmental sounds. Then, the processor in the respective edge device may analyse the raw data provided by the microphone(s) to generate information about attributes of the environment. That is, the processor may attempt to detect attributes of the audio environment, for example specific noises, such as human speech, or gunshots, or excessively loud vehicles, or sounds that are typical of specific vehicles.
Additionally, or alternatively, some or all edge devices may be provided with chemical sensors, for detecting pollutants or other chemicals in the environment. Then, the processor in the respective edge device may analyse the raw sensor data to generate information about attributes of, for example, the 'air quality' entity in the environment.
For example, the processor may notify the central device when a level of a particular pollutant exceeds a threshold level. Meanwhile, the raw sensor data remains stored in the edge device. The central device may receive the information about the attributes of the 'air quality' entity from multiple edge devices, and may combine this received information to form a representation of the 'air quality' entity in the wider coverage area.
Additionally, or alternatively, some or all edge devices may be provided with radio detectors, for detecting radio transmissions from nearby devices, for example detecting unpaired Bluetooth discovery data. Then, the processor may analyse the raw data to obtain information about elements in the environment, for example in the form of MAC addresses of the transmitting devices.
Returning to Figure 5, this shows a further aspect of the system as it may be used to track vehicles through an urban environment.
As described above, in the system shown in Figure 1, the edge device 20 may be able to detect the presence of the vehicle 100 in the street 10, and can also determine that it is heading in the direction of the junction with the streets 12, 14, 16.
In order to provide low latency object tracking and faster acquisition of fast moving objects between edge device installations, edge devices may be able to communicate directly, for example over VVi-Fi or some other high speed technology, with other nearby edge devices. This provides a much more efficient and less error prone hand-over of detected objects as they leave the range of one edge device and enter into the range of another. The edge devices may be deployed so that their coverage areas are overlapping or contiguous, as this may result in better quality and a higher confidence in the accuracy of the object's path, but this is not essential.
This direct communication may be controlled by the central device 40. For example, if one edge device detects movement of a vehicle of interest, and reports this to the central device, the central device may instruct that one edge device to cooperate with nearby edge devices to track that vehicle through their respective coverage areas.
When this direct communication is configured, the originating and receiving edge devices are configured to transmit and receive such notifications. Any device acting as an originating device can establish a communication path with selected adjacent devices, if this has not already been done.
Thus, in Figure 1, when the edge device 20 detects the presence of the vehicle 100, heading in the direction of the junction with the streets 12, 14, 16, it can send a handover to the edge devices 22, 24, 28, which cover the streets along which the vehicle 100 may travel next (as shown at 100a, 100b, 100c). The handover may be sent by broadcasting or multicasfing a message to the relevant edge devices, or sending separate messages to each relevant edge device, describing the attributes of the entity or object. For example, in this illustrated example of a vehicle, the information sent in the handover message may include some or all of: a unique tracking ID, a last known GPS location of the vehicle, further classification of the vehicle (whether it is a car, truck, etc), the colour of the vehicle, the number plate details of the vehicle, and any other available attributes that might help the receiving edge device to identify the vehicle in its image data. This same data can be passed on to subsequent edge devices as the vehicle passes from one coverage area to another, and subsequent edge devices can also include additional information or modified information, for example if a first edge device identifies a specific vehicle but is unable to read its number plate, and a second edge device is able to read the number plate, the number plate details may be included in the information passed from the second edge device to a third edge device, etc. Thus, as shown in Figure 5, information about the vehicle 100 is sent from one edge device to one or more other edge device over the relevant communications link 80.
Typically only one of those edge devices will detect the vehicle 100 and process it in a way similar to that in which it would process an entity established within its own range, others will not detect the vehicle and therefore not process it as an entity in the environment in their range.
If the edge device 28 detects the presence of the vehicle on the street 14, heading away from the junction with the streets 10, 12, 16, as shown at 100b, it can determine that the vehicle may continue along the street 14 as shown at 100d, and remain within its own coverage area, or may turn along the street 18, as shown at 100e, and so it may send a handover to the edge device 26. If the edge device 28 detects the continued presence of the vehicle on the street 14, as shown at 100d, it can send a handover to the edge device 30, as this covers the street 14 along which the vehicle is expected to continue travelling, as shown at 100f.
To carry out this handover functionality, each edge device must be provided with or itself establish, based on communication between edge devices, information about a nearby street layout, and/or information about the locations of other edge devices. The relevant information may be specifically downloaded to the respective edge devices from the central device, or the edge devices may be provided with access to map information, which each edge device can then access to determine its own location, with each edge device being further provided with a means to discover other nearby devices and their locations.
It would also be possible to filter the classifications of entities that can be handed over where any privacy concerns exist.

Claims (21)

  1. CLAIMS1. A system comprising: at least one edge device; and a central device, wherein the or each edge device is provided at a respective location within a coverage area, wherein the or each edge device comprises at least one sensor generating sensor data, wherein the or each edge device is configured to manipulate the sensor data from the or each sensor to obtain information about one or more attribute of an entity in the environment in a vicinity of the edge device, wherein the or each edge device is configured to transmit said information about the attribute of the entity to the central device, and wherein the central device is configured to receive said information about the attribute of the entity from the or each respective edge device, and to combine the received information to obtain information about the attribute of the entity within said coverage area.
  2. 2. A system according to claim 1, wherein the or each edge device comprises a memory, configured for storing the sensor data generated by the at least one sensor, in an encrypted form.
  3. 3. A system according to claim 2, wherein the central device is configured to send search queries to one or more edge device, requesting the one or more edge device to perform a search on the stored sensor data.
  4. 4. A system according to any preceding claim, wherein the central device is configured to send manipulation requests to one or more edge device, requesting the one or more edge device to perform a specific manipulation on the sensor data.
  5. 5. A system according to any preceding claim, wherein the or each edge device is configured to transmit said information about entities in the environment to the central device in an encrypted form.
  6. 6. A system according to any preceding claim, wherein the or each edge device is configured to transmit said information about entities in the environment to the central device together with an associated time stamp.
  7. 7. A system according to any preceding claim, wherein the central device is configured to send data requests to one or more edge device, requesting the one or more edge device to transmit some or all of its stored sensor data to the central device.
  8. 8. A system according to claim 7, wherein the or each edge device is configured to transmit said stored sensor data to the central device in an encrypted form.
  9. 9. A system according to any preceding claim, comprising a plurality of edge devices
  10. 10. A system according to claim 9, wherein at least one edge device is configured to communicate directly with at least one other edge device.
  11. 11 A system according to claim 10, wherein the at least one edge device is configured to transmit the information about the attributes of entities directly to the at least one other edge device.
  12. 12. A system according to any preceding claim, wherein the or each edge device comprises a plurality of sensors generating respective sensor data.
  13. 13. A system according to any preceding claim, wherein at least one edge device is provided at a respective fixed location.
  14. 14. A system according to any preceding claim, wherein at least one edge device is provided on a respective mobile platform.
  15. 15. A system according to any preceding claim, wherein the at least one sensor is selected from a group comprising: a camera; a radar or lidar detector; an environmental sensor; a pollution sensor; an occupancy or presence detector; an RFID detector; a radio detector; and an acoustic sensor.
  16. 16. A system comprising: a plurality of edge devices; and a central device, wherein the edge devices are provided at respective fixed locations within a coverage area, wherein each edge device comprises a plurality of sensors generating respective sensor data, wherein the sensors include at least one imaging device generating respective image data, wherein each edge device is configured to process and combine the sensor data from the sensors to obtain information about attributes of entities in a vicinity of the edge device, wherein each edge device is configured to transmit said information about the attributes of entities to the central device. and wherein the central device is configured to receive said information about the attributes of entities from the or each respective edge device, and to combine the received information to obtain information about the attributes of entities within said coverage area.
  17. 17. A system according to claim 16, wherein said entities comprise physical objects in the vicinity of the edge device.
  18. 18. A system according to claim 17, wherein said attributes include movement of the physical objects in the vicinity of the edge device.
  19. 19 A system according to one of claims 16 to 18, wherein the sensors comprise a video imaging device and a radar device.
  20. 20. A system according to one of claims 16 to 19, wherein at least one edge device is configured to communicate directly with at least one other edge device.
  21. 21. A system according to claim 20, wherein the at least one edge device is configured to transmit the information about the attributes of entities directly to the at least one other edge device.
GB1818096.8A 2018-11-06 2018-11-06 Monitoring system Withdrawn GB2578746A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB1818096.8A GB2578746A (en) 2018-11-06 2018-11-06 Monitoring system
GB1916083.7A GB2580495A (en) 2018-11-06 2019-11-05 Monitoring system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1818096.8A GB2578746A (en) 2018-11-06 2018-11-06 Monitoring system

Publications (2)

Publication Number Publication Date
GB201818096D0 GB201818096D0 (en) 2018-12-19
GB2578746A true GB2578746A (en) 2020-05-27

Family

ID=64655593

Family Applications (2)

Application Number Title Priority Date Filing Date
GB1818096.8A Withdrawn GB2578746A (en) 2018-11-06 2018-11-06 Monitoring system
GB1916083.7A Withdrawn GB2580495A (en) 2018-11-06 2019-11-05 Monitoring system

Family Applications After (1)

Application Number Title Priority Date Filing Date
GB1916083.7A Withdrawn GB2580495A (en) 2018-11-06 2019-11-05 Monitoring system

Country Status (1)

Country Link
GB (2) GB2578746A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022122750A1 (en) * 2020-12-07 2022-06-16 Schreder S.A. Network system using fog computing
EP4192047A1 (en) 2021-12-01 2023-06-07 BIOT Sp. z o.o. Infrastructural lighting system and method for dynamically adjustable on-demand lighting

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112969022B (en) * 2021-01-29 2023-09-01 新东方教育科技集团有限公司 Camera adjustment method, system, storage medium and electronic equipment
CN112949041B (en) * 2021-02-01 2022-08-05 广东工业大学 Automatic stereoscopic warehouse construction method based on digital twinning

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070039030A1 (en) * 2005-08-11 2007-02-15 Romanowich John F Methods and apparatus for a wide area coordinated surveillance system
US20080157983A1 (en) * 2006-10-17 2008-07-03 Designlink, Llc Remotely Operable Game Call or Monitoring Apparatus
US20100254282A1 (en) * 2009-04-02 2010-10-07 Peter Chan Method and system for a traffic management network
US20140368652A1 (en) * 2013-06-18 2014-12-18 Xerox Corporation Methods and systems for efficiently monitoring parking occupancy
WO2015004325A1 (en) * 2013-07-08 2015-01-15 Seppälä, Mikael Parking system
US20180190098A1 (en) * 2011-07-12 2018-07-05 Cerner Innovation, Inc. Method and process for determining whether an individual suffers a fall requiring assistance

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8902085B1 (en) * 2011-05-17 2014-12-02 Raytheon Company Integrated 3D audiovisual threat cueing system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070039030A1 (en) * 2005-08-11 2007-02-15 Romanowich John F Methods and apparatus for a wide area coordinated surveillance system
US20080157983A1 (en) * 2006-10-17 2008-07-03 Designlink, Llc Remotely Operable Game Call or Monitoring Apparatus
US20100254282A1 (en) * 2009-04-02 2010-10-07 Peter Chan Method and system for a traffic management network
US20180190098A1 (en) * 2011-07-12 2018-07-05 Cerner Innovation, Inc. Method and process for determining whether an individual suffers a fall requiring assistance
US20140368652A1 (en) * 2013-06-18 2014-12-18 Xerox Corporation Methods and systems for efficiently monitoring parking occupancy
WO2015004325A1 (en) * 2013-07-08 2015-01-15 Seppälä, Mikael Parking system

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022122750A1 (en) * 2020-12-07 2022-06-16 Schreder S.A. Network system using fog computing
NL2027061B1 (en) * 2020-12-07 2022-07-07 Schreder Sa Network system using fog computing
EP4192047A1 (en) 2021-12-01 2023-06-07 BIOT Sp. z o.o. Infrastructural lighting system and method for dynamically adjustable on-demand lighting

Also Published As

Publication number Publication date
GB201916083D0 (en) 2019-12-18
GB201818096D0 (en) 2018-12-19
GB2580495A (en) 2020-07-22

Similar Documents

Publication Publication Date Title
GB2580495A (en) Monitoring system
EP3497590B1 (en) Distributed video storage and search with edge computing
US11062606B2 (en) Method and system for vehicle-to-pedestrian collision avoidance
US7385515B1 (en) Surveillance detection system and methods for detecting surveillance of an individual
US10929462B2 (en) Object recognition in autonomous vehicles
EP2780862B1 (en) Mobile and one-touch tasking and visualization of sensor data
Laouira et al. An efficient WSN based solution for border surveillance
KR20180086900A (en) Connected dash cam
CN112073936A (en) System and method for network node communication
GB2583363A (en) Data anonymization
KR20190043396A (en) Method and system for generating and providing road weather information by using image data of roads
US10154393B2 (en) Method, motor vehicle, and system for determining a transmission path
US10999696B1 (en) Distributed geospatial communications system for UAV monitoring
US10388132B2 (en) Systems and methods for surveillance-assisted patrol
CN112789667A (en) System and method for identifying and tracking targets
US20230274647A1 (en) Systems and methods for electronic surveillance
CN113379790B (en) AI early warning positioning method for high-altitude observation object based on 3D model
JP2024517394A (en) Method for creating a map with collision probability
Zhang et al. A Survey on Integrated Sensing and Communication in 3GPP
Thakuriah et al. Data sources and management
McQuiddy Unattended ground sensors for monitoring national borders
WO2020084288A1 (en) Method and apparatus for controlling a mobile camera
RU2228861C1 (en) Regional signaling antitheft system
Bourdon Compact integrated sensor processor: a common sensor processing core for the HYDRA unattended ground sensor system

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)