WO2021161008A1 - Object location status monitoring apparatus and method - Google Patents

Object location status monitoring apparatus and method Download PDF

Info

Publication number
WO2021161008A1
WO2021161008A1 PCT/GB2021/050299 GB2021050299W WO2021161008A1 WO 2021161008 A1 WO2021161008 A1 WO 2021161008A1 GB 2021050299 W GB2021050299 W GB 2021050299W WO 2021161008 A1 WO2021161008 A1 WO 2021161008A1
Authority
WO
WIPO (PCT)
Prior art keywords
location
data
image
defmed
utility
Prior art date
Application number
PCT/GB2021/050299
Other languages
French (fr)
Inventor
Yifan Zhao
Arkadiusz DMITRUK
Paul Nigel GREEN
Mark Nicholas James WHARTON
Arjun THIRUNAVUKARASU
Colin EVISON
Original Assignee
Cranfield University
Bam Nuttall Ltd
Iotic Labs Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cranfield University, Bam Nuttall Ltd, Iotic Labs Ltd filed Critical Cranfield University
Publication of WO2021161008A1 publication Critical patent/WO2021161008A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/0008Industrial image inspection checking presence/absence
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • G05B19/0426Programming the control sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/087Inventory or stock management, e.g. order filling, procurement or balancing against orders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/08Construction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services
    • G06Q50/265Personal security, identity or safety
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q90/00Systems or methods specially adapted for administrative, commercial, financial, managerial or supervisory purposes, not involving significant data processing
    • G06Q90/20Destination assistance within a business structure or complex
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19608Tracking movement of a target, e.g. by detecting an object predefined as a target, using target direction and or velocity to predict its new position
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19613Recognition of a predetermined image pattern or behaviour pattern indicating theft or intrusion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Definitions

  • the present invention relates to a system, method and apparatus to monitor and report a location and/or position status of at least one object at and/or proximate to a pre-defmed location, and in particular although not exclusively, to apparatus and method to identify displacement of the object from the pre-defmed location automatically and/or remotely.
  • the construction industry has undergone developments such as the emergence of more off-site manufacturing-based approaches in which small, medium and large-scale building projects utilise off-site construction and assembly in factory -based environments.
  • the final building/construction site may be considered an extension of the factory environment, with such sites having to align practices, standards and approaches to improve productivity.
  • the inventors provide an electronics and internet-connected or internet-of- things implemented system in which a local electronic station is configured with a suitable camera device to generate image data that may be processed via suitable modules, utilities, processes and/or algorithms to generate a live location and/or positional status alert or notification to a user that an object is at its desired pre-defmed location or has been moved, is misplaced, lost or otherwise displaced from the original and intended pre-defmed location.
  • Reference within this specification to the location status refers to the location of an object at or close to the intended pre-defmed location, such as an object mounted on a resource support (i.e. a health and safety item located at a safety board typically found at a construction site).
  • Reference within this specification to a position status of an object encompasses a positional movement of the object partially or fully from the intended and desired pre-defmed location with such positional movement being negligible, small, partial or complete, where complete positional movement may be considered to be an external boundary or edge of an object being entirely displaced from an original position when a first 2D image of the object is compared to a second 2D image of the object.
  • references within this specification to a module encompasses electronic hardware components including a printed circuit board, a processor, a storage utility, a power supply, a network communication device and/or software running on such components.
  • Reference within this specification to a utility encompasses software, algorithms, computer programs and the like running on electronic components as described herein and configured for analogue and/or digital data processing in which data is processed by at least one processor with data transferred to and from electronic components, data storage utilities, reference libraries, data libraries, local area networks, local electronics stations, servers, cloud and internet-connected devices.
  • references within this specification to an object ‘ proximate ’ to a location encompasses an object at a general location such as a mounting or display board. This includes the object being at a pre-defmed location on the board but also includes the object being moved/displaced slightly from an original position based on a comparison of 2D images of the object at the general location.
  • the term ‘ proximate ’ encompasses the object at a second position or location being close or near to a first original position or location.
  • apparatus to monitor and report a location and/or position status of at least one object at and/or proximate to a pre-defmed location comprising: a local electronic station positionable opposed to the object to be monitored at the pre-defmed location, the station having a camera device to generate image data of the object at and/or proximate to the pre-defmed location; a user interface module to enable a user to manage at least one of generation, processing and output of data based on the image data generated by the camera device; an object monitoring module to process the image data generated by the camera device; and an alerts module to generate and/or issue an alert notification of a location and/or position of the object relative to the pre-defmed location based on the image data processed by the object monitoring module; the user interface module, object monitoring module and/or alerts module provided at the station and/or at least one server located remote from the station.
  • the apparatus further comprises a learning module that utilises the user interface module, the learning module having an identification utility and/or a marking utility to allow a user to identify and/or mark the at least one object at the pre-defmed location within the image data.
  • the learning module further comprises an alert configure utility to allow a user to configure a type, frequency, format and/or delivery destination of an alert notification of a location and/or position status of the object.
  • the object monitoring module is provided with an object tracking utility to process the image data generated by the camera device and to monitor and/or track a location and/or a position of the object relative to the pre-defmed location.
  • the object tracking utility comprises a Kernelized Correlation Filters algorithm to monitor and/or track a location and/or a position of the object relative to the pre-defmed location.
  • the object tracking utility is operable with a first data set comprising at least one image of the object at the pre-defmed location and at least a second data set comprising at least one image of the object moved from the pre-defmed location to determine a displacement direction of the object from the pre-defmed location.
  • the object tracking utility further comprises containers containing a plurality of trackers associated with a location and/or a position of the object, each tracker configured to identify the object within an image generated by the camera device.
  • the present apparatus further comprises a pixel-based feature detection utility, the object tracking utility operable with the pixel -based feature detection utility to determine features and/or edges of an image based on the image data and configured to locate and/or confirm a position of the object relative to the pre-defmed location.
  • a pixel-based feature detection utility operable with the pixel -based feature detection utility to determine features and/or edges of an image based on the image data and configured to locate and/or confirm a position of the object relative to the pre-defmed location.
  • the present apparatus further comprises a recovery module provided with or operable with a pixel-based feature detection utility to determine features and/or edges of an image based on the image data and configured to locate and/or confirm a position of the object relative to the pre-defmed location.
  • the recovery module further comprises edge and/or corner measurement and/or detection utilities to identify a difference in intensity of pixels of at least a first image of the object at a first location and/or position and at least a second image of the object moved from the first location and/or position to a second location and/or position.
  • the pixel-based feature detection utility comprises an Orientated Features Accelerated Segment Test (FAST) algorithm and/or a rotated Binary Robust Independent Elementary Features (BRIEF) algorithm.
  • FAST Orientated Features Accelerated Segment Test
  • BRIEF Binary Robust Independent Elementary Features
  • the recovery module further comprises a Fast Library for Approximate Nearest Neighbours (FLANN) algorithm to compare feature vectors from an image of the object generated by the camera device with a library/template image of the object at the pre defined location based on image data generated by the camera device.
  • FLANN Fast Library for Approximate Nearest Neighbours
  • the object tracking utility is provided in data communication with the recovery module to locate a position and/or location of the object relative to the pre-defmed location.
  • the local electronic station further comprises: a proximity sensor to generate object distance data based on a distance measured between the sensor and the object; and/or a GPS sensor to generate GPS coordinate-based positional data of the object at and/or proximate to the pre-defmed location.
  • a proximity sensor to generate object distance data based on a distance measured between the sensor and the object
  • a GPS sensor to generate GPS coordinate-based positional data of the object at and/or proximate to the pre-defmed location.
  • the apparatus may further comprise a data acquisition module to collect data from any one or a combination of the camera device, the proximity sensor and the GPS sensor.
  • the apparatus may further comprise an object distance utility to process the object distance data and compare measured distances to identify if the object has moved from the pre-defmed location and/or is obstructed from detection by the proximity sensor.
  • the object distance utility is further configured to process the object distance data and calculate a reference threshold distance value.
  • the object distance utility is provided in data communication with the recovery module and operative to prompt triggering of the recovery module if the object distance is measured by the proximity sensor and is less than the reference threshold distance value.
  • the apparatus may further comprise an analysis utility in data communication with the camera device to convert an image of the object and/or support structure on which the object is mounted at the pre-defmed location from a red-green-blue (RGB) image to a hue-saturation-value (HSV) image.
  • an analysis utility in data communication with the camera device to convert an image of the object and/or support structure on which the object is mounted at the pre-defmed location from a red-green-blue (RGB) image to a hue-saturation-value (HSV) image.
  • RGB red-green-blue
  • HSV hue-saturation-value
  • the apparatus may further comprise an automatic start-up module connected to a power supply and configured to monitor an illumination intensity at or proximate to the object and to prompt the apparatus to enter a sleep mode if the illumination intensity is below a pre-set illumination intensity threshold.
  • an automatic start-up module connected to a power supply and configured to monitor an illumination intensity at or proximate to the object and to prompt the apparatus to enter a sleep mode if the illumination intensity is below a pre-set illumination intensity threshold.
  • the modules and utilities as described herein may be located at the station and/or the at least one server.
  • the apparatus may further comprise at least one or a combination of the following electronic components: at least one processor; at least one data storage; at least one reference data library; at least one network communication module, including a transceiver, transmitter and/or receiver; an electronics board, a printed circuit board and/or a programmable logic device; wherein said electronic components are provided at the local electronic station and/or the server.
  • a method of monitoring and reporting a location and/or position status of at least one object at and/or proximate to a pre-defmed location comprising: generating image data of the object using a camera device provided at a local electronic station positioned or positionable opposed to the object to be monitored at the pre-defmed location; managing at least one of generation, processing and output of data associated with the location and/or position of the object based on the image data generated by the camera device using a user interface module; processing the image data generated by the camera device using an object monitoring module to monitor the location and/or position of the object at and/or proximate to the pre- defmed location; and generating and/or issuing an alert notification based on the location and/or position of the object relative to the pre-defmed location using an alerts module based on the data processed by the object monitoring module.
  • the user interface module, the object monitoring module and/or the alerts module are provided at the station and/or at least one
  • the method may further comprise identifying and/or marking the object within the image data generated by the camera device using an identification utility and/or a marking utility being part of a learning module that forms part of or is operable with the user interface module.
  • the method may further comprise enabling a user to configure a type, frequency, format and/or delivery destination of an alert notification of a location and/or position status of the object via an alert configure utility.
  • the method may further comprise processing the image data generated by the camera device to monitor and/or track a location and/or a position of the object relative to the pre-defmed location using an object tracking utility.
  • the object tracking utility comprises a Kernelized Correlation Filters algorithm to monitor and/or track a location and/or a position of the object relative to the pre-defmed location.
  • the object tracking utility is operable with a first data set comprising at least one image of the object at the pre-defmed location and at least a second data set comprising at least one image of the object moved from the pre-defmed location to determine a displacement direction of the object from the pre-defmed location.
  • the object tracking utility further comprises containers containing a plurality of trackers associated with a location and/or a position of the object, each tracker configured to identify the object within an image generated by the camera device.
  • the method may further comprise determining features and/or edges of an image based on the image data using a pixel-based feature detection utility to locate and/or confirm a location and/or position of the object relative to the pre-defmed location.
  • the method may further comprise identifying a difference in intensity of pixels of at least a first image of the object at first location and/or position and at least a second image of the object moved from the first location and/or position to a second location and/or position using edge and/or corner measurement and/or detection utilities.
  • the method comprises utilising a pixel-based feature detection utility that comprises an Orientated Features Accelerated Segment Test (FAST) algorithm and/or a rotated Binary Robust Independent Elementary Features (BRIEF) algorithm.
  • FAST Orientated Features Accelerated Segment Test
  • BRIEF Binary Robust Independent Elementary Features
  • the step of locating and/or confirming a position of the object relative to the pre-defmed location further comprises comparing feature vectors from an image of the object generated by the camera device with a library/template image of the object at the pre-defmed location based on image data generated by the camera device using a Fast Library for Approximate Nearest Neighbours (FLANN) algorithm.
  • FLANN Fast Library for Approximate Nearest Neighbours
  • the method may comprise generating object distance data based on a distance measured between the object and a proximity sensor provided at the local electronic station; and/or generating GPS coordinate-based positional data of the object at and/or proximate to the pre-defmed location a GPS sensor provided at the local electronic station.
  • a data acquisition module collects data from any one or a combination of the camera device, the proximity sensor and the GPS sensor.
  • an object distance utility processes the object distance data and compares measured distances to identify if the object has moved from the pre-defmed location or is obstructed from detection by the proximity sensor.
  • the object distance utility processes the object distance data and calculates a reference threshold distance value.
  • the object distance utility is operative to prompt triggering of a recovery module to locate and/or confirm a position of the object relative to the pre-defmed location if the object distance is detected to be less than the reference threshold distance value.
  • the method further comprises converting an image of the object and/or a support structure on which the object is mounted at the pre-defmed location from a red- green-blue (RGB) image to a hue-saturation-value (HSV) image.
  • the method further comprises monitoring an illumination intensity at the object at and/or proximate to the pre-defmed location and prompting a system implementing the method to enter a sleep mode if the illumination intensity is below a pre-set illumination intensity threshold.
  • the method further comprises at least one or a combination of the following: processing the data using at least one processor; storing the data in at least one data storage; retrieving data from at least one reference data library; transmitting and/or receiving data via at least one network communication module.
  • apparatus to monitor and report a location and/or position status of at least one object at and/or proximate to a pre-defmed location comprising: a local electronic station positionable opposed to the object to be monitored at the pre-defmed location, the station comprising: a camera device to generate image data of the object at and/or proximate to the pre-defmed location; a proximity sensor to generate object distance data based on a distance measured between the sensor and the object; and/or a GPS sensor to generate GPS coordinate-based positional data of the object at and/or proximate to the pre-defmed location; and at least one server posting a user interface module to enable a user to manage at least one of generation, processing and output of data based on any one or a combination of the image data generated by the camera, the object distance data generated by the proximity sensor and GPS coordinate-based positional data generated by the GPS sensor; an object monitoring module to process the image data generated by the camera, the object distance data generated
  • a method of monitoring and reporting a location and/or position status of at least one object at and/or proximate to a pre-defmed location comprising: generating image data of the object using a camera device provided at a local electronic station positioned or positionable opposed to the object to be monitored at the pre-defmed location; managing at least one of generation, processing and output of data associated with the location and/or position of the object based on the image data generated by the camera device; processing the image data generated by the camera device using an object monitoring module to monitor the location and/or position of the object at and/or proximate to the pre-defmed location; and generating and/or issuing an alert notification based on the location and/or position of the object relative to the pre-defmed location using based on the data processed by the object monitoring module.
  • Figure 1 is a schematic illustration of an architecture of the present apparatus/system to monitor and report a location and/or position status of at least one object at and/or proximate to a pre-defmed location according to a specific implementation;
  • Figure 2 is a perspective schematic flow diagram of the interaction between components of the system of figure 1;
  • Figure 3 is a schematic illustration of a user interface enabling a user to identify and/or mark at least one object at a pre-defmed location according to a specific implementation
  • Figure 4 is a perspective schematic flow diagram of signal pathways provided and utilised by the system of figure 1;
  • Figure 5 is a schematic illustration of one aspect of a user interface of the system of figure i;
  • Figure 6 is a perspective schematic flow diagram of object monitoring and tracking steps as part of the system of figure 1;
  • Figure 7 is a perspective schematic illustration of hardware components and layout of a local electronic station forming part of the system of figure 1 according to a specific implementation.
  • the present apparatus and system is configured to identify if an object that is being monitored is present, removed from, missing or misplaced from a pre-defmed location.
  • the present apparatus provides an automated or semi-automated electronics and computer hardware and software implemented system to monitor and report the location and/or positional status of a plurality of objects at and/or proximate to one or a plurality of pre- defmed locations.
  • the plurality of objects are generally located at the same location being for example a health and safety board provided at a work, manufacturing, industrial, construction or other commercial site where personnel are operative.
  • Figure 1 illustrates the global architecture of the present system in which a local electronic station 11 is located adjacent or opposed to the objects to be monitored and provided in data communication with at least one remote hub (implemented on at least one server 12 to provides a cloud-based processing and data gathering resource).
  • a local electronic station 11 is located adjacent or opposed to the objects to be monitored and provided in data communication with at least one remote hub (implemented on at least one server 12 to provides a cloud-based processing and data gathering resource).
  • Local electronic station 11 comprises an analogue or digital camera 13; a proximity sensor 14 and a GPS sensor 15.
  • the station 11 comprises a Raspberry Pi 3B+ 39 implemented with a 1.4GHz 64 bit quad-core processor, dual-band wireless LAN, Bluetooth 4.2/BLE ethemet and power-over-ethernet support components and functions.
  • Station 11 further comprises a voltage converter 18 coupled to the GPS sensor 14 via a proximity sensor unit 76.
  • Station 11 further comprises a 4G connection mode 75, an interface (incorporating a voltage converter 18 for proximity sensor 14), a Cam/Prox integrator 20 coupled a local agent API 49.
  • Station 11 is also provided with an object tracker module 16 having a computer vision utility 17.
  • Cam/Prox integrator 20 is coupled to the local agent API 49 for communication with hub 12 via advanced message queueing protocols (AMQPs) 22. Additionally, the Cam/Prox integrator 20 provides communication integration with the GPS sensor 15, proximity sensor 14 and object tracker module 16 via inter-process communication sockets 19. Camera 13 is further coupled to the object tracker module 16 via suitable electronics pathways. All the electronics components of station 11 are incorporated within an IP-65 enclosure 74 assembled and sealed to be weather and water resistant for mounting inside or outdoors proximate to a set of objects for location and/or position monitoring.
  • AQPs advanced message queueing protocols
  • Hub 12 is remote from local station 11 and comprises a plurality of process modules implemented as software and/or algorithms hosted on a server computer.
  • hub 12 comprises an object monitoring module 23; a user interface module 24 and an alert module 25 with all modules 23 to 25 provided in fast data communication with one another and with electronics station 11 and independently with a user interface 36.
  • communication between station 11 and hub 12 is provided by AMQPs with communication between hub 12 and user interface 36 being via WebSocket and REST 35.
  • Object monitoring module 23 is configured to issue to the user interface module 24 data streams/packets including a proximity feed 26 (based on distances between an object to be monitored and proximity sensor 14); objects feed 27 (including data based on 2D images of the objects to be monitored obtained from camera 13); and alerts feed 28 (representing alerts to be issued to a user providing a location and/or position status of the objects - such as an object being located at the desired and pre-defmed location or being displaced from this location).
  • the user interface module 24 is adapted to issue feeds to the object monitoring module 23 to allow a user and/or an initial set-up operator to interface with the present location and/or position monitoring system 10 via user interface 36.
  • Such data transfer between modules 24 and 23 includes object control data 29; current data control 30 and alerts control 31.
  • object control data 29 Such data sets enable a user to identify objects (i.e. select and mark objects to be monitored), and to set a frequency, destination for and type of alerts to be issued.
  • Object monitoring module 23 is further configured to issue feeds to alert module 25 including alert feed 32; object feed 33 and proximity feed 34 based on data generated by camera 13, proximity sensor 14 and GPS sensor 15.
  • Alert module 25 is further configured to issue alerts to user 38 optionally via user interface 36 and/or other communication protocols such as email, SMS 37 as desired.
  • the present system 10 is adapted to provide a cloud-based utility to identify missing and misplaced objects and to notify end- users when abnormal events occur.
  • Processes according to the present system 10 may be divided into two categories, allowing for distribution of responsibilities/processing with the categories including processes run locally and processes run in-cloud (at server 12).
  • the cloud-based service along with a dedicated user interface 36, allow for the connecting of multiple local electronics stations 11 that each monitor status of targeted objects independently and output the appearance, position and/or location of the objects to the server hub 12 via computer vision module 17 and camera 13.
  • the GPS sensor 15 and proximity sensor 14 provide additional information to hub 12 to enhance the reliability of decision-making.
  • Hub 12 comprises a Digital Twin representation such that communication between hub 12 and station 11 is secured.
  • an operator/user is capable of defining targeted objects, managing all aspects of camera identification, to send notifications for identified abnormal events and to control other aspects and function of the present system 10 remotely via a secure data transfer.
  • the present system 10 is adapted for multiple sub-processes including in particular:
  • a status reporting process to report the status of objects of interest including appearance, location and/or position; • an alerting process adapted for sending appropriate alerts according to the pre-set configuration (by the user/operator) applied by the learning process.
  • Figure 2 illustrates the high-level data transfer between the local electronic station 11 and cloud-based hub 12.
  • a plurality of objects 47a, 47b, 47c, 47d and 47e are mounted on a suitable support structure 48 being a health and safety board located at a construction site.
  • Objects 47a to 47e may be health and safety equipment including for example an eyewash, bandage, medicines, health and safety procedure, instructions etc.
  • the location and/or position of the objects 47a to 47e at support 48 is monitored by camera 13.
  • the system comprises a data acquisition process 40 feeding the learning process 41 including a location, label and status of the objects.
  • the monitoring process 42 is fed by the learning process 41, with the monitoring process 42 configured to communicate with the recovery process 43 if any one of the objects 47a to 47e is identified as displaced, missing or removed from the pre-defmed location at support 48.
  • the learning process 41, data acquisition 40, status monitoring 42 and recovery process 43 are implemented locally at station 11.
  • Learning process 41 and status monitoring processes 42 are coupled for data communication with hub 12 and the associated object monitoring module 23, user interface module 24 and alert module 25.
  • the system 10 is adapted to be self-learning by the learning process 41 and object monitoring module 23 so as to optimise status monitoring according to any one of changes in object size, appearance, position, location, illumination intensity level etc.
  • a user interface framework 46 allows a user via a user interface 36 to tailor the system settings and the way in which objects 47a to 47e are monitored and object status is alerted 44, 45 via server 12.
  • FIG 3 illustrates aspects of the user interface framework 46 implemented with the user interface module 24 and user interface platform 36.
  • Frame 50 provides a means to configure system 10 for the monitoring of multiple stations 11, each having multiple objects 47a to 47e. Accordingly, a user is enabled to identify objects 47a to 47e at each separate and remote location and to specifically configure the system 10 as required.
  • each separate station (electronics board) 11 is capable of being set up in which the objects are individually identified and marked.
  • frame 77 of user interface 36 (implemented on hub 12) provides the learning process 41 that allows the operator to configure necessary parameters. This includes the marking of objects of interest. In particular, using 2D image data obtained by camera 13 at station 11, object 47a is marked by drawing a rectangle 63 around it.
  • Object labels 65 may then be added, and actions and alerts configured 66. Further object notification, status and information labelling may be configured via tile 67 including configuration of alert messages e.g. ‘ eyewash is missing, please investigate an alert type (i.e. repeated), an alert interval (delay between object anomaly detection and alert issue).
  • alert messages e.g. ‘ eyewash is missing, please investigate an alert type (i.e. repeated), an alert interval (delay between object anomaly detection and alert issue).
  • a data transfer pathway map is illustrated in figure 4.
  • signal data is generated 52.
  • Such signal data 52 is divided into image data 53 (generated by camera 13); GPS positional coordinate data 54 (generated by GPS sensor 15) and proximity/di stance data 55 (generated by proximity sensor 14).
  • the various data sets 53, 54, 55 are relayed to the implementing modules of system 10 to feed the various processors including the learning process 41, the object status/reporting process 57, the object monitoring/tracking process 42 and the recovery process 43.
  • Present system 10 is adapted to provide: a communication pathway 59 between GPS coordinates data 54 and the status process 57; a data transfer pathway 60 between the proximity distance data 55 and the recovery process 43; a data pathway 62 between image data 53 and tracking/monitoring process 42; and a data pathway 61 between the status and recovery processes 57, 43.
  • the system 10 collects all data from the various sensors 13, 14, 15 and feeds them into the hub 12 for processing. As indicated, local processing may occur initially prior to issuing to hub 12.
  • the image data 53 acquired by camera 13 is used for tracking, recovery and status reporting; the GPS coordinates data 54 is sent directly to the status reporting process 57 and the proximity distance data 55 is collected through proximity sensor 14 and is used to improve the efficiency and accuracy of both the decision-making process and the recovery process 43.
  • the learning process 41 enables an operator to configure the various operative parameters via web-based user interface 36.
  • This learning process 41 via example frames 50, 51, 77 and the associated fields, menu selections, labels and tiles 65, 66, 67 include functions to mark objects of interest by annotating/editing images generated by camera 13 at local station 11 i.e., by drawing a rectangle 63, 64 around such objects on these images captured by camera 13 .
  • An identified targeted image may then be labelled via label 65.
  • the customisation of alerts including type, delivery destination, frequency etc., may then be configured via tiles 66, 67 for example enabling a user to configure multiple or single alerts to be sent depending upon how long the object of interest is missing or displaced from a pre-defmed location at board 48.
  • the object monitoring process 42 using object monitoring module 23 provides that data is returned from the learning process 41 to the local electronic station 11 and is used to initialise the monitoring process 42.
  • This functionality is implemented using a computer vision-based tracking method and tracker process module 16 and computer vision module 17.
  • the vision-based tracking method utilises Kernelized Correlation Filters algorithm available from OpenCV libraries.
  • the recovery process 43 utilises oriented FAST and rotated BRIEF (ORB) algorithm to locate an object and report a new position when displaced from the original pre-defmed location. Accordingly, the supplementary recovery process 43 allows for reinitialisation of a moved or lost object and continued monitoring.
  • the recovery process is triggered whenever an object of interest (as determined by a user) is missing for longer than a pre- defmed time period. For example, an object may have been removed from board 48 for a time period and may be placed back on the board 48 at a new location. Accordingly, the recovery process (using image data generated by camera 13) provides reinitilisation of the object pre-defmed location and continued monitoring.
  • the status process 57 involves the sending of all appropriate data from station 11 to server 12. This data contains current image and status data for all tracked objects 47a to 47e, with the status being updated whenever the system detects anomalies associated with each individually tracked object.
  • the alerting process via alert module 25 is a server-based configuration of the system 10 responsible for sending appropriate warnings, notifications and alerts in direct response to a change in position, location and/or appearance of an object at the pre-defmed location. Such a location could be the original position of mounting of an object at board 48 or a new position (for example if an object has been removed from board 48, used and then returned but mounted in a different position). If an object of interest is reported lost by the status reporting module for long enough, the alerting process will take appropriate action according to its configuration (as provided and set by an operator during the learning process) to issue a notification or alert.
  • the present system 10 is configured for initial automatic detection of board 48 using an analysis utility in data communication with station 11 and in particular the images generated by camera 13.
  • an analysis utility in data communication with station 11 and in particular the images generated by camera 13.
  • a first frame is captured and analysed by the analysis utility to detect board 48.
  • the image is then converted from a red-green-blue (RGB) image to a hue-saturation value (HSV) image.
  • RGB red-green-blue
  • HSV hue-saturation value
  • the HSV allows for much easier colour separation and detection.
  • the next step is to find the largest green object in the frame after binarising the result extracting board position.
  • the step involves creating a mask of the safety board which can be used to crop future images to contain only a pre-defmed area of board 48. This is advantageous to reduce computational demand by eliminating from the processing area, regions that do not contain objects of interest. If the system 10 is unable to find a large green object, the default setting is to ignore the masking procedure.
  • tracking process 42 utilises KCF to monitor objects of interest by ‘ trackers ’ using tracking- by-detection methods.
  • An object 47a is marked by a user via the earlier learning process 41 (involving user interface module 24) object monitoring module 23 and selected components of the station (electronics board) 11.
  • the identified objects are fed into the KCF and used as positive samples to train a ‘ classifier ⁇ Multiple samples from the rest of the image are fed as negative samples (background).
  • the present system 10 via the object tracking process 42
  • a score assigned to each location into which an object is moved helps determine the movement direction of the tracked object.
  • the frame with movement is used as a positive sample to further train the classifier.
  • KCF is advantageous due to its characteristics of being ‘ strict ’ when processing object movement.
  • Tracking process 42 is utilised to report whenever an object of interest is in a scene or not.
  • ‘ containers' are built containing a list of trackers that track each object 47a to 47e individually, with this being created from data received from the cloud server 12 after initialisation and running of the learning process 41 described with reference to figure 2.
  • the next stage is for each tracker to update its status on the provided frame.
  • the process triggers the recovery process 43 to locate the object at board 48 (for example with the object being returned by personnel to a different location at board 48).
  • the tracking process involves initial identification of objects of interest at stage 70, the creation of a container having multiple trackers at stage 71. This data is generated based on the 2D image data collected by camera 13 involving data acquisition step 40. The tracking process is implemented in direct response to the image data acquisition at stage 72 to then trigger 73 the recovery process 43, as appropriate.
  • the recovery process 43 may be considered supplementary to the primary monitoring process 42 utilising object monitoring module 23.
  • the recovery process 43 is triggered by the monitoring process 42 once an object has been categorised as ‘ misplaced ’ or ‘lost beyond a pre-defmed time period.
  • data containing the positions of objects of interest is used to issue the process of saving a template image for every object individually. Every template image contains features, unique to a particular object. Finding the lost object is a matter of finding its features, in the query image (current frame seen by the system).
  • the recovery process is based on ORB, which consists of Oriented FAST and Rotated BRIEF algorithms.
  • the first step is to detect features on the image (so-called keypoints). These can be any changes in pixel intensity when changing directions, the strongest being edges and corners.
  • Harris Corner Measure To perform this task the Features Accelerated Segment Test (FAST) algorithm is used. This applies the Harris Corner Measure to find the best points on the provided image. It finds the difference in intensity for a given displacement, in all directions. It also uses a pyramid technique to construct multi-scale features. This ensures the recovery process is compatible with scaling issues (object of interest might be closer or further in the scene). Finally, it computes the intensity weighted centroid of the patch with the located comer at the centre.
  • BRIEF Binary Robust Independent Elementary Features
  • FLANN Fast Library for Approximate Nearest Neighbours
  • the recovery process also makes sure that only unique features are being used. Every feature from template images is compared to at least two closest features on the query image. If correlations between template keypoint and two consecutive, closest features from query image are too similar, then that particular feature is disregarded as not unique enough.
  • proximity sensor 14 is utilised to generate object distance data corresponding to a distance measured between sensor 14 (at station 11) and an object 47a to 47e at board 48.
  • This distance data is utilised as part of the status process 57 and recovery process 43 to identify if an object has moved from the initial pre-defmed location at board 48 or is obstructed from detection by the proximity sensor 14, for example by a vehicle or person obstructing the direct ‘ line of sight path from sensor 14.
  • the proximity sensor distance data is utilised in two different ways. Firstly, it boosts decision making performance by introducing another measure in the form of fluctuations of distance signal.
  • the system can deduce that objects of interest are occluded (for example by a large vehicle parked in front of the board 48).
  • the second implementation is quite similar to the recovery system. After the learning procedure, a series of distance signals are stored. Next, these values are processed and a threshold is calculated, representing a reference point for the system. Before issuing a recovery procedure for a lost object, the current distance value is checked and compared to the reference value. If it exceeds the reference threshold, the system does not trigger the recovery procedure due to the high probability of the scene being obstructed. This helps not only avoid running a computationally expensive algorithm, whenever there are signs it would not be able to perform its tasks, but also reduces the number of false positives.
  • the present system 10 further comprises energy saving modules, utilities and function via an automatic start-up module connected to a power supply.
  • the system 10 automatically starts after connecting to a suitable power supply. No further steps are required by an operator.
  • Station 11 detects and connects to server 12 automatically. Once powered, if the illumination environment at board 48 is insufficient, system 10 is configured to go into a ‘ sleep mode ’ automatically by calculating the average intensity of the images obtained by camera 13. Once the average intensity exceeds a pre-set threshold, system 10 will awake automatically and continue execution of tracking process 42. This functionality allows system 10 to work continuously and without manual start-up.
  • the main computation unit is a Raspberry Pi 3B+ working under the Raspbian system.
  • Camera 13 is preferably a Pi Camera V2 to capture images and is advantageous for efficient performance and low cost.
  • GPS sensor may comprise an Adafruit Ultimate GPS sensor model providing high accuracy and stability.
  • Proximity sensor 14 may be implemented as a weatherproof JSN-SR04T-2.0 proximity sensor. Such a sensor may be used in both wet or dry working conditions.
  • internet connection is reliably provided by built-in Wi-Fi or a suitable 4G modem (as will be appreciated other modems are suitable).
  • Both voltage conversion and IP-65 rated enclosures may be customised by appropriate providers to meet set standards and requirements with one specific implementation of the hardware components at station 11 detailed in figure 7, with reference to figure 1.
  • the present system 10 comprising local electronic station 11 and server 12 that utilises various electronic components, architectures and functionality including at least one processor; at least one data storage; at least one reference data library; appropriate network communication modules including transceivers, transmitters and/or receivers; electronics boards such as printed circuit boards and/or programmable logic devices.
  • Such components may be provided specifically at local station 11 and/or implemented within a cloud server (hub 12).
  • Data reference libraries may be utilised by the various modules including the object monitoring module 23, user interface module 24, alert module 25 and /or tracking process module 16.
  • Such data libraries may be acquired from third parties and/or built locally as required.
  • Such local building may be a product of the data generated by the sensors including the camera 13, proximity sensor 14 and/or GPS sensor 15.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Tourism & Hospitality (AREA)
  • Economics (AREA)
  • General Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Strategic Management (AREA)
  • Marketing (AREA)
  • Health & Medical Sciences (AREA)
  • Quality & Reliability (AREA)
  • Primary Health Care (AREA)
  • Development Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Operations Research (AREA)
  • Signal Processing (AREA)
  • Automation & Control Theory (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Computer Security & Cryptography (AREA)
  • Educational Administration (AREA)
  • Image Analysis (AREA)
  • Alarm Systems (AREA)

Abstract

Apparatus, a method and a system to monitor and report a location and/or a positional status of an object at and/or proximate to a pre-defmed location. The present system is adapted for the automated or semi-automated monitoring of an object to identify if the object has moved relative to its original desired location. The present system utilises an imaging camera device that generates 2D image data which is processed and used to generate notifications and/or alerts to a user that an object has moved, is missing or otherwise not at its intended pre-defmed location.

Description

Object Location Status Monitoring Apparatus and Method
Field of invention
The present invention relates to a system, method and apparatus to monitor and report a location and/or position status of at least one object at and/or proximate to a pre-defmed location, and in particular although not exclusively, to apparatus and method to identify displacement of the object from the pre-defmed location automatically and/or remotely.
Background
The construction industry is founded on well-established processes and methods. It is common for this sector, to be reluctant to explore, consider and adopt new technologies, utilised in other sectors, due to external and internal pressures such as time constraints, cost and general attitudes.
More recently, the construction industry has undergone developments such as the emergence of more off-site manufacturing-based approaches in which small, medium and large-scale building projects utilise off-site construction and assembly in factory -based environments. In such situations, the final building/construction site may be considered an extension of the factory environment, with such sites having to align practices, standards and approaches to improve productivity.
Internationally, construction sites are highly regulated according to health, safety and environmental regulations and this has required additional monitoring and management resources. Due to pressures on availability of skilled and experienced construction personnel and professionals, efforts are being made to maximise the time and effectiveness of such personnel. One opportunity for maximising the efficiency of personnel’s time and effort is to provide appropriate tools and procedures to facilitate working practices and avoid high-value personnel wasting time with tasks that are or at least should be considered ancillary to their primary objectives. In particular, commonly construction projects involve a lot of human effort spent checking and verifying on-site situations associated with health, safety and environmental matters. These efforts do not necessarily add value to a project. One such aspect is the requirement to maintain health and safety items at a pre-defmed location such as a safety equipment/medical board. Conventionally, one or a number of responsible site personnel are required to manually monitor and log the presence and/or absence of objects at the safety board to ensure compliance with the appropriate regulations.
Accordingly, there is a need for apparatus and method to enhance efficiency of object location and/or position monitoring according to a remote, automated and/or semi- automated system.
Summary of the Invention
It is an objective of the present invention to provide apparatus, a method and/or a system to allow automated, semi-automated and/or remote monitoring of objects at a pre-defmed location and to identify and alert a user that an object has been taken, is misplaced and/or has moved relative to the pre-defmed location. It is a further objective to provide apparatus, a method and a system offering electronic and internet-connected functionality to monitor and report an appearance, location and/or positional status of at least one object at and/or proximate to a pre-defmed location.
It is a further objective to provide apparatus, a method and a system that may be considered fully or partially ‘ self-learning ’ such that following an initial ‘ set-up ’ the system is capable of monitoring and reporting the status of the location and/or position of an object relative to the pre-defmed location autonomously and/or with minimal manual input and intervention.
Accordingly, the inventors provide an electronics and internet-connected or internet-of- things implemented system in which a local electronic station is configured with a suitable camera device to generate image data that may be processed via suitable modules, utilities, processes and/or algorithms to generate a live location and/or positional status alert or notification to a user that an object is at its desired pre-defmed location or has been moved, is misplaced, lost or otherwise displaced from the original and intended pre-defmed location.
Reference within this specification to the location status refers to the location of an object at or close to the intended pre-defmed location, such as an object mounted on a resource support (i.e. a health and safety item located at a safety board typically found at a construction site). Reference within this specification to a position status of an object encompasses a positional movement of the object partially or fully from the intended and desired pre-defmed location with such positional movement being negligible, small, partial or complete, where complete positional movement may be considered to be an external boundary or edge of an object being entirely displaced from an original position when a first 2D image of the object is compared to a second 2D image of the object.
Reference within this specification to a module encompasses electronic hardware components including a printed circuit board, a processor, a storage utility, a power supply, a network communication device and/or software running on such components. Reference within this specification to a utility encompasses software, algorithms, computer programs and the like running on electronic components as described herein and configured for analogue and/or digital data processing in which data is processed by at least one processor with data transferred to and from electronic components, data storage utilities, reference libraries, data libraries, local area networks, local electronics stations, servers, cloud and internet-connected devices.
Reference within this specification to an object ‘ proximate ’ to a location encompasses an object at a general location such as a mounting or display board. This includes the object being at a pre-defmed location on the board but also includes the object being moved/displaced slightly from an original position based on a comparison of 2D images of the object at the general location. The term ‘ proximate ’ encompasses the object at a second position or location being close or near to a first original position or location.
According to a first aspect of the present invention there is provided apparatus to monitor and report a location and/or position status of at least one object at and/or proximate to a pre-defmed location comprising: a local electronic station positionable opposed to the object to be monitored at the pre-defmed location, the station having a camera device to generate image data of the object at and/or proximate to the pre-defmed location; a user interface module to enable a user to manage at least one of generation, processing and output of data based on the image data generated by the camera device; an object monitoring module to process the image data generated by the camera device; and an alerts module to generate and/or issue an alert notification of a location and/or position of the object relative to the pre-defmed location based on the image data processed by the object monitoring module; the user interface module, object monitoring module and/or alerts module provided at the station and/or at least one server located remote from the station.
Optionally, the apparatus further comprises a learning module that utilises the user interface module, the learning module having an identification utility and/or a marking utility to allow a user to identify and/or mark the at least one object at the pre-defmed location within the image data. Optionally, the learning module further comprises an alert configure utility to allow a user to configure a type, frequency, format and/or delivery destination of an alert notification of a location and/or position status of the object.
Optionally, the object monitoring module is provided with an object tracking utility to process the image data generated by the camera device and to monitor and/or track a location and/or a position of the object relative to the pre-defmed location.
Optionally, the object tracking utility comprises a Kernelized Correlation Filters algorithm to monitor and/or track a location and/or a position of the object relative to the pre-defmed location. Optionally, the object tracking utility is operable with a first data set comprising at least one image of the object at the pre-defmed location and at least a second data set comprising at least one image of the object moved from the pre-defmed location to determine a displacement direction of the object from the pre-defmed location.
Optionally, the object tracking utility further comprises containers containing a plurality of trackers associated with a location and/or a position of the object, each tracker configured to identify the object within an image generated by the camera device.
Optionally, the present apparatus further comprises a pixel-based feature detection utility, the object tracking utility operable with the pixel -based feature detection utility to determine features and/or edges of an image based on the image data and configured to locate and/or confirm a position of the object relative to the pre-defmed location.
Optionally, the present apparatus further comprises a recovery module provided with or operable with a pixel-based feature detection utility to determine features and/or edges of an image based on the image data and configured to locate and/or confirm a position of the object relative to the pre-defmed location. Optionally, the recovery module further comprises edge and/or corner measurement and/or detection utilities to identify a difference in intensity of pixels of at least a first image of the object at a first location and/or position and at least a second image of the object moved from the first location and/or position to a second location and/or position. Optionally, the pixel-based feature detection utility comprises an Orientated Features Accelerated Segment Test (FAST) algorithm and/or a rotated Binary Robust Independent Elementary Features (BRIEF) algorithm.
Optionally, the recovery module further comprises a Fast Library for Approximate Nearest Neighbours (FLANN) algorithm to compare feature vectors from an image of the object generated by the camera device with a library/template image of the object at the pre defined location based on image data generated by the camera device.
Optionally, the object tracking utility is provided in data communication with the recovery module to locate a position and/or location of the object relative to the pre-defmed location.
Optionally, the local electronic station further comprises: a proximity sensor to generate object distance data based on a distance measured between the sensor and the object; and/or a GPS sensor to generate GPS coordinate-based positional data of the object at and/or proximate to the pre-defmed location.
Optionally, the apparatus may further comprise a data acquisition module to collect data from any one or a combination of the camera device, the proximity sensor and the GPS sensor.
Optionally, the apparatus may further comprise an object distance utility to process the object distance data and compare measured distances to identify if the object has moved from the pre-defmed location and/or is obstructed from detection by the proximity sensor. Optionally, the object distance utility is further configured to process the object distance data and calculate a reference threshold distance value. Optionally, the object distance utility is provided in data communication with the recovery module and operative to prompt triggering of the recovery module if the object distance is measured by the proximity sensor and is less than the reference threshold distance value. Optionally, the apparatus may further comprise an analysis utility in data communication with the camera device to convert an image of the object and/or support structure on which the object is mounted at the pre-defmed location from a red-green-blue (RGB) image to a hue-saturation-value (HSV) image.
Optionally, the apparatus may further comprise an automatic start-up module connected to a power supply and configured to monitor an illumination intensity at or proximate to the object and to prompt the apparatus to enter a sleep mode if the illumination intensity is below a pre-set illumination intensity threshold.
Optionally, the modules and utilities as described herein may be located at the station and/or the at least one server. Optionally, the apparatus may further comprise at least one or a combination of the following electronic components: at least one processor; at least one data storage; at least one reference data library; at least one network communication module, including a transceiver, transmitter and/or receiver; an electronics board, a printed circuit board and/or a programmable logic device; wherein said electronic components are provided at the local electronic station and/or the server.
According to a further aspect of the present invention there is provided a method of monitoring and reporting a location and/or position status of at least one object at and/or proximate to a pre-defmed location comprising: generating image data of the object using a camera device provided at a local electronic station positioned or positionable opposed to the object to be monitored at the pre-defmed location; managing at least one of generation, processing and output of data associated with the location and/or position of the object based on the image data generated by the camera device using a user interface module; processing the image data generated by the camera device using an object monitoring module to monitor the location and/or position of the object at and/or proximate to the pre- defmed location; and generating and/or issuing an alert notification based on the location and/or position of the object relative to the pre-defmed location using an alerts module based on the data processed by the object monitoring module. Optionally, the user interface module, the object monitoring module and/or the alerts module are provided at the station and/or at least one server located remote from the station.
Optionally, the method may further comprise identifying and/or marking the object within the image data generated by the camera device using an identification utility and/or a marking utility being part of a learning module that forms part of or is operable with the user interface module.
Optionally, the method may further comprise enabling a user to configure a type, frequency, format and/or delivery destination of an alert notification of a location and/or position status of the object via an alert configure utility.
Optionally, the method may further comprise processing the image data generated by the camera device to monitor and/or track a location and/or a position of the object relative to the pre-defmed location using an object tracking utility.
Optionally, the object tracking utility comprises a Kernelized Correlation Filters algorithm to monitor and/or track a location and/or a position of the object relative to the pre-defmed location. Optionally, the object tracking utility is operable with a first data set comprising at least one image of the object at the pre-defmed location and at least a second data set comprising at least one image of the object moved from the pre-defmed location to determine a displacement direction of the object from the pre-defmed location.
Optionally, the object tracking utility further comprises containers containing a plurality of trackers associated with a location and/or a position of the object, each tracker configured to identify the object within an image generated by the camera device.
Optionally, the method may further comprise determining features and/or edges of an image based on the image data using a pixel-based feature detection utility to locate and/or confirm a location and/or position of the object relative to the pre-defmed location. Optionally, the method may further comprise identifying a difference in intensity of pixels of at least a first image of the object at first location and/or position and at least a second image of the object moved from the first location and/or position to a second location and/or position using edge and/or corner measurement and/or detection utilities.
Optionally, the method comprises utilising a pixel-based feature detection utility that comprises an Orientated Features Accelerated Segment Test (FAST) algorithm and/or a rotated Binary Robust Independent Elementary Features (BRIEF) algorithm.
Optionally, the step of locating and/or confirming a position of the object relative to the pre-defmed location further comprises comparing feature vectors from an image of the object generated by the camera device with a library/template image of the object at the pre-defmed location based on image data generated by the camera device using a Fast Library for Approximate Nearest Neighbours (FLANN) algorithm.
Optionally, the method may comprise generating object distance data based on a distance measured between the object and a proximity sensor provided at the local electronic station; and/or generating GPS coordinate-based positional data of the object at and/or proximate to the pre-defmed location a GPS sensor provided at the local electronic station. Optionally, a data acquisition module collects data from any one or a combination of the camera device, the proximity sensor and the GPS sensor. Optionally, an object distance utility processes the object distance data and compares measured distances to identify if the object has moved from the pre-defmed location or is obstructed from detection by the proximity sensor. Optionally, the object distance utility processes the object distance data and calculates a reference threshold distance value. Optionally, the object distance utility is operative to prompt triggering of a recovery module to locate and/or confirm a position of the object relative to the pre-defmed location if the object distance is detected to be less than the reference threshold distance value.
Optionally, the method further comprises converting an image of the object and/or a support structure on which the object is mounted at the pre-defmed location from a red- green-blue (RGB) image to a hue-saturation-value (HSV) image. Optionally, the method further comprises monitoring an illumination intensity at the object at and/or proximate to the pre-defmed location and prompting a system implementing the method to enter a sleep mode if the illumination intensity is below a pre-set illumination intensity threshold.
Optionally, the method further comprises at least one or a combination of the following: processing the data using at least one processor; storing the data in at least one data storage; retrieving data from at least one reference data library; transmitting and/or receiving data via at least one network communication module.
According to a further aspect of the present invention there is provided apparatus to monitor and report a location and/or position status of at least one object at and/or proximate to a pre-defmed location comprising: a local electronic station positionable opposed to the object to be monitored at the pre-defmed location, the station comprising: a camera device to generate image data of the object at and/or proximate to the pre-defmed location; a proximity sensor to generate object distance data based on a distance measured between the sensor and the object; and/or a GPS sensor to generate GPS coordinate-based positional data of the object at and/or proximate to the pre-defmed location; and at least one server posting a user interface module to enable a user to manage at least one of generation, processing and output of data based on any one or a combination of the image data generated by the camera, the object distance data generated by the proximity sensor and GPS coordinate-based positional data generated by the GPS sensor; an object monitoring module to process the image data generated by the camera, the object distance data generated by the proximity sensor and/or the GPS coordinate-based positional data generated by the GPS sensor, and an alerts module to generate and/or issue an alert notification of a location and/or position of the object relative to the pre-defmed location based on the data processed by the object monitoring module.
According to a further aspect of the present invention there is provided a method of monitoring and reporting a location and/or position status of at least one object at and/or proximate to a pre-defmed location comprising: generating image data of the object using a camera device provided at a local electronic station positioned or positionable opposed to the object to be monitored at the pre-defmed location; managing at least one of generation, processing and output of data associated with the location and/or position of the object based on the image data generated by the camera device; processing the image data generated by the camera device using an object monitoring module to monitor the location and/or position of the object at and/or proximate to the pre-defmed location; and generating and/or issuing an alert notification based on the location and/or position of the object relative to the pre-defmed location using based on the data processed by the object monitoring module.
Brief description of drawings
A specific implementation of the present invention will now be described, by way of example only, and with reference to the accompanying drawings in which:
Figure 1 is a schematic illustration of an architecture of the present apparatus/system to monitor and report a location and/or position status of at least one object at and/or proximate to a pre-defmed location according to a specific implementation;
Figure 2 is a perspective schematic flow diagram of the interaction between components of the system of figure 1;
Figure 3 is a schematic illustration of a user interface enabling a user to identify and/or mark at least one object at a pre-defmed location according to a specific implementation;
Figure 4 is a perspective schematic flow diagram of signal pathways provided and utilised by the system of figure 1;
Figure 5 is a schematic illustration of one aspect of a user interface of the system of figure i; Figure 6 is a perspective schematic flow diagram of object monitoring and tracking steps as part of the system of figure 1;
Figure 7 is a perspective schematic illustration of hardware components and layout of a local electronic station forming part of the system of figure 1 according to a specific implementation.
Detailed description of preferred embodiment of the invention
The present apparatus and system is configured to identify if an object that is being monitored is present, removed from, missing or misplaced from a pre-defmed location.
The present apparatus provides an automated or semi-automated electronics and computer hardware and software implemented system to monitor and report the location and/or positional status of a plurality of objects at and/or proximate to one or a plurality of pre- defmed locations. According to preferred implementations, the plurality of objects (N objects) are generally located at the same location being for example a health and safety board provided at a work, manufacturing, industrial, construction or other commercial site where personnel are operative.
Figure 1 illustrates the global architecture of the present system in which a local electronic station 11 is located adjacent or opposed to the objects to be monitored and provided in data communication with at least one remote hub (implemented on at least one server 12 to provides a cloud-based processing and data gathering resource).
Local electronic station 11 comprises an analogue or digital camera 13; a proximity sensor 14 and a GPS sensor 15. Referring to figures 1 and 7, the station 11 comprises a Raspberry Pi 3B+ 39 implemented with a 1.4GHz 64 bit quad-core processor, dual-band wireless LAN, Bluetooth 4.2/BLE ethemet and power-over-ethernet support components and functions. Station 11 further comprises a voltage converter 18 coupled to the GPS sensor 14 via a proximity sensor unit 76. Station 11 further comprises a 4G connection mode 75, an interface (incorporating a voltage converter 18 for proximity sensor 14), a Cam/Prox integrator 20 coupled a local agent API 49. Station 11 is also provided with an object tracker module 16 having a computer vision utility 17. According to the specific implementation, Cam/Prox integrator 20 is coupled to the local agent API 49 for communication with hub 12 via advanced message queueing protocols (AMQPs) 22. Additionally, the Cam/Prox integrator 20 provides communication integration with the GPS sensor 15, proximity sensor 14 and object tracker module 16 via inter-process communication sockets 19. Camera 13 is further coupled to the object tracker module 16 via suitable electronics pathways. All the electronics components of station 11 are incorporated within an IP-65 enclosure 74 assembled and sealed to be weather and water resistant for mounting inside or outdoors proximate to a set of objects for location and/or position monitoring.
Hub 12, according to the specific implementation, is remote from local station 11 and comprises a plurality of process modules implemented as software and/or algorithms hosted on a server computer. In particular, hub 12 comprises an object monitoring module 23; a user interface module 24 and an alert module 25 with all modules 23 to 25 provided in fast data communication with one another and with electronics station 11 and independently with a user interface 36. As indicated, communication between station 11 and hub 12 is provided by AMQPs with communication between hub 12 and user interface 36 being via WebSocket and REST 35.
Object monitoring module 23 is configured to issue to the user interface module 24 data streams/packets including a proximity feed 26 (based on distances between an object to be monitored and proximity sensor 14); objects feed 27 (including data based on 2D images of the objects to be monitored obtained from camera 13); and alerts feed 28 (representing alerts to be issued to a user providing a location and/or position status of the objects - such as an object being located at the desired and pre-defmed location or being displaced from this location). The user interface module 24 is adapted to issue feeds to the object monitoring module 23 to allow a user and/or an initial set-up operator to interface with the present location and/or position monitoring system 10 via user interface 36. Such data transfer between modules 24 and 23 includes object control data 29; current data control 30 and alerts control 31. Such data sets enable a user to identify objects (i.e. select and mark objects to be monitored), and to set a frequency, destination for and type of alerts to be issued. Object monitoring module 23 is further configured to issue feeds to alert module 25 including alert feed 32; object feed 33 and proximity feed 34 based on data generated by camera 13, proximity sensor 14 and GPS sensor 15. Alert module 25 is further configured to issue alerts to user 38 optionally via user interface 36 and/or other communication protocols such as email, SMS 37 as desired.
The present system 10 according to the system architecture of figures 1 and 7 is adapted to provide a cloud-based utility to identify missing and misplaced objects and to notify end- users when abnormal events occur. Processes according to the present system 10 may be divided into two categories, allowing for distribution of responsibilities/processing with the categories including processes run locally and processes run in-cloud (at server 12). The cloud-based service along with a dedicated user interface 36, allow for the connecting of multiple local electronics stations 11 that each monitor status of targeted objects independently and output the appearance, position and/or location of the objects to the server hub 12 via computer vision module 17 and camera 13. The GPS sensor 15 and proximity sensor 14 provide additional information to hub 12 to enhance the reliability of decision-making. Hub 12 comprises a Digital Twin representation such that communication between hub 12 and station 11 is secured. As indicated, via user interface 36, an operator/user is capable of defining targeted objects, managing all aspects of camera identification, to send notifications for identified abnormal events and to control other aspects and function of the present system 10 remotely via a secure data transfer.
The present system 10 is adapted for multiple sub-processes including in particular:
• a data acquisition process in which all necessary data/information is collected from the system’s sensors 13, 14, 15;
• a learning process that requires manual user input and the identification and marking of objects of interest that are to be monitored;
• a monitoring process adapted to track the status of the identified objects;
• a recovery process providing a monitoring system being supplementary to the monitoring process;
• a status reporting process to report the status of objects of interest including appearance, location and/or position; • an alerting process adapted for sending appropriate alerts according to the pre-set configuration (by the user/operator) applied by the learning process.
Figure 2 illustrates the high-level data transfer between the local electronic station 11 and cloud-based hub 12. A plurality of objects 47a, 47b, 47c, 47d and 47e are mounted on a suitable support structure 48 being a health and safety board located at a construction site. Objects 47a to 47e may be health and safety equipment including for example an eyewash, bandage, medicines, health and safety procedure, instructions etc. The location and/or position of the objects 47a to 47e at support 48 is monitored by camera 13. The system comprises a data acquisition process 40 feeding the learning process 41 including a location, label and status of the objects. The monitoring process 42 is fed by the learning process 41, with the monitoring process 42 configured to communicate with the recovery process 43 if any one of the objects 47a to 47e is identified as displaced, missing or removed from the pre-defmed location at support 48.
The learning process 41, data acquisition 40, status monitoring 42 and recovery process 43 are implemented locally at station 11. Learning process 41 and status monitoring processes 42 are coupled for data communication with hub 12 and the associated object monitoring module 23, user interface module 24 and alert module 25. Accordingly, the system 10 is adapted to be self-learning by the learning process 41 and object monitoring module 23 so as to optimise status monitoring according to any one of changes in object size, appearance, position, location, illumination intensity level etc. A user interface framework 46 allows a user via a user interface 36 to tailor the system settings and the way in which objects 47a to 47e are monitored and object status is alerted 44, 45 via server 12.
Figure 3 illustrates aspects of the user interface framework 46 implemented with the user interface module 24 and user interface platform 36. Frame 50 provides a means to configure system 10 for the monitoring of multiple stations 11, each having multiple objects 47a to 47e. Accordingly, a user is enabled to identify objects 47a to 47e at each separate and remote location and to specifically configure the system 10 as required. According to frame 51, each separate station (electronics board) 11 is capable of being set up in which the objects are individually identified and marked. Referring to figure 5, frame 77 of user interface 36 (implemented on hub 12) provides the learning process 41 that allows the operator to configure necessary parameters. This includes the marking of objects of interest. In particular, using 2D image data obtained by camera 13 at station 11, object 47a is marked by drawing a rectangle 63 around it. Further objects of interest are identified and marked via separate identification rectangles 64. Object labels 65 may then be added, and actions and alerts configured 66. Further object notification, status and information labelling may be configured via tile 67 including configuration of alert messages e.g. ‘ eyewash is missing, please investigate an alert type (i.e. repeated), an alert interval (delay between object anomaly detection and alert issue).
A data transfer pathway map is illustrated in figure 4. Via the RPi computer board 39 implementing camera 13, proximity sensor 14 and GPS sensor 15, signal data is generated 52. Such signal data 52 is divided into image data 53 (generated by camera 13); GPS positional coordinate data 54 (generated by GPS sensor 15) and proximity/di stance data 55 (generated by proximity sensor 14). The various data sets 53, 54, 55 are relayed to the implementing modules of system 10 to feed the various processors including the learning process 41, the object status/reporting process 57, the object monitoring/tracking process 42 and the recovery process 43. Present system 10 is adapted to provide: a communication pathway 59 between GPS coordinates data 54 and the status process 57; a data transfer pathway 60 between the proximity distance data 55 and the recovery process 43; a data pathway 62 between image data 53 and tracking/monitoring process 42; and a data pathway 61 between the status and recovery processes 57, 43. In particular, the system 10 collects all data from the various sensors 13, 14, 15 and feeds them into the hub 12 for processing. As indicated, local processing may occur initially prior to issuing to hub 12. Accordingly, the image data 53 acquired by camera 13 is used for tracking, recovery and status reporting; the GPS coordinates data 54 is sent directly to the status reporting process 57 and the proximity distance data 55 is collected through proximity sensor 14 and is used to improve the efficiency and accuracy of both the decision-making process and the recovery process 43.
Referring to figure 5 and as indicated, the learning process 41 enables an operator to configure the various operative parameters via web-based user interface 36. This learning process 41 via example frames 50, 51, 77 and the associated fields, menu selections, labels and tiles 65, 66, 67 include functions to mark objects of interest by annotating/editing images generated by camera 13 at local station 11 i.e., by drawing a rectangle 63, 64 around such objects on these images captured by camera 13 . An identified targeted image may then be labelled via label 65. The customisation of alerts including type, delivery destination, frequency etc., may then be configured via tiles 66, 67 for example enabling a user to configure multiple or single alerts to be sent depending upon how long the object of interest is missing or displaced from a pre-defmed location at board 48.
Referring to figures 1, 2, 4 and 7, the object monitoring process 42 using object monitoring module 23 provides that data is returned from the learning process 41 to the local electronic station 11 and is used to initialise the monitoring process 42. This functionality is implemented using a computer vision-based tracking method and tracker process module 16 and computer vision module 17. In particular, the vision-based tracking method utilises Kernelized Correlation Filters algorithm available from OpenCV libraries.
The recovery process 43 utilises oriented FAST and rotated BRIEF (ORB) algorithm to locate an object and report a new position when displaced from the original pre-defmed location. Accordingly, the supplementary recovery process 43 allows for reinitialisation of a moved or lost objet and continued monitoring. The recovery process is triggered whenever an object of interest (as determined by a user) is missing for longer than a pre- defmed time period. For example, an object may have been removed from board 48 for a time period and may be placed back on the board 48 at a new location. Accordingly, the recovery process (using image data generated by camera 13) provides reinitilisation of the object pre-defmed location and continued monitoring.
The status process 57 involves the sending of all appropriate data from station 11 to server 12. This data contains current image and status data for all tracked objects 47a to 47e, with the status being updated whenever the system detects anomalies associated with each individually tracked object. The alerting process via alert module 25 is a server-based configuration of the system 10 responsible for sending appropriate warnings, notifications and alerts in direct response to a change in position, location and/or appearance of an object at the pre-defmed location. Such a location could be the original position of mounting of an object at board 48 or a new position (for example if an object has been removed from board 48, used and then returned but mounted in a different position). If an object of interest is reported lost by the status reporting module for long enough, the alerting process will take appropriate action according to its configuration (as provided and set by an operator during the learning process) to issue a notification or alert.
The present system 10 is configured for initial automatic detection of board 48 using an analysis utility in data communication with station 11 and in particular the images generated by camera 13. When system 10 is booted and the camera 13 is initialized, a first frame is captured and analysed by the analysis utility to detect board 48. The image is then converted from a red-green-blue (RGB) image to a hue-saturation value (HSV) image.
The HSV allows for much easier colour separation and detection. The next step is to find the largest green object in the frame after binarising the result extracting board position. The step involves creating a mask of the safety board which can be used to crop future images to contain only a pre-defmed area of board 48. This is advantageous to reduce computational demand by eliminating from the processing area, regions that do not contain objects of interest. If the system 10 is unable to find a large green object, the default setting is to ignore the masking procedure.
The object tracking process will now be described referring to figure 6. As indicated, tracking process 42 utilises KCF to monitor objects of interest by ‘ trackers ’ using tracking- by-detection methods. An object 47a is marked by a user via the earlier learning process 41 (involving user interface module 24) object monitoring module 23 and selected components of the station (electronics board) 11. The identified objects are fed into the KCF and used as positive samples to train a ‘ classifier \ Multiple samples from the rest of the image are fed as negative samples (background). By calculating the correlations using pixels in the direct vicinity of the tracked object (in next frame) the present system 10 (via the object tracking process 42) can identify a displacement direction of object 47a. A score assigned to each location into which an object is moved, helps determine the movement direction of the tracked object. The frame with movement is used as a positive sample to further train the classifier. KCF is advantageous due to its characteristics of being ‘ strict ’ when processing object movement. Tracking process 42 is utilised to report whenever an object of interest is in a scene or not. Then, ‘ containers' are built containing a list of trackers that track each object 47a to 47e individually, with this being created from data received from the cloud server 12 after initialisation and running of the learning process 41 described with reference to figure 2. The next stage is for each tracker to update its status on the provided frame. If the tracker is unable to find an object at a pre-defmed location during a pre-defmed time period, the process triggers the recovery process 43 to locate the object at board 48 (for example with the object being returned by personnel to a different location at board 48). Referring to figure 6, the tracking process involves initial identification of objects of interest at stage 70, the creation of a container having multiple trackers at stage 71. This data is generated based on the 2D image data collected by camera 13 involving data acquisition step 40. The tracking process is implemented in direct response to the image data acquisition at stage 72 to then trigger 73 the recovery process 43, as appropriate.
The recovery process will now be described referring to figures 1, 2 and 4. As indicated, the recovery process 43 may be considered supplementary to the primary monitoring process 42 utilising object monitoring module 23. As indicated, the recovery process 43 is triggered by the monitoring process 42 once an object has been categorised as ‘ misplaced ’ or ‘lost beyond a pre-defmed time period. After execution of the learning process 41, data containing the positions of objects of interest is used to issue the process of saving a template image for every object individually. Every template image contains features, unique to a particular object. Finding the lost object is a matter of finding its features, in the query image (current frame seen by the system). The recovery process is based on ORB, which consists of Oriented FAST and Rotated BRIEF algorithms. The first step is to detect features on the image (so-called keypoints). These can be any changes in pixel intensity when changing directions, the strongest being edges and corners. To perform this task the Features Accelerated Segment Test (FAST) algorithm is used. This applies the Harris Corner Measure to find the best points on the provided image. It finds the difference in intensity for a given displacement, in all directions. It also uses a pyramid technique to construct multi-scale features. This ensures the recovery process is compatible with scaling issues (object of interest might be closer or further in the scene). Finally, it computes the intensity weighted centroid of the patch with the located comer at the centre. The Binary Robust Independent Elementary Features (BRIEF) is then used as the descriptor, to describe the keypoints from template and scene images. It describes them by using the feature vector. It is also steered by the ORB algorithm, according to orientation. Finally, the Fast Library for Approximate Nearest Neighbours (FLANN) algorithm is used to compare feature vectors from the query image and template. FLANN is a collection of algorithms optimized for fast nearest neighbour search in the dataset. It is a more efficient approach compared to Brute Force methods.
The recovery process also makes sure that only unique features are being used. Every feature from template images is compared to at least two closest features on the query image. If correlations between template keypoint and two consecutive, closest features from query image are too similar, then that particular feature is disregarded as not unique enough.
As indicated, proximity sensor 14 is utilised to generate object distance data corresponding to a distance measured between sensor 14 (at station 11) and an object 47a to 47e at board 48. This distance data is utilised as part of the status process 57 and recovery process 43 to identify if an object has moved from the initial pre-defmed location at board 48 or is obstructed from detection by the proximity sensor 14, for example by a vehicle or person obstructing the direct ‘ line of sight path from sensor 14. The proximity sensor distance data is utilised in two different ways. Firstly, it boosts decision making performance by introducing another measure in the form of fluctuations of distance signal. For instance, if the majority of tracked objects 47a to 47e are lost, but the distance from the system to the board 48 has changed dramatically, the system can deduce that objects of interest are occluded (for example by a large vehicle parked in front of the board 48). The second implementation is quite similar to the recovery system. After the learning procedure, a series of distance signals are stored. Next, these values are processed and a threshold is calculated, representing a reference point for the system. Before issuing a recovery procedure for a lost object, the current distance value is checked and compared to the reference value. If it exceeds the reference threshold, the system does not trigger the recovery procedure due to the high probability of the scene being obstructed. This helps not only avoid running a computationally expensive algorithm, whenever there are signs it would not be able to perform its tasks, but also reduces the number of false positives.
The present system 10 further comprises energy saving modules, utilities and function via an automatic start-up module connected to a power supply. In particular, the system 10 automatically starts after connecting to a suitable power supply. No further steps are required by an operator. Station 11 detects and connects to server 12 automatically. Once powered, if the illumination environment at board 48 is insufficient, system 10 is configured to go into a ‘ sleep mode ’ automatically by calculating the average intensity of the images obtained by camera 13. Once the average intensity exceeds a pre-set threshold, system 10 will awake automatically and continue execution of tracking process 42. This functionality allows system 10 to work continuously and without manual start-up.
According to the specific implementation, the main computation unit is a Raspberry Pi 3B+ working under the Raspbian system. Camera 13 is preferably a Pi Camera V2 to capture images and is advantageous for efficient performance and low cost. GPS sensor may comprise an Adafruit Ultimate GPS sensor model providing high accuracy and stability. Proximity sensor 14 may be implemented as a weatherproof JSN-SR04T-2.0 proximity sensor. Such a sensor may be used in both wet or dry working conditions. Preferably, internet connection is reliably provided by built-in Wi-Fi or a suitable 4G modem (as will be appreciated other modems are suitable). Both voltage conversion and IP-65 rated enclosures may be customised by appropriate providers to meet set standards and requirements with one specific implementation of the hardware components at station 11 detailed in figure 7, with reference to figure 1.
As will be appreciated, the present system 10 comprising local electronic station 11 and server 12 that utilises various electronic components, architectures and functionality including at least one processor; at least one data storage; at least one reference data library; appropriate network communication modules including transceivers, transmitters and/or receivers; electronics boards such as printed circuit boards and/or programmable logic devices. Such components may be provided specifically at local station 11 and/or implemented within a cloud server (hub 12). Data reference libraries may be utilised by the various modules including the object monitoring module 23, user interface module 24, alert module 25 and /or tracking process module 16. Such data libraries may be acquired from third parties and/or built locally as required. Such local building may be a product of the data generated by the sensors including the camera 13, proximity sensor 14 and/or GPS sensor 15.

Claims

Claims
1. Apparatus to monitor and report a location and/or position status of at least one object at and/or proximate to a pre-defmed location comprising: a local electronic station positionable opposed to the object to be monitored at the pre-defmed location, the station having a camera device to generate image data of the object at and/or proximate to the pre-defmed location; a user interface module to enable a user to manage at least one of generation, processing and output of data based on the image data generated by the camera device; an object monitoring module to process the image data generated by the camera device; and an alerts module to generate and/or issue an alert notification of a location and/or position of the object relative to the pre-defmed location based on the image data processed by the object monitoring module; the user interface module, object monitoring module and/or alerts module provided at the station and/or at least one server located remote from the station.
2. The apparatus as claimed in claim 1 further comprising a learning module that utilises the user interface module, the learning module having an identification utility and/or a marking utility to allow a user to identify and/or mark the at least one object at the pre-defmed location within the image data.
3. The apparatus as claimed in claim 2 wherein the learning module further comprises an alert configure utility to allow a user to configure a type, frequency, format and/or delivery destination of an alert notification of a location and/or position status of the object.
4. The apparatus as claimed in any preceding claim wherein the object monitoring module is provided with an object tracking utility to process the image data generated by the camera device and to monitor and/or track a location and/or a position of the object relative to the pre-defmed location.
5. The apparatus as claimed in claim 4 wherein the object tracking utility comprises a Kernelized Correlation Filters algorithm to monitor and/or track a location and/or a position of the object relative to the pre-defmed location.
6. The apparatus as claimed in claim 5 wherein the object tracking utility is operable with a first data set comprising at least one image of the object at the pre-defmed location and at least a second data set comprising at least one image of the object moved from the pre-defmed location to determine a displacement direction of the object from the pre- defmed location.
7. The apparatus as claimed in claim 6 wherein the object tracking utility further comprises containers containing a plurality of trackers associated with a location and/or a position of the object, each tracker configured to identify the object within an image generated by the camera device.
8. The apparatus as claimed in claim 7 wherein apparatus further comprises a pixel- based feature detection utility, the object tracking utility operable with the pixel -based feature detection utility to determine features and/or edges of an image based on the image data and configured to locate and/or confirm a position of the object relative to the pre- defmed location.
9. The apparatus as claimed in any preceding claim further comprising a recovery module provided with or operable with a pixel-based feature detection utility to determine features and/or edges of an image based on the image data and configured to locate and/or confirm a position of the object relative to the pre-defmed location.
10. The apparatus as claimed in claim 9 wherein the recovery module further comprises edge and/or corner measurement and/or detection utilities to identify a difference in intensity of pixels of at least a first image of the object at a first location and/or position and at least a second image of the object moved from the first location and/or position to a second location and/or position.
11. The apparatus as claimed in claim 10 wherein the pixel -based feature detection utility comprises an Orientated Features Accelerated Segment Test (FAST) algorithm and/or a rotated Binary Robust Independent Elementary Features (BRIEF) algorithm.
12. The apparatus as claimed in any one of claims 9 to 11 wherein the recovery module further comprises a Fast Library for Approximate Nearest Neighbours (FLANN) algorithm to compare feature vectors from an image of the object generated by the camera device with a library/template image of the object at the pre-defmed location based on image data generated by the camera device.
13. The apparatus as claimed in any one of claims 4 to 8 in combination with any one of claims 9 to 12 wherein the object tracking utility is provided in data communication with the recovery module to locate a position and/or location of the object relative to the pre-defmed location.
14. The apparatus as claimed in any preceding claim wherein the local electronic station further comprises:
• a proximity sensor to generate object distance data based on a distance measured between the sensor and the object; and/or
• a GPS sensor to generate GPS coordinate-based positional data of the object at and/or proximate to the pre-defmed location.
15. The apparatus as claimed in claim 14 further comprising a data acquisition module to collect data from any one or a combination of the camera device, the proximity sensor and the GPS sensor.
16. The apparatus as claimed in claim 14 or 15 further comprising an object distance utility to process the object distance data and compare measured distances to identify if the object has moved from the pre-defmed location and/or is obstructed from detection by the proximity sensor.
17. The apparatus as claimed in claim 16 wherein the object distance utility is further configured to process the object distance data and calculate a reference threshold distance value.
18. The apparatus as claimed in claim 17 when dependent on any one of claims 9 to 12 wherein the object distance utility is provided in data communication with the recovery module and operative to prompt triggering of the recovery module if the object distance is measured by the proximity sensor and is less than the reference threshold distance value.
19. The apparatus as claimed in any preceding claim further comprising an analysis utility in data communication with the camera device to convert an image of the object and/or support structure on which the object is mounted at the pre-defmed location from a red-green-blue (RGB) image to a hue-saturation-value (HSV) image.
20. The apparatus as claimed in any preceding claim further comprising an automatic start-up module connected to a power supply and configured to monitor an illumination intensity at or proximate to the object and to prompt the apparatus to enter a sleep mode if the illumination intensity is below a pre-set illumination intensity threshold.
21. The apparatus as claimed in any preceding claim wherein the modules and utilities of the preceding claims are located at the station and/or the at least one server.
22. The apparatus as claimed in claim 21 further comprising at least one or a combination of the following electronic components:
• at least one processor
• at least one data storage
• at least one reference data library
• at least one network communication module, including a transceiver, transmitter and/or receiver
• an electronics board, a printed circuit board and/or a programmable logic device; wherein said electronic components are provided at the local electronic station and/or the server.
23. A method of monitoring and reporting a location and/or position status of at least one object at and/or proximate to a pre-defmed location comprising: generating image data of the object using a camera device provided at a local electronic station positioned or positionable opposed to the object to be monitored at the pre-defmed location; managing at least one of generation, processing and output of data associated with the location and/or position of the object based on the image data generated by the camera device using a user interface module; processing the image data generated by the camera device using an object monitoring module to monitor the location and/or position of the object at and/or proximate to the pre-defmed location; and generating and/or issuing an alert notification based on the location and/or position of the object relative to the pre-defmed location using an alerts module based on the data processed by the object monitoring module.
24. The method as claimed in claim 23 wherein the user interface module, object monitoring module and/or alerts module provided at the station and/or at least one server located remote from the station.
25. The method as claimed in claim 23 or 24 further comprising identifying and/or marking the object within the image data generated by the camera device using an identification utility and/or a marking utility being part of a learning module that forms part of or is operable with the user interface module.
26. The method as claimed in claim 25 further comprising enabling a user to configure a type, frequency, format and/or delivery destination of an alert notification of a location and/or position status of the object via an alert configure utility.
27. The method as claimed in any one of claim 23 to 26 comprising processing the image data generated by the camera device to monitor and/or track a location and/or a position of the object relative to the pre-defmed location using an object tracking utility.
28. The method as claimed in claim 27 wherein the object tracking utility comprises a Kernelized Correlation Filters algorithm to monitor and/or track a location and/or a position of the object relative to the pre-defmed location.
29. The method as claimed in claim 28 wherein the object tracking utility is operable with a first data set comprising at least one image of the object at the pre-defmed location and at least a second data set comprising at least one image of the object moved from the pre-defmed location to determine a displacement direction of the object from the pre- defmed location.
30. The method as claimed in claim 29 wherein the object tracking utility further comprises containers containing a plurality of trackers associated with a location and/or a position of the object, each tracker configured to identify the object within an image generated by the camera device.
31. The method as claimed in claim 30 further comprising determining features and/or edges of an image based on the image data using a pixel-based feature detection utility to locate and/or confirm a location and/or position of the object relative to the pre- defmed location.
32. The method as claimed in claim 31 further comprising identifying a difference in intensity of pixels of at least a first image of the object at first location and/or position and at least a second image of the object moved from the first location and/or position to a second location and/or position using edge and/or corner measurement and/or detection utilities.
33. The method as claimed in claim 32 wherein the pixel-based feature detection utility comprises an Orientated Features Accelerated Segment Test (FAST) algorithm and/or a rotated Binary Robust Independent Elementary Features (BRIEF) algorithm.
34. The method as claimed in claim 32 or 33 wherein the step of locating and/or confirming a position of the object relative to the pre-defmed location further comprises comparing feature vectors from an image of the object generated by the camera device with a library/template image of the object at the pre-defmed location based on image data generated by the camera device using a Fast Library for Approximate Nearest Neighbours (FLANN) algorithm.
35. The method as claimed in one of claims 23 to 34 further comprising:
• generating object distance data based on a distance measured between the object and a proximity sensor provided at the local electronic station; and/or
• generating GPS coordinate-based positional data of the object at and/or proximate to the pre-defmed location a GPS sensor provided at the local electronic station.
36. The method as claimed in claim 35 wherein a data acquisition module collects data from any one or a combination of the camera device, the proximity sensor and the GPS sensor.
37. The method as claimed in claim 35 or 36 wherein an object distance utility processes the object distance data and compares measured distances to identify if the object has moved from the pre-defmed location or is obstructed from detection by the proximity sensor.
38. The method as claimed in claim 37 wherein the object distance utility processes the object distance data and calculates a reference threshold distance value.
39. The method as claimed in claim 38 wherein the object distance utility is operative to prompt triggering of a recovery module to locate and/or confirm a position of the object relative to the pre-defmed location if the object distance is detected to be less than the reference threshold distance value.
40. The method as claimed in any one of claims 23 to 39 further comprising converting an image of the object and/or a support structure on which the object is mounted at the pre-defmed location from a red-green-blue (RGB) image to a hue- saturation-value (HSV) image.
41. The method as claimed in any one of claims 23 to 40 further comprising monitoring an illumination intensity at the object at and/or proximate to the pre-defmed location and prompting a system implementing the method to enter a sleep mode if the illumination intensity is below a pre-set illumination intensity threshold.
42. The method as claimed in any one of claims 23 to 41 further comprising at least one or a combination of the following:
• processing the data using at least one processor
• storage the data in at least one data storage
• retrieving data from at least one reference data library
• transmitting and/or receiving data via at least one network communication module.
PCT/GB2021/050299 2020-02-13 2021-02-10 Object location status monitoring apparatus and method WO2021161008A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB2001968.3A GB2592035A (en) 2020-02-13 2020-02-13 Object location status monitoring apparatus and method
GB2001968.3 2020-02-13

Publications (1)

Publication Number Publication Date
WO2021161008A1 true WO2021161008A1 (en) 2021-08-19

Family

ID=69956456

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2021/050299 WO2021161008A1 (en) 2020-02-13 2021-02-10 Object location status monitoring apparatus and method

Country Status (2)

Country Link
GB (1) GB2592035A (en)
WO (1) WO2021161008A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115988413A (en) * 2022-12-21 2023-04-18 北京工业职业技术学院 Train operation supervision platform based on sensing network

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6970183B1 (en) * 2000-06-14 2005-11-29 E-Watch, Inc. Multimedia surveillance and monitoring system including network configuration
US20060203101A1 (en) * 2005-03-14 2006-09-14 Silsby Christopher D Motion detecting camera system
US20070217780A1 (en) * 2006-03-17 2007-09-20 Shinichiro Hirooka Object detection apparatus
US20180220104A1 (en) * 2017-01-30 2018-08-02 David R. Hall Apparatus for Protecting a Delivered Parcel
US20190188492A1 (en) * 2016-08-19 2019-06-20 Osram Gmbh Detection of the presence of static objects

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2932465B1 (en) * 2012-12-17 2018-02-07 Brainlab AG Removing image distortions based on movement of an imaging device
US9672710B2 (en) * 2015-02-26 2017-06-06 International Business Machines Corporation Item movement tracking with three-dimensional (3D) proximity exclusions
CN106412501B (en) * 2016-09-20 2019-07-23 华中科技大学 A kind of the construction safety behavior intelligent monitor system and its monitoring method of video
SG11201907834UA (en) * 2017-03-31 2019-09-27 Nec Corp Video image processing device, video image analysis system, method, and program
CN108093228A (en) * 2018-02-08 2018-05-29 飞巡(上海)航空科技发展有限公司 A kind of construction site recruitment monitoring service system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6970183B1 (en) * 2000-06-14 2005-11-29 E-Watch, Inc. Multimedia surveillance and monitoring system including network configuration
US20060203101A1 (en) * 2005-03-14 2006-09-14 Silsby Christopher D Motion detecting camera system
US20070217780A1 (en) * 2006-03-17 2007-09-20 Shinichiro Hirooka Object detection apparatus
US20190188492A1 (en) * 2016-08-19 2019-06-20 Osram Gmbh Detection of the presence of static objects
US20180220104A1 (en) * 2017-01-30 2018-08-02 David R. Hall Apparatus for Protecting a Delivered Parcel

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115988413A (en) * 2022-12-21 2023-04-18 北京工业职业技术学院 Train operation supervision platform based on sensing network
CN115988413B (en) * 2022-12-21 2024-05-07 北京工业职业技术学院 Train operation supervision platform based on sensor network

Also Published As

Publication number Publication date
GB2592035A (en) 2021-08-18
GB202001968D0 (en) 2020-04-01

Similar Documents

Publication Publication Date Title
US10627829B2 (en) Location-based control method and apparatus, movable machine and robot
US10078812B2 (en) Data center infrastructure management system having real time enhanced reality tablet
US20220077820A1 (en) Method and system for soar photovoltaic power station monitoring
CN111710055A (en) Portable power inspection equipment, power inspection method and power inspection system
CN103856762A (en) Multi-camera intelligent selection and video priority judgment system and selection method
CN110118973A (en) Warehouse Intellisense recognition methods, device and electronic equipment
CN105336074A (en) Alarm method and device
US11528452B2 (en) Indoor positioning system using beacons and video analytics
CA2851950A1 (en) System for managing locations of items
CN111274934A (en) Implementation method and system for intelligently monitoring forklift operation track in warehousing management
Liang et al. Image-based positioning of mobile devices in indoor environments
CN104981820A (en) Method, system and processor for instantly recognizing and positioning object
CN115649501B (en) Unmanned aerial vehicle night lighting system and method
CN111770450B (en) Workshop production monitoring server, mobile terminal and application
CN108257244B (en) Power inspection method, device, storage medium and computer equipment
WO2021161008A1 (en) Object location status monitoring apparatus and method
CN113965733A (en) Binocular video monitoring method, system, computer equipment and storage medium
CN109168173A (en) Base station operation management method, apparatus and electronic equipment
CN107610260B (en) Intelligent attendance system and attendance method based on machine vision
EP3929804A1 (en) Method and device for identifying face, computer program, and computer-readable storage medium
CN110633639B (en) Receiving and transporting supervision method and receiving and transporting supervision system
CN113516122A (en) Robot vision system and method for intelligent energy conservation operation of power distribution room
CN112766138A (en) Positioning method, device and equipment based on image recognition and storage medium
CN112433260B (en) Security check imaging system
CN115797894A (en) Data processing method, data processing device, electronic device, and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21706666

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21706666

Country of ref document: EP

Kind code of ref document: A1