WO2021015672A1 - Système de surveillance, système de suivi d'objet et procédé de fonctionnement associé - Google Patents

Système de surveillance, système de suivi d'objet et procédé de fonctionnement associé Download PDF

Info

Publication number
WO2021015672A1
WO2021015672A1 PCT/SG2020/050418 SG2020050418W WO2021015672A1 WO 2021015672 A1 WO2021015672 A1 WO 2021015672A1 SG 2020050418 W SG2020050418 W SG 2020050418W WO 2021015672 A1 WO2021015672 A1 WO 2021015672A1
Authority
WO
WIPO (PCT)
Prior art keywords
tracking system
object tracking
targeted
alert
image capture
Prior art date
Application number
PCT/SG2020/050418
Other languages
English (en)
Inventor
Bondan Setiawan
Yoriko Kazama
Original Assignee
Hitachi, Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi, Ltd. filed Critical Hitachi, Ltd.
Publication of WO2021015672A1 publication Critical patent/WO2021015672A1/fr

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19608Tracking movement of a target, e.g. by detecting an object predefined as a target, using target direction and or velocity to predict its new position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/22Electrical actuation
    • G08B13/24Electrical actuation by interference with electromagnetic field distribution
    • G08B13/2402Electronic Article Surveillance [EAS], i.e. systems using tags for detecting removal of a tagged item from a secure area, e.g. tags for detecting shoplifting
    • G08B13/2405Electronic Article Surveillance [EAS], i.e. systems using tags for detecting removal of a tagged item from a secure area, e.g. tags for detecting shoplifting characterised by the tag technology used
    • G08B13/2414Electronic Article Surveillance [EAS], i.e. systems using tags for detecting removal of a tagged item from a secure area, e.g. tags for detecting shoplifting characterised by the tag technology used using inductive tags
    • G08B13/2417Electronic Article Surveillance [EAS], i.e. systems using tags for detecting removal of a tagged item from a secure area, e.g. tags for detecting shoplifting characterised by the tag technology used using inductive tags having a radio frequency identification chip

Definitions

  • Various aspects of this disclosure relate to a surveillance system. Various aspects of this disclosure relate to a method of operating a surveillance system. Various aspects of this disclosure relate to a method of forming a surveillance system. Various aspects of this disclosure relate to an object tracking system. Various aspects of this disclosure relate to a method of forming and/or operating the object tracking system.
  • Various embodiments may provide a surveillance system.
  • the surveillance system may include one or more image capture devices.
  • the surveillance system may also include an object tracking system connecting the one or more image capture devices through a network.
  • the object tracking system may be configured to provide an alert upon the object tracking system determining that the object tracking system is unable to identify a targeted object in one or more images captured by the one or more image capture devices when a time period reaches a predefined duration, and upon a re-identification confidence level dropping below a predetermined threshold.
  • the object tracking system may include a processor configured to identify a targeted object in one or more images.
  • the processor may be configured to provide an alert upon the processor determining that the processor is unable to identify a targeted object in the one or more images when a time period reaches a predefined duration, and upon a re-identification confidence level dropping below a predetermined threshold.
  • Various embodiments may relate to a method of operating a surveillance system.
  • the method may include capturing one or more images using one or more image capture devices of the surveillance system via a network of the surveillance system.
  • the object tracking system may be configured to provide an alert upon the object tracking system determining that the object tracking system is unable to identify a targeted object in the one or more images captured by the one or more image capture devices when a time period reaches a predefined duration and upon a re-identification confidence level dropping below a predetermined threshold.
  • FIG. 1 is a system architecture diagram of a surveillance system according to various embodiments.
  • FIG. 2A is a conceptual diagram illustrating identification of the targeted object according to various embodiments.
  • FIG. 2B is another conceptual diagram illustrating identification of the targeted object according to various embodiments.
  • FIG. 2C is another conceptual diagram illustrating a scenario in which an image capture device 111 is unable to capture an image showing the bag according to various embodiments.
  • FIG. 3 is a schematic showing the object tracking system according to various embodiments.
  • FIG. 4 is a conceptual flow chart illustrating the operation of the object tracking system according to various embodiments.
  • FIG. 5 shows an example of an object table according to various embodiments.
  • FIG. 6 shows an example of a tracked object table according to various embodiments.
  • FIG. 7 is a conceptual flow chart illustrating providing alert for triggering an alert the object tracking system determining that the object tracking system is unable to identify the targeted object various embodiments.
  • FIG. 8 shows an exemplary screen of a Graphical User Interface (GUI) of the surveillance system 100 when the object tracking system is able to identify the targeted object according to various embodiments.
  • GUI Graphical User Interface
  • FIG. 9 shows another exemplary screen of the Graphical User Interface (GUI) of the surveillance system when the object tracking system is unable to identify the targeted object according to various embodiments.
  • GUI Graphical User Interface
  • FIG. 10 shows yet another exemplary screen of the Graphical User Interface (GUI) of the surveillance system when multiple objects similar to the targeted object is been identified according to various embodiments.
  • GUI Graphical User Interface
  • FIG. 11 is a system architecture diagram of a surveillance system according to various embodiments.
  • FIG. 12A is a conceptual diagram illustrating identification of the targeted object at a surveillance area, such as an airport or a bus terminal, when the airport or the bus terminal is not so crowded according to various embodiments.
  • FIG. 12B is another conceptual diagram illustrating the scenario the surveillance area is crowded according to various embodiments.
  • FIG. 13 is a schematic showing a surveillance system according to various embodiments.
  • FIG. 14 is a schematic showing an object tracking system according to various embodiments.
  • FIG. 15 is a schematic illustrating a method of operating a surveillance system according to various embodiments
  • Embodiments described in the context of one of the methods or surveillance systems are analogously valid for the other methods or surveillance systems. Similarly, embodiments described in the context of a method are analogously valid for a surveillance system, and vice versa.
  • the articles“a”,“an” and“the” as used with regard to a feature or element include a reference to one or more of the features or elements.
  • the term“about” or“approximately” as applied to a numeric value encompasses the exact value and a reasonable variance.
  • an object tracking system as described herein may include a memory which is for example used in the processing carried out in the object tracking system.
  • a memory used in the embodiments may be a volatile memory, for example a DRAM (Dynamic Random Access Memory) or a non-volatile memory, for example a PROM (Programmable Read Only Memory), an EPROM (Erasable PROM), EEPROM (Electrically Erasable PROM), or a flash memory, e.g., a floating gate memory, a charge trapping memory, an MRAM (Magnetoresistive Random Access Memory) or a PCRAM (Phase Change Random Access Memory).
  • DRAM Dynamic Random Access Memory
  • PROM Programmable Read Only Memory
  • EPROM Erasable PROM
  • EEPROM Electrical Erasable PROM
  • flash memory e.g., a floating gate memory, a charge trapping memory, an MRAM (Magnetoresistive Random Access Memory) or a PCRAM (Phase Change Random Access Memory).
  • Coupled may be understood as electrically coupled or as mechanically coupled, for example attached or fixed, or just in contact without any fixation, and it will be understood that both direct coupling or indirect coupling (in other words: coupling without direct contact) may be provided.
  • FIG. 1 is a system architecture diagram of a surveillance system 100 according to various embodiments.
  • the surveillance system 100 may include image capture devices 111, 112, 113 such as cameras, video cameras, or video management systems (VMS).
  • the surveillance system 100 may also include an object tracking system 200 for tracking objects.
  • the object tracking system 200 may be a computation capable system that performs image data acquisition from image capture devices 111, 112, 113 etc., perform detection and analysis to the images from the image capture devices 111, 112, 113 to track an object-of-interest, i.e. targeted object.
  • the surveillance system 100 may also include a network 140 connecting the image capture devices 111, 112, 113 etc. and the object tracking system 200.
  • the network 140 may be a connection which enable communication between devices within surveillance system.
  • the network 140 may be a wired network, and may include local area network (LAN), serial communication network, and/or other suitable wired connections.
  • the network may be a wireless network, and may include Wi-Fi network, a Bluetooth network, and/or other suitable wireless connections. Wireless transmissions may be carried out over a data protocol such as Real Time Transport Protocol (RTSP).
  • RTSP Real Time Transport Protocol
  • the network may include both wired and wireless connections.
  • the network 140 may include one or more direct connections between the various components of the system 100, e.g. a direct connection between each of the image capture devices 111, 112, 113 etc. and the object tracking system 200.
  • the surveillance system 100 may also include a command center terminal 130, which may be a user terminal, connected to the network 140.
  • the command center terminal 130 may be configured to display images captured by the image capture devices 111, 112, 113 etc. and the results of object tracking.
  • the command center terminal 130 may have multiple displays showing multiple camera streams at the same time to enable an user, e.g. a security officer, to monitor multiple places at the same time.
  • the command center terminal 130 may provide the user an interface or input device to perform available functions or control operations of the surveillance system 100.
  • the surveillance system 100 may also include a client terminal 121 connected to the network 140.
  • the client terminal 121 may be another user terminal or may be a portable processing device such as a mobile device.
  • the client terminal 121 may be positioned near an exit, and may typically be accessed by a ground or field officer or guard stationed at the exit.
  • the client terminal 121 may display results of object tracking.
  • FIG. 2A is a conceptual diagram illustrating identification of the targeted object according to various embodiments.
  • FIG. 2B is another conceptual diagram illustrating identification of the targeted object according to various embodiments.
  • Surveillance area 10 represents an area in which the surveillance system 100 is being deployed.
  • the area may be a park, bus terminal area, train station, airport, shopping mall, etc.
  • the surveillance area 10 may have boundaries that is impossible or difficult for an object, such as the targeted object to pass through, such as walls or fences. Exit 20 may be the only way for the targeted object to exit from the surveillance area 10.
  • the surveillance center 30 may be a building or a closed area where the object tracking system 200 and the command center terminal 130 reside.
  • the image capture device 111 may be placed near the exit point 20.
  • Image capture devices 112, 113 and other image capture devices may be placed at various locations inside the surveillance area 10.
  • the image capture devices 111, 112, 113 etc. may capture images which include objects such as people, carts, baggage, bike, etc.
  • the images captured by the image capture devices 111, 112, 113 etc. may be used by the object tracking system 200 to track the targeted object.
  • the guard station 40 may be a small post near the exit point 20 in which a guard may be stationed.
  • the client terminal 121 may be deployed at the guard station 40, so that the guard may monitor the images from image capture devices 111, 112, 113 etc., and access the tracking results of the targeted object.
  • FIGS. 2A-B there may be a situation in which an object, e.g. bag 11, is being reported stolen within the area 10 at time to by an owner.
  • the bag 11 may have been moved within area 10 by the criminal who has taken the bag.
  • the bag 11 is labelled in FIG. 2A as bag 1 la at time ti (a time after to), bag 1 l b at time t2 (a time after ti), bag 1 lc at time t3 (a time after t2), and bag 1 Id at time U (a time after t3).
  • the owner may be able to identify the bag 11 from the image provided by the image capture device 113 at to.
  • the officer at the command center 130 may set the bag 11 as a targeted object to be tracked in object tracking system 200.
  • the object tracking system 200 may be configured to determine whether the targeted object is present in one or more images captured by the one or more image capture devices 111,112, 113 etc.
  • the object tracking system 200 may be configured to extract features of the various objects (from the images provided by the capture devices 111, 112, 113 etc.), and may be configured to store the extracted features of the various objects.
  • the object tracking system 200 may be configured such that the owner or the officer may be able to provide, for instance via an input device, a key image of the targeted object and/or other relevant information.
  • the object tracking system 200 may be configured to extract features of the key image of the targeted object.
  • the object tracking system 200 may then be configured to determine whether the extracted features of the various objects correspond to the extracted features of the key image of the targeted object.
  • the object tracking system 200 may determine or calculate a vector distance between the extracted features of various objects and the extracted features of the key image of the targeted object, and may determine that the object is indeed the targeted object if the vector distance falls below a predefined threshold.
  • a low vector distance may represent high similarity between an object and the targeted object.
  • the targeted object may be tracked.
  • the guard in the guard station 40 may be informed that bag l id has been stolen and is still within surveillance area 10.
  • the camera 111 may capture one or more images showing the bag l id.
  • the object tracking system 200 may identify the bag l id and may trigger an identification alert on the client terminal 121, so the guard can confirm that someone is about to bring the bag 11 out from the surveillance area 10.
  • FIG. 2C is another conceptual diagram illustrating a scenario in which an image capture device 111 is unable to capture an image showing the bag l id according to various embodiments.
  • the bag 1 Id may be approaching the exit 20 but the one or more images captured at time U by the image capture device 111 does not show bag 1 Id.
  • the object tracking system 200 may have actually estimated that bag 1 Id should reach exit point 20 area at U, but is unable to identify the bag 1 Id based on the one or more images due to the crowded condition near the exit 20.
  • the object tracking system 200 may be configured to identify the bag l id by extracting features of objects within the one or more images captured by the image capture device 111 and comparing the extracted features with the stored features of the targeted object. However, the object tracking system 200 may determine that the system 200 is unable to identify the bag 1 Id in one or more images captured by the image capture devices 111, 112, 113 etc. Upon a time period reaching a predefined duration, such as a value from 30 seconds to 3 minutes (e.g. 2 or 3 minutes) and upon a re- identification confidence level dropping below a predetermined threshold, the object tracking system 200 may be configured to provide an alert. The alert may indicate an“unable to identify” situation or a“lost track” condition.
  • the predefined duration may be calculated from when the object tracking system 200 determines the bag l id should be near the exit 20 and should be captured by the image capture device 111.
  • the alert may trigger the guard to perform manual check on all people and/or all objects leaving exit 20.
  • the object tracking system may be configured to provide the alert upon the object tracking system determining that the object tracking system is unable to identify the targeted object in one or more images captured by the image capture devices 111, 112, 113 etc., and upon a re identification confidence level dropping below a predetermined threshold.
  • the re-identification confidence level dropping below a predetermined threshold may be due to a crowdedness level exceeding a predetermined threshold.
  • the object tracking system 200 may be configured to determine or calculate the crowdedness level by determining or calculating a number of people or objects in one image of the one or more images. Upon the crowdedness level exceeding a predetermined value, such as any number selected from 10 to 50 (e.g. 20 people or objects), and upon the object tracking system being unable to identify the targeted object in the one or more images captured by the image capture devices 111, 112, 113 etc. during the predefined duration, the object tracking system 200 may be configured to provide the alert to indicate the“unable to identify” situation or the“lost track” condition.
  • the predefined duration for the time period and/or the predetermined threshold for crowdedness level may be set by an user, such as a security guard.
  • the re identification confidence level dropping below a predetermined threshold may be additionally or alternatively be due to the vector distance between extracted features of multiple objects and the extracted features of the targeted object falling below the predefined threshold, thereby indicating that multiple similar objects are found by the object tracking system.
  • the object tracking system 200 may be configured to provide an indication that the targeted object is identified only when one object (i.e. a single object) is found to be similar to the targeted object during the predefined duration.
  • the re identification confidence level may drop below the predetermined threshold, and the object tracking system 200 may be configured to provide the alert to indicate the“unable to identify” situation or the“lost track” condition.
  • the re-identification confidence level may drop below the predetermined threshold, and the object tracking system 200 may be configured to provide the alert to indicate the“unable to identify” situation or the “lost track” condition.
  • the alert may, for instance, be a visual alert (e.g. a change of colour and/or display of text), an audio alert, a physical alert (e.g. vibrations), or any combination thereof.
  • the visual alert may for instance be provided through a monitor display of the client terminal 121, and/or one or more physical tower lamps which are configured to switch between red, yellow and green.
  • the one or more physical tower lamps may be configured such that the alert is provided when the one or more physical tower lamps turn red.
  • the client terminal 121 may be a mobile device, and the alert may be provided through vibrations of the mobile device.
  • FIG. 3 is a schematic showing the object tracking system 200 according to various embodiments.
  • the object tracking system 200 may include a computer server 201.
  • the computer server 201 or object tracking system 200 may be externally connected to image capture devices 111, 112, 113 etc.
  • the computer server 201 or object tracking system 200 may be externally connected to input device 250 and display 260.
  • the input device 250 and/or the display 260 may be part of the command center terminal 130 and/or the client terminal 121, and may be connected to the computer server 201 through the network 140.
  • the input device 250 and/or the display 260 may be directly connected to the server 201.
  • the computer server 201 may include a Central Processing Unit (CPU) 210, a network device 215, an input/output (I/O) interface 220 and a storage device 230.
  • the CPU 210 may be configured to perform image and video analysis.
  • the various components of the server 201, i.e. the CPU 210, the network device 215, the I/O interface 220, and the storage device 230 may be connected to one another.
  • the CPU 210 may alternatively be a Graphical Processing Unit (GPU).
  • the storage device 230 may be a logical unit capable of storing program 231 and database 230.
  • the CPU 210 may execute or run the program 231 stored in the database 230.
  • the storage device 230 may also temporarily store processing data when the CPU is running the program 231.
  • the storage device 230 may be an internal memory such as a Random Access Memory (RAM), a Solid State Drive (SSD), or a Hard Disk Drive (HDD).
  • the storage device 230 may alternatively be a partially separated physical storage system such as a Network Attached Storage (NAS), or a Storage Array Network (SAN).
  • NAS Network Attached Storage
  • SAN Storage Array Network
  • the program 231 may execute steps of the method for tracking objects.
  • the database 230 may store information, such as user input, location, description, time relating to the targeted object, the key image or features extract from the targeted object, as well as the key image or features of objects extracted from the images captured by the image capture devices 111, 112, 113 etc.
  • the network device 215 may connect the surveillance system 100.
  • the network device 215 may be a connected ethemet device, or a wireless network connected device, etc. connecting the object tracking system 200 / server 201 to the network 140.
  • the I/O interface 220 may perform data send and receive with the input device 250 and the display 260. Data may be transmitted from the input device 250 to the server 201 through the I/O interface 220, and data may be transmitted from the server 201 to the display 260 through the I/O interface 220. In various embodiments, different I/O interfaces 220 may connect to the input device 250 and to the display 260.
  • the I/O interface may be a serial or parallel data interface such as Universal Serial Bus (USB), or High Definition Multimedia Interface (HDMI).
  • USB Universal Serial Bus
  • HDMI High Definition Multimedia Interface
  • the I/O interface may also be a wireless connection such as Bluetooth, or Wireless LAN.
  • the display 260 may be a Liquid Crystal Display (LCD), a Plasma Display, a Cathode Ray Tube (CRT) Display, or a projector display etc.
  • the input device 250 may be a keyboard, a mouse, or a touch screen.
  • the display 260 and the input device 250 may also be implemented in a separate device such as a browser in a computer, or an application on a tablet, that is connected to the surveillance system 100 or the object tracking system 200.
  • FIG. 4 is a conceptual flow chart illustrating the operation of the object tracking system 200 according to various embodiments.
  • the one or more images e.g. surveillance videos
  • the one or more images may be received from image capture devices 111, 112, 113 etc. at a corresponding plurality of locations within area 10.
  • the one or more images may be sent from image capture devices 111, 112, 113 etc. through the network 140, which may include wired connections, wireless connections, or a combination of both wired and wireless connections.
  • the one or more images may be transmitted through one or more direct connections between the image capture devices 111, 112, 113 etc. and the object tracking system 200.
  • the object tracking system 200 may begin image analysis on the one or more images.
  • the system 200 may extract every image frame from the received surveillance videos and may perform image analysis on the image frames.
  • the image analysis process may involve processing the image frames (i.e. the one or more images) to detect objects, including the targeted object.
  • the image analysis may include extraction of data relating to the objects and registering or storing the extracted data in an object table in the database 230.
  • the system 200 may be configured to extract the features of various objects from the image frames, and may be configured to store the extracted features as an object table in the database 230.
  • the extracted data or features may include the signatures of the key images of the objects, which may be used to identify any unique object.
  • Image analysis process may also include utilizing an object detection neural network model trained to localize object positions within an image frame and to classify the targeted object.
  • the process of localizing and classifying the targeted object may include evaluating the image signature or features extracted from each image frame.
  • the image signature may be represented as an array of numbers.
  • the image signature of the detected objects may be registered or stored in the database 230.
  • the object tracking system 200 may receive a user operation input for registering an object to be tracked via the input device 250.
  • the user operation input may be received from input device 250 in the command center 130.
  • the user operation input may include selecting a key image of the targeted object for the search or identification, where the key image portrays the targeted object to be searched.
  • the object tracking system 200 may present a Graphical User Interface (GUI) to the user to facilitate defining the key image of the targeted object.
  • GUI Graphical User Interface
  • the GUI provide an interface for the user to enter further information for the search, e.g. search time duration, camera selection etc.
  • the object tracking system may generate a query on the extracted features of various objects stored in the database 230 based on the input provided by the user, e.g. the key image of the targeted object, and/or other information.
  • the object tracking system 200 may initiate searching or tracking of an object that resembles the key image of the targeted object.
  • the object tracking system 200 may extract features of the key image of the targeted object.
  • the query may include information such as a combination of search time duration, camera selection and extracted features of the key image of the targeted object.
  • the query may also include comparing the features of key image of the targeted object with the extracted features of various objects that is stored in the object table, and determine whether the extracted features of an object stored in the object table corresponds to the extracted features of the key image of the targeted object, thereby determining whether if one of the objects as represented in the object table is indeed the targeted object.
  • the query may also include comparing the extracted features of the key image of the targeted object to extracted features of various objects present in images captured by image capture devices 111, 112, 113 etc.
  • the matching process can be performed by calculating similarity of object features or signature, for example by calculating the vector distance between signatures or features of the key image and signatures or features of objects in the one or more images captured by the image capture devices.
  • the object tracking system may determine that the object is the targeted object, and the location of the object may be estimated based on the location of the camera that captured the matching images.
  • the matching process may further include determining the respective accuracies of the estimated locations and selecting the estimated locations with the highest determined accuracy. Determining the accuracies of the estimated locations may include comparing the vector distance or similarity level of the signatures.
  • the display may present the results of the tracking and identification process through display 260.
  • FIG. 5 shows an example of an object table 400 according to various embodiments.
  • the object table 400 may be generated in step 302 shown in FIG. 4.
  • the object table 400 may be stored in the database 230 of the object tracking system 200.
  • the column labelled as“Object ID” 401 may store object identifiers (ID).
  • the object IDs may be unique identification codes for every detected object.
  • the column labelled as“Image data” 402 may store the image data.
  • the image data may be the numerical representation of the image, for example, an array of binary code or hexadecimal code.
  • the“Image Data” column 402 may store a pointer or a web link to an archived image file.
  • the column labelled as“Image Type” 403 may store information on the type, i.e. category or classification that the detected object belongs to.
  • the detected object may be a person, a bicycle, a bag, an animal or any other type of objects.
  • the column labelled as“Image Signature” 405 may contain the signature of the object, in other words, extracted features of the object.
  • the signature may include a matrix or a vector that encodes the extracted features of the object.
  • the column labelled as“Camera ID” 406 may store a unique camera identifier that indicates the image capture device that captured the image containing the object.
  • the column indicated by“Frame ID” 407 may store the frame identifier that indicates a frame or image within the video captured by the image capture device that contains the object.
  • the frame ID may represent time information, since the sequence of the frame corresponds to the time that the image frame was captured.
  • the frame ID may be an encoding of time and date, for example in the epoch format.
  • the column labelled as“Bounding Box” 408 may store information on bounding box of the object.
  • the bounding box may contain x, y coordinate location position within the frame and x width and y width of the bounding box over the frame.
  • the distance between objects within an image frame may be calculated from the bounding box of each object.
  • the column“Track ID” 409 may store an identifier tracking number each object belongs to. Each track ID value may be a unique identification number of an unique object with sequential position movement across frames. The same object captured at different times by the same image capture device may have different values under“Object ID” 401 but may have the same value under“Track ID” 409.
  • the tracking of an object across frames can be carried out using B OOSTIN G tracker, MIL Tracker, Kalman Filtering or any other methods. Tracking among different image capture devices may also be done by estimating arrival time of an object to a certain camera location based on its last location, movement trajectory direction, camera distance, etc.
  • FIG. 6 shows an example of a tracked object table 500 according to various embodiments.
  • This table 500 may hold status information of tracking operation of targeted objects.
  • the column labelled as“Object Query ID” 501 is a unique identifier (ID) that differentiate one query object, i.e. targeted object, from another. It may correspond to“Object ID” 401 in image table 400.
  • the columns“Image Data” 502, “Image Type” 503, “Image Signature” 504, “Camera ID” 505,“Frame ID” 506, and“Bounding Box” 507 may serve similar functions as respective columns 402 - 407 of table 400 shown in FIG. 5.
  • the column labelled as“Tracking State: 508 may indicate information regarding the state of a targeted object, i.e. whether the targeted object has been found of not.
  • the value can be text representation variations. For example,“found” may be used to show that the targeted object is already found so the object tracking system will not perform tracking of this targeted object anymore.
  • the state may be displayed as“on going” which means that object tracking system is performing search, tracking and identification of the targeted object.
  • the column labelled as“Track ID” 509 may be similar to the column 409 in table 400.
  • the system 400 may try to search the same object in all of the image capture devices. If there is an object in an image captured by an image capture device which is considered to be similar to the query object, the track ID of the object may be added to this list. Different track IDs corresponding to different image capture devices may be used to indicate the same object.
  • a track ID may correspond to the query object at different locations at different points in time. As such, multiple track IDs for the same query object in column 509 may provide a moving trace of the query object.
  • the targeted object may be estimated to appear at exit point 20 at a certain time.
  • the system 100 may perform identification of the object based on images captured by image device 111 located near the exit at a predetermined time period. When the object is identified, an indication may be raised to inform the guard at guard station 40, so that the guard may stop the corresponding person carrying the object.
  • FIG. 7 is a conceptual flow chart illustrating providing alert for triggering an alert the object tracking system 200 determining that the object tracking system 200 is unable to identify the targeted object various embodiments.
  • the object tracking system 200 may display an alert indicating that the system is unable to identify the targeted object (i.e.“Catch All” Alert) in step S602.
  • the alert i.e.“Catch All” Alert
  • the alert is different from the indication provided when the system 200 is able to identify the targeted object, and may trigger the guard to manually check to all objects within vicinity (in step S603).
  • the alert may include, for instance, the display of text such as“Catch All” or“Please Perform Manual Check On All People In Vicinity” or“Suspected item May Be Within Crowd”.
  • the text may be shown on a monitor display of the client terminal.
  • the alert may additionally or alternatively be another visual alert such as a change of colour, an audio alert, a physical alert (e.g. vibrations), or any combination thereof.
  • the visual alert may, alternatively or in addition to being provided through the monitor display of the client terminal 121, be provided via one or more physical tower lamps which are configured to switch between red, yellow and green.
  • the one or more physical tower lamps may be configured such that the alert is provided when the one or more physical tower lamps turn red.
  • the client terminal may be a mobile device, and the alert may be provided through vibrations of the mobile device.
  • the guard noticing the alert may inform all the people approaching the exit to line up and let the guard manually check all the people for the targeted object one at a time.
  • the targeted object is not able to be identified due to crowdedness condition where the targeted object is covered by other objects, i.e. occluded, after entering the“Catch All” condition where people are lining up, there may be higher chance that the targeted object may become easier to be identified.
  • the system 200 may continue to try to identify the object and display the indication or alert accordingly upon its identification.
  • the officer may perform a graphics user interface (GUI) operation to set that the targeted object is being found, which may trigger the system to remove the“Catch All” alert and stop the tracking of the targeted object.
  • GUI graphics user interface
  • the system 200 may perform additional identification for non-targeted objects and accordingly display“non-targeted” labels or notifications to these objects after the system 200 has evaluated or determined that these objects have very low similarity to the targeted object.
  • the system may also trigger a backup request message sending over the network to command center to request dispatching of more officers to help the manual check of the targeted object.
  • the backup request message may indicate a request for backup after the“Catch All” Alert is provided, and may be based on existing or pre-defined communication protocols.
  • the system 200 may trigger the sending of one or more backup request messages over the network 140 to the command center 130 to request dispatching of more security officers to help with the manual checking of the targeted object.
  • a state where an object is expected to be found in a certain camera image but could not be identified may come as a result where in an image frame, or in a sequence of image frames, the targeted object is occluded by other objects, resulting in the determination that in the image frame or frames, an object having high similarity to targeted object does not exist.
  • a threshold value may be set to define to set a state indicating that the targeted object cannot be identified (e.g.“Object Cannot Be Identified” or“Lost Track” state), and initiate“Catch All” steps as indicated in S601 - S602.
  • The“Object Cannot Be Identified” state may also arise as a result of estimation of the crowdedness by in a predefined area, e.g. a predefined area near the exit 11.
  • the crowdedness level may be determined by counting the number of the objects being detected in an image captured by the image capture device. If the number of the objects exceeds a certain predetermined number, combined with the above condition where system cannot find any objects with similarity above threshold value, the system may trigger the“Object Cannot be Identified” state in step S601, followed by step S602.
  • the object tracking system 200 may also be in the“Object Cannot be Identified” state upon manual input by the user.
  • the guard or security officer may decide he needs to check all the objects in vicinity, and/or trigger any additional steps further to providing the alert.
  • the object tracking system 200 may be triggered to be in the“Object Cannot be Identified” state upon multiple objects which are reasonably similar are identified within the same image frame, thus raising the possibility that the system may provide false indication. When this situation happens, the system 200 may produce a different alert or an additional alert.
  • the system 200 may be configured to display text such as“Multiple Similar Objects Found” or“Targeted Object Among Candidates” to inform the guard or security officer. At the same time, the system 200 may display bounding boxes overlaying each of the similar objects, until all these similar objects are being cleared, i.e. manually checked. [0063] FIG.
  • GUI 8 shows an exemplary screen of a Graphical User Interface (GUI) of the surveillance system 100 when the object tracking system 200 is able to identify the targeted object according to various embodiments.
  • FIG. 9 shows another exemplary screen of the Graphical User Interface (GUI) of the surveillance system 100 when the object tracking system 200 is unable to identify the targeted object according to various embodiments.
  • the GUI may, for instance, be a display at the client terminal 121.
  • the display may for instance include a query list 710 indicated the targeted object, as well as images 720, e.g. video feed, captured by the image capture device.
  • a bounding box corresponding to or surrounding the actual detected position of the targeted object is shown.
  • a message 730 e.g. “Suspected Item Detected”
  • suspected item i.e. the targeted object
  • a different message 731 for example,“Suspected Item May Be Within Crowd”, may be shown.
  • the GUI may also display a large bounding box covering group of objects where the object may exist.
  • FIG. 10 shows yet another exemplary screen of the Graphical User Interface (GUI) of the surveillance system 100 when multiple objects similar to the targeted object is been identified according to various embodiments.
  • the display e.g. a display at client terminal 121
  • the screen may show multiple bounding boxes surrounding or at the positions of the various objects that are similar to the targeted object.
  • the screen may also display a message 733, for example“Targeted Object Among Candidates”.
  • the guard or security officer may manually check all objects until either the system 200 is able to identify which object is the targeted object, or the guard or security officer manually set the object’s tracking state to “found”.
  • FIG. 11 is a system architecture diagram of a surveillance system 800 according to various embodiments. Similar to the surveillance system 100, the surveillance system 800 may include one or more image capture devices 111, 112, 113 etc., an object tracking system 270, and a network 140 connecting the one or more image capture devices 111, 112, 113 etc. to the object tracking system 270. The surveillance system 800 may also include a command center terminal 130 and a client terminal 121 connected to the network 140.
  • the object tracking system 270 may have similar hardware component as the object tracking system 200.
  • the object tracking system 270 may be configured to activate other components within the system 800, e.g. sending a signal to additional standby image capture devices 811, 812 and 813 etc. to activate these additional standby image capture devices 811, 812 and 813 etc.
  • additional standby image capture devices 811, 812 and 813 etc. may be surveillance cameras, and may be connected to the object tracking system 270 via the network 140.
  • These additional standby image capture devices 811, 812 and 813 etc. may be located near the exit of the surveillance area.
  • the standby image capture devices 811, 812 and 813 etc. may, by default, be turned off to lower energy consumption of the surveillance system 800, but may be activated by the object tracking system 270.
  • the standby image capture devices 811, 812 and 813 etc. may be activated by a trigger, which may be, for instance, a mechanism of Wake on LAN (Local Area Network), or a message being sent to the command center so the security officer at the command center can turn on the additional standby image capture devices 811, 812 and 813 etc
  • a trigger which may be, for instance, a mechanism of Wake on LAN (Local Area Network), or a message being sent to the command center so the security officer at the command center can turn on the additional standby image capture devices 811, 812 and 813 etc
  • the surveillance system 800 may have access to more capture angles, thus increasing the possibility to identify the targeted object.
  • the system 800 may have additional tracking devices, such as body worn cameras worn by security officers, mobile phone cameras held by the security officers, and/or smart glasses with cameras connected to the object tracking system 270 via the network 140. These additional tracking devices may also be activated when the object tracking system 270 is in the“Object Cannot be Identified” state.
  • the tracking system 270 may be configured to generate Augmented Reality (AR) images to the displays of these devices to show where the targeted object might be hidden. For example, arrows, texts or bounding boxes may be shown in the displays in real time. Additionally, during the“Catch All” state, the tracking system 270 may perform a video analysis process, for example, to identify pose and/or posture of people that may indicate they are holding the targeted object. The tracking system 270 may additionally or alternately perform a video analysis process on a group of objects (e.g. carts containing multiple items etc.) to determine the targeted object is within the group of objects.
  • a group of objects e.g. carts containing multiple items etc.
  • the surveillance system 800 may include another mode of identification such as (Radio Frequency Identifier) gates or gantry 820, X-ray scanner gates 830, and/or any other suitable devices.
  • the other mode of identification may be connected to the object tracking system 270 via the network 140.
  • the system 270 may also activate the RFID gates or gantry 820, which able to detect whether a certain RFID tag is passing through.
  • the targeted object may be known to have a particular RFID tag attached to the object.
  • the system 270 may activate the x-ray scanner gate to scan various objects to generate x-ray image profiles for determining if any of the various objects is the targeted object.
  • the targeted object may be known to have a certain x- ray image profile.
  • the system 270 may return to the normal identification state.
  • FIG. 12A is a conceptual diagram illustrating identification of the targeted object at a surveillance area 10’, such as an airport or a bus terminal, when the airport or the bus terminal is not so crowded according to various embodiments.
  • FIG. 12B is another conceptual diagram illustrating the scenario the surveillance area 10’ is crowded according to various embodiments.
  • the surveillance area 10’ in FIGS. 12A-B may be monitored by a surveillance system which includes image capture devices 111’, 112’, 113’ etc., an object tracking system, and a network connecting the image capture devices 111’, 112’, 113’ etc. with the object tracking system.
  • the surveillance system may include one or more x-ray scanner gates 830’ .
  • FIG. 13 is a schematic showing a surveillance system 900 according to various embodiments.
  • the surveillance system 900 may include one or more image capture devices 910.
  • the surveillance system 900 may also include an object tracking system 920 connecting the one or more image devices 910 through a network.
  • the object tracking system 900 may be configured to provide an alert upon the object tracking system 920 determining that the object tracking system 920 is unable to identify a targeted object in one or more images captured by the one or more image capture devices 910 when a time period reaches a predefined duration, and upon a re-identification confidence level dropping below a predetermined threshold.
  • the alert may, for instance, be a visual alert (e.g. a change in colour and/or display of text), an audio alert, a physical alert (e.g. vibrations), or any combination thereof.
  • the alert may, for instance, include a text such as“Catch All” or“Please perform manual check to all people in vicinity” or“Suspected item may be within the crowd”.
  • the visual alert may additionally or alternatively be sent using one or more physical tower lamps which are configured to switch between red, yellow and green.
  • the one or more physical tower lamps may be configured such that the alert is provided when the one or more physical tower lamps turn red.
  • the client terminal 121 may be a mobile device, and the alert may be provided through vibrations of the mobile device.
  • the surveillance system may, for instance, be a surveillance system 100 as shown in FIG. 1, a surveillance system 800 shown in FIG. 11, or a surveillance system as illustrated in FIGS. 12A-B.
  • the one or more image capture devices 910 may refer to the one or more image capture devices 111, 112, 113 etc. in FIG. 1 or FIG. 11, or devices 111’. 112’, 113’ etc. as shown in FIGS. 12 A-B.
  • the object tracking system 920 may refer to the object tracking system 200 shown in FIG. 1, the object tracking system 270 shown in FIG. 11, or the object tracking system as described by FIGS. 12 A-B.
  • the network may refer to the network 140 as shown in FIG. 1 or FIG. 11, or the network as described by FIGS. 12 A-B.
  • various embodiments may relate to a surveillance system which includes one or more capture devices 910 and an object tracking system 920 connected to the one or more capture devices 910.
  • the object tracking system 920 may provide an alert if two conditions are met: (1) the system 920 is unable to identify a targeted object in images provided by the one or more capture devices during a predefined time duration; and (2) a re-identification confidence level dropping below a predetermined threshold.
  • the time period may be calculated from a time in which the system 920 determines the targeted object should be in a predefined area, e.g. an area near the exit.
  • the predefined time duration may be any time value selected from 5 seconds to 3 minutes, e.g. 2 or 3 minutes.
  • the re-identification confidence level dropping below a predetermined threshold may be due to a crowdedness level exceeding a predetermined value.
  • the crowdedness level may be determined based on a number of objects or people in an image captured by the one or more image capture devices 111, 112, 113 etc.
  • the predetermined value may, for instance, be any number selected from 10 to 50 people or objects, e.g. 20 people or objects.
  • the surveillance system 900 may further include a command center terminal connected to the network. In various embodiments, the surveillance system 900 may also include a client terminal connected to the network. In various embodiments, the surveillance system may include a plurality of client terminals connected to the network.
  • the object tracking system may be configured to identify the targeted object by determining a vector distance between extracted features of various objects contained in the one or more images, and extracted features of a key image of the targeted object provided to the object tracking system 920.
  • a low vector distance may indicate that an object is similar to the targeted object.
  • the object tracking system 920 may be configured to provide an indication that the targeted object is identified (e.g.“Suspected Item Detected”) upon the object tracking system 920 determining that the vector distance between the extracted features of one object of the various objects and the extracted features of the targeted object falls below a predefined threshold.
  • the re-identification confidence level dropping below the predetermined threshold may be due to the vector distance between extracted features of multiple objects of the various objects and the extracted features of the targeted object falling below the predefined threshold, thereby indicating that multiple similar objects are found by the object tracking system 920.
  • the object tracking system 920 may be configured to provide the alert upon the object tracking system 920 determining during the predefined duration that that the vector distance between extracted features of multiple objects of the various objects and the extracted features of the targeted object falls below the predefined threshold.
  • the re-identification confidence level dropping below the predetermined threshold may also be due to none of the various objects having extracted features in which the vector distance between these features and the features of the targeted object falls below the predetermined threshold, thereby indicating that no similar object is found by the object tracking system 920.
  • the object tracking system 920 may be further configured to identify one or more non-targeted objects in the one or more images captured by the one or more image capture devices 910 after providing the alert. Identifying the one or more non- targeted objects may help to filter out objects that are of low similarity to the targeted object, and may help facilitate manual checking at the exit and shorten the queue.
  • the object tracking system is further configured to transmit a backup request message over the network after providing the alert.
  • the backup request message may indicate a request for backup, and may be based on existing or pre-defined protocols.
  • the backup request message may, for instance, request additional manpower to be dispatched to the exit to help with manual checking.
  • the surveillance system 900 may also include one or more standby image capture devices connected to the network.
  • the object tracking system 920 may be further configured to activate the one or more standby image capture devices after providing the alert.
  • the one or more standby image capture devices may be deactivated under normal circumstances to save power, but may be activated upon the object tracking system 920 determining that the object tracking system 920 is unable to identify the targeted object in the one or more images captured by the one or more image capture devices 910.
  • By activating the one or more standby image capture devices a greater number of viewing angles of the exit can be captured, thus facilitating the identification of the targeted object.
  • the object tracking system 920 may continue identifying the targeted object in images provided by the one or more image capture devices 910, as well as any activated standby image capture device after providing the alert.
  • the surveillance system 900 may also include one or more tracking devices connected to the network.
  • the object tracking system 920 may be further configured to activate the one or more tracking devices after providing the alert.
  • the one or more tracking devices may include one or more body cameras, one or more mobile phone cameras, and/or one or more smart glasses with cameras. The one or more tracking devices may also help to facilitate identification of the targeted object.
  • the surveillance system 900 may additionally include one or more radio frequency identifier (RFID) gates connected to the network.
  • RFID radio frequency identifier
  • the object tracking system 920 may be further configured to activate the one or more radio frequency identifier (RDID) gates after providing the alert.
  • the surveillance system 900 may further include one or more x-ray scanner gates connected to the network.
  • the object tracking system 920 may be further configured to activate the one or more x-ray scanner gates after providing the alert.
  • the activation of the RFID gates and/or the x-ray scanner gates may provide more means to identify the targeted object, i.e. through RFID tag attached know to attach to the targeted device and/or through know x-ray profile of the targeted object.
  • the object tracking system 920 may additionally include a processor configured to identify the targeted object in the one or more images captured by the one or more image capture devices 910.
  • the processor may, e.g. by executing a program, be configured to generate the alert upon the processor determining that the processor is unable to identify the targeted object in the one or more images captured by the one or more image capture devices, and upon the time period reaching the predefined duration or the crowdedness level exceeding the predetermined threshold.
  • the object tracking system 920 may include a storage device connected to the processor.
  • the storage device may include a database configured to store features of objects extracted from the one or more images captured by the one or more image capture devices 910.
  • the storage device may also store the program for the processor to execute to identify the targeted object.
  • the object tracking system 920 may include the network. In various embodiments, the object tracking system 920 may include a network device connected to the network.
  • FIG. 14 is a schematic showing an object tracking system 1020 according to various embodiments.
  • the object tracking system 1020 may include a processor 1010 configured to identify a targeted object in one or more images.
  • the processor 1010 may be configured to generate an alert upon the processor determining that the processor is unable to identify the targeted object in the one or more images when a time period reaches a predefined duration, and upon a re-identification confidence level dropping below a predetermined threshold.
  • the processor 1010 may be configured to identify the targeted object by determining a vector distance between extracted features of various objects contained in the one or more images, and extracted features of a key image of the targeted object provided to the object tracking system 1020.
  • the processor 1010 e.g. a central processing module of the processor 1010
  • the object tracking system 1020 or the processor 1010 may include an image processing module configured to extract the features of the various objects contained in the one or more images, and/or the features of the targeted object contained in the key image.
  • the object tracking system 1020 may include a database to store the extracted features of the various objects and/or the extracted features of the targeted object.
  • the processor 1010 or the object tracking system 1020 may include a clock module to keep track of the time period.
  • the re-identification confidence level dropping below the predetermined threshold may be due to the vector distance between extracted features of multiple objects of the various objects and the extracted features of the targeted object falling below the predefined threshold, thereby indicating that multiple similar objects are found by the object tracking system. Additionally or alternatively, the re-identification confidence level dropping below the predetermined threshold may be due to a crowdedness level detected by the processor 1010 or the object tracking system 1020 exceeding a predetermined value.
  • the re-identification confidence level dropping below the predetermined threshold may also be due to none of the various objects having extracted features in which the vector distance between these features and the features of the targeted object falls below the predetermined threshold, thereby indicating that no similar object is found by the object tracking system 1020 or the processor 1010.
  • the processor 1010 or the object tracking system 1020 may be further configured to identify one or more non-targeted objects in the one or more images after providing the alert.
  • the processor 1010 or the object tracking system 1020 may be further configured to transmit a backup request message after providing the alert.
  • FIG. 15 is a schematic illustrating a method of operating a surveillance system according to various embodiments.
  • the method may include, in S 1101, capturing one or more images using one or more image capture devices of the surveillance system via a network of the surveillance system.
  • An object tracking system the object tracking system connected to the network, may be configured to provide an alert upon the object tracking system determining that the object tracking system is unable to identify a targeted object in the one or more images captured by the one or more image capture devices when a time period reaches a predefined duration, and upon a re-identification confidence level dropping below a predetermined threshold.
  • the object tracking system may be configured to identify the targeted object by determining a vector distance between extracted features of various objects contained in the one or more images, and extracted features of a key image of the targeted object provided to the object tracking system.
  • the object tracking system may be configured to provide an indication that the targeted object is identified upon the object tracking system determining that the vector distance between the extracted features of one object of the various objects and the extracted features of the targeted object falls below a predefined threshold.
  • the re-identification confidence level dropping below the predetermined threshold is due to the vector distance between extracted features of multiple objects of the various objects and the extracted features of the targeted object falling below the predefined threshold, thereby indicating that multiple similar objects are found by the object tracking system.
  • the re-identification confidence level dropping below the predetermined threshold may be due to a crowdedness level detected by the processor or the object tracking system exceeding a predetermined value.
  • the re-identification confidence level dropping below the predetermined threshold may also be due to none of the various objects having extracted features in which the vector distance between these features and the features of the targeted object falls below the predetermined threshold, thereby indicating that no similar object is found by the object tracking system or the processor.
  • the object tracking system may be further configured to transmit a backup request message over the network after providing the alert.
  • the object tracking system may be further configured to identify one or more non-targeted objects from the one or more images captured by the one or more image capture devices after providing the alert.
  • the object tracking system may be further configured to activate one or more standby image capture devices of the surveillance system, the one or more standby image capture devices connected to the network, after providing the alert.
  • the object tracking system may be further configured to activate one or more tracking devices of the surveillance system, the one or more tracking devices connected to the network, after providing the alert.
  • the one or more tracking devices may include one or more body cameras, one or more mobile phone cameras, and/or one or more smart glasses with cameras.
  • the object tracking system may be further configured to activate one or more radio frequency identifier (RFID) gates, the one or more radio frequency identifier (RFID) gates connected to the network, after providing the alert.
  • RFID radio frequency identifier
  • the object tracking system may be further configured to activate one or more x-ray scanner gates, the one or more x-ray scanner gates connected to the network, after providing the alert.
  • the object tracking system may include a processor configured to identify the targeted object in the one or more images captured by the one or more image capture devices.
  • the processor may be configured to generate the alert upon the processor determining that the processor is unable to identify the targeted object in the one or more images captured by the one or more image capture devices, and upon the time period reaching the predefined duration or the crowdedness level exceeding the predetermined threshold.
  • the object tracking system may include a storage device connected to the processor.
  • the storage device may include a database configured to store features of objects extracted from the one or more images captured by the one or more image capture devices.
  • the object tracking system may include a network device connected to the network.
  • Various embodiments may relate to a method of operating an object tracking system.
  • the method may include providing one or more images to a processor of the object tracking system.
  • the processor may be configured to identify a targeted object in one or more images.
  • the processor may be configured to generate an alert upon the processor determining that the processor is unable to identify the targeted object in the one or more images when a time period reaches a predefined duration, and upon a re-identification confidence level dropping below a predetermined threshold.
  • Various embodiments may provide a method of forming a surveillance system.
  • the method may include connecting one or more image capture devices and an object tracking system via a network.
  • the object tracking system may be configured to provide an alert upon the object tracking system determining that the object tracking system is unable to identify a targeted object in one or more images captured by the one or more image capture devices when a time period reaches a predefined duration, and upon a re-identification confidence level dropping below a predetermined threshold.
  • the method may also include connecting a command center terminal to the network.
  • the method may further include connecting one or more client terminals to the network.
  • Various embodiments may provide a method of forming an object tracking system.
  • the method may include providing a processor in the object tracking system, the processor configured to receive one or more images, and may be further configured to identify a targeted object in one or more images.
  • the processor may be configured to generate an alert upon the processor determining that the processor is unable to identify the targeted object in the one or more images when a time period reaches a predefined duration, and upon a re-identification confidence level dropping below a predetermined threshold.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

La présente invention concerne, selon divers modes de réalisation, un système de surveillance. Le système de surveillance peut comprendre un ou plusieurs dispositifs de capture d'image. Le système de surveillance peut également comprendre un système de suivi d'objet connectant ledit dispositif de capture d'image par le biais d'un réseau. Le système de suivi d'objet peut être configuré pour fournir une alerte lorsque le système de suivi d'objet détermine que le système de suivi d'objet ne peut pas identifier un objet ciblé dans une ou plusieurs images capturées par ledit dispositif de capture d'image et lorsqu'un niveau de confiance de nouvelle identification tombe au-dessous d'un seuil prédéterminé.
PCT/SG2020/050418 2019-07-24 2020-07-17 Système de surveillance, système de suivi d'objet et procédé de fonctionnement associé WO2021015672A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SG10201906847U 2019-07-24
SG10201906847U 2019-07-24

Publications (1)

Publication Number Publication Date
WO2021015672A1 true WO2021015672A1 (fr) 2021-01-28

Family

ID=74194313

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SG2020/050418 WO2021015672A1 (fr) 2019-07-24 2020-07-17 Système de surveillance, système de suivi d'objet et procédé de fonctionnement associé

Country Status (1)

Country Link
WO (1) WO2021015672A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022201582A1 (fr) * 2021-03-24 2022-09-29 株式会社日立製作所 Système et procédé de suivi d'objet
CN115834453A (zh) * 2023-02-14 2023-03-21 浙江德塔森特数据技术有限公司 手持协议检测终端的协议检测方法和手持协议检测终端

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999059116A1 (fr) * 1998-05-08 1999-11-18 Primary Image Limited Procede et appareil pour la detection d'un deplacement dans un secteur de surveillance
US20040161133A1 (en) * 2002-02-06 2004-08-19 Avishai Elazar System and method for video content analysis-based detection, surveillance and alarm management
US20100157064A1 (en) * 2008-12-18 2010-06-24 Industrial Technology Research Institute Object tracking system, method and smart node using active camera handoff
CN104168444A (zh) * 2013-05-17 2014-11-26 浙江大华技术股份有限公司 一种跟踪球机的目标跟踪方法及跟踪球机
CN105759839A (zh) * 2016-03-01 2016-07-13 深圳市大疆创新科技有限公司 无人机视觉跟踪方法、装置以及无人机
CN106570478A (zh) * 2016-11-04 2017-04-19 北京智能管家科技有限公司 视觉跟踪中的目标丢失判断方法和装置
CN106937077A (zh) * 2015-12-29 2017-07-07 天津市亚安科技有限公司 一种自动跟踪全景云台监控系统及方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999059116A1 (fr) * 1998-05-08 1999-11-18 Primary Image Limited Procede et appareil pour la detection d'un deplacement dans un secteur de surveillance
US20040161133A1 (en) * 2002-02-06 2004-08-19 Avishai Elazar System and method for video content analysis-based detection, surveillance and alarm management
US20100157064A1 (en) * 2008-12-18 2010-06-24 Industrial Technology Research Institute Object tracking system, method and smart node using active camera handoff
CN104168444A (zh) * 2013-05-17 2014-11-26 浙江大华技术股份有限公司 一种跟踪球机的目标跟踪方法及跟踪球机
CN106937077A (zh) * 2015-12-29 2017-07-07 天津市亚安科技有限公司 一种自动跟踪全景云台监控系统及方法
CN105759839A (zh) * 2016-03-01 2016-07-13 深圳市大疆创新科技有限公司 无人机视觉跟踪方法、装置以及无人机
CN106570478A (zh) * 2016-11-04 2017-04-19 北京智能管家科技有限公司 视觉跟踪中的目标丢失判断方法和装置

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022201582A1 (fr) * 2021-03-24 2022-09-29 株式会社日立製作所 Système et procédé de suivi d'objet
CN115834453A (zh) * 2023-02-14 2023-03-21 浙江德塔森特数据技术有限公司 手持协议检测终端的协议检测方法和手持协议检测终端
CN115834453B (zh) * 2023-02-14 2023-06-02 浙江德塔森特数据技术有限公司 手持协议检测终端的协议检测方法和手持协议检测终端

Similar Documents

Publication Publication Date Title
US9881216B2 (en) Object tracking and alerts
US10255793B2 (en) System and method for crime investigation
US9472072B2 (en) System and method of post event/alarm analysis in CCTV and integrated security systems
US9754630B2 (en) System to distinguish between visually identical objects
US11710397B2 (en) Theft prediction and tracking system
US11270562B2 (en) Video surveillance system and video surveillance method
US20140267738A1 (en) Visual monitoring of queues using auxillary devices
EP3099061A1 (fr) Système et procédé de recherche d'image
JP4925419B2 (ja) 情報収集システム、及び、モバイル端末
US11450197B2 (en) Apparatus and method of controlling a security system
US20240161592A1 (en) Proactive loss prevention system
EP2942759A1 (fr) Système et procédé de suivi dynamique d'un sujet et de multi-tagging dans un système de contrôle d'accès
WO2021015672A1 (fr) Système de surveillance, système de suivi d'objet et procédé de fonctionnement associé
CN105580058B (zh) 移动终端安全系统
US11450186B2 (en) Person monitoring system and person monitoring method
Denman et al. Automatic surveillance in transportation hubs: No longer just about catching the bad guy
US10319204B1 (en) Systems and methods for retracing shrink events
WO2020145883A1 (fr) Systèmes de suivi d'objet et procédés de suivi d'un objet
WO2018210039A1 (fr) Procédé de traitement de données, dispositif de traitement de données, dispositif informatique et support de stockage
WO2020145882A1 (fr) Systèmes de suivi d'objet et procédés de suivi d'un objet cible
EP3806053A1 (fr) Référence croisée vers des applications associées
KR101194177B1 (ko) 비동기적으로 동작하는 이기종 센서들을 포함하는 지능형 감시 시스템
CN114549580A (zh) 实现室内监控的方法及装置和服务端和客户端

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20844268

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20844268

Country of ref document: EP

Kind code of ref document: A1