EP2805188A1 - Appareil pour détecter des êtres humains sur des bandes transporteuses à l'aide d'un ou de plusieurs dispositifs d'imagerie - Google Patents

Appareil pour détecter des êtres humains sur des bandes transporteuses à l'aide d'un ou de plusieurs dispositifs d'imagerie

Info

Publication number
EP2805188A1
EP2805188A1 EP13738858.3A EP13738858A EP2805188A1 EP 2805188 A1 EP2805188 A1 EP 2805188A1 EP 13738858 A EP13738858 A EP 13738858A EP 2805188 A1 EP2805188 A1 EP 2805188A1
Authority
EP
European Patent Office
Prior art keywords
objects
images
image
sequence
specified class
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP13738858.3A
Other languages
German (de)
English (en)
Other versions
EP2805188A4 (fr
Inventor
Mohamed Shehata
Tamer Mohamed
Wael Badawy
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intelliview Technologies Inc
Original Assignee
Intelliview Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US13/351,103 external-priority patent/US9208554B2/en
Priority claimed from CA2764192A external-priority patent/CA2764192C/fr
Application filed by Intelliview Technologies Inc filed Critical Intelliview Technologies Inc
Publication of EP2805188A1 publication Critical patent/EP2805188A1/fr
Publication of EP2805188A4 publication Critical patent/EP2805188A4/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64FGROUND OR AIRCRAFT-CARRIER-DECK INSTALLATIONS SPECIALLY ADAPTED FOR USE IN CONNECTION WITH AIRCRAFT; DESIGNING, MANUFACTURING, ASSEMBLING, CLEANING, MAINTAINING OR REPAIRING AIRCRAFT, NOT OTHERWISE PROVIDED FOR; HANDLING, TRANSPORTING, TESTING OR INSPECTING AIRCRAFT COMPONENTS, NOT OTHERWISE PROVIDED FOR
    • B64F1/00Ground or aircraft-carrier-deck installations
    • B64F1/36Other airport installations
    • B64F1/368Arrangements or installations for routing, distributing or loading baggage
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/06Recognition of objects for industrial automation

Definitions

  • North American airports are employing new methods for baggage check-in and shipping.
  • a passenger is typically able to buy the ticket and check in online.
  • the only remaining step is to handle the passenger baggage.
  • passengers are now responsible for taking the baggage to a conveyor belt. The consequence of this is that a part of the conveyor belt is accessible to the public which cases many issues including safety and security issues.
  • a method and system for responding to the presence in a scene of a member of a specified class of objects, the method comprising acquiring an image of the scene using a sensor, identifying in a computer system one or more objects in the image, each object having a radiation intensity distribution, determining a variance of the radiation intensity distribution for each object, and classifying each object according to the variance of the radiation intensity distribution determined for the respective object, and for each object, taking an action if the respective object is classified as one of the specified class of objects.
  • the sensor may be a thermal imaging device, the image may be a thermal image and the radiation intensity distribution may be a heat intensity distribution.
  • Stationary heat sources may be subtracted from the thermal image before identifying objects in the thermal image.
  • a color image of the scene may further be acquired using a color imaging device, and in the computer system, the one or more objects identified by the computer system in the thermal image may be identified in the color image, a color histogram of each object may be analyzed, and each object classified acccording to a quantization of colors determined from the color histogram of the respective object.
  • the specified class of objects may be humans. Each object may be classified as human if the variance of the image intensity distribution for the respective object falls within a predetermined range
  • the scene may be a view of a conveyor belt.
  • the action may comprise stopping the conveyor belt.
  • the action may comprise alerting operating personnel.
  • An edge filter may be applied to each object and each object classified according to a number of edge-like features of the object detected by the edge filter.
  • a method for responding to the entry of a member of a specified class of objects into an area comprising acquiring a first sequence of images of a scene with a first imaging device oriented in a manner suitable for detecting members of the specified class of objects, acquiring a second sequence images of the scene with a second imaging device oriented in a manner suitable to detect whether blobs detected in the second sequence of images are within the area, detecting in a computer system members of the specified class of objects in the first sequence of images, for each member of the specified class of objects detected in the first sequence of images detecting in the computer system a corresponding blob in the second sequence of images, and detecting in the computer system, for each blob corresponding to a member of the specified class of objects, when the respective blob enters the area, and taking an action when a blob corresponding to a member of the specified class of objects is detected to enter the area.
  • the specified class of objects may be humans.
  • the humans may be detected based on an upright human body classifier.
  • the upright human body classifier may be based on a histogram of oriented gradients.
  • the upright human body classifier may be based on optical flow patterns.
  • the upright human body classifier may be based on covariance features.
  • the first imaging device may be a color camera and the first sequence of images may be a sequence of color images.
  • the second imaging device may be a thermal camera and the second sequence of images may be a sequence of thermal images.
  • the second imaging device may be a color camera and the second sequence of images may be a sequence of color images.
  • Each blob may be tracked using a Kalman filter.
  • the area may be an area above a conveyor belt.
  • the conveyor belt may be a baggage handling conveyor belt.
  • the action may comprise stopping the conveyor belt.
  • the action may comprise alerting operating personnel.
  • FIG. 1 is a block diagram showing the physical layer of an embodiment of an image analysis system
  • FIG. 2 is an illustration of an embodiment of the analytic system using a single thermal camera
  • FIG. 3 is a block diagram showing the steps of the analytics system using a single thermal camera
  • FIG. 4 is a block diagram showing the steps of the heat signature analysis stage of the analytics system
  • FIG. 5 illustrates an embodiment of the invention using one thermal camera and a color video camera
  • Fig. 6 is a block diagram showing the steps of the analytics system when using both a thermal camera and a video camera;
  • FIG. 7 is an illustration of an embodiment of the analytics system using a thermal camera and a fish-eye video camera
  • Fig. 8 is a block diagram showing the steps of the analytics system when using both a thermal camera and a fish-eye video camera;
  • FIG. 9 is an illustration of an embodiment of the analytics system using a video camera and a fish-eye video camera
  • Fig. 10 is a block diagram showing the steps of the analytics system when several of the described systems are combined using a weighted majority system.
  • Fig. 11 shows a block diagram of the steps of the analytics system using image fusion.
  • a system for detecting a class of objects at a location for example humans on a conveyor belt.
  • Any combination of the following systems can be used to increase the detection rate and reduce the error rate further by combining information from the systems, for example using majority rating.
  • FIG. 1 is a block diagram showing the physical layer of an embodiment of an image analysis system.
  • An imaging and analysis subsystem 100 comprises an imaging system 102, in this case consisting of one thermal camera 112, and computer configured software setup 104 which is responsible for the detection of humans.
  • the result from the software setup is passed to actuator system 106.
  • the actuator system 106 which is responsive to the computer system 104, stops the conveyor belt if a human is detected.
  • the computer system may also trigger alarm system 108 which informs the operating personnel about the incident.
  • the imaging system, computer system and actuator may use any conventional communication channels for communications passed between them.
  • the computer system may provide control signals to the imaging system and to the actuator.
  • FIG. 2 is an illustration of an embodiment of the analytic system using a single thermal camera.
  • Thermal camera 112 views the area above conveyor belt 1 10, in this case from above, in order to detect humans on the conveyor belt.
  • Fig. 3 is a block diagram showing the steps of the analytics system using a single thermal camera.
  • the analytics system 104 receives data from the single thermal camera 112.
  • the software setup 104 conducts background compensation, in this case by subtracting stationary heat sources, such as light sources and conveyor belt rollers, and the semi stationary heat source, which is the conveyor belt itself.
  • the software setup obtains object meshes from the thermal image to detect an object.
  • the heat signature is analyzed, to make a classification in step 128 whether it is a human signature. If a human is detected the alarm will be triggered in step 130.
  • Background compensation in this case by subtracting stationary heat sources, such as light sources and conveyor belt rollers, and the semi stationary heat source, which is the conveyor belt itself.
  • the software setup obtains object meshes from the thermal image to detect an object.
  • the heat signature is analyzed, to make a classification in step 128 whether it is a human signature. If a human is detected the alarm will be triggered in step 130.
  • the first stage image analysis system subtracts the stationary heat sources, which are the light sources and the conveyor belt rollers, and the semi stationary heat source, which is the conveyor belt itself.
  • the second stage image analysis system detects the presence of a foreground hot object in the area of the conveyor belt itself.
  • the system may use the technique for object detection and tracking described in the Patent US 7,616,782, B2.
  • the result of the algorithm is a mesh of anchor points which describes the detected object.
  • the third stage of the detection system rejects hot objects that are not human to prevent false alarms and passes the result to the actuator, which sounds an alarm and activates an emergency shutdown of the conveyor belt.
  • Fig. 4 is a block diagram showing the steps of the heat signature analysis stage of the analytics system.
  • step 140 data is received on what objects have been detected in step 124.
  • step 142 it is determined if all detected objects have undergone heat signature analysis. If so, the heat signature analysis stage waits to receive more object detection data. If not, the heat signature analysis stage proceeds in step 144 to receive image data about a detected object.
  • step 146 the heat signature analysis stage calculates a local histogram of thermal brightness from the image data about the object.
  • step 148 the variance of the histogram is calculated.
  • step 1 it is determined if the variance is between preset threshholds. If the variance is between the preset thresholds an alarm is triggered in step 130, otherwise the system proceeds to analyze the next object.
  • step 1 it is determined if the variance is between preset threshholds. If the variance is between the preset thresholds an alarm is triggered in step 130, otherwise the system proceeds to analyze the next object.
  • the system calculates the histogram of the heat intensity distribution
  • the variance is a measure for the distribution of values compared to the mean.
  • the system calculates the variance of the local histogram which can be written in way:
  • the variance of the object is smaller than the variance of objects made out of fabric/plastic/cloth and is greater than the variance for objects made out of metal then the objects is classified as a human.
  • the system is supplied by the latitude and longitude of the airport location and this way it can calculate sun rise and sun set times. Based on this information, the system increases the bias towards identifying hot objects as humans when the time of the detection is in the time range of sunset time plus one hour and sunrise time plus one hour. Outside this time range, the bias is increased towards detecting false positives. The system also rejects more hot- object occurrences during the months of the summer.
  • the system can be manually configured to run only during night hours when traffic is slow and the incident is more likely to happen. This further reduces the chance for false alarms without sacrificing the sensitivity of the system and risking false negatives.
  • FIG. 5 illustrates an embodiment of the invention using one thermal camera
  • both the video camera and thermal camera look at the conveyor belt from above.
  • Fig. 6 is a block diagram showing the steps of an embodiment of the analytics system using both a thermal camera and a video camera.
  • image data is received from the cameras. Background compensation may be performed on the thermal data as in Fig. 3 but this is not shown in Fig. 6.
  • objects are detected in the thermal data.
  • step 162 the heat distributions of detected objects are analyzed. The same technique may be used as in step 126 in Fig. 3 shown in more detail in Fig. 4.
  • step 164 the number of edge like features in the thermal data is detected.
  • geometric correction is performed on the objects detected in the thermal data to identify those objects in the color image data.
  • step 168 the color distribution of objects identified in the color image data is analyzed.
  • the information from steps 162, 164 and 168 are combined to make a determination whether a detected object is human. If the object is determined to be human, in step 130 the alarm is triggered.
  • the image stream from the infrared camera is used to detect moving hot objects.
  • the image analysis system detects the presence of a foreground hot object in the area of the conveyor belt itself.
  • a technique for object detection and tracking is described in US 7,616,782, B2.
  • the result of the algorithm is a mesh which describes the detected object.
  • the detected objects are meshes which consist of several anchor points.
  • the position of every anchor point is geometrically transformed to find the corresponding point in the color image.
  • This transformation can be represented as a linear Transformation in 3D space.
  • a homography if is a matrix that translates points from one camera plane to another plane. The matrix is computed based on 4 reference points which have to be entered manually, d is the distance between the two cameras, is a point of the color camera and ir is a point in the corresponding point in respect to the color cameras viewpoint.
  • An edge filter is applied to the object and the amount of edge-like features is counted.
  • the amount has to be smaller than a threshold t BdgB , because baggage pieces are more likely to have edge-like features.
  • Typical edge filters or corner filters are, for example, Sobel operator, Laplace operator or SUSAN.
  • Color histogram analysis is, for example, Sobel operator, Laplace operator or SUSAN.
  • the frequency of the quantized colors is measured and has to exceed a threshold t ee3w , .
  • the system can be manually configured to run only during night hours when traffic is slow and the incident is more likely to happen. This further reduces the chance for false alarms without sacrificing the sensitivity of the system and risking false negatives.
  • FIG. 7 is an illustration of an embodiment of the analytics system using a thermal camera 112 and a fish-eye video camera 116.
  • the thermal camera looks at the conveyor belt 110 from above with a field of view extending into a neighbouring area from which humans may interact with the conveyor belt.
  • Fish-eye video camera 116 has a field of view which also extends from the belt area into the neighbouring area.
  • Fig. 8 is a block diagram showing the steps of the analytics system in an embodiment using both a thermal camera and a fish-eye video camera.
  • the system receives data from the cameras.
  • the system detects upright humans in the data from the fish-eye video camera.
  • the system applies geometric correction to the detected upright humans to detect heat blobs corresponding to the upright humans in the data from the thermal camera.
  • the system tracks the heat blobs detected as corresponding to upright humans, for example using a Kalman filter.
  • the system detects if a blob detected as corresponding to an upright human coincides with the belt area. If so, in step 130 the system triggers the alarm. If not, the system continues to perform steps 180 to 186.
  • the first stage in the detection is to analyze the image of the color camera and detect silhouettes of human beings based on a multi scale upright human body classifier.
  • the classification of an upright human can be based on, for example,
  • the output of the classifier is geometrically corrected to find the
  • the heat blob identified as human is marked and tracked by means of a
  • Kalman filter in the view of the thermal camera. The system activates the alarm if the marked blob track starts to coincide with the belt area.
  • Fig. 9 is an illustration of an embodiment of the analytics system using a video camera and a fish-eye video camera.
  • the analysis for this setup may be the same as for the analysis shown in Fig. 8 for the thermal camera and fish eye camera setup shown in Fig.
  • FIG. 7 is an illustration of an embodiment of the analytics system using a thermal camera 1 12 and a fish-eye video camera 116.
  • the thermal camera looks at the conveyor belt 110 from above with a field of view extending into a neighbouring area from which humans may interact with the conveyor belt.
  • Fish-eye video camera 116 has a field of view which also extends from the belt area into the neighbouring area.
  • Fig. 8 is a block diagram showing the steps of the analytics system in an embodiment using both a thermal camera and a fish-eye video camera.
  • the system receives data from the cameras.
  • the system detects upright humans in the data from the fish-eye video camera.
  • the system applies geometric correction to the detected upright humans to detect heat blobs corresponding to the upright humans in the data from the thermal camera.
  • the system tracks the heat blobs detected as corresponding to upright humans, for example using a Kalman filter.
  • the system detects if a blob detected as corresponding to an upright human coincides with the belt area. If so, in step 130 the system triggers the alarm. If not, the system continues to perform steps 180 to 186.
  • the analysis system corrects for viewpoint and geometry and identifies the same human objects in the scene of the second color camera.
  • This transformation can be represented as a linear Transformation in 3D space (J. Han and B. Bhanu).
  • a homography H is a matrix that translates points from one camera plane to another plane. It is precomputed based on 4 reference points, d is the distance between the two cameras.
  • the system activates the alarm if the marked blob track starts to coincide with the belt area.
  • the area of the conveyor belt is defined by a bounding box. As soon as the heat blob enters the bounding box the alarm will be triggered.
  • FIG. 10 is a block diagram showing the steps of an embodiment of the system in which several of the described imaging and analytic systems are combined.
  • the outputs of the imaging and analysis subsystems are combined in step 190, for example using majority voting, to produce an overall decision. If the overall decision is that there is a human on the conveyor belt, actuator system 106 stops the conveyor belt, and alarm system 108 informs the operating personnel about the incident.
  • each subsystem may produce a likelihood of a human on belt given the observed data, which may include factors such as time of day or outside air temperature, and the likelihoods produced by the subsystems may be combined to produce an overall likelihood (or combined along with a prior to produce an overall probability) which may be compared to a threshold to produce a human on belt/ no human on belt binary decision.
  • the combination of the likelihoods may assume independence or take into account the non-independence of the systems.
  • each subsystem produces a likelihood for each of a number of locations and the likelihoods produced at each location are combined to produce an overall decision as to whether there is a human at that location.
  • blobs detected by each subsystem are correlated and each subsystem produces a likelihood for each blob, and the likelihoods produced for each blob are combined to produce an overall decision as to whether the blob is human and on the belt.
  • FIG. 11 shows a block diagram of the steps of the analytics system using image fusion. Images are received from multiple cameras or imaging systems 102.
  • step 192 feature detection is then performed on the received images in step 192.
  • step 194 the information of the images from the different sources is combined.
  • step 196 an analytics system processes the combined images, to produce a determination as to whether there is a human on the conveyor belt. If there is, actuator system 106 stops the conveyor belt, and alarm system 108 informs the operating personnel about the incident. [00130] Image Fusion
  • the process of image fusion combines the information of multiple image sources before the image is analysed. This can result in better performance. Images can also be fused after the process of feature detection such as edge detection.
  • the computer used for the analysis system may be any computing device now known or later developed that is configured to carry out the processes described here.
  • the computing devices may for example be personal computers programmed to carry out the described processes, or may be application specific devices that are hard wired to carry out the described processes.
  • Communications between the various apparatus may use any suitable communication links such as wires or wireless that supply a sufficient data rate.
  • the required communication links and general purpose computing devices required for implementing the method steps described here after suitable programming are already known and do not need to be described further.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Processing (AREA)

Abstract

L'invention concerne un système de détection d'une classe d'objets à un emplacement, par exemple, des êtres humains sur une bande transporteuse. Une caméra thermique peut être utilisée pour détecter des objets ainsi que la variance de la distribution thermique d'objets afin de les classifier. Des objets détectés dans une image provenant d'une caméra peuvent être utilisés dans une image provenant d'une autre caméra à l'aide de la correction géométrique. Une caméra couleur peut être utilisée pour détecter le nombre de bords et le nombre de couleurs d'un objet afin de le classifier. Une caméra couleur peut être utilisée avec un dispositif de classification de corps humains debout afin de détecter des êtres humains dans une zone, et des taches correspondant aux êtres humains détectés peuvent être suivies dans une image thermique ou couleur pour détecter si un être humain pénètre dans une zone interdite adjacente, telle qu'une bande transporteuse.
EP13738858.3A 2012-01-16 2013-01-16 Appareil pour détecter des êtres humains sur des bandes transporteuses à l'aide d'un ou de plusieurs dispositifs d'imagerie Withdrawn EP2805188A4 (fr)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US13/351,103 US9208554B2 (en) 2012-01-16 2012-01-16 Apparatus for detecting humans on conveyor belts using one or more imaging devices
CA2764192A CA2764192C (fr) 2012-01-16 2012-01-16 Appareil de detection de personnes sur les courroies d'un convoyeur a l'aide d'un ou plusieurs dispositifs d'imagerie
PCT/CA2013/050025 WO2013106928A1 (fr) 2012-01-16 2013-01-16 Appareil pour détecter des êtres humains sur des bandes transporteuses à l'aide d'un ou de plusieurs dispositifs d'imagerie

Publications (2)

Publication Number Publication Date
EP2805188A1 true EP2805188A1 (fr) 2014-11-26
EP2805188A4 EP2805188A4 (fr) 2016-02-24

Family

ID=48798453

Family Applications (1)

Application Number Title Priority Date Filing Date
EP13738858.3A Withdrawn EP2805188A4 (fr) 2012-01-16 2013-01-16 Appareil pour détecter des êtres humains sur des bandes transporteuses à l'aide d'un ou de plusieurs dispositifs d'imagerie

Country Status (2)

Country Link
EP (1) EP2805188A4 (fr)
WO (1) WO2013106928A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019060110A1 (fr) * 2017-08-29 2019-03-28 Thyssenkrupp Ag Système de surveillance et de commande de trafic d'ascenseurs

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003088157A1 (fr) * 2002-04-08 2003-10-23 Newton Security Inc. Detection d'acces a califourchon et d'ecriture contrepassee, alarme, enregistrement et prevention utilisant une vision artificielle
NL1025759C2 (nl) * 2004-03-18 2005-09-20 Vanderlande Ind Nederland Check-in systeem.
US7639840B2 (en) * 2004-07-28 2009-12-29 Sarnoff Corporation Method and apparatus for improved video surveillance through classification of detected objects
US7961906B2 (en) * 2007-01-03 2011-06-14 Science Applications International Corporation Human detection with imaging sensors

Also Published As

Publication number Publication date
WO2013106928A1 (fr) 2013-07-25
EP2805188A4 (fr) 2016-02-24

Similar Documents

Publication Publication Date Title
US9208554B2 (en) Apparatus for detecting humans on conveyor belts using one or more imaging devices
CN108256459B (zh) 基于多摄像机融合的安检门人脸识别和人脸自动建库算法
US9143843B2 (en) Automated monitoring and control of safety in a production area
US10347062B2 (en) Personal identification for multi-stage inspections of persons
DK2734447T3 (en) TRANSPORT SYSTEM FOR PIECES OF LUGGAGE, CHECK-IN SYSTEM INCLUDING SUCH A TRANSPORT SYSTEM AND PROCEDURE FOR USING SUCH A TRANSPORT SYSTEM
Lim et al. iSurveillance: Intelligent framework for multiple events detection in surveillance videos
EP2553313B1 (fr) Surveillance automatique et enclenchement d'une commande de sécurité dans une zone de production
US20230262312A1 (en) Movable body
EP2653772A1 (fr) Reconnaissance d'image pour mise en conformité d'équipement de protection personnel dans des zones de travail
CN107662871B (zh) 用于乘客运输装置的移动扶手监测系统、乘客运输装置及其监测方法
WO2011101856A2 (fr) Procédé et système de détection et de suivi utilisant une imagerie multispectrale à plusieurs vues
US20140146998A1 (en) Systems and methods to classify moving airplanes in airports
JP6881898B2 (ja) ゲート装置
US20220189264A1 (en) Monitoring device, suspicious object detecting method, and recording medium
Katsamenis et al. Man overboard event detection from RGB and thermal imagery: Possibilities and limitations
Avgerinakis et al. Smoke detection using temporal HOGHOF descriptors and energy colour statistics from video
Santad et al. Application of YOLO deep learning model for real time abandoned baggage detection
CA2764192C (fr) Appareil de detection de personnes sur les courroies d'un convoyeur a l'aide d'un ou plusieurs dispositifs d'imagerie
EP2805188A1 (fr) Appareil pour détecter des êtres humains sur des bandes transporteuses à l'aide d'un ou de plusieurs dispositifs d'imagerie
US10656303B2 (en) System and method for screening objects
Nam Loitering detection using an associating pedestrian tracker in crowded scenes
Czyzewski et al. Moving object detection and tracking for the purpose of multimodal surveillance system in urban areas
Al Maashri et al. A novel drone-based system for accurate human temperature measurement and disease symptoms detection using thermography and AI
EP4379673A1 (fr) Systèmes et procédés d'altération de sacs de passagers et d'identification de vol
Siddiqua et al. Real Time Face Mask Detection and Monitoring System (RFMDM)

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20140812

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
RA4 Supplementary search report drawn up and despatched (corrected)

Effective date: 20160121

RIC1 Information provided on ipc code assigned before grant

Ipc: B07C 5/34 20060101ALI20160115BHEP

Ipc: B65G 43/06 20060101ALI20160115BHEP

Ipc: G01V 8/10 20060101ALI20160115BHEP

Ipc: G01V 99/00 20090101AFI20160115BHEP

Ipc: B64F 1/36 20060101ALI20160115BHEP

Ipc: G06K 9/62 20060101ALI20160115BHEP

Ipc: G08B 21/02 20060101ALI20160115BHEP

Ipc: G06K 9/00 20060101ALI20160115BHEP

Ipc: B65G 43/00 20060101ALI20160115BHEP

R17P Request for examination filed (corrected)

Effective date: 20140812

17Q First examination report despatched

Effective date: 20171208

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20180419