EP2255330B1 - Verfahren zum verfolgen von individuen in einem sichtfeld einer kamera - Google Patents

Verfahren zum verfolgen von individuen in einem sichtfeld einer kamera Download PDF

Info

Publication number
EP2255330B1
EP2255330B1 EP09710232.1A EP09710232A EP2255330B1 EP 2255330 B1 EP2255330 B1 EP 2255330B1 EP 09710232 A EP09710232 A EP 09710232A EP 2255330 B1 EP2255330 B1 EP 2255330B1
Authority
EP
European Patent Office
Prior art keywords
pixels
distance
image
individual
positions
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Not-in-force
Application number
EP09710232.1A
Other languages
English (en)
French (fr)
Other versions
EP2255330A2 (de
Inventor
Alexandre Zeller
Alexandre Revue
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CLIRIS
Original Assignee
CLIRIS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CLIRIS filed Critical CLIRIS
Publication of EP2255330A2 publication Critical patent/EP2255330A2/de
Application granted granted Critical
Publication of EP2255330B1 publication Critical patent/EP2255330B1/de
Not-in-force legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30236Traffic on road, railway or crossing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory

Definitions

  • the present invention relates to a method of tracking individuals moving in an observation area. It finds a particularly interesting but non-limiting application in the field of behavioral analysis by artificial vision. More specifically, the present invention can be applied to systems in which there is at least one camera disposed at a high level in a strategic location of a store where one is able to view movements of people to and from department shelves. The video stream of the camera is processed in particular to perform behavioral analyzes.
  • the document is known Thao Zhao et al, "Segmentation and Tracking of Multiple Humans in Crowded Environments," IEEE Transactions on Pattern Analysis and Machine Intelligence, p. 1198-1211 , describing a complex modeling of the problem seeking to simultaneously solve the problem of background learning, detection / enumeration of people, and matching / monitoring, by a continuous estimation of a global likelihood function.
  • the generation of new targets in the approach of this document is made from three criteria: two of heads detection, one of moving zone residue.
  • An object of the present invention is to establish flows between different access points of a crossroad for example.
  • the present invention also aims a new behavioral analysis tool on the trajectories of people in a store.
  • the trajectory of an individual is formed by a set of positions determined through successive images and paired together according to the criterion of global minimization.
  • the blobs detected are directly matched to the targets previously tracked.
  • the detected blobs are used to generate positional hypotheses in 3D space, and these are the assumptions that are mapped to 3D targets tracked over time. Due to this difference of approach, the mapping into the method according to the document WO 2006/097680 is performed using 2D criteria in the image (proximity in the image, color distributions, contours, ). The mapping in the method according to the present invention uses in addition to these criteria, the three-dimensional coherence of the targets and the hypotheses generated at the detection.
  • the minimization of the overall distance consists in carrying out the following steps: for each determined individual position, the distance is evaluated with respect to the other positions of individuals, thus defining a distance matrix that one converts into a matrix of similarities, then normalizes eigenvectors of this matrix of similarities. In other words, a method based on singular value decomposition is actually carried out. The normalization of the eigenvectors of this matrix of similarities makes it possible to identify the best overall pairing of all the positions.
  • each blob generating several detection hypotheses, including a certain number of false detections. These hypotheses are then filtered and optimized to best explain the size, shape and properties of the blob. This allows among other things not to be dependent on the quality of past detections, and to be able to separate a group of people without having to assume that these people were separately detectable in previous images.
  • the management of the distance to the multiple detected objects is carried out after detection.
  • this distance is estimated from the generation of hypothesis, therefore even before the validation of a target.
  • the invention is notably remarkable by the fact that a global pairing is performed on all the detected individuals, using rich notions of distance between individuals that can be based on notions of geometric position, dimensions, appearance signature, temporal coherence, ...
  • Occlusion management is based on color signature and contour density criteria. On the contrary, according to the present invention, this management is performed globally on all targets, considering their collective consistency to explain the distribution of blobs and outlines in the scene. Color signatures are not used.
  • the step b1) of detecting an individual head is carried out by contour analysis.
  • the counting consists in simultaneously identifying the number of individuals present and their respective position.
  • segmentation of groups is performed.
  • the images come from a camera sometimes having an oblique angle of view with respect to the ground, thus causing many situations of occlusion between individuals, it is then difficult to use an individual search by individual.
  • this The invention thus proposes a solution seeking to detect and enumerate a group (simultaneous identification of the number of people present and their respective position), and not to detect each individual separately.
  • the identification of the individuals is carried out from a three ellipses appearance model imitating an individual.
  • the convergence of the iterative algorithm is distributed over successive images.
  • the convergence of the iterative algorithm to an optimal count of individuals can take a considerable time.
  • the characteristics of the video camera include the field of view, the positioning and the distortion.
  • the extraction of the fixed background of the image is obtained by averaging the values of the pixels of the image over time.
  • the value of each background pixel can be modeled by a distribution of probabilities.
  • This distribution can be represented by a Gaussian, or more simply by an average value and two min and max values.
  • FIG. 1 On the figure 1 we see a shelf 1 of a store equipped with a surveillance camera 2 arranged in height.
  • This surveillance camera is fixed to the ceiling and is arranged obliquely so as to film an object of interest 3 in a strategic location such as a crossroads of several aisles in the store.
  • An observation zone 4 is defined which is a surface around the object of interest 3 and included in the field of view of the camera 2.
  • the object of the present invention is to determine the trajectories of the individuals passing through this zone. observation.
  • the surveillance camera 2 is connected to a remote server 5 intended to implement the method according to the invention.
  • the figure 4 illustrates the problem of matching.
  • b21-b25 we distinguish positions of individuals for an image at time t + 1, and in a21-a24 positions of individuals for an image at time t.
  • the overall distance is minimized.
  • the pairing is optimal by minimizing distances globally: a22 is matched to b21, and a24 is matched to b23.
  • Stream analysis can be extended to an entire store viewed by multiple cameras. This can be done simply by concatenating local data on crossings, or better by individual tracking of individuals in different camera fields.
  • the targets are often lost for several seconds.
  • the individuals are tracked locally on each camera, and each local trajectory is recorded in a database in the server 5 for example; this record contains the positions and dates of appearance and disappearance of individuals, and possibly a condensed representation of their signature.
  • the pairing between the local trajectories is then performed by moving a sliding time window over all the recordings, trying to match the disappearance of a local trajectory with the appearance of another.
  • This global matching can be done in several passes in order to obtain a better overall optimization, always using the same method of standardization of the correspondence matrix.
  • the statistical analysis of the trajectories on the whole store can lead to the establishment of one or more standard trajectories (trajectories most borrowed by the customers of the store).
  • Flow data can be visually represented in different ways, depending on the information you want to extract.
  • They can be represented graphically in the form of a pie chart by indicating the trajectory rate towards this or that output. They can also be represented graphically in the form of a histogram illustrating the evolution of the distribution of flows over time.
  • the elements R1 to R7 represent departments of a store such as for example gardening, bookstore, multimedia, stationery ...
  • the flows are represented by arrows indicating their origin, destination and intensity by their width.
  • FIG 9 a flowchart of a method according to the invention implemented in the remote server 5 is shown. This flowchart has for inputs an image acquired by the surveillance camera 2 and the observation zone 4.
  • the background of the acquired image is extracted so as to obtain a fixed background model.
  • the background extraction algorithm may be of iterative type on in particular several images or a video image stream.
  • the simplest approach to extracting the background is to average pixel values over time. The contribution to the average of a moving object will be even lower than its movement is fast compared to the rate of the images.
  • the bottom extraction according to the invention leads to a modeling of gray levels, gradients and orientations of average gradients and their standard deviations.
  • an image acquired by the surveillance camera 2 is seen.
  • step a1 has been carried out so that the elements in motions out of fixed background are symbolized by blobs 12.
  • zone detection is performed by comparing the image and the background, and by extracting pixel areas, called “blobs", which do not belong to the background.
  • step b1 is a set of assumptions of head position or head / foot matching.
  • step b2 a count of the individuals is made. More precisely, an identification of the number of individuals in each blob and an estimation of their position is carried out. In fact, false assumptions are eliminated in order to identify real positions.
  • step b1 a head detection algorithm of step b1.
  • the first step is to apply a Canny filter to extract the contours of the individual heads on the acquired image so as to obtain a contour image in the area of the blobs.
  • a distance transform is performed. This is a calculation of a distance map between contours. It is a fast and stable calculation of the correlation of the head template with the contours. The output of this step is a distance map.
  • Convolution includes a calculation of the correlation of the head template with the contours.
  • a watershed algorithm is used to locate and quantify correlation maxima, and to determine maxima of probability of presence of heads in the acquired image. An output is obtained on the position of the heads.
  • step b1 we see a little more in detail a head / foot coherence search algorithm of step b1.
  • the blobs detaching from the bottom and the calibration of the camera making it possible to establish a correspondence between the pixels of the image and the points of the space corresponding to them.
  • the pixels of a blob correspond to a visible point of a person to be detected, and potentially to his feet or his head.
  • the projection of these pixels on the ground planes and plans placed at standard head height therefore delimits areas where people could be found.
  • the conjunction of these projections, represented in figure 17 allows to reduce these areas to localized spaces, corresponding mainly to the actual positions envisaged of people in the scene.
  • Detection according to the invention has the particularity of being carried out in the three-dimensional reference of the observed surface, the cameras being calibrated with precision in this environment; the calibration consists of accurately estimating the position and orientation of the camera, as well as its intrinsic geometric properties such as focal length, field, distortions ...
  • the position of the detected persons can be matched with the plane of the analyzed surface, and only the locations corresponding to their real position (zone located at their feet), and not all the corresponding ground positions, are counted. to the pixels of the blobs.
  • Detected people (whose outlines are highlighted) then generate a measurement only at the location corresponding to the position of their feet. This measurement can thus be viewed in an absolute reference of the plane.
  • This variant thus makes it possible to manage the perspective of the scene, as well as the defects of the camera such as the distortion, this parameter being taken into account in the calculation of the projection of parallelepipeds.
  • This variant also makes it possible to take into account the masking by objects: if we know that a zone corresponds to an object behind which people would be likely to be partially masked, the estimate of the probability of presence is corrected by function of the number of pixels of the masking object located in the search window.
  • the detections obtained in the previous step are classified and considered in ascending order of distance separating them from the surveillance camera. We validate detections from the nearest to the farthest.
  • each validated detection blots the pixels contained in its parallelepiped, in order to avoid multiple detections.
  • the present invention also relates to a software application or computer program comprising instructions for performing the steps defined in a method according to the invention.
  • the invention also relates to a computer storage means such as a CR-ROM, a USB key, a flash memory ... storing an application program code which, when executed by a digital processor, provides functionalities as defined in any method according to the present invention, as defined by the claims.
  • a computer storage means such as a CR-ROM, a USB key, a flash memory ... storing an application program code which, when executed by a digital processor, provides functionalities as defined in any method according to the present invention, as defined by the claims.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Development Economics (AREA)
  • Accounting & Taxation (AREA)
  • Multimedia (AREA)
  • Finance (AREA)
  • Strategic Management (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Game Theory and Decision Science (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Image Analysis (AREA)
  • Studio Devices (AREA)

Claims (18)

  1. Verfahren zum Verfolgen von Individuen, die sich in einem von Individuen durchquerten Beobachtungsbereich (4) bewegen, bei welchem Verfahren eine Mehrzahl von Bildern des Beobachtungsbereichs mittels einer Videokamera (2) aufgenommen werden, wobei bei jedem aufgenommenen Bild die nachfolgenden Schritte durchgeführt werden:
    - Extrahieren eines festen Hintergrunds aus dem Bild (a1);
    - einen Schritt a) des Erfassens von "Flatschen"-Pixelbereichen, die sich vom festen Hintergrund des Bildes abheben,
    - einen Schritt b) des Erfassens der Individuen, wobei
    b1) für jeden Flatschen mehrere Hypothesen über mögliche Positionen innerhalb eines dreidimensionalen Modells des Beobachtungsbereichs unter Verwendung von Merkmalen der Kamera und einer Standardgröße des Individuums erstellt werden;
    und für jede mögliche Position ein Kopf des Individuums unter Anwendung eines dreidimensionalen Individuummodells auf den entsprechenden Flatschen erfasst wird,
    b2) Abzählen der Individuen für jeden Flatschen,
    c) für jedes erfasste Individuum Ermitteln der von diesem Individuum eingenommenen Position, und
    d) für sämtliche ermittelten Positionen Minimieren des Gesamtabstands zwischen den so ermittelten Positionen in dem laufenden Bild und der ermittelten Positionen bei einem vorherigen Bild,
    bei welchem Verfahren der Schritt b1) des Erfassens des Kopfes des Individuums durch Ermitteln von Kohärenzen zwischen den Kopf- und Fußpositionen erfolgt, wobei:
    - eine Verknüpfung zwischen Pixeln des Bildes und Punkten des entsprechenden Raums erstellt wird,
    - in Betracht gezogen wird, dass die Pixel eines Flatschens einem sichtbaren Punkt eines Individuums und potentiell seinen Füßen oder seinem Kopf entsprechen, und
    - eine Projektion dieser Pixel auf die Ebenen des Bodens und auf Ebenen erfolgt, die in Standardhöhe des Kopfes liegen, wobei der gemeinsame Projektionsbereich von Füßen und Kopf als eine mögliche Position identifiziert wird.
  2. Verfahren nach Anspruch 1, wobei die Minimierung des Gesamtabstands darin besteht, die nachfolgenden Schritte durchzuführen: für jede ermittelte Position des erfassten Individuums Auswerten des Abstands von den weiteren Positionen von Individuen, wodurch eine Abstandsmatrix definiert wird, die in eine Ähnlichkeitsmatrix konvertiert wird, dann Normieren der Eigenvektoren dieser Ähnlichkeitsmatrix.
  3. Verfahren nach Anspruch 1 oder 2, wobei der Abstand zwischen zwei Positionen unter Berücksichtigung eines geometrischen 3D-Abstands definiert wird.
  4. Verfahren nach einem der vorangehenden Ansprüche, wobei der Abstand zwischen zwei Positionen unter Berücksichtigung eines geometrischen 2D-Abstands definiert wird.
  5. Verfahren nach einem der vorangehenden Ansprüche, wobei der Abstand zwischen zwei Positionen unter Berücksichtigung eines kolorimetrischen Abstands definiert wird.
  6. Verfahren nach einem der vorangehenden Ansprüche, wobei der Abstand zwischen zwei Positionen unter Berücksichtigung einer zeitlichen Kohärenz definiert wird.
  7. Verfahren, bei dem mehrere Kameras in verschiedenen Beobachtungsbereichen verwendet werden, bei welchem Verfahren für jede Kamera die Schritte nach einem der vorangehenden Ansprüche erfolgen, wobei ferner nachstehende Schritte durchgeführt werden:
    - Ermitteln einer lokalen Bewegungstrajektorie für jedes Individuum und für jeden Beobachtungsbereich, wobei diese lokale Bewegungstrajetkorie den aufeinanderfolgenden Positionen entspricht, die von diesem Individuum eingenommen werden,
    - Aufzeichnen dieser lokalen Bewegungstrajektorie sowie von Zeitabläufen des Auftretens und Verschwindens,
    - Abgleichen von verschiedenen lokalen Bewegungstrajektorien durch verschiedene Beobachtungsfelder durch Verlagerung eines gleitenden Zeitfensters über die gesamten Aufzeichnungen, indem versucht wird, das Verschwinden einer lokalen Bewegungstrajektorie mit dem Auftreten einer weiteren abzugleichen.
  8. Verfahren nach einem der vorangehenden Ansprüche, wobei der Schritt b1) des Erfassens des Kopfes eines Individuums durch Analyse von Konturen erfolgt.
  9. Verfahren nach Anspruch 8, wobei der Schritt des Erfassens von Köpfen die folgenden Schritte umfasst:
    - Anwenden eines Canny-Filters auf das Bild bezüglich der Pixelbereiche, so dass ein Konturenbild der Pixelbereiche erzeugt wird,
    - Anwenden einer Abstands-Transformierten, so dass eine Abstandskarte erhalten wird,
    - Ausführen einer Faltung zwischen der Abstandskarte und einem Kopfmodell (Template), so dass eine Faltungskarte erhalten wird, und
    - Anwenden eines Wasserscheiden-Algorithmus (watershed), so dass die Köpfe der Individuen erfasst werden.
  10. Verfahren nach einem der vorangehenden Ansprüche, wobei das Abzählen darin besteht, gleichzeitig die Anzahl von vorhandenen Individuen und deren jeweilige Position zu identifizieren.
  11. Verfahren nach Anspruch 10, wobei die Identifikation der Individuen ausgehend von einem Erscheinungsmodell mit drei Ellipsen erfolgt, das ein Individuum modelliert.
  12. Verfahren nach Anspruch 11, wobei das Erscheinungsmodell nach den folgenden Kriterien optimiert wird:
    - Minimierung der Anzahl an Individuen,
    - Maximierung der Anzahl von Pixeln, die sich vom Hintergrund unterscheiden und Silhouetten zugehören,
    - Minimierung der Anzahl von Pixeln, die sich vom Hintergrund unterscheiden und keinen Ellipsen zugehören,
    - Minimierung der Anzahl von Pixeln, die dem Hintergrund und den Ellipsen zugehören, und
    - Minimierung des Abstands der Konturen eines 2D/3D-Modells.
  13. Verfahren nach Anspruch 12, wobei die Optimierung mittels eines iterativen Gradientenverfahrens erfolgt.
  14. Verfahren nach Anspruch 13, wobei die Konvergenz des iterativen Algorithmus auf mehrere aufeinanderfolgende Bilder verteilt wird.
  15. Verfahren nach einem der vorangehenden Ansprüche, wobei die Merkmale der Videokamera das Sichtfeld, die Positionierung und die Verzerrung umfassen.
  16. Verfahren nach einem der vorangehenden Ansprüche, wobei die Extraktion des festen Hintergrunds aus dem Bild (a1) durch Mittelung der Werte der Pixel des Bildes im Laufe der Zeit erhalten wird.
  17. Verfahren nach Anspruch 16, wobei der Wert eines jeden Pixels des Hintergrunds durch Wahrscheinlichkeitsverteilung modelliert wird.
  18. Verfahren nach Anspruch 16 oder 17, wobei vor der Mittelung der Werte der Pixel des Bildes eine sofortige Erfassung der Bewegung durch Subtrahieren von aufeinanderfolgenden Bildern erfolgt und ein Schwellenwert angewendet wird, um eine Maske zu erhalten, die den Pixeln entspricht, die nicht gemittelt werden.
EP09710232.1A 2008-02-12 2009-02-12 Verfahren zum verfolgen von individuen in einem sichtfeld einer kamera Not-in-force EP2255330B1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR0850873A FR2927443B1 (fr) 2008-02-12 2008-02-12 Procede de suivi d'individus dans un champ de vision d'une camera
PCT/FR2009/050226 WO2009101366A2 (fr) 2008-02-12 2009-02-12 Procede de suivi d'individus dans un champ de vision d'une camera

Publications (2)

Publication Number Publication Date
EP2255330A2 EP2255330A2 (de) 2010-12-01
EP2255330B1 true EP2255330B1 (de) 2018-10-17

Family

ID=40001365

Family Applications (1)

Application Number Title Priority Date Filing Date
EP09710232.1A Not-in-force EP2255330B1 (de) 2008-02-12 2009-02-12 Verfahren zum verfolgen von individuen in einem sichtfeld einer kamera

Country Status (3)

Country Link
EP (1) EP2255330B1 (de)
FR (1) FR2927443B1 (de)
WO (1) WO2009101366A2 (de)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3000266B1 (fr) 2012-12-26 2016-04-29 Thales Sa Procede de lutte contre la fraude, et systeme correspondant
CN112800828B (zh) * 2020-12-18 2024-07-26 零八一电子集团有限公司 地面栅格占有概率目标轨迹方法

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070003141A1 (en) * 2005-06-30 2007-01-04 Jens Rittscher System and method for automatic person counting and detection of specific events

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007018523A2 (en) * 2004-07-28 2007-02-15 Sarnoff Corporation Method and apparatus for stereo, multi-camera tracking and rf and video track fusion
CN101142593B (zh) * 2005-03-17 2010-12-15 英国电讯有限公司 跟踪视频序列中的目标的方法
US9158975B2 (en) * 2005-05-31 2015-10-13 Avigilon Fortress Corporation Video analytics for retail business process monitoring

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070003141A1 (en) * 2005-06-30 2007-01-04 Jens Rittscher System and method for automatic person counting and detection of specific events

Also Published As

Publication number Publication date
EP2255330A2 (de) 2010-12-01
WO2009101366A3 (fr) 2009-10-08
FR2927443A1 (fr) 2009-08-14
WO2009101366A2 (fr) 2009-08-20
FR2927443B1 (fr) 2013-06-14

Similar Documents

Publication Publication Date Title
CN111415461B (zh) 物品识别方法及系统、电子设备
Sun et al. Benchmark data and method for real-time people counting in cluttered scenes using depth sensors
CN105745687B (zh) 情景感知移动目标检测
US10115209B2 (en) Image target tracking method and system thereof
US9892316B2 (en) Method and apparatus for pattern tracking
US9405974B2 (en) System and method for using apparent size and orientation of an object to improve video-based tracking in regularized environments
US10552687B2 (en) Visual monitoring of queues using auxillary devices
Ryan et al. Scene invariant multi camera crowd counting
US20170161591A1 (en) System and method for deep-learning based object tracking
Benedek 3D people surveillance on range data sequences of a rotating Lidar
EP2257924B1 (de) Verfahren zur erstellung eines dichtebildes eines observationsgebiets
US20130243343A1 (en) Method and device for people group detection
Liciotti et al. People detection and tracking from an RGB-D camera in top-view configuration: review of challenges and applications
EP3707676A1 (de) Verfahren zur schätzung der installation einer kamera im referenzrahmen einer dreidimensionalen szene, vorrichtung, system mit erweiterter realität und zugehöriges computerprogramm
EP3271869B1 (de) Verfahren zur verarbeitung eines asynchronen signals
CN112633255B (zh) 目标检测方法、装置及设备
EP2255330B1 (de) Verfahren zum verfolgen von individuen in einem sichtfeld einer kamera
Patel et al. Vehicle tracking and monitoring in surveillance video
EP2257925B1 (de) Verfahren zur bestimmung eines lokalen umsatzfaktors eines objekts
Mohaghegh et al. A four-component people identification and counting system using deep neural network
Lee et al. Understanding human-place interaction from tracking and identification of many users
Naseer et al. Efficient Multi-Object Recognition Using GMM Segmentation Feature Fusion Approach
JP5988894B2 (ja) 被写体照合装置、被写体照合方法、およびプログラム
Boschini et al. Improving the reliability of 3D people tracking system by means of deep-learning
Tuncer et al. Monte Carlo based distance dependent Chinese restaurant process for segmentation of 3D LIDAR data using motion and spatial features

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20100910

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL BA RS

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20151006

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20180509

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

Free format text: NOT ENGLISH

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

Free format text: LANGUAGE OF EP DOCUMENT: FRENCH

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1054861

Country of ref document: AT

Kind code of ref document: T

Effective date: 20181115

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602009055094

Country of ref document: DE

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20181017

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1054861

Country of ref document: AT

Kind code of ref document: T

Effective date: 20181017

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181017

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181017

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181017

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190117

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181017

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190117

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181017

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181017

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190217

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181017

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181017

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181017

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190118

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190217

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602009055094

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181017

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181017

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181017

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181017

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181017

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181017

26N No opposition filed

Effective date: 20190718

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181017

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190212

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181017

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190228

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190228

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190212

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181017

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181017

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20210226

Year of fee payment: 13

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181017

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: BE

Payment date: 20210225

Year of fee payment: 13

Ref country code: GB

Payment date: 20210302

Year of fee payment: 13

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20090212

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20210421

Year of fee payment: 13

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181017

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 602009055094

Country of ref document: DE

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20220228

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20220212

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220228

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220212

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220901

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220228