EP1062637A1 - Method for determining movement of objects in a video image sequence - Google Patents

Method for determining movement of objects in a video image sequence

Info

Publication number
EP1062637A1
EP1062637A1 EP99909044A EP99909044A EP1062637A1 EP 1062637 A1 EP1062637 A1 EP 1062637A1 EP 99909044 A EP99909044 A EP 99909044A EP 99909044 A EP99909044 A EP 99909044A EP 1062637 A1 EP1062637 A1 EP 1062637A1
Authority
EP
European Patent Office
Prior art keywords
signals
values
images
difference
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP99909044A
Other languages
German (de)
French (fr)
Inventor
Michel Collobert
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Orange SA
Original Assignee
France Telecom SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by France Telecom SA filed Critical France Telecom SA
Publication of EP1062637A1 publication Critical patent/EP1062637A1/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/254Analysis of motion involving subtraction of images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion

Definitions

  • the present invention relates to a method for detecting the movement of moving objects in a sequence of video images.
  • a simpler and faster method consists in calculating the difference image in each pixel between two consecutive images of the video sequence. This difference is generally calculated using only the luminance information.
  • OF CONFIRMATION values p ⁇ ses by the luminance signal respectively of the images of order 1 and 1 + 1 Vi ⁇ L (x, y) L,. 1 (x, y) - L I (x, y) or L, (x, y) and L, -. ⁇ (x, y) represent respectively the luminance signals of the pixel of coordinates (x, y) of the images of order / and of order / - 1 Due to the noise which is induced by the electronics and the sensors of the camera, we generally apply a threshold so that the difference signal is quantified as follows
  • the value of a quantized signal associated with a pixel of coordinates (x, y) is a first value (here, zero) if the difference in luminance signal is less than a threshold value and is a second value (here, 1) if the difference signal is greater than said threshold value
  • the thresholds to be applied are of the order of 5% of the possible excursion of the luminance values (typically if the luminance varies between 0 and 255, the threshold will be of the order of 12)
  • this process it is above all the contours of moving objects that are detected. To obtain all of the zones making up a moving object, this process requires other long steps in computing time.
  • a step consists in generating first and second difference images which are then each processed in a threshold detector at two levels representative of the movens levels of positive and negative noise to which a coefficient is applied If the value of a pixel of the difference images is greater than the positive threshold value, a value one is assigned to said pixel If the value of a pixel of difference images is less than the negative threshold value, a value minus one is assigned to said pixel When the pixel value is between the two threshold values, a zero value is assigned
  • I (x, y) 1 if ⁇ L (x, y)> threshold I (x, y) - 0 if -threshold> zlL (x, y)> threshold where I (x, y) represents the quantized difference signal for the coordinate pixel
  • the cited document provides for the use of low pass filters for image difference values
  • the sensitivity of the detection is linked to the level of the threshold used for the calculation of the difference image and therefore to the level of the camera used
  • a method according to the invention consists in making the differences in the values taken respectively in each pixel determined by its coordinates (x, y) of the values of the luminance signals L 1+ ⁇ (x, y) and Lj (x, y) of two consecutive images i and i + 1. The resulting difference signal is therefore
  • any signals representative of at least one characteristic of the image For example, one could use instead of luminance signals, the chrominance signals One could also use a particular combination of the chrominance signals and luminance signals
  • a difference image is formed of the set of pixels taking respectively the difference values ⁇ L (x, y) which will have been quantified by quantization in n-ary signals with odd n, (nl) / 2 quantization levels for positive difference values A ⁇ , y), (nl) / 2 quantization levels for negative difference values z-vL (x, y) and one level for strictly zero values
  • This filtering step will, for example, consist in canceling the quantized value I (x, y) of a pixel with coordinates (x, y) if it is between two ho ⁇ zontal pixels with coordinates (x - 1, y) and (x + 1, y) or vertical coordinates (x, Y - 1) and (x, y + 1) having quantized values different from its own From this filtered difference image, we will implement a region growth stage to segment moving objects Such a stage is for example described in a book entitled "Nision by Computers" by Hermès published by R Horand and O Monga A particular realization of this segmentation can be carried out, for example, by using a connectivity of order 4 by aggregation of the pixels which are a 1 or a - 1 To supp ⁇ mer the small regions due to the noise one can, either fix a threshold on the size of the objects that we are looking for a priori, that is, if the object has contours, impose that at least one of the pixels which composes it present on
  • the method according to the invention provides a much more relevant segmentation than the conventional difference method of the state of the art.
  • all of the zones making up a moving object are easily set in highlight From the point of view of calculation time, only an additional simple filtering is necessary, which is not very detrimental

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

The invention concerns a method for detecting the movement of objects which consists in computing the differences in each pixel of image signals into n-ary signals and carrying out a step of region growth and breaking it down.

Description

Procède de détection du mouvement d'objets dans une séquence d'images video Detects the movement of objects in a sequence of video images
La présente invention concerne un procède de détection du mouvement d'objets en mouvement dans une séquence d'images videoThe present invention relates to a method for detecting the movement of moving objects in a sequence of video images.
On connaît déjà des procèdes de détection de mouvement et on citera, a titre d'exemples, ceux qui sont decπts dans un article intitule "Determining optical flow a rétrospective" au nom de B K P Horn et B G Schunck paru dans Artificial Intelligence (1993) p 81 a 87 et dans un livre intitule "Nideo Coding" au nom de L Torres et Murât Kunt paru aux éditions ".KLTJVVΕR ACADEMIC PUBLISHERS" Les procèdes de l'art anteπeur sont par exemple bases sur le flux optique, sur les appaπements de blocs, de contours, etcMotion detection methods are already known and we will cite, by way of examples, those which have been deceased in an article entitled "Determining optical flow a retrospective" in the name of BKP Horn and BG Schunck published in Artificial Intelligence (1993) p 81 to 87 and in a book entitled "Nideo Coding" in the name of L Torres and Murât Kunt published in editions ".KLTJVVΕR ACADEMIC PUBLISHERS" The procedures of the prior art are for example based on the optical flux, on the pairings of blocks , contours, etc
Ces méthodes ont l'inconvénient d'être très gourmandes en temps de calcul et donc peu utilisables pour la détection en temps réel d'objets en mouvementThese methods have the disadvantage of being very greedy in computation time and therefore not very usable for the detection in real time of moving objects.
Par contre, un procède plus simple et plus rapide consiste a calculer l'image de différence en chaque pixel entre deux images consécutives de la séquence video Cette différence est généralement calculée en utilisant uniquement l'information de luminanceOn the other hand, a simpler and faster method consists in calculating the difference image in each pixel between two consecutive images of the video sequence. This difference is generally calculated using only the luminance information.
Ainsi, pour chaque pixel de coordonnées (λ, y), on détermine la différence desThus, for each pixel of coordinates (λ, y), the difference of the
DE CONFIRMATION valeurs pπses par le signal de luminance respectivement des images d'ordre 1 et 1 + 1 Vi ΔL(x,y) = L , . 1(x,y) - LI(x,y) ou L,(x,y) et L, -. ι(x,y) représentent respectivement les signaux de luminance du pixel de coordonnées (x,y) des images d'ordre / et d'ordre / - 1 Du fait du bruit qui est induit par l'électronique et les capteurs de la caméra, on applique généralement un seuil si bien que le signal de différence est quantifie de la manière suivanteOF CONFIRMATION values pπses by the luminance signal respectively of the images of order 1 and 1 + 1 Vi ΔL (x, y) = L,. 1 (x, y) - L I (x, y) or L, (x, y) and L, -. ι (x, y) represent respectively the luminance signals of the pixel of coordinates (x, y) of the images of order / and of order / - 1 Due to the noise which is induced by the electronics and the sensors of the camera, we generally apply a threshold so that the difference signal is quantified as follows
Vx,Vy I(x, y) = 0 si .ΔL(x,y) < seuil I(x, y) = 1 si ΔL(x,y) > seuil ou I(x,y) représente le signal de différence quantifie pour le pixel de coordonnéesVx, Vy I (x, y) = 0 if .ΔL (x, y) <threshold I (x, y) = 1 if ΔL (x, y)> threshold or I (x, y) represents the difference signal quantizes for the coordinate pixel
(χ,y)( χ , y)
La valeur d'un signal quantifie associe a un pixel de coordonnées (x,y) est une première valeur (ici, zéro) si le signal de différence de luminance est infeπeur a une valeur seuil et est une seconde valeur (ici, 1) si le signal de différence est supérieur a ladite valeur seuilThe value of a quantized signal associated with a pixel of coordinates (x, y) is a first value (here, zero) if the difference in luminance signal is less than a threshold value and is a second value (here, 1) if the difference signal is greater than said threshold value
Pour une caméra standard du commerce, les seuils a appliquer sont de l'ordre de 5 % de l'excursion possible des valeurs de luminance (typiquement si la luminance varie entre 0 et 255, le seuil sera de l'ordre de 12)For a standard commercial camera, the thresholds to be applied are of the order of 5% of the possible excursion of the luminance values (typically if the luminance varies between 0 and 255, the threshold will be of the order of 12)
Selon ce procède, ce sont surtout les contours des objets en mouvement qui sont détectes Pour obtenir l'ensemble des zones composant un objet en mouvement ce procède nécessite d'autres étapes longues en temps de calculAccording to this process, it is above all the contours of moving objects that are detected. To obtain all of the zones making up a moving object, this process requires other long steps in computing time.
On connaît également le document de brevet WO-A-90 01706 qui decπt un procède de traitement d'images permettant l'acquisition d'objets dans un fond désordonnée Dans ce procède, une étape consiste a générer des première et seconde images de différence qui sont ensuite chacune traitées dans un détecteur de seuil a deux niveaux représentatifs des niveaux movens de bruit positif et négatif auxquels est applique un coefficient Si la valeur d'un pixel des images de différence est supérieure a la valeur positive de seuil, une valeur un est attribuée audit pixel Si la valeur d'un pixel des images de différence est inférieure a la valeur négative de seuil, une valeur moins un est attπbuee audit pixel Lorsque la valeur du pixel est entre les deux valeurs seuil, une valeur nulle est attπbueeWe also know the patent document WO-A-90 01706 which decπt a method of image processing allowing the acquisition of objects in a disordered background. In this method, a step consists in generating first and second difference images which are then each processed in a threshold detector at two levels representative of the movens levels of positive and negative noise to which a coefficient is applied If the value of a pixel of the difference images is greater than the positive threshold value, a value one is assigned to said pixel If the value of a pixel of difference images is less than the negative threshold value, a value minus one is assigned to said pixel When the pixel value is between the two threshold values, a zero value is assigned
En reprenant la notation mentionnée ci-dessus, on peut alors écrire Vx,Vy I(x, y) = -1 siUsing the notation mentioned above, we can then write Vx, Vy I (x, y) = -1 if
I(x, y) = 1 si ΔL(x,y) > seuil I(x,y) — 0 si -seuil >zlL(x,y) > seuil où I(x,y) représente le signal de différence quantifié pour le pixel de coordonnéesI (x, y) = 1 if ΔL (x, y)> threshold I (x, y) - 0 if -threshold> zlL (x, y)> threshold where I (x, y) represents the quantized difference signal for the coordinate pixel
(χ,y)( χ , y)
Au moyen de ce procédé, les niveaux seuils dépendant du bruit, il est très difficile, voire impossible de détecter des mouvements à des valeurs très en-dessous du bruit des caméras Pour tenter de résoudre ce problème, le document cité prévoit l'utilisation de filtres passe-bas des valeurs de différence d'imagesBy means of this process, the threshold levels depending on the noise, it is very difficult, if not impossible, to detect movements at values far below the noise of the cameras. In an attempt to solve this problem, the cited document provides for the use of low pass filters for image difference values
Maigre tout, la sensibilité de la détection est liée au niveau du seuil utilise pour le calcul de l'image de différence et donc au niveau du bπiit de la caméra utiliséeLean all, the sensitivity of the detection is linked to the level of the threshold used for the calculation of the difference image and therefore to the level of the camera used
Le but de l'invention est donc de proposer un procédé de détermination du mouvement qui ne présente pas les inconvénients mentionnés ci-dessus Selon un premier mode de réalisation, un procédé selon l'invention consiste à effectuer les différences des valeurs respectivement prises en chaque pixel déterminé par ses coordonnées (x,y) des valeurs des signaux de luminance L1+ι(x,y) et Lj(x,y) de deux images i et i + 1 consécutives. Le signal de différence résultant est doncThe object of the invention is therefore to propose a method for determining the movement which does not have the drawbacks mentioned above. According to a first embodiment, a method according to the invention consists in making the differences in the values taken respectively in each pixel determined by its coordinates (x, y) of the values of the luminance signals L 1+ ι (x, y) and Lj (x, y) of two consecutive images i and i + 1. The resulting difference signal is therefore
Vi ΔL(x,y) = L , - ι(x,y) - L;(x,y) où, comme précédemment, L,(x,y) et L, ^ ι(x,y) représentent respectivement les signaux de luminance du pixel de coordonnées (x,y) des images d'ordre / et d'ordre / + 1Vi ΔL (x, y) = L, - ι (x, y) - L; (x, y) where, as before, L, (x, y) and L, ^ ι (x, y) represent respectively the pixel luminance signals of coordinates (x, y) of the order / and order / + images 1
On notera que l'on pourrait utiliser tous signaux représentatifs d'au moins une caractéristique de l'image Par exemple, on pourra utiliser à la place des signaux de luminance, les signaux de chrominance On pourra également utiliser une combinaison particulière des signaux de chrominance et des signaux de luminanceIt will be noted that one could use any signals representative of at least one characteristic of the image. For example, one could use instead of luminance signals, the chrominance signals One could also use a particular combination of the chrominance signals and luminance signals
Une image de différence est formée de l'ensemble des pixels prenant respectivement les valeurs de différence ΔL(x,y) que l'on aura quantifiées par quantification en signaux n-aires avec n impair, (n-l)/2 niveaux de quantification pour les valeurs de différence A { ,y) positives, (n-l )/2 niveaux de quantification pour les valeurs de différence z-vL(x,y) négatives et un niveau pour les valeurs strictement nullesA difference image is formed of the set of pixels taking respectively the difference values ΔL (x, y) which will have been quantified by quantization in n-ary signals with odd n, (nl) / 2 quantization levels for positive difference values A {, y), (nl) / 2 quantization levels for negative difference values z-vL (x, y) and one level for strictly zero values
Par exemple, pour n = 3, on quantifiera de la manière suivante Vx, Vy I(x,y) = 0 si .ΔL(x,y) = 0 I(x,y) = 1 si ΔL(x,y) > 0 I(x,y) = - 1 si ΔL(x,y) < 0 L'ensemble des signaux correspondant aux valeurs de I(x,y) pour tous les pixels de coordonnées (x,y) de l'image sont appelés ci-dessous image de différenceFor example, for n = 3, we will quantify as follows Vx, Vy I (x, y) = 0 if .ΔL (x, y) = 0 I (x, y) = 1 if ΔL (x, y)> 0 I (x, y) = - 1 if ΔL ( x, y) <0 The set of signals corresponding to the values of I (x, y) for all the pixels of coordinates (x, y) of the image are called below difference image
Lorsque l'on construit de la sorte une image de différence et qu'on la visualise, on voit apparaître, d'une part, de larges zones aux valeurs I(x,y) essentiellement positives ou négatives et, d'autre part, des zones en damiers Chacune des larges plages correspond a une zone ou le gradient de luminosité est grossièrement constant en direction Statistiquement, maigre le bruit de la caméra, une telle zone présentera donc une valeur de différence positive ou négative si la projection du vecteur mouvement sur le vecteur gradient de luminosité est non nulWhen we build an image of difference in this way and visualize it, we see, on the one hand, large areas with values I (x, y) essentially positive or negative and, on the other hand, checkerboard areas Each of the wide areas corresponds to an area where the brightness gradient is roughly constant in the direction Statistically, the camera noise is reduced, such an area will therefore have a positive or negative difference value if the projection of the motion vector on the brightness gradient vector is not zero
Quant aux zones en damiers du reste de l'image, elles mettent en évidence le bruit dû au capteur et a l'électroniqueAs for the checkered areas of the rest of the image, they highlight the noise due to the sensor and the electronics.
Pour exploiter cette image de différence ternaire, on doit donc suppπmer les petites zones coπespondant au bruit et regrouper les grandes zones connexesTo exploit this ternary difference image, we must therefore remove the small areas corresponding to the noise and group the large related areas
Si on examine attentivement les zones en damiers correspondant au bruit, on s'aperçoit que celles-ci sont correlees localement a leurs voisins, produisant des zones de forme "filamenteuse" Cela peut produire des phénomènes dits dans le domaine de la technique de "percolation" intempestive sur toute l'image en cas d'utilisation d'algonthmes de croissance de régionIf we carefully examine the checkered zones corresponding to the noise, we will see that these are locally correlated to their neighbors, producing zones of "filamentous" form. This can produce phenomena known in the field of the "percolation" technique. "untimely on the whole image when using region growth algonthms
On propose, pour éviter ce phénomène de percolation, de mettre en oeuvre une étape de filtrage qui doit être appliquée a cette image de différence pour supprimer les zones "filamenteuses"It is proposed, to avoid this percolation phenomenon, to implement a filtering step which must be applied to this difference image to remove the "filamentous" areas
Cette étape de filtrage va, par exemple, consister a annuler la valeur quantifiée I(x,y) d'un pixel de coordonnées (x,y) s'il se trouve entre deux pixels hoπzontaux de coordonnées (x - 1, y) et (x + 1 , y) ou verticaux de coordonnées (x, Y - 1) et (x, y + 1 ) présentant des valeurs quantifiées différentes de la sienne A partir de cette image de différence filtrée, on va mettre en oeuvre une étape de croissance de région pour segmenter les objets en mouvement Une telle étape est par exemple decπte dans un livre intitule "Nision par Ordinateurs" aux éditions Hermès écrit par R Horand et O Monga Une réalisation particulière de cette segmentation peut être effectuée, par exemple, en utilisant une connexite d'ordre 4 par agrégation des pixels qui sont a 1 ou a - 1 Pour suppπmer les petites régions dues au bruit on peut, soit fixer un seuil sur la taille des objets que l'on recherche a priori, soit, si l'objet possède des contours, imposer qu'au moins un des pixels qui le compose présente sur l'image de différence, une valeur supeπeure a un seuil fixe au-dessus du bruitThis filtering step will, for example, consist in canceling the quantized value I (x, y) of a pixel with coordinates (x, y) if it is between two hoπzontal pixels with coordinates (x - 1, y) and (x + 1, y) or vertical coordinates (x, Y - 1) and (x, y + 1) having quantized values different from its own From this filtered difference image, we will implement a region growth stage to segment moving objects Such a stage is for example described in a book entitled "Nision by Computers" by Hermès published by R Horand and O Monga A particular realization of this segmentation can be carried out, for example, by using a connectivity of order 4 by aggregation of the pixels which are a 1 or a - 1 To suppπmer the small regions due to the noise one can, either fix a threshold on the size of the objects that we are looking for a priori, that is, if the object has contours, impose that at least one of the pixels which composes it present on the difference image, a value greater than a fixed threshold above noise
Le procède selon l'invention apporte une segmentation beaucoup plus pertinente que la méthode par différence classique de l'état de la technique En effet, au heu d'obtenir uniquement les contours, l'ensemble des zones composant un objet en mouvement est facilement mis en évidence Au point de vue temps de calcul, seul un filtrage simple supplémentaire est nécessaire, ce qui est peu pénalisantThe method according to the invention provides a much more relevant segmentation than the conventional difference method of the state of the art. In fact, instead of obtaining only the contours, all of the zones making up a moving object are easily set in highlight From the point of view of calculation time, only an additional simple filtering is necessary, which is not very detrimental
Les domaines d'application de la présente invention sont varies On peut citer a titre d'exemples non exhaustifs des interfaces "homme-machine", le codage et la compression d'images, la robotique, la télévision Elle peut être utilisée dans des rétines artificielles, intégration "en dur" dans des caméras ou des moniteurs, dans l'écoulement des fluides (turbulences, meteo) et dans la surveillance d'intrusion et la détection de fiimees The fields of application of the present invention are varied. Mention may be made, by way of non-exhaustive examples, of "man-machine" interfaces, coding and compression of images, robotics, television. It can be used in retinas. artificial, "hard" integration in cameras or monitors, in the flow of fluids (turbulence, weather) and in intrusion monitoring and fiime detection

Claims

REVENDICATIONS
1) Procède de détection du mouvement d'objets dans une séquence d'images vidéo, caractérise en ce qu'il consiste a effectuer les différences des valeurs respectivement pπses en chaque pixel par des signaux représentatifs d'une caractéristique desdites images pour deux images consécutives afin d'obtenir un signal de différence qui est ensuite quantifie en signaux n-aires avec n impair, (n-l)/2 niveaux de quantification pour les valeurs de différence positives, (n-l)/2 niveaux de quantification pour les valeurs de différence négatives et un niveau pour les valeurs de différence strictement nulles, puis effectuer une étape de croissance de région et ainsi segmenter le mouvement des objets 2) Procède selon la revendication 1, caractérise en ce que le signal de différence est quantifie en signaux ternaires, un niveau pour les valeurs positives du signal de différence, un niveau pour les valeurs négatives du signal de différence et un niveau pour la valeur stπctement nulle du signal de différence1) Method for detecting the movement of objects in a sequence of video images, characterized in that it consists in effecting the differences of the values respectively pπses in each pixel by signals representative of a characteristic of said images for two consecutive images in order to obtain a difference signal which is then quantized into n-ary signals with odd n, (nl) / 2 quantization levels for positive difference values, (nl) / 2 quantization levels for negative difference values and a level for the strictly zero difference values, then perform a region growth step and thus segment the movement of the objects 2) Proceeds according to claim 1, characterized in that the difference signal is quantified in ternary signals, a level for the positive values of the difference signal, a level for the negative values of the difference signal and a level for the value stπctement zero difference signal
3) Procédé selon la revendication 1 ou 2, caractéπse en ce qu'avant l'étape de croissance de région, on effectue une étape de filtrage de l'image formée de l'ensemble des pixels prenant respectivement lesdites valeurs de différence quantifiées afin d'en éliminer les zones filamenteuses qui y apparaissent3) Method according to claim 1 or 2, caractéπse in that before the region growth step, a step of filtering the image formed of the set of pixels is carried out respectively taking said quantized difference values in order to d '' remove the filamentous areas that appear there
4) Procédé selon la revendication 3, caractérise en ce que ladite étape de filtrage consiste a annuler la valeur quantifiée d'un pixel si ledit pixel se trouve entre deux pixels hoπzontaux ou verticaux présentant des valeurs quantifiées différentes de la sienne4) Method according to claim 3, characterized in that said filtering step consists in canceling the quantized value of a pixel if said pixel is between two hoπzontal or vertical pixels having quantized values different from its own
5) Procède selon une des revendications précédentes, caractéπse en ce que lesdits signaux représentatifs d'une caractéristique desdites images sont les signaux de luminance 6) Procède selon une des revendications 1 a 4, caracteπse en ce que lesdits signaux représentatifs d'une caracteπstique desdites images sont des signaux de chrominance5) Proceeds according to one of the preceding claims, characterized in that said signals representative of a characteristic of said images are the luminance signals 6) Proceeds according to one of claims 1 to 4, characterized in that said signals representative of a characteristic of said images are chrominance signals
7) Procède selon une des revendications 1 a 4, caractérise en ce que lesdits signaux représentatifs d'une caracteπstique desdites images sont des signaux résultant de la combinaison des signaux de luminance et des signaux de chrominance 7) Method according to one of claims 1 to 4, characterized in that said signals representative of a characteristic of said images are signals resulting from the combination of luminance signals and chrominance signals
EP99909044A 1998-03-19 1999-03-19 Method for determining movement of objects in a video image sequence Withdrawn EP1062637A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
FR9803641A FR2776459B1 (en) 1998-03-19 1998-03-19 METHOD FOR DETECTING MOVING OBJECTS IN A SEQUENCE OF VIDEO IMAGES
FR9803641 1998-03-19
PCT/FR1999/000634 WO1999048048A1 (en) 1998-03-19 1999-03-19 Method for determining movement of objects in a video image sequence

Publications (1)

Publication Number Publication Date
EP1062637A1 true EP1062637A1 (en) 2000-12-27

Family

ID=9524451

Family Applications (1)

Application Number Title Priority Date Filing Date
EP99909044A Withdrawn EP1062637A1 (en) 1998-03-19 1999-03-19 Method for determining movement of objects in a video image sequence

Country Status (4)

Country Link
US (1) US6754372B1 (en)
EP (1) EP1062637A1 (en)
FR (1) FR2776459B1 (en)
WO (1) WO1999048048A1 (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6940998B2 (en) * 2000-02-04 2005-09-06 Cernium, Inc. System for automated screening of security cameras
WO2006115676A2 (en) * 2005-03-30 2006-11-02 Cernium Corporation Video ghost detection by outline
US7822224B2 (en) 2005-06-22 2010-10-26 Cernium Corporation Terrain map summary elements
US20090062002A1 (en) * 2007-08-30 2009-03-05 Bay Tek Games, Inc. Apparatus And Method of Detecting And Tracking Objects In Amusement Games
WO2010124062A1 (en) 2009-04-22 2010-10-28 Cernium Corporation System and method for motion detection in a surveillance video
DE102016207705A1 (en) * 2016-05-04 2017-11-09 Robert Bosch Gmbh Smoke detection device, method for detecting smoke of a fire and computer program

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4937878A (en) * 1988-08-08 1990-06-26 Hughes Aircraft Company Signal processing for autonomous acquisition of objects in cluttered background
US5731840A (en) * 1995-03-10 1998-03-24 Kabushiki Kaisha Toshiba Video coding/decoding apparatus which transmits different accuracy prediction levels
US5764803A (en) * 1996-04-03 1998-06-09 Lucent Technologies Inc. Motion-adaptive modelling of scene content for very low bit rate model-assisted coding of video sequences
US6188776B1 (en) * 1996-05-21 2001-02-13 Interval Research Corporation Principle component analysis of images for the automatic location of control points
KR100259136B1 (en) * 1997-04-19 2000-06-15 김영환 Motion vector detection device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO9948048A1 *

Also Published As

Publication number Publication date
WO1999048048A1 (en) 1999-09-23
FR2776459B1 (en) 2000-04-28
US6754372B1 (en) 2004-06-22
FR2776459A1 (en) 1999-09-24

Similar Documents

Publication Publication Date Title
EP1694058A1 (en) Image capture method and device comprising local motion estimation
FR2591836A1 (en) INTERIOR IMAGE MOTION DETECTOR FOR VIDEO SIGNALS
FR2988191B1 (en) FILTERING METHOD AND FILTER DEVICE FOR SENSOR DATA
EP3314888B1 (en) Correction of bad pixels in an infrared image-capturing apparatus
CN110889403A (en) Text detection method and related device
WO2022022975A1 (en) Method and device for underwater imaging
Takamatsu et al. Estimating demosaicing algorithms using image noise variance
EP1062637A1 (en) Method for determining movement of objects in a video image sequence
EP3216213B1 (en) Method for detecting defective pixels
EP3314887B1 (en) Detection of bad pixels in an infrared image-capturing apparatus
EP2943935B1 (en) Estimation of the movement of an image
FR3089662A1 (en) Method for recognizing objects such as traffic signs by means of an on-board camera in a motor vehicle
Seo et al. On quantization of convolutional neural networks for image restoration
EP3072110B1 (en) Method for estimating the movement of an object
Jöchl et al. Content Bias in Deep Learning Age Approximation: A new Approach Towards more Explainability
FR2861524A1 (en) Method for detecting orientation of image taken by digital camera, involves detecting lines in image, calculating attributes that characterize lines, for each detected line, and detecting orientation of image according to attributes
GB2609661A (en) A signal cleaner
EP4066204A1 (en) Method and device for processing images
JP2002237997A (en) Defective pixel correction device for solid-state image pickup element, and solid-state image pickup device provided with the element
JP2015056040A (en) Image processor
JP2022017900A (en) Information processing device
KR101706218B1 (en) Apparatus and method for removing blur and ghost
WO1991018358A1 (en) Device for detecting objects in a sequence of images
FR2814894A1 (en) Industrial site/public place/road traffic intrusion detection system having video scene block transformed/spatial activity determined and previous current image compared with evolving luminosity weighting applied.
Chetty et al. Blind image tamper detection based on multimodal fusion

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20000728

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): DE DK ES GB IT NL SE

GRAG Despatch of communication of intention to grant

Free format text: ORIGINAL CODE: EPIDOS AGRA

GRAG Despatch of communication of intention to grant

Free format text: ORIGINAL CODE: EPIDOS AGRA

GRAH Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOS IGRA

17Q First examination report despatched

Effective date: 20010921

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20020129