EP1646967B1 - Method for measuring the proximity of two contours and system for automatic target identification - Google Patents

Method for measuring the proximity of two contours and system for automatic target identification Download PDF

Info

Publication number
EP1646967B1
EP1646967B1 EP04766207A EP04766207A EP1646967B1 EP 1646967 B1 EP1646967 B1 EP 1646967B1 EP 04766207 A EP04766207 A EP 04766207A EP 04766207 A EP04766207 A EP 04766207A EP 1646967 B1 EP1646967 B1 EP 1646967B1
Authority
EP
European Patent Office
Prior art keywords
contour
point
proximity
image
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
EP04766207A
Other languages
German (de)
French (fr)
Other versions
EP1646967A1 (en
Inventor
Olivier Thales Intellectual Property Ruch
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Thales SA
Original Assignee
Thales SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thales SA filed Critical Thales SA
Publication of EP1646967A1 publication Critical patent/EP1646967A1/en
Application granted granted Critical
Publication of EP1646967B1 publication Critical patent/EP1646967B1/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/255Detecting or recognising potential candidate objects based on visual cues, e.g. shapes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/752Contour matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/754Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries involving a deformation of the sample pattern or of the reference pattern; Elastic matching

Definitions

  • the present invention relates to the automatic identification of the targets present in an image. More specifically, this invention describes a discriminant method for comparing 2D contours. It applies mainly in the military field, to assist the pilot of a plane in a combat situation in his choice of shots. It is also of interest in any other field concerned with pattern recognition, in particular the field of surveillance and the medical field.
  • An automatic identification process must reliably determine how many targets are in the image, at what positions and what types they are.
  • a target is a 3D object that one seeks to identify.
  • these targets are typically tanks, land vehicles ....
  • identification system a system by which a target in an image is identified by its type: mark, name or number, or by its class: car, tank, car ...
  • the automatic identification of objects or targets is a complex algorithmic problem, partly because of the potential resemblances between two different targets from certain angles of view, and secondly because of the great variability of appearance of a target, due to geometric deformations, the position of certain elements, or the presence of certain equipment.
  • a vehicle may have open or closed doors, luggage on the roof ...
  • the automatic identification process must thus have two essential qualities: to be robust, ie insensitive to variations in the appearance of a target that result in local disturbances on the object in the image; to be discriminating, that is to say to be able to discern between two targets close in appearance.
  • one is more particularly interested in an automatic identification system of targets based on the comparison of contours.
  • the contours are initially extracted. present in the image to be analyzed, and then, in a second step, these contours are compared to those of a target reference database, containing data representing the 3D objects that one seeks to identify.
  • the extraction of the contours present in the image is done using a so-called segmentation technique.
  • the result is an image called extracted outlines, corresponding to a binary image leaving only pixels of outlines, usually represented by white dots on a black background. In this image, only the outlines contain information.
  • point means a point carrying information, ie a point belonging to an outline in the model or in the image. Pixels that are not contour points do not carry information.
  • the extracted contour image is then compared to the contours obtained from a database representing the 3D objects that we are trying to identify. These outlines are called contour-models and are obtained, for each of the 3D objects, by projection according to a set of points of view making it possible to represent all the appearances of the object. Each 3D object in the database thus corresponds to a collection of model contours of this object.
  • one is more particularly interested in a so-called correlative comparison method which consists in comparing each model contour with the image of extracted contours for all the possible positions of this model contour in the image. For a given position, this comparison is performed by superimposing the model contour on the image, and consists in measuring "the difference" between the points of the model contour and those of the extracted contour image. Since each of the model contours is identified with respect to an origin, it is possible to recalculate the coordinates of each of its points in the coordinate system of the contour image, according to the pixel of the image on which this origin is centered. . Thus, each of the model contours is scanned over the entire contour image extracted.
  • the process consists in selecting the most likely hypothesis or hypotheses.
  • a method of estimating the difference between the model contour points and the extracted contour points is to count the number of points that these contours have in common.
  • Another more complex evaluation method uses a so-called Hausdorff measurement method. This method consists in identifying for each of the model contour points, the lowest distance from this point to the points of the image contour, and to deduce therefrom a degree of dissimilarity between the model contour and the image contour, on the basis of the average evaluated distances.
  • An object of the invention is an automatic identification process that does not have these various disadvantages.
  • An automatic identification process comprises a method for measuring the proximity of a model contour to an image contour based on a single matching step of each point of a model contour to zero or a single contour point picture.
  • This point-to-point matching method comprises a step of associating with each point of the image contour, the model contour point the closest. At this stage, two points of information are mapped to each point of the image contour: the coordinates of a model contour point determined as the closest and the distance between the two points thus associated.
  • each model contour point is matched either to zero image contour point or to a single image contour point corresponding to a smallest distance.
  • the overall score resulting from this method is much more discriminating than the proximity measurement used in the automatic identification methods of the state of the art, especially with respect to false assumptions.
  • An automatic identification system uses this method for each position of the model contour in the image, and for each model of a model collection.
  • the set of global scores obtained, corresponding to the different model contours and their different positions in the image, makes it possible to elaborate a certain number of hypotheses by retaining the best global proximity ratings.
  • the point-to-point matching process makes it possible to improve the discrimination of the automatic identification system with respect to the false assumptions corresponding to cases where the contours in the image comprise interior points of contours. to say corresponding to the internal contours of a target, and external points of contours, ie corresponding to the environment of the target (vegetation, buildings .).
  • the method of proximity measurement applies a local weighting at each point of a model contour.
  • This weighting is representative of a quantity of information contained at this point and defined with respect to the other model contour.
  • This weighting makes it possible to discriminate the silhouettes of the two targets on the basis of their local differences. More particularly, this weighting consists in applying the method of measuring proximity between the two model contours to be discriminated, in order to obtain, for each model contour, a weighting factor at each point which makes it possible to give more weight to the model contour points which contain the difference information with the other template outline.
  • the automatic identification system applies to each of the model contours of a collection, the process of measuring the proximity of this model contour to the image contour to be analyzed in order to evaluate the likelihood of this model, and between the model contours taken. two by two in a selection of overlapping hypotheses, to discriminate between two close model contours by locally weighting this probability relative to each of the two models.
  • the invention relates to a method of measuring proximity of a second contour to a first contour, comprising for each point of the first contour, a step of association with a point of the second contour determined as the nearest , characterized in that it comprises a step of pairing each point of the second contour with one or zero points of the first contour, by determining the point of the first closest contour among the set of points of the first contour associated with said point of the second contour.
  • the invention also relates to a method for automatically identifying targets, which uses such a method for measuring the proximity of a model contour to an image contour.
  • this identification method uses this method of measuring proximity of a model contour to another model contour, to allow discrimination between two overlapping hypotheses.
  • Figure 1 shows an image of outlines extracted from a data image, which may be from an infrared camera, a video system, or any other image source.
  • FIG. 1 illustrates an image of outlines extracted from an image obtained in any way: infrared image, active image. It contains image contour points, which correspond to the black pixels on this image, such as the points referenced I a , I b , I c , I d in FIG. 1. These contour points may be contour points of a target to be identified as the point I a, the outer points to the contour of the target to be identified, such points I b and I c or points of an inner contour to the target to be identified, such as the point I of .
  • the proximity measuring method comprises a step of unambiguous pairing of each of the model contour points to zero or a single image contour point and a step of assigning a local proximity note to each point of the image. model outline, representing the proximity of this model outline point with the image outline.
  • This method entails that, in step a), two information items are memorized for each image contour point: the coordinates of the associated model contour point and the corresponding distance between the two associated points, to perform step b) - d matching on the basis of these two pieces of information.
  • the distance considered is the Euclidean distance, of which a true measurement is measured or a discrete measurement according to the calculation methods used.
  • the use of a chamfer method makes it possible in a known manner to accelerate the calculation time, uses a discrete measurement of the Euclidean distance.
  • Steps a) - and b) - are illustrated in Figures 2 and 3.
  • Step a) -. is illustrated in FIG. 2.
  • the proximity of the image contour points CI to the model contour CM is evaluated to associate with each contour point a closest model contour point.
  • the evaluation of the nearest model contour point consists in finding the lowest distance d between this image contour point and an outline point. model.
  • this evaluation leads to associating the point M 1 of model contour CM with the point I 1 of the image contour C1.
  • a same model contour point may be associated with different image contour points.
  • the point M 1 of the model contour CM has been associated with the image contour points I 1 , I 2 , I 3 .
  • Step b) - is illustrated in Figure 3. It consists for each model contour point to select from the image contour points associated with it in the first step a), the nearest contour point image of the model outline point.
  • the dotted lines represent the mapping of image contour points with model contour points according to the first step a). For each model contour point, there is thus 0, one or n associated contour points according to this step a). For example, for the image contour point M 15 , there are 3 associated image contour points: l 24 , l 28 , and l 29 .
  • Step b) consists in keeping only the nearest image point, when it exists, among the image contour points associated with the same model contour point and in evaluating the local proximity note of this model contour point. to the image contour on the basis of the matching (contour point model-contour point image) thus performed.
  • the pairing point M i (pattern) with point l k (image) is the following: (M 10 , zero point image); (M 11 , 1, 20 ); (M 12 , 12 ); (M 13 , 22 ); (M 15 , 1, 24 ).
  • the point-to-point matching step provides for each point M i of model contour M i matched to a single image contour point I k , a measurement of proximity of this point M i to the image contour.
  • This measure of proximity of the point M i can be written as:
  • Dist (M i ) d (M i , I k ), where d (M i , I k ) is a true or approximate measure of the Euclidean distance between the paired points. It is expressed in number of pixels.
  • the last step of the process is then to determine the overall score for the model, by averaging the local scores of all the points of the model outline.
  • the model contour is evaluated as being all the closer to an image contour that the overall rating assigned to it is high.
  • FIGS. 4a and 4b we superimposed on an image 1 comprising a target C, a first model MOD 1 ( Figure 4a) and a second model MOD 2 ( Figure 4b).
  • the first model MOD 1 corresponds in the example to the target to detect on which it is perfectly positioned. It leads to a hypothesis retained.
  • the second model corresponds to another type of target. But with a method according to the state of the art, the hypothesis will be retained, because of the presence of contour points not belonging to the contour of the target, but actually belonging to the bottom, or to contour points internal.
  • the evaluation of the local proximity note of each model contour point is a function of the distance d between this point and the paired image contour point according to the invention.
  • the evaluation of the proximity of two points comprises taking into account the orientation class at the points I 20 and M 30 of the pair P considered.
  • This class is typically defined by the orientation of the tangent of the contour at the point considered: in FIG. 5a, the tangent t I is represented at the image contour point I 20 of the image contour CI and the tangent t M at the contour point model M 30 of the contour model CM.
  • proximity measurement is a continuous function of position and orientation. This limits the weight of the orientation, which can be erroneously estimated.
  • the orientation class is taken into account in the step of associating the point-to-point matching process, by not allowing the association (and therefore the pairing) only between points of the same class.
  • the proximity measure Dist (M i ) is equal to the distance between the paired points M i and l k .
  • the attribution of the local proximity note N (M i ) of a model contour point M i as a function of the proximity measurement according to the invention must contribute to the robustness of the identification method.
  • a practical implementation of a proximity measurement method according to the invention can use so-called chamfer calculation methods. These chamfer methods are very efficient in terms of computation time and are widely used in many areas of image processing including that of shape recognition.
  • a classical chamfer method allows to map to a contour, a map with two inputs x and y corresponding to the coordinates of a given point, and an output, which is the lowest distance from this point (x, y) to the outline. In other words, the lowest distance from the point (x, y) to the contour mapped by contour lines is evaluated.
  • This known method of chamfer is generally used to apply the Hausdorff measurement method. In this case, the chamfer method is applied to the image contour, making it possible to determine for each point (x, y) of model contour, the smallest distance to the image contour.
  • the chamfer method must be applied differently.
  • the aim is to measure the smallest distance from an image contour point to the model contour. This involves applying the chamfering method no longer to the image contour, but to each of the model contours.
  • the calculation of the chamfer map of a model contour is independent of the extracted contour image to be analyzed. These calculations can be done once and for all and stored, to be used when the time comes, in real time, for the analysis of a given contour image.
  • the chamfer map of the model contour must output a first information which is the distance between the two associated points, and a second information which is the identification of the model contour point associated with this distance.
  • This second piece of information is necessary because it will enable the pairing stage to determine all the image contour points associated with the same model contour point, and to automatically deduce the proximity measure. by the first associated information.
  • a rapid calculation method comprises the calculation of a chamfer map for each model contour, said map giving as a function of the two inputs x and y corresponding to the coordinates of an image contour point, an information S 0 (x, y) identifying the model contour point reached by the lowest distance measurement, and information S 1 (x, y) corresponding to the value of this measurement.
  • This method does not make it possible to correct the distance measure Dist (M i ) of a model contour point M i as a function of the orientation class of this point and of the paired image contour point I k .
  • the step of associating with each contour point of a model contour point then comprises, for each image contour point, the preliminary determination of the orientation class of this point, and the selection of the chamfer card. of the model contour in the corresponding orientation class.
  • the method according to the invention is applied to all model contours by scanning each time on the entire image.
  • model contour to be understood as the model contour in a given position
  • probability measure of similarity of this model contour to the image contour is obtained.
  • hypotheses a selection of hypotheses is then established.
  • hypothesis is meant a model contour (that is to say a target, from a certain point of view) in a determined position in the image.
  • the selection is typically obtained by retaining the most probable assumptions corresponding to an overall score obtained above a decision threshold.
  • This threshold is preferably set at 0.6.
  • Another aspect of the invention makes it possible to improve this last point.
  • the problem more particularly considered here is due to the fact that, according to certain angles of view, for certain orientations, two targets may have relatively similar shapes, similar to the meaning of the overall score ⁇ assigned according to the invention.
  • Some parts of a model outline are therefore more informative than others with respect to another model outline.
  • FIG. 7d thus shows an IC image contour corresponding to an image of extracted contours.
  • Two models CM 1 and CM 2 shown respectively in Figure 7a and Figure 7b, are found close to the meaning of the invention of this image contour.
  • the basic idea of the improvement according to the invention consists in considering the two hypotheses which are superimposed, and in weighting the local note of each point of a model contour, established in the measure of proximity of this model contour to the image contour. , by a quantity of information representing the local difference at this point, with the other model contour.
  • the overall score ⁇ 1 associated with the model contour CM 1 which measures the probability of similarity of this model contour CM 1 to the image contour CI is obtained by weighting each of the local notes. More precisely, the local proximity note N (M1 i ) of each point M1 i of the model contour CM1 is weighted by a factor representative at this point of the quantity of discriminant information that it contains relative to the other contour. CM model 2 . This amount of information contained in a point M1 i of the model contour CM 1 must be all the more high since this point is far from the other model contour CM 2 : it is the very definition of the proximity measurement in this Dist point (M1 i ) according to the method of the invention.
  • M2 j is a point of the model contour CM2 j paired with the point M1 i according to the proximity measuring method of the invention.
  • M2 j is a point of the model contour CM2 j paired with the point M1 i according to the proximity measuring method of the invention.
  • the greater the distance to the matched point the greater the amount of information at this point.
  • This is schematically represented in Figure 7c.
  • the amount of information X (M1) is large, corresponding to the distance a in Fig.
  • the weighting method is applied to the points of the second contour CM 2 , inverting the role of the first and second contours, that is to say using the method of measuring the proximity of the second contour CM 2 to the first contour CM 1 : we obtain the amount of information X (M2 j ) of each point M2 j of the second contour CM 2 relative to the first contour CM 1 .
  • the local proximity note N (M2 j ) of each point M2 j is weighted by the associated quantity of information X (M2 j ).
  • the discrimination process is applied two by two.
  • the invention describes a method for measuring the proximity of a second contour to a first contour, according to which each point M i of the second contour is matched with one or zero points of the first contour, giving a distance measurement Dist (M i ) at this point.
  • An automatic target identification method applies this proximity measurement process to determine the measure of proximity of each point of a model contour, applied as a second contour, to an image contour, applied as the first contour. He deduces for each point of the model contour, a local note of proximity and for the model contour, a global note, giving a measure of likelihood of similarity to the image contour.
  • the automatic identification method thus determines the overall score associated with each of the model contours of a collection (with as many different model contours as different 3D models and points of view considered for each 3D model).
  • the invention applies a criterion of selection of hypotheses, retaining as probable hypothesis, each of the model contours whose overall score is greater than the threshold.
  • the model contours of the collection correspond to a selection of hypotheses, resulting from another process, for example, resulting from a Hausdorff measurement.
  • the automatic identification method then applies the weighting method to each pair of hypotheses which are superimposed among the assumptions retained, in order to obtain for the model contour associated with each hypothesis a weighted overall score.
  • the method of measuring proximity applying it a first time, to measure the amount of information associated with each point of the contour of the first hypothesis, applied as a second contour, relative to the outline of the second hypothesis. applied as the first contour, and calculate the overall rating associated by averaging the weighted local scores.
  • It applies the proximity measurement method a second time to measure the amount of information associated with each point of the contour of the second hypothesis applied as the second contour, relative to the contour of the first hypothesis applied as the first contour, and to calculate the overall score. associated by averaging the weighted local scores.
  • the identification system selects the best hypothesis. If the hypotheses that are superimposed are greater than two, the automatic identification system applies this weighting two by two, to retain each time the best hypothesis.
  • an automatic identification system according to the invention to a first selection of hypotheses obtained from another automatic identification process, such as a process using the Hausdorff measurement, does not change. identification performance, but advantageously saves computing time.
  • the invention which has just been described makes it possible to appreciably improve the robustness and the discrimination of an automatic identification system which implements it. It applies to the military field, but more generally, to any domain using pattern recognition as compared to a series of models.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Medical Informatics (AREA)
  • Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Length Measuring Devices With Unspecified Measuring Means (AREA)
  • Flow Control (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

Method for measuring the proximity of a second contour (CM) relative to a first contour in which for each point (I k) of the first contour an association step with a point (M i) of the second contour, determined as the closest to it, is carried out. As a result each point on the second contour is coupled with a single or no point on the first contour. The invention also relates to a corresponding identification system.

Description

La présente invention concerne l'identification automatique des cibles présentes dans une image. Plus précisément cette invention décrit une méthode discriminante permettant de comparer des contours 2D. Elle s'applique principalement dans le domaine militaire, afin d'assister le pilote d'un avion en situation de combat dans ses choix de tirs. Elle présente également un intérêt dans tout autre domaine concerné par la reconnaissance de formes, notamment, le domaine de la surveillance et le domaine médical.The present invention relates to the automatic identification of the targets present in an image. More specifically, this invention describes a discriminant method for comparing 2D contours. It applies mainly in the military field, to assist the pilot of a plane in a combat situation in his choice of shots. It is also of interest in any other field concerned with pattern recognition, in particular the field of surveillance and the medical field.

Un processus d'identification automatique doit permettre de déterminer de façon fiable combien de cibles se trouvent dans l'image, à quelles positions et de quels types elles sont.An automatic identification process must reliably determine how many targets are in the image, at what positions and what types they are.

On entend par cible, un objet 3D que l'on cherche à identifier. Dans le domaine militaire, ces cibles sont typiquement des chars, des véhicules terrestres .... Dans la suite, on parle indifféremment de cible ou d'objet.A target is a 3D object that one seeks to identify. In the military field, these targets are typically tanks, land vehicles .... In the following, we speak indifferently target or object.

Dans la présente demande, on entend par système d'identification, un système par lequel une cible dans une image est identifiée par son type: marque, nom ou numéro, ou bien par sa classe : voiture, char, car...In this application, the term identification system, a system by which a target in an image is identified by its type: mark, name or number, or by its class: car, tank, car ...

L'identification automatique d'objets ou de cibles, est un problème algorithmique complexe, en raison d'une part des ressemblances potentielles entre deux cibles différentes sous certains angles de vue, et d'autre part de la grande variabilité d'apparence d'une cible, due à des déformations géométriques, à la position de certains éléments, ou à la présence de certains équipements. Par exemple, un véhicule peut avoir des portes ouvertes ou fermées, des bagages sur le toit...The automatic identification of objects or targets is a complex algorithmic problem, partly because of the potential resemblances between two different targets from certain angles of view, and secondly because of the great variability of appearance of a target, due to geometric deformations, the position of certain elements, or the presence of certain equipment. For example, a vehicle may have open or closed doors, luggage on the roof ...

On cherche à identifier automatiquement de la façon la plus fiable possible des cibles dans une image. Le processus automatique d'identification doit ainsi présenter deux qualités essentielles : être robuste, c'est à dire peu sensible aux variations d'apparence d'une cible qui se traduisent par des perturbations locales sur l'objet dans l'image ; être discriminant, c'est à dire être capable de discerner entre deux cibles proches en apparence.We try to identify automatically as reliably as possible targets in an image. The automatic identification process must thus have two essential qualities: to be robust, ie insensitive to variations in the appearance of a target that result in local disturbances on the object in the image; to be discriminating, that is to say to be able to discern between two targets close in appearance.

Dans l'invention, on s'intéresse plus particulièrement à un système d'identification automatique de cibles basé sur la comparaison de contours. Dans un tel système, on extrait, dans un premier temps, les contours présents dans l'image à analyser puis, dans un second temps, on compare ces contours à ceux d'une base de référence de cibles, contenant des données représentant les objets 3D que l'on cherche à identifier.In the invention, one is more particularly interested in an automatic identification system of targets based on the comparison of contours. In such a system, the contours are initially extracted. present in the image to be analyzed, and then, in a second step, these contours are compared to those of a target reference database, containing data representing the 3D objects that one seeks to identify.

L'extraction des contours présents dans l'image se fait à l'aide d'une technique dite de segmentation. Le résultat est une image dite de contours extraits, correspondant à une image binaire ne laissant plus apparaître que des pixels de contours, représentés en général par des points blancs sur un fond noir. Dans cette image, seuls les pixels de contours contiennent de l'information. Dans la suite, sauf mention contraire explicite, on entend par point, un point porteur d'information, c'est à dire un point appartenant à un contour dans le modèle ou dans l'image. Les pixels qui ne sont pas des points de contour, ne sont pas porteurs d'information.The extraction of the contours present in the image is done using a so-called segmentation technique. The result is an image called extracted outlines, corresponding to a binary image leaving only pixels of outlines, usually represented by white dots on a black background. In this image, only the outlines contain information. In the following, unless explicitly stated otherwise, the term "point" means a point carrying information, ie a point belonging to an outline in the model or in the image. Pixels that are not contour points do not carry information.

L'image de contours extraits est alors comparée aux contours obtenus d'une base de données représentant les objets 3D que l'on cherche à identifier. Ces contours sont dits contours-modèles et sont obtenus, pour chacun des objets 3D, par projection selon un ensemble de points de vue permettant de représenter toutes les apparences de l'objet. A chaque objet 3D dans la base, correspond ainsi une collection de contours-modèles de cet objet.The extracted contour image is then compared to the contours obtained from a database representing the 3D objects that we are trying to identify. These outlines are called contour-models and are obtained, for each of the 3D objects, by projection according to a set of points of view making it possible to represent all the appearances of the object. Each 3D object in the database thus corresponds to a collection of model contours of this object.

Dans l'invention, on s'intéresse plus particulièrement à une méthode de comparaison dite corrélative, qui consiste à comparer chaque contour modèle à l'image de contours extraits pour toutes les positions possibles de ce contour modèle dans l'image. Pour une position donnée, cette comparaison est effectuée en superposant le contour modèle à l'image, et consiste à mesurer " l'écart " entre les points du contour modèle et ceux de l'image de contours extraits. Chacun des contours-modèles étant repéré par rapport à une origine, il est possible de recalculer les coordonnées de chacun de ses points dans le système de coordonnées de l'image de contours, selon le pixel de l'image sur lequel on centre cette origine. On balaye ainsi chacun des contours-modèles sur l'ensemble de l'image de contours extraits.In the invention, one is more particularly interested in a so-called correlative comparison method, which consists in comparing each model contour with the image of extracted contours for all the possible positions of this model contour in the image. For a given position, this comparison is performed by superimposing the model contour on the image, and consists in measuring "the difference" between the points of the model contour and those of the extracted contour image. Since each of the model contours is identified with respect to an origin, it is possible to recalculate the coordinates of each of its points in the coordinate system of the contour image, according to the pixel of the image on which this origin is centered. . Thus, each of the model contours is scanned over the entire contour image extracted.

Lorsque l'image de contours extraits a été balayée par l'ensemble des contours-modèles, le processus consiste à sélectionner l'hypothèse ou les hypothèses les plus vraisemblables.When the extracted contour image has been scanned by the set of model contours, the process consists in selecting the most likely hypothesis or hypotheses.

On entend par hypothèse, une cible, une position de cette cible dans l'image et un point de vue sous lequel est observé cette cible.By hypothesis, we mean a target, a position of this target in the image and a point of view under which this target is observed.

Une méthode d'évaluation de l'écart entre les points de contours modèles et les points de contours extraits consiste à comptabiliser le nombre de points que ces contours ont en commun.A method of estimating the difference between the model contour points and the extracted contour points is to count the number of points that these contours have in common.

Cette méthode d'évaluation simple basée sur le nombre de points en commun avec un contour modèle est cependant peu robuste et peu discriminante. Peu robuste car elle est très sensible aux variations d'apparence de la cible et peu discriminante car elle prend en compte avec la même importance tous les points du contour.This simple evaluation method based on the number of points in common with a model contour is, however, not very robust and not very discriminating. Not very robust because it is very sensitive to the variations of appearance of the target and not very discriminating because it takes into account with the same importance all the points of the outline.

Une autre méthode d'évaluation plus complexe utilise un procédé de mesure dit de Hausdorff. Ce procédé consiste à identifier pour chacun des points de contour modèle, la distance la plus faible de ce point aux points du contour image, et à en déduire un degré de dissimilarité entre le contour modèle et le contour image, sur la base de la moyenne des distances évaluées.Another more complex evaluation method uses a so-called Hausdorff measurement method. This method consists in identifying for each of the model contour points, the lowest distance from this point to the points of the image contour, and to deduce therefrom a degree of dissimilarity between the model contour and the image contour, on the basis of the average evaluated distances.

Cependant, ce procédé même s'il est plus performant que le précédent n'est pas suffisamment robuste ni discriminant, car il peut tenir compte de distances non pertinentes qui devraient être écartées. En effet, un même point de contour modèle peut être vu comme le plus proche de plusieurs points de contour image différents. C'est notamment le cas si l'image contient des points parasites qui ne correspondent pas à un contour d'une cible à identifier, par exemple, des points qui correspondent à des contours internes de la cible, ou des points qui correspondent à l'environnement de la cible (végétation, immeubles....). Ces points parasites vont perturber la mesure. La prise en compte de toutes ces distances peut ainsi conduire à une hypothèse fausse.However, this process, even if it is more efficient than the previous one, is not robust enough nor discriminating, because it can take into account irrelevant distances that should be discarded. Indeed, the same model outline point can be seen as the closest to several different image outline points. This is particularly the case if the image contains parasitic points that do not correspond to a contour of a target to be identified, for example, points that correspond to internal contours of the target, or points that correspond to the target. environment of the target (vegetation, buildings ....). These parasitic points will disturb the measurement. Taking all these distances into account can thus lead to a false assumption.

Un objet de l'invention est un processus d'identification automatique qui ne présente pas ces différents inconvénients.An object of the invention is an automatic identification process that does not have these various disadvantages.

Un processus d'identification automatique selon l'invention comprend un procédé de mesure de proximité d'un contour modèle à un contour image basé sur une étape d'appariement univoque de chaque point d'un contour modèle à zéro ou un unique point de contour image.An automatic identification process according to the invention comprises a method for measuring the proximity of a model contour to an image contour based on a single matching step of each point of a model contour to zero or a single contour point picture.

Ce procédé d'appariement point à point comprend une étape d'association à chaque point de contour image, du point de contour modèle le plus proche. A cette étape, on fait correspondre à chaque point de contour image, deux informations : les coordonnées d'un point de contour modèle déterminé comme le plus proche et la distance entre les deux points ainsi associés.This point-to-point matching method comprises a step of associating with each point of the image contour, the model contour point the closest. At this stage, two points of information are mapped to each point of the image contour: the coordinates of a model contour point determined as the closest and the distance between the two points thus associated.

Puis, inversement, pour chaque point du contour modèle, on considère l'ensemble des points de contour image qui lui ont été associés dans l'étape précédente et dans cet ensemble, on détermine le point de contour image le plus proche, en prenant la distance la plus faible. On obtient un appariement univoque point à point. A la sortie, chaque point de contour modèle est apparié soit à zéro point de contour image soit à un unique point de contour image correspondant à une distance la plus faible.Then, inversely, for each point of the model contour, we consider all the image contour points associated with it in the previous step and in this set, we determine the nearest image contour point, taking the lowest distance. We get a point-to-point univocal match. At the output, each model contour point is matched either to zero image contour point or to a single image contour point corresponding to a smallest distance.

En attribuant une note locale de proximité à chaque point de contour modèle, égale à zéro si il est apparié à zéro point de contour image, et si il est apparié à un point de contour image, égale à une valeur d'autant plus faible que la distance entre les deux points appariés est grande, on peut calculer une note globale, égale à la moyenne des notes locales qui exprime la probabilité de similarité du contour modèle au contour image.By assigning a local proximity note to each model contour point, equal to zero if it is matched to zero image edge points, and if it is matched to an image edge point, equal to a value that is lower as the distance between the two paired points is large, it is possible to calculate an overall score, equal to the average of the local notes which expresses the probability of similarity of the model contour to the image contour.

La note globale qui résulte de cette méthode est beaucoup plus discriminante que la mesure de proximité utilisée dans les procédés d'identification automatique de l'état de la technique, notamment vis à vis des hypothèses fausses.The overall score resulting from this method is much more discriminating than the proximity measurement used in the automatic identification methods of the state of the art, especially with respect to false assumptions.

Un système d'identification automatique selon l'invention utilise ce procédé pour chaque position du contour modèle dans l'image, et pour chaque modèle d'une collection de modèles.An automatic identification system according to the invention uses this method for each position of the model contour in the image, and for each model of a model collection.

L'ensemble des notes globales obtenues, correspondant aux différents contours modèles et à leurs différentes positions dans l'image, permet d'élaborer un certain nombre d'hypothèses en retenant les meilleures notes globales de proximité.The set of global scores obtained, corresponding to the different model contours and their different positions in the image, makes it possible to elaborate a certain number of hypotheses by retaining the best global proximity ratings.

Le processus d'appariement point à point selon l'invention permet d'améliorer la discrimination du système d'identification automatique vis à vis des hypothèses fausses correspondant à des cas où les contours dans l'image comprennent des points intérieurs de contours c'est à dire correspondant à des contours internes d'une cible, et des points extérieurs de contours, c'est à dire correspondant à l'environnement de la cible (végétation, immeubles....).The point-to-point matching process according to the invention makes it possible to improve the discrimination of the automatic identification system with respect to the false assumptions corresponding to cases where the contours in the image comprise interior points of contours. to say corresponding to the internal contours of a target, and external points of contours, ie corresponding to the environment of the target (vegetation, buildings ....).

Selon un autre aspect de l'invention, pour améliorer la discrimination entre des hypothèses de cibles qui se superposent (c'est à dire à des positions identiques ou proches dans l'image, ce qui se définit habituellement par des points de contour en commun entre les deux hypothèses de contours modèles), le procédé de mesure de proximité applique une pondération locale en chaque point d'un contour modèle. Cette pondération est représentative d'une quantité d'information contenue en ce point et définie par rapport à l'autre contour modèle. Cette pondération permet de discriminer les silhouettes des deux cibles sur la base de leurs différences locales. Plus particulièrement, cette pondération consiste à appliquer le procédé de mesure de proximité entre les deux contours modèles à discriminer, pour obtenir, pour chaque contour modèle, un facteur de pondération en chaque point qui permet de donner plus de poids aux points de contour modèle qui contiennent les informations différences avec l'autre contour modèle. Lorsque la collection d'hypothèses contient plus de deux hypothèses superposables, on applique ce processus de pondération deux à deux, et on retient la meilleure note globale obtenue à chaque fois.According to another aspect of the invention, to improve discrimination between overlapping target hypotheses (ie at identical or near positions in the image, which is usually defined by contour points in common between the two hypotheses of model contours), the method of proximity measurement applies a local weighting at each point of a model contour. This weighting is representative of a quantity of information contained at this point and defined with respect to the other model contour. This weighting makes it possible to discriminate the silhouettes of the two targets on the basis of their local differences. More particularly, this weighting consists in applying the method of measuring proximity between the two model contours to be discriminated, in order to obtain, for each model contour, a weighting factor at each point which makes it possible to give more weight to the model contour points which contain the difference information with the other template outline. When the hypothesis collection contains more than two superimposable hypotheses, we apply this weighting process two by two, and we retain the best overall score obtained each time.

Le système d'identification automatique selon l'invention applique à chacun des contours modèle d'une collection, le processus de mesure de proximité de ce contour modèle au contour image à analyser pour évaluer la vraisemblance de ce modèle, et entre les contours modèles pris deux à deux dans une sélection d'hypothèses qui se superposent, pour discriminer entre deux contours modèles proches en pondérant localement cette probabilité relativement à chacun des deux modèles.The automatic identification system according to the invention applies to each of the model contours of a collection, the process of measuring the proximity of this model contour to the image contour to be analyzed in order to evaluate the likelihood of this model, and between the model contours taken. two by two in a selection of overlapping hypotheses, to discriminate between two close model contours by locally weighting this probability relative to each of the two models.

Ainsi, telle que caractérisée, l'invention concerne un procédé de mesure de proximité d'un deuxième contour à un premier contour, comprenant pour chaque point du premier contour, une étape d'association avec un point du deuxième contour déterminé comme le plus proche, caractérisé en ce qu'il comprend une étape d'appariement de chaque point du deuxième contour avec un ou zéro point du premier contour, par détermination du point du premier contour le plus proche parmi l'ensemble des points du premier contour associés au dit point du deuxième contour.Thus, as characterized, the invention relates to a method of measuring proximity of a second contour to a first contour, comprising for each point of the first contour, a step of association with a point of the second contour determined as the nearest , characterized in that it comprises a step of pairing each point of the second contour with one or zero points of the first contour, by determining the point of the first closest contour among the set of points of the first contour associated with said point of the second contour.

L'invention concerne aussi un procédé d'identification automatique de cibles, qui utilise un tel procédé de mesure de proximité d'un contour modèle à un contour image.The invention also relates to a method for automatically identifying targets, which uses such a method for measuring the proximity of a model contour to an image contour.

Selon un perfectionnement, ce procédé d'identification utilise ce procédé de mesure de proximité d'un contour modèle à un autre contour modèle, pour permettre la discrimination entre deux hypothèses qui se superposent.According to one improvement, this identification method uses this method of measuring proximity of a model contour to another model contour, to allow discrimination between two overlapping hypotheses.

D'autres avantages et caractéristiques de l'invention apparaîtront plus clairement à la lecture de la description qui suit, faite à titre indicatif et non limitatif de l'invention et en référence aux dessins annexés, dans lesquels :

  • la figure 1 représente une image de contours extraite d'une image d'entrée appliquée à un système d'identification automatique de contours;
  • la figure 2 illustre l'étape d'association d'un point de l'image à un point de contour modèle selon un procédé de mesure de proximité d'un contour modèle au contour image à analyser selon l'invention;
  • la figure 3 illustre l'étape d'appariement point à point selon un procédé de mesure de proximité d'un contour modèle au contour image à analyser selon l'invention ;
  • les figures 4a et 4b illustrent un problème de détection d'hypothèses fausses ;
  • les figures 5a et 5b illustrent les classes d'orientation associées aux points de contour image et modèle ;
  • la figure 6 représente la courbe associée à un exemple de fonction d'attribution d'une note locale de proximité selon l'invention ;
  • les figures 7a à 7d illustrent le principe de pondération de la note locale de pondération selon l'invention.
Other advantages and characteristics of the invention will appear more clearly on reading the description which follows, given by way of indication and not limitation of the invention and with reference to the appended drawings, in which:
  • FIG. 1 represents a contour image extracted from an input image applied to an automatic identification system of contours;
  • FIG. 2 illustrates the step of associating a point of the image with a model contour point according to a method for measuring the proximity of a model contour to the image contour to be analyzed according to the invention;
  • FIG. 3 illustrates the step of point-to-point pairing according to a method for measuring the proximity of a model contour to the image contour to be analyzed according to the invention;
  • Figures 4a and 4b illustrate a problem of detection of false assumptions;
  • FIGS. 5a and 5b illustrate the orientation classes associated with the image and model contour points;
  • FIG. 6 represents the curve associated with an exemplary function for assigning a local proximity note according to the invention;
  • FIGS. 7a to 7d illustrate the weighting principle of the local weighting note according to the invention.

La figure 1 représente une image de contours extraits d'une image de donnée, qui peut-être issue d'une caméra infrarouge, d'un système vidéo, ou de toute autre source d'images.Figure 1 shows an image of outlines extracted from a data image, which may be from an infrared camera, a video system, or any other image source.

On veut déterminer dans cette image de contours extraits, combien de cibles elle contient, à quelles positions et de quels types, parmi un ensemble de cibles identifiées, que l'on a sous forme d'objets 3D dans une base de données. Pour cela, on construit un ensemble de contours modèles 2D correspondant à des projections de chacun des objets 3D, selon différents angles de vue, en tenant compte des informations sur les conditions sous lesquelles les cibles sont observées, comme par exemple, des informations de distance entre la cible et le capteur, d'angle de visée...We want to determine in this image outlines extracted, how many targets it contains, which positions and which types, among a set of identified targets, that we have in the form of 3D objects in a database. For this, we build a set of 2D model contours corresponding to projections of each of the 3D objects, according to different angles of view, taking into account the information on the conditions under which targets are observed, for example, distance information between the target and the sensor, viewing angle ...

Considérons un contour modèle positionné d'une façon quelconque dans l'image de contours extraits, et noté CM. On appelle dans la suite contour image CI, l'ensemble des points de contours de l'image de contours extraits. On applique un procédé de mesure de proximité selon l'invention, pour mesurer la proximité de ce contour CM au contour image à analyser.Consider a model outline positioned in any way in the extracted contour image, and noted CM. In the following image contour CI, the set of contour points of the extracted contour image is called. A proximity measurement method according to the invention is applied to measure the proximity of this contour CM to the image contour to be analyzed.

La figure 1 illustre une image de contours extraits d'une image obtenue de façon quelconque: image infrarouge, image active. Elle contient des points de contour image, qui correspondent aux pixels noirs sur cette image, tels que les points référencés Ia, Ib, Ic, Id sur la figure 1. Ces points de contours peuvent être des points de contour d'une cible à identifier, tel le point Ia, des points externes au contour de la cible à identifier, tels les points Ib et Ic ou encore des points d'un contour interne à la cible à identifier, tel le point Id.FIG. 1 illustrates an image of outlines extracted from an image obtained in any way: infrared image, active image. It contains image contour points, which correspond to the black pixels on this image, such as the points referenced I a , I b , I c , I d in FIG. 1. These contour points may be contour points of a target to be identified as the point I a, the outer points to the contour of the target to be identified, such points I b and I c or points of an inner contour to the target to be identified, such as the point I of .

Le procédé de mesure de proximité selon l'invention comprend une étape d'appariement univoque de chacun des points de contour modèle à zéro ou un seul point de contour image et une étape d'attribution d'une note locale de proximité à chaque point de contour modèle, représentant la proximité de ce point de contour modèle avec le contour image.The proximity measuring method according to the invention comprises a step of unambiguous pairing of each of the model contour points to zero or a single image contour point and a step of assigning a local proximity note to each point of the image. model outline, representing the proximity of this model outline point with the image outline.

Plus précisément, l'étape d'appariement de chacun des points du contour modèle comprend les étapes a)- et b)- suivantes :

  • a)-une étape d'association à chaque point de contour image d'un point de contour modèle, sur le critère de la distance la plus faible ;
  • b)-pour chaque point de contour modèle, la détermination de l'ensemble des points de contour image auxquels il a été associé à l'étape a)-, et la détermination du point image le plus proche dans cet ensemble, sur le critère de la distance la plus faible.
More specifically, the step of matching each of the points of the model contour comprises the following steps a) - and b) -:
  • a) a step of associating with each point of the image contour of a model contour point, on the criterion of the weakest distance;
  • b) for each model contour point, the determination of the set of image contour points to which it has been associated in step a), and the determination of the nearest image point in this set, on the criterion the smallest distance.

Ce procédé entraîne qu'à l'étape a)- on mémorise pour chaque point de contour image deux informations : les coordonnées du point de contour modèle associé et la distance correspondante entre les deux points associés, pour effectuer l'étape b)- d'appariement sur la base de ces deux informations.This method entails that, in step a), two information items are memorized for each image contour point: the coordinates of the associated model contour point and the corresponding distance between the two associated points, to perform step b) - d matching on the basis of these two pieces of information.

La distance considérée est la distance euclidienne, dont on effectue une mesure vraie ou une mesure discrète selon les méthodes de calcul utilisées. Notamment, l'utilisation d'une méthode de chanfrein, permettant de manière connue d'accélérer le temps de calcul utilise une mesure discrète de la distance euclidienne.The distance considered is the Euclidean distance, of which a true measurement is measured or a discrete measurement according to the calculation methods used. In particular, the use of a chamfer method, making it possible in a known manner to accelerate the calculation time, uses a discrete measurement of the Euclidean distance.

Les étapes a)- et b)- sont illustrées sur les figures 2 et 3.Steps a) - and b) - are illustrated in Figures 2 and 3.

L'étape a)-. est illustrée sur la figure 2. On évalue la proximité des points de contour image CI avec le contour modèle CM, pour associer à chaque point de contour image un point de contour modèle le plus proche. Ainsi, comme représenté sur la figure 2, si on prend un point du contour image CI, l'évaluation du point de contour modèle le plus proche consiste à rechercher la distance d la plus faible entre ce point de contour image et un point de contour modèle. Dans l'exemple représenté sur la figure 2, cette évaluation conduit à associer le point M1 de contour modèle CM au point I1 du contour image Cl. Dans cet exemple, on a aussi les associations suivantes : (I1,M1), (I2,M1), (I3,M1), (I4,M2), (I5,M2), (I6,M3).Step a) -. is illustrated in FIG. 2. The proximity of the image contour points CI to the model contour CM is evaluated to associate with each contour point a closest model contour point. Thus, as shown in FIG. 2, if a point of the image contour CI is taken, the evaluation of the nearest model contour point consists in finding the lowest distance d between this image contour point and an outline point. model. In the example shown in FIG. 2, this evaluation leads to associating the point M 1 of model contour CM with the point I 1 of the image contour C1. In this example, we also have the following associations: (I 1 , M 1 ) , (I 2 , M 1 ), (I 3 , M 1 ), (I 4 , M 2 ), (I 5 , M 2 ), (I 6 , M 3 ).

Au cours de cette étape, un même point de contour modèle peut être associé à des points différents de contour image. Dans l'exemple, le point M1 du contour modèle CM a été associé aux points I1, I2, I3 de contour image.During this step, a same model contour point may be associated with different image contour points. In the example, the point M 1 of the model contour CM has been associated with the image contour points I 1 , I 2 , I 3 .

L'étape b)- est illustrée sur la figure 3. Elle consiste pour chaque point de contour modèle, à sélectionner parmi les points de contour image qui lui ont été associés dans la première étape a), le point de contour image le plus proche du point de contour modèle. Sur la figure 3, on a représenté par des lignes pointillées la mise en correspondance de points de contour image avec des points de contour modèle selon la première étape a). Pour chaque point de contour modèle, on a ainsi 0, un ou n points de contour image associés selon cette étape a). Par exemple, pour le point de contour image M15, on a 3 points de contour image associés : l24, l28, et l29.Step b) - is illustrated in Figure 3. It consists for each model contour point to select from the image contour points associated with it in the first step a), the nearest contour point image of the model outline point. In FIG. 3, the dotted lines represent the mapping of image contour points with model contour points according to the first step a). For each model contour point, there is thus 0, one or n associated contour points according to this step a). For example, for the image contour point M 15 , there are 3 associated image contour points: l 24 , l 28 , and l 29 .

L'étape b)- consiste à ne garder que le point image le plus proche, quand il existe, parmi les points de contour image associés à un même point de contour modèle et à évaluer la note locale de proximité de ce point de contour modèle au contour image sur la base de l'appariement (point de contour modèle -point de contour image) ainsi effectué.Step b) consists in keeping only the nearest image point, when it exists, among the image contour points associated with the same model contour point and in evaluating the local proximity note of this model contour point. to the image contour on the basis of the matching (contour point model-contour point image) thus performed.

Dans l'exemple de la figure 3, l'appariement point Mi (modèle) à point lk (image) selon l'invention est le suivant : (M10, zéro point image); (M11, l20); (M12, l12); (M13, l22); (M15, l24).In the example of FIG. 3, the pairing point M i (pattern) with point l k (image) according to the invention is the following: (M 10 , zero point image); (M 11 , 1, 20 ); (M 12 , 12 ); (M 13 , 22 ); (M 15 , 1, 24 ).

Avec un appariement point à point selon l'invention, les points de contour images I25 à I29 ne seront donc pas pris en compte dans l'évaluation de la proximité du modèle.With a point-to-point pairing according to the invention, the image contour points I 25 to I 29 will therefore not be taken into account in the evaluation of the model's proximity.

L'étape d'appariement point à point selon l'invention fournit pour chaque point Mi de contour modèle Mi apparié à un unique point de contour image Ik, une mesure de proximité de ce point Mi au contour image. Cette mesure de proximité du point Mi peut s'écrire :The point-to-point matching step according to the invention provides for each point M i of model contour M i matched to a single image contour point I k , a measurement of proximity of this point M i to the image contour. This measure of proximity of the point M i can be written as:

Dist(Mi)= d(Mi, Ik), où d(Mi, Ik) est une mesure vraie ou approchée de la distance euclidienne entre les deux points appariés. Elle s'exprime en nombre de pixels.Dist (M i ) = d (M i , I k ), where d (M i , I k ) is a true or approximate measure of the Euclidean distance between the paired points. It is expressed in number of pixels.

Le procédé comprend en outre une étape d'attribution d'une note locale de proximité à chacun des points du contour modèle de la façon suivante : la note prend une valeur comprise entre 0 et 1, d'autant plus grande que les points appariés sont proches (que la mesure de proximité de ce point est faible). Plus précisément :

  • si un point de contour modèle n'est atteint par aucun point de contour image, correspondant à un point de contour modèle très éloigné du contour image, on lui attribue la note zéro. Dans l'exemple de la figure 3, la note attribuée au point M10 est zéro : N(M10)=0.
  • si un point de contour modèle est atteint par un point de contour image, unique, on lui affecte une note d'autant plus grande que les points sont proches. Par exemple, on pourrait avoir N(M12)=0,7 ; N(M15)=0,3.
The method further comprises a step of assigning a local proximity note to each of the points of the model contour as follows: the note takes a value between 0 and 1, all the greater the matched points are close (that the proximity measure of this point is weak). More precisely :
  • if a model contour point is not reached by any image contour point, corresponding to a model contour point far away from the image contour, it is given the zero score. In the example of FIG. 3, the score assigned to the point M 10 is zero: N (M 10 ) = 0.
  • if a model contour point is reached by an image boundary point, unique, it is assigned a note that is all the greater as the points are close. For example, we could have N (M 12 ) = 0.7; N (M 15 ) = 0.3.

La dernière étape du procédé consiste alors à déterminer la note globale pour le modèle, en moyennant les notes locales de tous les points du contour modèle.The last step of the process is then to determine the overall score for the model, by averaging the local scores of all the points of the model outline.

Selon ce principe d'évaluation, le contour modèle est évalué comme étant d'autant plus proche d'un contour image que la note globale qui lui est attribuée est élevée.According to this evaluation principle, the model contour is evaluated as being all the closer to an image contour that the overall rating assigned to it is high.

On a pu montrer qu'un tel processus d'identification automatique de cibles selon l'invention permet d'éviter des erreurs de détection du type illustré sur les figures 4a et 4b. Sur ces figures, on a superposé sur une image 1 comprenant une cible C, un premier modèle MOD1 (figure 4a) et un deuxième modèle MOD2 (figure 4b). Le premier modèle MOD1 correspond dans l'exemple à la cible à détecter sur laquelle il est parfaitement positionné. Il conduit à une hypothèse retenue. Le deuxième modèle correspond à un autre type de cible. Mais avec un procédé selon l'état de la technique, l'hypothèse sera retenue, du fait de la présence de points de contours n'appartenant pas au contour de la cible, mais appartenant en réalité au fond, ou à des points de contours internes.It has been possible to show that such an automatic target identification process according to the invention makes it possible to avoid detection errors of the type illustrated in FIGS. 4a and 4b. In these figures, we superimposed on an image 1 comprising a target C, a first model MOD 1 (Figure 4a) and a second model MOD 2 (Figure 4b). The first model MOD 1 corresponds in the example to the target to detect on which it is perfectly positioned. It leads to a hypothesis retained. The second model corresponds to another type of target. But with a method according to the state of the art, the hypothesis will be retained, because of the presence of contour points not belonging to the contour of the target, but actually belonging to the bottom, or to contour points internal.

Selon un mode de réalisation de l'invention, l'évaluation de la note locale de proximité de chaque point de contour modèle est une fonction de la distance d entre ce point et le point de contour image apparié selon l'invention.According to one embodiment of the invention, the evaluation of the local proximity note of each model contour point is a function of the distance d between this point and the paired image contour point according to the invention.

De préférence, et comme représenté schématiquement sur les figures 5a et 5b, l'évaluation de la proximité de deux points comprend la prise en compte de la classe d'orientation aux points I20 et M30 de la paire P considérée. Cette classe est typiquement définie par l'orientation de la tangente du contour au point considéré : sur la figure 5a, on a représenté la tangente tI au point de contour image I20 du contour image CI et la tangente tM au point de contour modèle M30 du contour modèle CM. On définit n classes d'orientation, avec n entier : la classe d'orientation 0 correspond à une orientation horizontale de la tangente ; la classe d'orientation n-1 correspond à une orientation verticale de la tangente et chacune des classes d'orientation intermédiaires correspond à une orientation de la tangente déterminée, comprise entre 0 et π rad. Ces classes sont représentées sur la figure 5b avec n=8. Dans cet exemple, le point I20 appartient à la classe d'orientation 6 et le point M30 appartient à la classe d'orientation 5.Preferably, and as shown diagrammatically in FIGS. 5a and 5b, the evaluation of the proximity of two points comprises taking into account the orientation class at the points I 20 and M 30 of the pair P considered. This class is typically defined by the orientation of the tangent of the contour at the point considered: in FIG. 5a, the tangent t I is represented at the image contour point I 20 of the image contour CI and the tangent t M at the contour point model M 30 of the contour model CM. We define n orientation classes, with n integer: the orientation class 0 corresponds to a horizontal orientation of the tangent; the orientation class n-1 corresponds to a vertical orientation of the tangent and each of the intermediate orientation classes corresponds to an orientation of the determined tangent between 0 and π rad. These classes are shown in Figure 5b with n = 8. In this example, the 20 point I belongs to the orientation class 6 and the point M 30 is part of the orientation class 5.

De façon générale, si les tangentes tI et tM coïncident, c'est à dire si les deux points appariés appartiennent à la même classe d'orientation, alors ΔORI=0. Si les deux points appariés sont dans des classes orthogonales, ΔORI = n-1. Plus généralement, on a AORI = |classe(Ik)-classe#(Mi)| (en nombre de pixels).In general, if the tangents t I and t M coincide, that is, if the paired points belong to the same orientation class, then ΔORI = 0. If the paired points are in orthogonal classes, ΔORI = n-1. More generally, we have AORI = | class (Ik) -class # (Mi) | (in number of pixels).

Dans l'exemple représenté sur la figure 5a, ΔORI =6-5=1In the example shown in FIG. 5a , ΔORI = 6-5 = 1

La mesure corrigée de proximité au contour image du point de contour modèle Mi apparié au point de contour image lk peut ainsi s'écrire : Dist M i = d M i  I k + 1 4 ΔORI .

Figure imgb0001
The corrected proximity measurement to the image contour of the model contour point M i matched to the image contour point l k can thus be written as follows: dist M i = d M i I k + 1 4 ΔORI .
Figure imgb0001

En pratique, avec n=8 classes, on obtient un bon compromis en termes de fausses détections et de temps de calcul.In practice, with n = 8 classes, we obtain a good compromise in terms of false detections and computing time.

Dans ce perfectionnement, la mesure de proximité est une fonction continue de la position et de l'orientation. On limite ainsi le poids de l'orientation, qui peut être estimée de façon erronée.In this improvement, proximity measurement is a continuous function of position and orientation. This limits the weight of the orientation, which can be erroneously estimated.

Dans une variante de prise en compte de la classe d'orientation, on tient compte de la classe d'orientation dans l'étape d'association du processus d'appariement point à point, en ne permettant l'association (et donc l'appariement) qu'entre points de même classe. Dans ce cas, la mesure de proximité Dist(Mi) est égale à la distance entre les deux points appariés Mi et lk.In an alternative taking into account of the orientation class, the orientation class is taken into account in the step of associating the point-to-point matching process, by not allowing the association (and therefore the pairing) only between points of the same class. In this case, the proximity measure Dist (M i ) is equal to the distance between the paired points M i and l k .

L'attribution de la note locale de proximité N(Mi) d'un point de contour modèle Mi en fonction de la mesure de proximité selon l'invention doit contribuer à la robustesse du procédé d'identification .The attribution of the local proximity note N (M i ) of a model contour point M i as a function of the proximity measurement according to the invention must contribute to the robustness of the identification method.

Cette note locale traduit une probabilité de similarité entre le contour modèle et le contour image : elle prend une valeur sur l'intervalle [0,1]. Quand elle est égale à zéro, elle traduit que le point de contour modèle ne "matche pas" avec le contour image; quand elle est égale à 1, elle traduit une forte probabilité que le contour modèle corresponde au contour image.This local note translates a probability of similarity between the model contour and the image contour: it takes a value on the interval [0,1]. When it is zero, it translates that the model contour point does not "match" with the image outline; when it is equal to 1, it reflects a high probability that the model contour corresponds to the image contour.

Ainsi, tous les points de contour modèle qui n'ont pu être appariés avec un point de contour image selon le procédé de l'invention doivent avoir une contribution nulle, c'est à dire une note nulle, traduisant leur grand éloignement du contour image.Thus, all the model contour points that could not be matched with an image contour point according to the method of the invention must have a zero contribution, that is to say a zero note, reflecting their large distance from the image contour .

Pour les points de contour modèle qui sont appariés à un point, unique, de contour image, la fonction d'attribution de la note suit de préférence les critères suivants :

  • la note doit prendre une valeur égale à 1, lorsque la mesure de proximité Dist(Mi) est nulle ;
  • la note doit prendre une valeur voisine de 1 lorsque la mesure de proximité Dist(Mi) est comprise entre 0 et 1.
  • la note doit décroître très rapidement vers 0 dès que la mesure de proximité Dist(Mi) devient supérieure à 1.
  • la courbe d'attribution de la note N(Mi) possède un point d'inflexion, de préférence pour une mesure de proximité Dist(Mi) voisine de 2 pixels.
  • la note doit prendre une valeur quasi-nulle dès que la mesure de proximité Dist(Mi) devient supérieure à 3 pixels.
For model contour points that are matched to a single, contoured point, the score assignment function preferably follows the following criteria:
  • the note must take a value equal to 1, when the proximity measure Dist (M i ) is zero;
  • the note must take a value close to 1 when the proximity measure Dist (M i ) is between 0 and 1.
  • the note must decrease very rapidly to 0 as soon as the proximity measure Dist (M i ) becomes greater than 1.
  • the allocation curve of the note N (M i ) has a point of inflection, preferably for a proximity measurement Dist (M i ) close to 2 pixels.
  • the note must take a quasi-zero value as soon as the proximity measure Dist (M i ) becomes greater than 3 pixels.

La fonction N(Mi) d'attribution de la note à un point de contour modèle Mi apparié selon l'invention au point de contour image Ik aura par exemple la forme représentée sur la figure 6, qui correspond à la fonction suivante : N M i = 0 , 5 - arctan 4 Dist M i - 2 π 1 0 , 9604 .

Figure imgb0002
The function N (M i ) for assigning the note to a model contour point M i paired according to the invention at the image contour point I k will have for example the shape shown in FIG. 6, which corresponds to the following function : NOT M i = 0 , 5 - arctan 4 dist M i - 2 π 1 0 , 9604 .
Figure imgb0002

Une mise en oeuvre pratique d'un procédé de mesure de proximité selon l'invention peut utiliser des méthodes de calcul dites de chanfrein. Ces méthodes de chanfrein sont très efficaces en termes de temps de calcul et sont très utilisées dans de nombreux domaines de traitement d'image dont celui de la reconnaissance de forme.A practical implementation of a proximity measurement method according to the invention can use so-called chamfer calculation methods. These chamfer methods are very efficient in terms of computation time and are widely used in many areas of image processing including that of shape recognition.

Une méthode de chanfrein classique permet de faire correspondre à un contour, une carte à deux entrées x et y correspondant aux coordonnées d'un point donné, et à une sortie, qui est la distance la plus faible de ce point(x,y) au contour. En d'autres termes, on évalue la distance la plus faible du point (x,y) au contour cartographié au moyen de courbes de niveaux. Cette méthode connue de chanfrein est généralement utilisée pour appliquer le procédé de mesure de Hausdorff. Dans ce cas, la méthode de chanfrein est appliquée sur le contour image, permettant de déterminer pour chaque point (x,y) de contour modèle, la distance la plus faible au contour image.A classical chamfer method allows to map to a contour, a map with two inputs x and y corresponding to the coordinates of a given point, and an output, which is the lowest distance from this point (x, y) to the outline. In other words, the lowest distance from the point (x, y) to the contour mapped by contour lines is evaluated. This known method of chamfer is generally used to apply the Hausdorff measurement method. In this case, the chamfer method is applied to the image contour, making it possible to determine for each point (x, y) of model contour, the smallest distance to the image contour.

Dans le procédé selon l'invention, la méthode de chanfrein doit être appliquée de façon différente.In the method according to the invention, the chamfer method must be applied differently.

Tout d'abord, dans la première étape d'association du procédé selon l'invention, on cherche à mesurer la distance la plus faible d'un point de contour image au contour modèle. Ceci implique d'appliquer la méthode de chanfrein non plus au contour image, mais à chacun des contours modèles.First, in the first association step of the method according to the invention, the aim is to measure the smallest distance from an image contour point to the model contour. This involves applying the chamfering method no longer to the image contour, but to each of the model contours.

Cependant, le calcul de la carte de chanfrein d'un contour modèle est indépendant de l'image de contours extraits à analyser. Ces calculs peuvent donc être effectués une fois pour toutes et mémorisés, pour être exploités le moment venu, en temps réel, pour l'analyse d'une image de contours donnée.However, the calculation of the chamfer map of a model contour is independent of the extracted contour image to be analyzed. These calculations can be done once and for all and stored, to be used when the time comes, in real time, for the analysis of a given contour image.

Ensuite, pour permettre l'appariement point à point selon l'invention, la carte de chanfrein du contour modèle doit fournir en sortie une première information qui est la distance entre les deux points associés, et une deuxième information qui est l'identification du point de contour modèle associé à cette distance. Cette deuxième information est nécessaire car c'est elle qui va permettre dans l'étape d'appariement, de déterminer l'ensemble des points de contour image associés à un même point de contour modèle, et d'en déduire automatiquement la mesure de proximité par la première information associée.Then, to allow point-to-point matching according to the invention, the chamfer map of the model contour must output a first information which is the distance between the two associated points, and a second information which is the identification of the model contour point associated with this distance. This second piece of information is necessary because it will enable the pairing stage to determine all the image contour points associated with the same model contour point, and to automatically deduce the proximity measure. by the first associated information.

Ainsi, un procédé de calcul rapide selon l'invention comprend le calcul d'une carte de chanfrein pour chaque contour modèle, ladite carte donnant en fonction des deux entrées x et y correspondant aux coordonnées d'un point de contour image, une information S0(x,y) identifiant le point de contour modèle atteint par la mesure de distance la plus faible, et une information S1(x,y) correspondant à la valeur de cette mesure.Thus, a rapid calculation method according to the invention comprises the calculation of a chamfer map for each model contour, said map giving as a function of the two inputs x and y corresponding to the coordinates of an image contour point, an information S 0 (x, y) identifying the model contour point reached by the lowest distance measurement, and information S 1 (x, y) corresponding to the value of this measurement.

Ensuite, on applique les étapes d'appariement point à point, et d'attribution d'une note locale à chaque point de contour modèle, fonction de la mesure de proximité Dist(Mi) pour les points appariés.Then, we apply the steps of point-to-point matching, and assigning a local note to each model contour point, depending on the proximity measure Dist (M i ) for the paired points.

Cette méthode ne permet pas de corriger la mesure de proximité Dist(Mi) d'un point de contour modèle Mi en fonction de la classe d'orientation de ce point et du point de contour image apparié Ik.This method does not make it possible to correct the distance measure Dist (M i ) of a model contour point M i as a function of the orientation class of this point and of the paired image contour point I k .

On prévoit alors de calculer une carte de chanfrein par classe d'orientation du contour modèle. On a donc n cartes de chanfrein par contour modèle. On a vu que de préférence, n=8.It is then expected to calculate a chamfer map by orientation class of the model contour. We therefore have n chamfer maps by model contour. We have seen that, preferably, n = 8.

L'étape d'association à chaque point de contour image d'un point de contour modèle comprend alors, pour chaque point de contour image, la détermination préalable de la classe d'orientation de ce point, et la sélection de la carte de chanfrein du contour modèle dans la classe d'orientation correspondante.The step of associating with each contour point of a model contour point then comprises, for each image contour point, the preliminary determination of the orientation class of this point, and the selection of the chamfer card. of the model contour in the corresponding orientation class.

Enfin, le calcul de la note globale η consiste à effectuer la moyenne de toutes les notes locales, soit, si le contour modèle comprend l points Mi=1 à l de contour modèle, η = 1 l i = 1 l N M i .

Figure imgb0003
Finally, the calculation of the overall score η consists in averaging all the local scores, ie, if the model contour comprises the points M i = 1 to l of model contour, η = 1 l Σ i = 1 l NOT M i .
Figure imgb0003

Le procédé selon l'invention est appliqué sur tous les contours modèles en les balayant à chaque fois sur toute l'image.The method according to the invention is applied to all model contours by scanning each time on the entire image.

On obtient une note globale pour chaque contour modèle (contour modèle à comprendre comme contour modèle dans une position donnée), qui est une mesure de probabilité de similitude de ce contour modèle au contour image.An overall score is obtained for each model contour (model contour to be understood as the model contour in a given position), which is a probability measure of similarity of this model contour to the image contour.

Par exemple, on obtient la note globale η1 pour le contour modèle CM1; η2 pour le contour modèle CM2, ...For example, we obtain the overall score η 1 for the model contour CM 1 ; η 2 for the model contour CM 2 , ...

De préférence, on établit alors une sélection d'hypothèses. On entend par hypothèse, un contour modèle (c'est à dire une cible, sous un certain point de vue) dans une position déterminée dans l'image.Preferably, a selection of hypotheses is then established. By hypothesis is meant a model contour (that is to say a target, from a certain point of view) in a determined position in the image.

La sélection est typiquement obtenue en retenant les hypothèses les plus probables correspondant à une note globale obtenue supérieure à un seuil de décision. Ce seuil est de préférence fixé à 0,6.The selection is typically obtained by retaining the most probable assumptions corresponding to an overall score obtained above a decision threshold. This threshold is preferably set at 0.6.

La mise en oeuvre d'un tel processus d'identification automatique de cibles utilisant un procédé de mesure de proximité selon l'invention, permet de diminuer le nombre de fausses alarmes et de mieux discriminer entre les différentes hypothèses. En d'autres termes, on a moins d'hypothèses retenues en sortie.The implementation of such an automatic target identification process using a proximity measurement method according to the invention makes it possible to reduce the number of false alarms and to better discriminate between the various hypotheses. In other words, we have fewer hypotheses at the output.

Le tableau ci-dessus montre à titre de comparaison, pour des images différentes ne contenant qu'une cible à identifier, le nombre d'hypothèses retenues sur le critère de la mesure de Hausdorff (hypothèse retenue si mesure de Hausdorff<2pixels) et sur le critère de la note globale de proximité (η>0,6) selon l'invention. On voit que le critère de sélection basé sur la note de proximité selon l'invention donne de bien meilleurs résultats en terme de rejet de fausses hypothèses. Image 1 Image 2 Image 3 Image 4 Image 5 Image 6 Image 7 Image 8 Hausdorff 4 5 3 2 3 7 2 8 Note globale η 3 2 0 2 3 2 0 4 The table above shows, for comparison, for different images containing only one target to be identified, the number of hypotheses retained on the Hausdorff measurement criterion (hypothesis retained if Hausdorff measurement <2pixels) and on the criterion of the overall proximity score (η> 0.6) according to the invention. We see that the selection criterion based on the proximity note according to the invention gives much better results in terms of rejection of false assumptions. Image 1 Image 2 Image 3 Image 4 Image 5 Image 6 Image 7 Image 8 Hausdorff 4 5 3 2 3 7 2 8 Overall rating η 3 2 0 2 3 2 0 4

Par contre, il ne permet pas d'améliorer de façon vraiment déterminante la discrimination entre deux cibles de silhouettes proches. Cela se traduit par la présence d'hypothèses superposables dans la sélection d'hypothèses obtenue. La notion d'hypothèses qui se superposent est une notion bien connue de l'homme du métier. Elle traduit que les contours modèles de ces hypothèses ont des points de contour en commun.On the other hand, it does not make it possible to improve in a really decisive way the discrimination between two targets of close silhouettes. This results in the presence of superimposable hypotheses in the hypothesis selection obtained. The notion of overlapping hypotheses is a a concept well known to those skilled in the art. It shows that the model contours of these hypotheses have contour points in common.

Un autre aspect de l'invention permet d'améliorer ce dernier point.Another aspect of the invention makes it possible to improve this last point.

Le problème plus particulièrement considéré ici est dû au fait, que selon certains angles de vue, pour certaines orientations, deux cibles peuvent avoir des silhouettes relativement semblables, proches, au sens de la note globale η attribuée selon l'invention.The problem more particularly considered here is due to the fact that, according to certain angles of view, for certain orientations, two targets may have relatively similar shapes, similar to the meaning of the overall score η assigned according to the invention.

Néanmoins, on peut remarquer en pratique la présence de différences localisées. S'agissant de véhicules militaires par exemple, ce peut-être la présence de chenilles ou de roues; une longueur sensiblement différente; une forme arrondie ou au contraire anguleuse...Nevertheless, one can notice in practice the presence of localized differences. For military vehicles for example, it may be the presence of tracks or wheels; a significantly different length; a rounded or angular shape ...

Certaines parties d'un contour modèle sont donc plus informatives que d'autres relativement à un autre contour modèle.Some parts of a model outline are therefore more informative than others with respect to another model outline.

La figure 7d montre ainsi un contour image CI correspondant à une image de contours extraits. Deux modèles CM1 et CM2 représentés respectivement sur la figure 7a et la figure 7b, sont trouvés proches au sens de l'invention de ce contour image.FIG. 7d thus shows an IC image contour corresponding to an image of extracted contours. Two models CM 1 and CM 2 shown respectively in Figure 7a and Figure 7b, are found close to the meaning of the invention of this image contour.

L'idée à la base du perfectionnement selon l'invention consiste à considérer les deux hypothèses qui se superposent, et à pondérer la note locale de chaque point d'un contour modèle, établie dans la mesure de proximité de ce contour modèle au contour image, par une quantité d'information représentant la différence locale en ce point, avec l'autre contour modèle.The basic idea of the improvement according to the invention consists in considering the two hypotheses which are superimposed, and in weighting the local note of each point of a model contour, established in the measure of proximity of this model contour to the image contour. , by a quantity of information representing the local difference at this point, with the other model contour.

Selon l'invention, la note globale η1 associée du contour modèle CM1 qui mesure la probabilité de similitude de ce contour modèle CM1 au contour image CI est obtenue par pondération de chacune des notes locales. Plus précisément, on pondère la note locale de proximité N(M1i) de chaque point M1i du contour modèle CM1, par un facteur représentatif en ce point de la quantité d'information discriminante qu'il contient par rapport à l'autre contour modèle CM2. Cette quantité d'information contenue en un point M1i du contour modèle CM1 doit être d'autant plus élevée que ce point est éloigné de l'autre contour modèle CM2 : c'est la définition même de la mesure de proximité en ce point Dist(M1i) selon le procédé de l'invention.According to the invention, the overall score η 1 associated with the model contour CM 1 which measures the probability of similarity of this model contour CM 1 to the image contour CI is obtained by weighting each of the local notes. More precisely, the local proximity note N (M1 i ) of each point M1 i of the model contour CM1 is weighted by a factor representative at this point of the quantity of discriminant information that it contains relative to the other contour. CM model 2 . This amount of information contained in a point M1 i of the model contour CM 1 must be all the more high since this point is far from the other model contour CM 2 : it is the very definition of the proximity measurement in this Dist point (M1 i ) according to the method of the invention.

La quantité d'information de chacun des points M1i du premier contour CM1 relativement au contour CM2 est donc définit comme suit : X M 1 i = Dist ( M 1 i ) = d ( M 1 i , M 2 j ) .

Figure imgb0004

où M2j est un point du contour modèle CM2j apparié au point M1i selon le procédé de mesure de proximité de l'invention. En un point donné du contour modèle CM1, plus la distance au point apparié est grande, plus la quantité d'information en ce point est importante. C'est ce que représente schématiquement la figure 7c. Au point M1a, la quantité d'information X(M1a) est grande, correspondant à la distance da sur la figure.The amount of information of each of the points M1 i of the first contour CM 1 relative to the contour CM 2 is thus defined as follows: X M 1 i = dist ( M 1 i ) = d ( M 1 i , M 2 j ) .
Figure imgb0004

where M2 j is a point of the model contour CM2 j paired with the point M1 i according to the proximity measuring method of the invention. At a given point of the model contour CM 1 , the greater the distance to the matched point, the greater the amount of information at this point. This is schematically represented in Figure 7c. At the point M1, the amount of information X (M1) is large, corresponding to the distance a in Fig.

Au point M1b, la quantité d'information X(M1b) est nulle, puisqu'en ce point, les deux contours se confondent.At the point M1 b, the amount of information X (M1 b) is zero, since this point, the two contours merge.

Les méthodes de calcul de chanfrein et de prise en compte de l'orientation des points dans la mesure de distance décrites précédemment s'appliquent de la même manière à ce calcul de quantité d'information.The methods for calculating the chamfer and for taking into account the orientation of the points in the distance measurement described above apply in the same way to this calculation of the amount of information.

Le procédé de pondération selon l'invention consiste alors, dans l'étape de calcul de la note globale η1 du contour CM1 au contour image, à pondérer la note locale de proximité de chaque point M1i du contour modèle CM1 par la quantité d'information X(M1i) associée, soit: η 1 = 1 m i = 1 m N M 1 i . X M 1 i .

Figure imgb0005
The weighting method according to the invention then consists, in the step of calculating the overall score η 1 of the contour CM 1 to the image contour, to weight the local proximity note of each point M 1 i of the model contour CM 1 by the quantity of information X (M1 i ) associated, that is: η 1 = 1 m Σ i = 1 m NOT M 1 i . X M 1 i .
Figure imgb0005

On applique le procédé de pondération au points du deuxième contour CM2, en inversant le rôle des premier et deuxième contours, c'est à dire en utilisant le procédé de mesure de proximité du deuxième contour CM2 au premier contour CM1 : on obtient la quantité d'information X(M2j) de chaque point M2j du deuxième contour CM2 relativement au premier contour CM1. On pondère la note locale de proximité N(M2j) de chaque point M2j par la quantité d'information X(M2j) associée. La note globale η2 est obtenue en moyennant la note locale de proximité pondérée de chacun des points du contour modèle CM2, soit η 2 = 1 l j = 1 l N M 2 j . X M 2 j .

Figure imgb0006
The weighting method is applied to the points of the second contour CM 2 , inverting the role of the first and second contours, that is to say using the method of measuring the proximity of the second contour CM 2 to the first contour CM 1 : we obtain the amount of information X (M2 j ) of each point M2 j of the second contour CM 2 relative to the first contour CM 1 . The local proximity note N (M2 j ) of each point M2 j is weighted by the associated quantity of information X (M2 j ). The overall score η 2 is obtained by averaging the weighted local proximity score of each of the points of the model contour CM 2 , ie η 2 = 1 l Σ j = 1 l NOT M 2 j . X M 2 j .
Figure imgb0006

Ainsi, on donne plus de poids aux parties du contour de modèle qui ont le plus d'information par rapport aux autres.Thus, more weight is given to the parts of the model outline that have the most information compared to others.

En d'autres termes, cela revient à discriminer entre les deux hypothèses sur la base des points de contour qui contiennent le plus d'information par rapport à l'autre.In other words, it amounts to discriminating between the two hypotheses on the basis of the contour points that contain the most information compared to the other.

Cette notion de quantité d'information se définit donc par rapport à un couple de contours modèles donné.This notion of quantity of information is thus defined with respect to a given couple of model contours.

Lorsque plus de deux hypothèses se superposent, on applique le procédé de discrimination deux à deux.When more than two hypotheses overlap, the discrimination process is applied two by two.

Ainsi l'invention décrit un procédé de mesure de proximité d'un deuxième contour à un premier contour, selon lequel chaque point Mi du deuxième contour est apparié avec un ou zéro point du premier contour, donnant une mesure de proximité Dist(Mi) en ce point.Thus, the invention describes a method for measuring the proximity of a second contour to a first contour, according to which each point M i of the second contour is matched with one or zero points of the first contour, giving a distance measurement Dist (M i ) at this point.

Un procédé d'identification automatique de cibles selon l'invention applique ce processus de mesure de proximité pour déterminer la mesure de proximité de chaque point d'un contour modèle, appliqué comme deuxième contour, à un contour image, appliqué comme premier contour. Il en déduit pour chaque point du contour modèle, une note locale de proximité et pour le contour modèle, une note globale, donnant une mesure de probabilité de similitude au contour image.An automatic target identification method according to the invention applies this proximity measurement process to determine the measure of proximity of each point of a model contour, applied as a second contour, to an image contour, applied as the first contour. He deduces for each point of the model contour, a local note of proximity and for the model contour, a global note, giving a measure of likelihood of similarity to the image contour.

Le procédé d'identification automatique détermine ainsi la note globale associée à chacun des contours modèles d'une collection (avec autant de contours modèles différents que de modèles 3D différents et que de points de vue considérés pour chaque modèle 3D).The automatic identification method thus determines the overall score associated with each of the model contours of a collection (with as many different model contours as different 3D models and points of view considered for each 3D model).

Selon un autre aspect de l'invention, Il applique un critère de sélection d'hypothèses, en retenant comme hypothèse probable, chacun des contours modèles dont la note globale est supérieure au seuil.According to another aspect of the invention, it applies a criterion of selection of hypotheses, retaining as probable hypothesis, each of the model contours whose overall score is greater than the threshold.

Selon une variante, les contours modèles de la collection correspondent à une sélection d'hypothèses, issue d'un autre processus, par exemple, issue d'une mesure de Hausdorff.According to one variant, the model contours of the collection correspond to a selection of hypotheses, resulting from another process, for example, resulting from a Hausdorff measurement.

Selon un autre aspect de l'invention, le procédé d'identification automatique applique alors le procédé de pondération à chaque couple d'hypothèses qui se superposent parmi les hypothèses retenues, pour obtenir pour le contour modèle associé à chaque hypothèse, une note globale pondérée selon l'invention. Pour cela, il utilise le procédé de mesure de proximité, en l'appliquant une première fois, pour mesurer la quantité d'information associée à chaque point du contour de la première hypothèse, appliqué comme deuxième contour, relativement au contour de la deuxième hypothèse appliqué comme premier contour, et calculer la note globale associée en moyennant les notes locales pondérées. Il applique le procédé de mesure de proximité une deuxième fois pour mesurer la quantité d'information associée à chaque point du contour de la deuxième hypothèse appliqué comme deuxième contour, relativement au contour de la première hypothèse appliqué comme premier contour, et calculer la note globale associée en moyennant les notes locales pondérées. Puis le système d'identification sélectionne la meilleure hypothèse. Si les hypothèses qui se superposent sont en nombre supérieur à deux, le système d'identification automatique applique cette pondération deux à deux, pour retenir à chaque fois la meilleure hypothèse.According to another aspect of the invention, the automatic identification method then applies the weighting method to each pair of hypotheses which are superimposed among the assumptions retained, in order to obtain for the model contour associated with each hypothesis a weighted overall score. according to the invention. For this, he uses the method of measuring proximity, applying it a first time, to measure the amount of information associated with each point of the contour of the first hypothesis, applied as a second contour, relative to the outline of the second hypothesis. applied as the first contour, and calculate the overall rating associated by averaging the weighted local scores. It applies the proximity measurement method a second time to measure the amount of information associated with each point of the contour of the second hypothesis applied as the second contour, relative to the contour of the first hypothesis applied as the first contour, and to calculate the overall score. associated by averaging the weighted local scores. Then the identification system selects the best hypothesis. If the hypotheses that are superimposed are greater than two, the automatic identification system applies this weighting two by two, to retain each time the best hypothesis.

On a pu tester sur station de travail les performances d'un système d'identification automatique de cibles utilisant un tel procédé d'identification selon l'invention, sur une base contenant 200 images à analyser et 9 cibles 3D correspondant à des véhicules terrestres. On a pu ainsi mettre en évidence une amélioration significative des performances de l'identification, avec un taux de bonne identification de 80%, contre 50% obtenus avec des procédés de l'état de la technique.It has been possible to test on a workstation the performance of an automatic identification system for targets using such an identification method according to the invention, on a base containing 200 images to be analyzed and 9 3D targets corresponding to terrestrial vehicles. It has thus been possible to demonstrate a significant improvement in the performance of the identification, with a good identification rate of 80%, compared to 50% obtained with state-of-the-art methods.

On notera que l'application d'un système d'identification automatique selon l'invention à une première sélection d'hypothèses obtenue d'un autre processus d'identification automatique, tel qu'un processus utilisant la mesure de Hausdorff, ne change pas les performances d'identification, mais permet avantageusement de gagner du temps de calcul.It should be noted that the application of an automatic identification system according to the invention to a first selection of hypotheses obtained from another automatic identification process, such as a process using the Hausdorff measurement, does not change. identification performance, but advantageously saves computing time.

L'invention qui vient d'être décrite permet d'améliorer de façon sensible la robustesse et la discrimination d'un système d'identification automatique qui la met en oeuvre. Elle s'applique au domaine militaire, mais plus généralement, à tout domaine utilisant la reconnaissance de forme par comparaison à une série de modèles.The invention which has just been described makes it possible to appreciably improve the robustness and the discrimination of an automatic identification system which implements it. It applies to the military field, but more generally, to any domain using pattern recognition as compared to a series of models.

Claims (19)

  1. Automated method of measurement of proximity of a second contour (CM) extracted from an image to a first contour (CI), comprising for each point (Ik) of the first contour, a step of association with a point (Mi) of the second contour determined as the closest, characterized in that it comprises a step of pairing each point of the second contour with one or zero points of the first contour, by determining the point of the first contour which is closest from among the set of points of the first contour that are associated with said point of the second contour.
  2. Method according to Claim 1, characterized in that the determination of a point that is closest to a given point is based on a true or discrete measure of the Euclidean distance between the two points.
  3. Method according to Claim 2, characterized in that it comprises a step of allocating a measure of proximity Dist (Mi) of each point Mi of the second contour (CM) to the first contour (CI), based on the measurement of the distance from this point to the point of the first contour with which it is paired.
  4. Method according to Claim 3, characterized in that said distance measure is a measure corrected as a function of the difference of class of orientation of the points of the pair considered.
  5. Method according to any one of Claims 1 to 3, characterized in that in the step of associating zero or one points of the second contour with each point of the first contour, the point that is closest from among the points of the second contour which have the same class of orientation as said point of the first contour is associated.
  6. Method according to any one of Claims 1, 2, 3, or 5, characterized in that the associating step uses a chamfer map of the second contour via which, at each point of the first contour with coordinates x and y applied as input, said map provides as output an identification (S0(x,y)) of the point of the associated second contour and a measure (S1(x,y)) of the proximity between the two points thus associated.
  7. Method according to Claim 6 in combination with claim 5, characterized in that with the second contour is associated a chamfer map per class of orientation, and in that for each point of the first contour, the associating step comprises a step of determining the class of the point of the first contour, so as to apply the coordinates (x,y) of this point as inputs to the chamfer map corresponding to said orientation class.
  8. Method according to any one of Claims 4 to 7, characterized in that it uses eight orientation classes.
  9. Method of automatic identification of targets in an image of extracted contours (CI) which is defined by an image contour, characterized in that it applies a method of measurement of proximity according to any one of Claims 1 to 8, by applying as second contour, a template contour (CM) and as first contour, said image contour (CI), so as to obtain the measure of proximity Dist(Mi) of each point of said template contour to said image contour.
  10. Method of identification according to Claim 9, characterized in that it comprises the allocation of a local score of proximity N(Mi) to each point Mi of the template contour as a function of the measure of proximity Dist(Mi) of this point, according to the following criteria:
    - N(Mi) has a value lying between 0 and 1.
    - N(Mi) =0, when said point is paired with zero points of the first contour;
    - N(Mi) = 1, when the proximity measure is equal to zero;
    - N(Mi) has a value of about 1 when the proximity measure lies between 0 and 1 pixels.
    - N(Mi) decreases very rapidly to 0 as soon as the proximity measure becomes greater than 1 pixel.
    - N(Mi) decreases according to a curve having a point of inflexion, in the neighborhood of a proximity measure of about 2 pixels.
    - N(Mi) has a quasi-zero value as soon as the proximity measure becomes greater than 3 pixels.
  11. Method of identification according to the preceding claim, characterized in that the function for allocating the score of proximity to the point Mi may be written: N M i = 0.5 - arctan 4 Dist M i - 2 π 1 0.9604 .
    Figure imgb0008
  12. Method of identification according to Claim 10 or 11, characterized in that it comprises a step of measuring a global score η equal to the mean of the proximity scores relative to the number of points of the template contour (CM).
  13. Method of identification according to any one of Claims 9 to 12, characterized in that it is applied successively to each of the template contours of a collection of template contours.
  14. Method of identification according to Claim 13, characterized in that said collection is obtained from another method of identification of targets, such a method using a Hausdorff distance measure.
  15. Method of identification according to Claim 13 or 14, characterized in that it comprises a step of selecting hypotheses by comparison with a threshold of each of the global scores η allocated to each of the template contours.
  16. Method of identification according to Claim 15, characterized in that said threshold is fixed at 0.6.
  17. Method of identification according to one of Claims 15 or 16, characterized in that it comprises a step of discriminating between hypotheses of template contours which are superimposed, comprising for each pair of a first contour hypothesis (CM1) · and of a second contour hypothesis (CM2) which are superimposed, a step of weighting the global score allocated to each of the template contours, said weighting step comprising the application of the method of measurement of proximity according to any one of Claims 3 to 8:
    - a by applying as second contour, the contour of said first hypothesis and as first contour, the contour of said second hypothesis, said proximity measure Dist(M1i) obtained for each point (M1i) of contour (CM1) of the first hypothesis being applied as weighting factor X(M1i) for the local score of proximity (N(M1i)) of this point to the image contour (CI), and by deducing the global score (η1) associated with the first contour hypothesis representing its proximity to the image contour by calculating the mean of said weighted local scores,
    - b by applying as second contour, the contour of said second hypothesis and as first contour, the contour of said first hypothesis, said proximity measure Dist(M2j) obtained for each point (M2j) of contour (CM2) of the first hypothesis being applied as weighting factor X(M2j) for the local score of proximity (N(M2j)) of this point to the image contour (CI), and by deducing the global score (η2) associated with the first contour hypothesis representing its proximity to the image contour by calculating the mean of said weighted local scores.
  18. Method of identification according to Claim 17, characterized in that it adopts as best hypothesis of template contour, from among a plurality of hypotheses which are superimposed, that with which the best global score is associated.
  19. System for the automatic identification of an object in an image of contours, comprising a database containing templates of determined objects to be recognized, and means of calculation, said means of calculation being configured so as to compare contours, characterized in that said means of calculation are configured so as to perform the steps of a method of identification, according to any one of Claims 9 to 18.
EP04766207A 2003-07-17 2004-07-13 Method for measuring the proximity of two contours and system for automatic target identification Expired - Lifetime EP1646967B1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR0308707A FR2857771B1 (en) 2003-07-17 2003-07-17 AUTOMATIC IDENTIFICATION SYSTEM OF TARGETS
PCT/EP2004/051476 WO2005017818A1 (en) 2003-07-17 2004-07-13 Method for measuring the proximity of two contours and system for automatic identification of targets

Publications (2)

Publication Number Publication Date
EP1646967A1 EP1646967A1 (en) 2006-04-19
EP1646967B1 true EP1646967B1 (en) 2007-01-31

Family

ID=33548197

Family Applications (1)

Application Number Title Priority Date Filing Date
EP04766207A Expired - Lifetime EP1646967B1 (en) 2003-07-17 2004-07-13 Method for measuring the proximity of two contours and system for automatic target identification

Country Status (10)

Country Link
US (1) US7783112B2 (en)
EP (1) EP1646967B1 (en)
JP (1) JP4745966B2 (en)
KR (1) KR20060056949A (en)
AT (1) ATE353153T1 (en)
DE (1) DE602004004594T2 (en)
ES (1) ES2281001T3 (en)
FR (1) FR2857771B1 (en)
IL (1) IL173125A (en)
WO (1) WO2005017818A1 (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7965890B2 (en) * 2007-01-05 2011-06-21 Raytheon Company Target recognition system and method
US9250324B2 (en) 2013-05-23 2016-02-02 GM Global Technology Operations LLC Probabilistic target selection and threat assessment method and application to intersection collision alert system
US9582739B2 (en) * 2014-11-18 2017-02-28 Harry Friedbert Padubrin Learning contour identification system using portable contour metrics derived from contour mappings
CN104463866B (en) * 2014-12-04 2018-10-09 无锡日联科技有限公司 A kind of local shape matching process based on profile stochastical sampling
CN104574519B (en) * 2015-01-31 2017-05-31 华北水利水电大学 Multi-source resident's terrain feature exempts from the automatic sane matching process of threshold value
EP3096271A1 (en) * 2015-05-16 2016-11-23 Tata Consultancy Services Limited Method and system for planogram compliance check based on visual analysis
CN105678844B (en) * 2016-01-06 2018-04-24 东南大学 One kind is based on atural object scattered points increased profile construction method point by point
FR3086161B1 (en) * 2018-09-24 2021-03-12 Interactif Visuel Systeme I V S AUTOMATIC DETERMINATION OF THE PARAMETERS NECESSARY FOR THE REALIZATION OF GLASSES.
US11022972B2 (en) * 2019-07-31 2021-06-01 Bell Textron Inc. Navigation system with camera assist

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5321770A (en) * 1991-11-19 1994-06-14 Xerox Corporation Method for determining boundaries of words in text
JPH05225335A (en) * 1992-02-17 1993-09-03 Nippon Telegr & Teleph Corp <Ntt> Object area segmenting device
US7158677B2 (en) * 2002-08-20 2007-01-02 National Instruments Corporation Matching of discrete curves under affine transforms

Also Published As

Publication number Publication date
JP4745966B2 (en) 2011-08-10
EP1646967A1 (en) 2006-04-19
IL173125A (en) 2010-11-30
ATE353153T1 (en) 2007-02-15
ES2281001T3 (en) 2007-09-16
US20060193493A1 (en) 2006-08-31
IL173125A0 (en) 2006-06-11
JP2009514057A (en) 2009-04-02
US7783112B2 (en) 2010-08-24
DE602004004594T2 (en) 2007-11-15
FR2857771A1 (en) 2005-01-21
FR2857771B1 (en) 2005-09-23
DE602004004594D1 (en) 2007-03-22
KR20060056949A (en) 2006-05-25
WO2005017818A1 (en) 2005-02-24

Similar Documents

Publication Publication Date Title
US20200160040A1 (en) Three-dimensional living-body face detection method, face authentication recognition method, and apparatuses
US20190362157A1 (en) Keyframe-based object scanning and tracking
EP2724203B1 (en) Generation of map data
EP2786314B1 (en) Method and device for following an object in a sequence of at least two images
WO2006058986A2 (en) Method for identifying an individual based on fragments
EP3614306B1 (en) Method for facial localisation and identification and pose determination, from a three-dimensional view
EP2275970A1 (en) Method of obstacle detection for a vehicle
CN111126393A (en) Vehicle appearance refitting judgment method and device, computer equipment and storage medium
EP1646967B1 (en) Method for measuring the proximity of two contours and system for automatic target identification
FR3019359A1 (en) METHOD FOR DETERMINING A STATE OF OBSTRUCTION OF AT LEAST ONE CAMERA EMBARKED IN A STEREOSCOPIC SYSTEM
FR3025898A1 (en) METHOD AND SYSTEM FOR LOCALIZATION AND MAPPING
US20140205177A1 (en) Valuable document identification method and system
EP3264329B1 (en) A method and a device for detecting fraud by examination using two different focal lengths during automatic face recognition
EP3608836B1 (en) Method for obtaining a digital fingerprint image
EP2710513A1 (en) Method of searching for parameterized contours for comparing irises
FR2979727A1 (en) IDENTIFICATION BY RECOGNITION OF IRIS
Mayer et al. Improved forgery detection with lateral chromatic aberration
EP2474937B1 (en) Electronic authentication method of a handwritten signature, module and corresponding computer program
EP0863488A1 (en) Method for detecting level contours in two stereoscopic images
EP3567521A1 (en) Iris biometric recognition method
EP4136565A1 (en) Method for detecting an attack by presentation for fingerprints
EP3903282B1 (en) Method for segmenting an image
US11899469B2 (en) Method and system of integrity monitoring for visual odometry
Ruscio et al. Dealing with Feature Correspondence in Visual Odometry for Underwater Applications
EP3300525A1 (en) Method for estimating geometric parameters representing the shape of a road, system for estimating such parameters and motor vehicle equipped with such a system

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20060113

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PL PT RO SE SI SK TR

RTI1 Title (correction)

Free format text: METHOD FOR MEASURING THE PROXIMITY OF TWO CONTOURS AND SYSTEM FOR AUTOMATIC TARGET IDENTIFICATION

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

DAX Request for extension of the european patent (deleted)
GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PL PT RO SE SI SK TR

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070131

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070131

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070131

Ref country code: IE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070131

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070131

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

Free format text: NOT ENGLISH

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

Free format text: LANGUAGE OF EP DOCUMENT: FRENCH

REF Corresponds to:

Ref document number: 602004004594

Country of ref document: DE

Date of ref document: 20070322

Kind code of ref document: P

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070430

GBT Gb: translation of ep patent filed (gb section 77(6)(a)/1977)

Effective date: 20070410

REG Reference to a national code

Ref country code: SE

Ref legal event code: TRGR

REG Reference to a national code

Ref country code: GR

Ref legal event code: EP

Ref document number: 20070401303

Country of ref document: GR

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070702

REG Reference to a national code

Ref country code: ES

Ref legal event code: FG2A

Ref document number: 2281001

Country of ref document: ES

Kind code of ref document: T3

REG Reference to a national code

Ref country code: IE

Ref legal event code: FD4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070131

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070131

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070131

26N No opposition filed

Effective date: 20071101

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20070731

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20070713

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070131

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20080731

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20080731

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20070713

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070801

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070131

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 13

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 14

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 15

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GR

Payment date: 20200624

Year of fee payment: 17

Ref country code: FR

Payment date: 20200625

Year of fee payment: 17

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: BE

Payment date: 20200624

Year of fee payment: 17

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: NL

Payment date: 20200715

Year of fee payment: 17

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20200630

Year of fee payment: 17

Ref country code: FI

Payment date: 20200709

Year of fee payment: 17

Ref country code: ES

Payment date: 20200803

Year of fee payment: 17

Ref country code: GB

Payment date: 20200707

Year of fee payment: 17

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: IT

Payment date: 20200625

Year of fee payment: 17

Ref country code: SE

Payment date: 20200710

Year of fee payment: 17

REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Ref document number: 602004004594

Country of ref document: DE

Free format text: PREVIOUS MAIN CLASS: G06K0009640000

Ipc: G06V0030192000

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 602004004594

Country of ref document: DE

REG Reference to a national code

Ref country code: FI

Ref legal event code: MAE

REG Reference to a national code

Ref country code: NL

Ref legal event code: MM

Effective date: 20210801

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20210713

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20210731

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210713

Ref country code: FI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210713

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220201

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210714

Ref country code: NL

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210801

Ref country code: GR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220207

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210731

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210713

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210731

REG Reference to a national code

Ref country code: ES

Ref legal event code: FD2A

Effective date: 20220826

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: ES

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210714

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230529