EP1525554A1 - Procede pour analyser les signaux d'un capteur d'images electronique lors de la reconnaissance de modeles de contenus d'images d'un echantillon - Google Patents

Procede pour analyser les signaux d'un capteur d'images electronique lors de la reconnaissance de modeles de contenus d'images d'un echantillon

Info

Publication number
EP1525554A1
EP1525554A1 EP03787716A EP03787716A EP1525554A1 EP 1525554 A1 EP1525554 A1 EP 1525554A1 EP 03787716 A EP03787716 A EP 03787716A EP 03787716 A EP03787716 A EP 03787716A EP 1525554 A1 EP1525554 A1 EP 1525554A1
Authority
EP
European Patent Office
Prior art keywords
value
image sensor
membership
signal
sympathy
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP03787716A
Other languages
German (de)
English (en)
Inventor
Volker Lohweg
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koenig and Bauer AG
Original Assignee
Koenig and Bauer AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koenig and Bauer AG filed Critical Koenig and Bauer AG
Publication of EP1525554A1 publication Critical patent/EP1525554A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/42Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation
    • G06V10/431Frequency domain transformation; Autocorrelation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning

Definitions

  • the invention relates to methods for signal evaluation of an electronic image sensor in the pattern recognition of image contents of a test specimen according to the preamble of claim 1, 2 or 19.
  • Known methods for analyzing image contents of a test object are usually based on metrics for determining similarities, such as distance measurements for segmented objects or the calculation of global threshold distributions. These methods are based on translation-invariant output spectra. In reality, situations often occur, such as object shifts under the recording system or different backgrounds in the recording or aliasing effects, so that a direct comparison of these output spectra in many cases can not be performed.
  • the invention has for its object to provide a method for signal evaluation of an electronic image sensor in the pattern recognition of image content of a specimen.
  • An advantage of the method lies, in particular, in the fact that a sensor signal is analyzed in an image window of size n ⁇ n pixels. As a result, the sensor signal of this image window can be considered local.
  • the image analysis method according to the invention can be subdivided into the essential steps: feature formation, fuzzyfication, interference, defuzzification and decision on class membership.
  • the sensor signal is converted by means of at least one calculation rule into an invariant, in particular a translation-invariant, signal in the feature space.
  • the aim of the feature formation is to determine such quantities by which typical signal properties of the image content are characterized.
  • the typical signal properties of the image content are represented by so-called features.
  • the features can be represented by values in feature space or by linguistic variables.
  • a signal is generated which consists of a feature value or of several feature values.
  • the membership of a feature value to a feature is described by at least one fuzzy membership function. This is a soft or fuzzy assignment, whereby, depending on the value of the feature value, the feature value belongs to the feature in a standardized interval between 0 and 1.
  • fuzzyfication there is essentially a conversion of a sharp feature value into one or more fuzzy affiliations.
  • a superordinate membership function is generated by means of a calculation rule which consists of at least one rule, with all membership functions being linked to one another. As a result, a higher-level membership function is thus obtained for each window.
  • a numerical value also called the sympathy value is determined from the superordinate membership function formed in the interference.
  • the sympathy score is compared to a predetermined threshold value that determines the membership of the window to a particular class.
  • Which type of feature values are in the feature space is of secondary importance to the principle of the invention.
  • time signals for example, it is possible to determine the mean value or the variance as feature values. If the request is submitted to the evaluation method that it can process the image contents error-free irrespective of the prevailing signal intensity, and if further small but permissible fluctuations of the image signal should not lead to disturbances, then it makes sense if the conversion of the sensor signal from the two-dimensional Space is performed by means of a two-dimensional spectral transformation, such as a two-dimensional Fourier, or a two-dimensional Walsh, or a two-dimensional Hadamard or a two-dimensional circular transformation.
  • the two-dimensional spectral transformation yields invariant feature values.
  • Another preferred embodiment is to use the amount of spectral coefficients obtained by spectral transformation as the feature value.
  • the membership functions are unimodal potential functions and the parent membership function is a multimodal potential function.
  • At least one membership function is parameterized. If the membership function has positive and negative slopes, then it is advantageous if the parameters of the positive and negative slopes can be determined separately. This ensures better adaptation of the parameters to the data records to be examined.
  • the method can be divided into a learning phase and a working phase.
  • the parameters of the membership function can be determined in the learning phase from measured data records.
  • the parameters of the membership functions are adapted to so-called reference images, ie in the learning phase, a membership of the feature values resulting from the reference images is derived from the corresponding features by means of the membership functions and their parameters.
  • the feature values resulting from the now measured data sets are weighted with the membership functions whose parameters were determined in the learning phase, whereby a membership of the feature values of the now measured data sets to the appropriate characteristics is produced.
  • At least one rule by means of which the membership functions are linked to one another, is a conjunctive rule in the sense of a IF ... THEN connection.
  • Another preferred embodiment divides the generation of the superordinate fuzzy membership function into the processing of the sub-steps: premise evaluation, activation and aggregation.
  • premise evaluation a membership value is determined for each IF part of a rule, and upon activation, a membership function is defined for each IF ... THEN rule. Subsequently, during aggregation, the superordinate membership function is generated by superimposing all the membership functions generated during activation.
  • the sympathy value determination is carried out in particular according to a center of gravity and / or maximum method.
  • 1 is a flowchart of the signal evaluation method
  • 2 shows a sympathy curve
  • FIG. 1 shows a flowchart of the signal evaluation method described below.
  • a grid of N x N windows 01 is laid over the entire image to be analyzed.
  • Each window 01 consists of n ⁇ n pixels 02.
  • the signal of each window 01 is analyzed separately.
  • the image content 03 of each window 01 can be considered local.
  • the two-dimensional image of the spatial domain is transformed into a two-dimensional image in the frequency domain.
  • the spectrum obtained is called frequency spectrum. Since it is a discrete spectrum in the present embodiment, the frequency spectrum is also discrete.
  • the frequency spectrum is formed by the spectral coefficients 06, also called spectral values 06.
  • the amount formation 07 of the spectral values 06 takes place.
  • the magnitude of the spectral values 06 is called spectral amplitude value 08.
  • the spectral amplitude values 08 form the feature values d in the present exemplary embodiment. H. they are identical to the characteristic values.
  • a circular transformation is preferably used.
  • the invariance properties are over the 6a
  • Transformation coefficients adjustable It is possible to set a translational invariance, as well as a reflection invariance or invariance with respect to various other permutation groups. Thus, it is possible to use the above transformation, for example, in the variant variant for examining characters
  • the mirror-invariant variant can be used for the examination of workpieces; because here it is just not necessary to make a distinction between a mirrored part and the original. It should be noted that the magnitude spectrum of the Fourier transform is invariant to mirror.
  • the circular transformation is also extremely tolerant in the subpixel range for arbitrary shifts. Comparisons have shown that this circular transformation is superior to other known transformations in terms of displacements.
  • Characteristics 11 are both characteristic spectral amplitude values 08, which define the feature 11 by their position in the frequency space and by their amplitude, as well as linguistic variables such as "gray”, "black” or "white”.
  • the fuzzification 12 the membership of each Specified spectral amplitude value 08 to a feature 11 by a soft or fuzzy membership function 13; ie a weighting takes place.
  • the membership functions 13 can be adapted in a learning phase to so-called reference data sets, it makes sense if the membership functions 13 are parameterized monomodal, i. are one-dimensional potential functions in which the parameters of the positive and negative slope can be adjusted separately to the data sets to be examined.
  • the data sets of the image content, from which the feature values 08 of the test images result are then weighted with the respective membership functions 13 whose parameters were determined in the previous learning phase. Ie. For each feature 11, there is a kind of DESIRED comparison between the reference data record expressed in the parameters of the membership functions 13 and the record of the test image.
  • the membership functions 13 produce a soft or fuzzy association between the respective feature value 08 and the feature 11.
  • the interference 14 there is essentially a conjunctive connection 15 - also called aggregation 15 - of all membership functions 13 of the features 11, whereby a higher-order membership function 16 is generated.
  • the next process step the defuzzification 17, obtained from the parent membership function 16 a concrete of belonging or sympathy value '18.
  • This emotional value 18 is compared in the classification 19 with a previously set threshold value 21, whereby a classification statement can be made.
  • Threshold 21 is set either manually or automatically. The setting of the threshold value 21 also takes place in the learning phase.
  • Neural networks are known to be able to be trained.
  • the fuzzy disk classification is based on a concept that accomplishes a distance measure and a feature join at the same time.
  • "Fuzzy” is the fact that the features are "rounded", but not logical, but out of focus. This leads firstly to the fact that all features are taken into account summarily. That is, small deviations of a feature are still tolerated. Secondly, if the deviation of a feature becomes too large, it immediately has a large influence on the distance measure. Accordingly, the output of the classifier itself does not provide a "good / bad” decision, but rather a continuous output value between [0 .... 1]. A threshold value is then used downstream, which then permits a "good / bad” decision ,
  • the expansion value C is learned with the aid of measured values generated with the circular transformation.
  • the ⁇ value describes how close the similarity of a pattern is to a reference pattern described by features. This means that the z value assumes the actual control of the ⁇ value. If the z-value is very small, the ⁇ -value is close to 1. The patterns are very similar (sympathetic). On the other hand, if the z value is large, the ⁇ value will become small; the patterns are not similar. The course of the curve - as implemented - is shown in FIG.
  • the values Cdiff x are determined in the learning phase, specifically a value for each feature m x .
  • the value range of a lies between [1 ... 3],
  • the value pc e indicates the percentage tolerance with which C d iff is assigned.
  • the x ⁇ r value indicates the mean of C di ff; it is calculated for each feature at runtime.
  • This difference is normalized with the width of the expansion value C x .
  • the consequence is that the corresponding feature contributes little to the z-value with little deviation; with a large deviation, however, a large deviation value will result depending on the difference measure of the expansion value C d j ff . Name the normalized difference d x .
  • the power D (2, 4, 8) adjusts the sensitivity at the edges of the normalized difference function d x . If the value D is set to "infinite” - which is not technically possible - you will also get an infinite slope and thus a hard "good / bad decision", which is why the values are usually set to between 2 ... 20.
  • the curves for the values 2, 4 and 8 are shown in Figures 3c, 3b and 3a.
  • the exponentiated functions d x are summed and only the number M of features m is used, which are also switched on. After the summation, the calculated value is divided by the number M. The mean value of all potentiated differences d x is determined. 12
  • This process is performed for all windows.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

L'invention concerne un procédé pour analyser les signaux d'un capteur d'images électronique lors de la reconnaissance de modèles de contenus d'images d'un échantillon. Le capteur d'images reçoit un signal d'entrée lumineux et émet un signal de sortie électrique en corrélation avec le signal d'entrée lumineux, le procédé comprenant les étapes suivantes : analyse du contenu d'image d'une fenêtre de dimension n x n pixels, conversion du signal de sortie direct ou indirect émis par le capteur d'images en au moins une valeur caractéristique de translation invariable sur la base d'au moins une règle de calcul, pondération de la valeur caractéristique au moyen d'au moins une fonction d'appartenance approximative, laquelle est en rapport fonctionnel avec la tranche de valeur de la valeur caractéristique, génération d'une fonction d'appartenance approximative supérieure par la mise en rapport de toutes les fonctions d'appartenance au moyen d'une base de calcul comprenant au moins une règle, détermination d'une valeur de sympathie à partir de la fonction d'appartenance approximative supérieure, comparaison de la valeur de sympathie avec une valeur seuil, détermination de l'appartenance à une catégorie.
EP03787716A 2002-07-26 2003-07-22 Procede pour analyser les signaux d'un capteur d'images electronique lors de la reconnaissance de modeles de contenus d'images d'un echantillon Withdrawn EP1525554A1 (fr)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DE10234086A DE10234086B4 (de) 2002-07-26 2002-07-26 Verfahren zur Signalauswertung eines elektronischen Bildsensors bei der Mustererkennung von Bildinhalten eines Prüfkörpers
DE10234086 2002-07-26
PCT/DE2003/002467 WO2004017252A1 (fr) 2002-07-26 2003-07-22 Procede pour analyser les signaux d'un capteur d'images electronique lors de la reconnaissance de modeles de contenus d'images d'un echantillon

Publications (1)

Publication Number Publication Date
EP1525554A1 true EP1525554A1 (fr) 2005-04-27

Family

ID=30469125

Family Applications (1)

Application Number Title Priority Date Filing Date
EP03787716A Withdrawn EP1525554A1 (fr) 2002-07-26 2003-07-22 Procede pour analyser les signaux d'un capteur d'images electronique lors de la reconnaissance de modeles de contenus d'images d'un echantillon

Country Status (6)

Country Link
US (1) US7483573B2 (fr)
EP (1) EP1525554A1 (fr)
CN (1) CN1331086C (fr)
AU (1) AU2003258460A1 (fr)
DE (1) DE10234086B4 (fr)
WO (1) WO2004017252A1 (fr)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10234086B4 (de) * 2002-07-26 2004-08-26 Koenig & Bauer Ag Verfahren zur Signalauswertung eines elektronischen Bildsensors bei der Mustererkennung von Bildinhalten eines Prüfkörpers
DE10314071B3 (de) * 2003-03-28 2004-09-30 Koenig & Bauer Ag Verfahren zur qualitativen Beurteilung eines Materials mit mindestens einem Erkennungsmerkmal
DE102004019978B3 (de) 2004-04-23 2005-08-04 Koenig & Bauer Ag Verfahren zur Beurteilung einer Qualität einer von einer Druckmaschine produzierten Drucksache
DE102004021047B3 (de) * 2004-04-29 2005-10-06 Koenig & Bauer Ag Verfahren zum Vergleich eines Bildes mit mindestens einem Referenzbild
US8233712B2 (en) * 2006-07-28 2012-07-31 University Of New Brunswick Methods of segmenting a digital image
US10042813B2 (en) * 2014-12-15 2018-08-07 Intel Corporation SIMD K-nearest-neighbors implementation
CN104598919B (zh) * 2014-12-22 2017-09-19 宁波力芯科信息科技有限公司 用于相似度智能匹配的模糊识别器及方法
TWI675331B (zh) 2018-08-31 2019-10-21 財團法人工業技術研究院 儲物裝置及儲物方法

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5602938A (en) 1994-05-20 1997-02-11 Nippon Telegraph And Telephone Corporation Method of generating dictionary for pattern recognition and pattern recognition method using the same
US6002807A (en) * 1996-01-11 1999-12-14 Northrop Grumman Corporation Rotationally invariant correlator
US6594392B2 (en) 1999-05-17 2003-07-15 Intel Corporation Pattern recognition based on piecewise linear probability density function
US7187810B2 (en) * 1999-12-15 2007-03-06 Medispectra, Inc. Methods and systems for correcting image misalignment
US20040022436A1 (en) * 2001-03-16 2004-02-05 Paul Patti System and method for data analysis of x-ray images
US7376242B2 (en) * 2001-03-22 2008-05-20 Digimarc Corporation Quantization-based data embedding in mapped data
FR2829343A1 (fr) * 2001-09-04 2003-03-07 St Microelectronics Sa Procede d'insertion de messages binaires dans une image numerique
US7050629B2 (en) * 2002-05-31 2006-05-23 Intel Corporation Methods and systems to index and retrieve pixel data
DE10234086B4 (de) * 2002-07-26 2004-08-26 Koenig & Bauer Ag Verfahren zur Signalauswertung eines elektronischen Bildsensors bei der Mustererkennung von Bildinhalten eines Prüfkörpers
WO2004090581A2 (fr) * 2003-03-31 2004-10-21 Cdm Optics, Inc. Systemes et procedes pour reduire au minimum des effets aberrants dans des systemes d'imagerie
WO2005027491A2 (fr) * 2003-09-05 2005-03-24 The Regents Of The University Of California Codage et traitement d'image pour une estimation de mouvement global
US7853071B2 (en) * 2006-11-16 2010-12-14 Tandent Vision Science, Inc. Method and system for learning object recognition in images

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2004017252A1 *

Also Published As

Publication number Publication date
AU2003258460A1 (en) 2004-03-03
WO2004017252A1 (fr) 2004-02-26
CN1672167A (zh) 2005-09-21
DE10234086B4 (de) 2004-08-26
CN1331086C (zh) 2007-08-08
DE10234086A1 (de) 2004-02-19
US20060050995A1 (en) 2006-03-09
US7483573B2 (en) 2009-01-27

Similar Documents

Publication Publication Date Title
DE19521346C2 (de) Bilduntersuchungs/-Erkennungsverfahren, darin verwendetes Verfahren zur Erzeugung von Referenzdaten und Vorrichtungen dafür
DE69530566T2 (de) Hough-Transform mit Fuzzy-Gradient und Wahl
EP0721631B1 (fr) Procede de segmentation d'images numeriques en couleur
EP0780002B1 (fr) Procede et appareil pour reconstruir de structures constituees de lignes sous forme d'une trame
WO2020049154A1 (fr) Procédé et dispositif de classification d'objets
EP0523407B1 (fr) Procédé de classification de signaux
DE102019218613B4 (de) Objektklassifizierungsverfahren, Objektklassifizierungsschaltung, Kraftfahrzeug
CH684856A5 (de) Verfahren zur Klassifizierung eines Musters - insbesondere eines Musters einer Banknote oder einer Münze - und Einrichtung zur Durchführung des Verfahrens.
DE19928231C2 (de) Verfahren und Vorrichtung zur Segmentierung einer Punkteverteilung
WO2004017252A1 (fr) Procede pour analyser les signaux d'un capteur d'images electronique lors de la reconnaissance de modeles de contenus d'images d'un echantillon
DE19636074C2 (de) Lernfähiges Bildverarbeitungssystem zur Klassierung
EP2064672A2 (fr) Procédé et dispositif de traitement d'image
EP0394959A2 (fr) Méthode d'analyse d'image
DE102021207613A1 (de) Verfahren zur Qualitätssicherung eines Systems
EP1938270A2 (fr) Procede pour segmenter dans un espace de caracteristiques a n dimensions et procede de classification a partir de proprietes geometriques d'objets segmentes dans un espace de donnees a n dimensions
DE10027657B4 (de) Verfahren zur Ermittlung von Referenzpunkten in einem Fingerabdruckbild
EP3655920B1 (fr) Procédé et dispositif pour l'évaluation de sections d'images pour un calcul de correspondance
EP2096578A2 (fr) Procédé et dispositif de caractérisation de la formation de papier
EP4097647A1 (fr) Procédé pour assurer la qualité d'un système basé sur des exemples
WO2020233961A1 (fr) Procédé pour évaluer une robustesse spécifique de la fonction d'un réseau neuronal
DE102019217951A1 (de) Verfahren und Vorrichtung zum Bestimmen einer Domänendistanz zwischen mindestens zwei Datendomänen
DE102019114049A1 (de) Verfahren zur Validierung eines Fahrerassistenzsystems mithilfe von weiteren generierten Testeingangsdatensätzen
EP0469315B1 (fr) Procédé d'inspection d'images bidimensionnelles ou tridimensionnelles
DE10066189B4 (de) Verfahren zur Erkennung von Objekten
DE102021200821B3 (de) Erzeugung von Trainingsdaten für zweidimensionale Scans eines Bodenradar- Systems

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20041117

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL LT LV MK

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20090123

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

18W Application withdrawn

Effective date: 20090313