EP1203342A2 - Verfahren und vorrichtung zur segmentierung einer punkteverteilung - Google Patents

Verfahren und vorrichtung zur segmentierung einer punkteverteilung

Info

Publication number
EP1203342A2
EP1203342A2 EP00941981A EP00941981A EP1203342A2 EP 1203342 A2 EP1203342 A2 EP 1203342A2 EP 00941981 A EP00941981 A EP 00941981A EP 00941981 A EP00941981 A EP 00941981A EP 1203342 A2 EP1203342 A2 EP 1203342A2
Authority
EP
European Patent Office
Prior art keywords
points
texture
image
determined
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP00941981A
Other languages
German (de)
English (en)
French (fr)
Inventor
Christoph RÄTH
Gregor Morfill
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Max Planck Gesellschaft zur Foerderung der Wissenschaften eV
Original Assignee
Max Planck Gesellschaft zur Foerderung der Wissenschaften eV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Max Planck Gesellschaft zur Foerderung der Wissenschaften eV filed Critical Max Planck Gesellschaft zur Foerderung der Wissenschaften eV
Publication of EP1203342A2 publication Critical patent/EP1203342A2/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/18Complex mathematical operations for evaluating statistical data, e.g. average values, frequency distributions, probability functions, regression analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • G06T7/41Analysis of texture based on statistical description of texture

Definitions

  • the invention relates to a method for segmenting a point distribution in partial areas with different structural properties and a device for carrying out such a segmentation method.
  • Image segmentation i.e. the subdivision of an image into segments or partial areas on the basis of certain image features that are common to a partial area in each case, is one of the most important tasks of image processing technology.
  • image segmentation is based on the detection of gray value differences, e.g. in the environment of a viewed pixel or on edge detection techniques.
  • this can only be used to segment simply structured images with extensive, homogeneous image elements.
  • more complex image structures occur, e.g. in the form of differentiated, repeating gray value patterns or blurred boundaries of picture elements, which cannot be detected with simple segmentation techniques.
  • cluster analysis is usually carried out in a feature space, in which each axis represents one of the examined features.
  • a characteristic marker or a label is assigned to each pixel, whereby pixels with the same label belong to the same feature class or structure, whereas pixels with different labels are assigned to different structures.
  • the available cluster analyzes are described, for example, by RC Dubes in "Handbook of pattern recognition and computer vision", published by CH Cheng et al., World Scientific Publishing, 1993, p. 3 ff., And BD Ripley in “Pattern Recognition and Neural Networks ", Cambridge University Press, 1996.
  • Unmonitored cluster algorithms i.e. Cluster algorithms without supervision initially process unmarked data and therefore require the solution of the following two problems.
  • an image in the broadest sense can also be a low- or high-dimensional structure, in which each pixel is initially defined by a number of coordinates corresponding to the dimension of the structure and a certain number of image features (measured values) are assigned to each pixel.
  • the dimensions of the structure can also be spanned by any other feature axes.
  • the systems examined thus include all physical, chemical or biological-medical processes or materials, the condition or individual features of which can be characterized by a set of n parameters in accordance with the dimension.
  • the systems can be unchangeable (static) or temporally changeable (dynamic) during the investigation period. In the latter case, time is one of the n parameters.
  • DE-OS 43 17 746 discloses a spatial filter method for recognizing structures in n-dimensional images based on the concept of the so-called isotropic scaling factor a.
  • the scaling factor ⁇ describes the change in the point density (gradient) around an examined pixel by specifying the surrounding number of points m as a function of the distance from the examined pixel.
  • An extension of this space filtering method to the detection of the orientation of structures in n-dimensional images is shown in -ET
  • the object of the invention is to provide an improved method for segmenting a point distribution in relation to textures, with which the disadvantages of conventional methods are overcome and which, in particular, is highly sensitive and reliable, requires as little preliminary information as possible about the point distribution and is as broad as possible the most varied of tasks can be used both with conventional optical images and with low or high-dimensional structures.
  • the object of the invention is also to provide a device for implementing such a method and uses of the method.
  • a method for partially monitored segmentation of point distributions in which structural or texture features are determined for each point using the concepts mentioned isotropic and anisotropic scaling factors and a cluster method with partial supervision is used for segmentation.
  • a predetermined number of known classes of structural elements texture classes
  • their assignment to specific points reference points, points with a label
  • a distance measure that defines the difference between the respective texture features and each of the texture classes for each of the remaining points (points without a label)
  • each point is assigned to one of the predetermined texture classes.
  • the number of reference points is selected depending on the application, in particular as a function of the image size, such that a sufficient number of points can be taken into account for a statistically reliable assessment for each texture class. In each texture class, approx. 30 to 40 (or 100) reference points fall.
  • the aforementioned assignment is made for the initially unclassified points without a label by evaluating the distance measure, preferably by assigning the respective point to the texture class to which it has the smallest distance.
  • the cluster method implemented according to the invention is referred to as partially monitored or as a method with partial supervision, since the classification is based on the limited number of reference points with known texture assignment. This ensures that the image segmentation is based on a sufficient number of clusters and physically meaningful labein.
  • a particular advantage of the invention is that the image segmentation has a high level of reliability, even if the number of reference points is significantly smaller (the proportion of the reference points can be below 1%, for example 0.1%) than the number of label-free points is.
  • the definition of the distance measure for each texture class is specific to the orientation and shape of the point set that has been assigned to the predetermined texture as part of the partial supervision. But it can also Easier defined distance measures are used, which are defined for all texture classes together in the global feature space of the point distribution.
  • All points of the point distribution that are assigned to a texture class form a texture segment, which is then displayed or subjected to further processing.
  • an apparatus for implementing the texture segmentation method comprises a device for measuring the point distribution and the features of the respective system state belonging to each point, a filter device with means for scanning the points of the point distribution under consideration, means for paying for points around the examined points, means for detecting predetermined scaling factors and means for statistical processing of the scaling factors, an input device which is designed to assign predetermined reference points for which a texture class association is known to the corresponding texture classes, a computing device for determining and evaluating distance dimensions for the texture features of the other points m in relation to the texture classes and an output device with which the texture segments are displayed, temporarily stored or forwarded for further processing.
  • the point distributions processed according to the invention can, in the broadest sense, represent system states in an n-dimensional state space.
  • the point distribution represents a two-dimensional or high-dimensional image of the system state, so that in the following we generally speak of image segmentation and image points.
  • the segmented images can also signal or amplitude curves depending on a reference parameter (such as time, energy or the like.) Or optical gray scale ⁇ and / or include color images.
  • the examined systems can also include materials, mechanical devices or biological systems. Depending on the application, the detection of a system status is achieved by means of actuator, sensor, analysis and registration or signaling measures.
  • the actuators that may be required include measures for generating system reactions that are representative of characteristic states, such as, for example, the excitation of mechanical vibrations in an object under investigation or the triggering of evoked potentials in neurological systems.
  • the sensor system comprises the detection of system features in relation to the n parameters of interest and the presentation of the features in a high-dimensional feature space, for example by storing suitable value groups which are assigned to the features.
  • a complex, but delimitable image structure is called a texture.
  • a texture forms an image area or an image region in which the image structure can be traced back to repeating patterns, in which elements are arranged according to an arrangement rule.
  • a certain (constant) texture can be assigned to an image region if a set of local statistics or other local properties of the image features is constant or changes only slightly.
  • a picture texture is described by the number and type of its (gray) tonal basic elements and the spatial arrangement of these elements.
  • textures can be assigned the following properties.
  • a local order repeats itself over an image region that is large compared to the size of the local order.
  • the order consists of a non-random arrangement of basic components, the so-called 3
  • Micro patterns form roughly the same units, have the same size within the texture and can be characterized by specific local properties.
  • the recognition and segmentation of textures allows the differentiation of different object areas. If, for example, ultrasound images of inner tissue are examined, then tumor tissue and healthy tissue can be differentiated via the texture segmentation and the size of the tumor tissue can be determined. Automation of the recognition and size specification is of particular interest here.
  • the texture segmentation according to the invention generally allows both the assignment of textures to specific points and the determination of the size of the respective image regions with a constant texture.
  • the texture segmentation process is parameter-free and non-iterative.
  • the method has a high segmentation speed and reliability.
  • the segmentation result has a low sensitivity to the specific choice of boundary conditions in the image evaluation.
  • a nonlinear filter technology for texture feature extraction is introduced.
  • the segmentation process can easily be used for any task by adapting the non-linear filtering depending on the application. For the first time it is possible to simultaneously recognize textures and to analyze them quantitatively in terms of their extent.
  • FIG. 1 shows an example of a pixel image with four natural textures (a) and the result of the texture segmentation in four feature classes (b),
  • Fig. 2 is a flow chart to illustrate the
  • Fig. 3 is a flow chart to illustrate the
  • Fig. 4 is an illustration for determining a
  • Fig. 5 is a flow chart to illustrate the
  • Fig. 6 is an illustration to explain the
  • Fig. 7 is an image sequence to illustrate the
  • Fig. 8 is a sequence of images to illustrate the
  • Fig. 10 is an image sequence to illustrate the
  • Fig. 11 is a graph showing the texture segmentation at different noise levels.
  • FIG. 12 shows a schematic overview of a texture segmentation device according to the invention.
  • the texture segmentation according to the invention is explained below using the example of two-dimensional gray-scale images, but is not limited to this, but rather can be used in a corresponding manner at any point distributions and combinations of features.
  • the point distribution can also be formed, for example, by a plurality of synchronously recorded time series of sensor signals, for example on a machine, the segmentation according to the invention being directed to the search for specific time intervals within the time series in which, for example, normal operating states of the machine or special faulty states are present.
  • the point distributions considered can be continuous or discrete.
  • the image examples are partly shown in simplified form for printing reasons or with artificial structures (hatching or the like) provided without these being mandatory features of the invention.
  • FIGS. 2 to 6 explain the individual steps of the segmentation according to the invention. Then examples are given to illustrate the implementation of the individual steps and a device for implementing the method is described.
  • a two-dimensional gray value image G (x, y) of the size N • M is considered (N, M: number of pixels or pixels in the x or y direction).
  • a discrete gray value g (x, y) is assigned to each pixel (ge [0; 255]).
  • the location and feature information assigned to each pixel produces a high-dimensional structure, which in the example under consideration is a three-dimensional point distribution.
  • a three-dimensional vector p x (x, y, g (x, y)) is assigned to each pixel. So that the x, y and g values are in a comparable range of values, it may be necessary to normalize the gray values g x .
  • a pixel image G is shown in Fig. La.
  • the image segmentation according to the invention is now directed to this texture recognition by recording local features 12- of the pixel image for each pixel and a classification of the pixels on the basis of the detected features. These steps are shown in an overview in FIG. 2 with the feature detection 100, the texture classification 200 and the evaluation 300. Depending on the application, the evaluation of the features and / or the texture classification can be carried out again as a result of the evaluation.
  • the feature detection 100 (see FIG. 3) is directed to the determination of local features for each pixel.
  • the local features include characteristic image properties in the immediate vicinity of the pixel, which is significantly smaller than the (global) overall image.
  • the feature detection 100 comprises a determination of the point distribution 110, a determination of scaling factors 120, 130 and a feature extraction 140 for the formation of feature vectors which are assigned to each pixel.
  • step 110 the determination of the point distribution (step 110) consists of a simple image acquisition and a gray value evaluation known per se (if necessary with the aforementioned standardization).
  • step 110 comprises a measurement value recording based on the application-specific sensor system.
  • step 120, 130 With regard to the determination of the isotropic and anisotropic scaling factors (steps 120, 130), reference is made to the aforementioned DE-OS 43 17 746 and DE-PS 196 33 693. The procedures are known per se and are therefore only partially explained in detail here. IN THE
  • Scaling index determined (step 120). For this purpose, two spheres with different radii ai, a 2 (ai ⁇ a 2 ) are placed concentrically around each point p in the spatial and gray value space. Within each sphere there is a certain number of pixels with a gray value, which is also referred to as the total mass M (in each case based on the sphere radius a X / 2 ).
  • the isotropic scaling factor ⁇ results according to equation (1) as the logarithmic derivative of the total masses for both spheres: log (M (a 2 )) - log (M (a 1 )) ⁇ (x 1 , y-.
  • a ⁇ , a 2 ) - (1) log a 2 - log a with
  • x x , yi are the coordinates of the pixel in question, ⁇ the Heaviside function and
  • the calculation of the scaling factor represents a filter function, the coordinates x lf y x denoting the center of the filter.
  • the isotropic scaling factor ⁇ is only characteristic for radial gradients. For applications with complex structures, it is also necessary to determine the orientation properties of the local features. This is done by determining the anisotropic scaling factors (step 130).
  • the anisotropic scaling factors are determined analogously to the determination of the isotropic scaling factors from gradients in the density of the surrounding points of a point under consideration. telt, whereby to detect an orientation of the structure
  • the projections of the number of points are determined.
  • a special feature of the feature detection for the image segmentation according to the invention is that not only a pair of values or tuple of values is determined from anisotropic scaling factors in accordance with the dimensionality of the image for each pixel in a viewed image. According to the invention, the determination of a plurality of value pairs or value tuples of anisotropic scaling factors is provided in accordance with the principles explained with reference to FIG. 4.
  • FIG. 4 shows an example of three differently oriented structures 41, 42 and 43 in a two-dimensional pixel image.
  • the structures have a characteristic angle of rotation ⁇ of 0 °, 45 ° and 90 ° with respect to the x-axis.
  • the determination of the anisotropic scaling factors ⁇ x and ⁇ y is illustrated on structure 41 by drawing in the spheres with the radii a ⁇ , a 2 .
  • the projection of the surrounding points belonging to the structure 41 to the point p i under consideration have strong gradients in the y direction and less strong gradients in the x direction. This results in a low scaling factor ⁇ x and a high scaling factor y .
  • the situation is correspondingly different for structures 42 and 43.
  • the anisotropic scaling factors are determined in at least two coordinate systems rotated relative to one another.
  • the scaling factors in the x-, y- and x * -, y * - coordinate systems provide additional information about the alignment of the structure when the angle of rotation ( ⁇ 90 °) between the coordinate systems is known.
  • the determination of anisotropic scaling factors in the method according to the invention thus includes the detection of a scaling factor tuple from a plurality of scaling factors for each pixel considered, each of which corresponds to the twisted coordinate systems.
  • the principle of scaling factor determination m rotated coordinate systems shown in simplified form with reference to FIG. 4, is adapted as a function of the application. This is illustrated in FIG. 3 by step 131 of the definition of reference variables for determining the scaling factor. These reference values include the number of anisotropic scaling factors considered per pixel, the size of the spheres and the number and angle of the coordinate system rotations.
  • an angle diagram has proven to be advantageous for the rotation of the coordinate system, in which four rotation angles are set, each of which differ by 45 °.
  • One or two anisotropic scaling factors are thus determined for each pixel m four coordinate systems, each rotated by an angle of rotation ⁇ . It is emphasized that the number and amount of the coordinate system rotations can be chosen to be larger or smaller depending on the application. It is not important that the anisotropic scaling factors determined in each case can be interpreted in a certain way or correlated with visually detectable image features. It’s only important to have multiple values for different coordinate system rotations because these values contain all the information needed for further texture classification (see below).
  • the x * axis represents the spatial direction in the rotated coordinate system to which the scaling factor is based.
  • p * denotes the vectors of the pixels in the rotated coordinate system.
  • the rotation matrix D is the rotation matrix known per se, the g axis being the rotation axis here.
  • the second Heaviside function in equation (3) ensures that only points in the vicinity of the pixel under consideration are taken into account.
  • the calculation of the logarithmic derivative of the projected masses M x * according to equation (5) yields the anisotropic scaling factors ⁇ :
  • the anisotropic scaling factor ⁇ is calculated for each pixel for four different angles of rotation ⁇ .
  • the rotation angles 0 °, 45 °, 90 ° and 135 ° are used. Since the orientation information for each angle of rotation is already contained in one of the two anisotropic scaling factors that belong to an angle of rotation, it is sufficient for the later texture classification if only an anisotropic scaling factor is determined for each angle of rotation.
  • the scaling factors are determined not only for one pair of spheres, but for two pairs of spheres.
  • the feature extraction 140 takes place, the result of which is a feature vector x for each pixel.
  • the feature vectors x represent the input variables for the following texture classification 200 (see below). '3
  • the components of the feature vectors are formed by the local features determined for each pixel.
  • the feature vector thus comprises ten components from two isotropic and eight anisotropic scaling factors.
  • a statistical evaluation of the local features determined for the pixels is first carried out in step 141 and then the vector 142 is formed from local expected values of the individual local features.
  • the corresponding local expected value ⁇ > is calculated for each local characteristic in accordance with equation (6).
  • ⁇ (x 1 , y 1 )> - ⁇ (x, y) ⁇ (-
  • the parameter k represents the size of a window shifted over the pixels to take account of neighboring local features.
  • Ten expected values are thus formed for each pixel, taking into account the local characteristics of the neighboring pixels.
  • This statistical processing of the local features has the advantage that the texture classification that follows is considerably less influenced by boundary effects at the image edges. It has been shown that the border effects at the image edges do not have to be taken into account at all in order to obtain good segmentation results.
  • the feature acquisition 100 ends with the determination of the feature vectors x from the scaling factors or from the expected values of the scaling factors.
  • the actual texture classification (clustering) 200 according to FIG. 5 now follows.
  • a finite number of texture classes occurring in the examined image are first considered.
  • these predetermined texture classes are each defined by the operator or entered from a memory.
  • the definition of the texture classes is based on empirical values or also as part of an optimization in which the image segmentation is carried out several times with different class numbers and types.
  • the superscript indices each relate to a label or the non-designated state without label (u).
  • the subscripts run from 1 to n (see equation (7).
  • the subset X u is significantly larger than the subset X 1 of the feature vectors for which the label is known.
  • the actual steps of the cluster process follow, namely the initialization phase 230 and the implementation phase 240.
  • the initialization phase 230 so-called ellipsoidal distance measures or a Euclidean distance measure are defined in the feature space Define specific standards for reference points assigned to a texture class.
  • the remaining pixels are assigned to the different texture classes based on the distance dimensions or metrics.
  • the initialization phase 230 comprises the steps of center of gravity calculation 231, covariance matrix calculation 232, eigenvalue calculation 233, eigenvector calculation 234 and metric definition 235. These steps are carried out in the high-dimensional feature space that is spanned by the components of the feature vectors. In the present example, the feature space is therefore 10-dimensional. The steps of the initialization phase are explained below with reference to the simplified representation in a two-dimensional paint room according to FIG. 6.
  • the points of a texture each represent a texture class as a connected entity, which also as Clusters. 6 shows four clusters 61-64 for two arbitrarily selected components of the feature space in accordance with the expected values of the scaling factors ai and 0.2.
  • the aim of the initialization phase 230 is to determine, for an initially unclassified point 65 (without a label), which cluster and thus which texture class it is to be assigned to. According to a simple assignment procedure, a point without a label could simply be assigned to the cluster to which it has the smallest Euclidean distance. However, this can lead to incorrect assignments if the extent and orientation of the cluster are not taken into account.
  • the point 65 is at a short distance from the cluster 62, but has a very characteristic longitudinal extent. For this reason, the point 65 may be more likely to belong to the more distant cluster 63, since this assignment is more compatible with the radial extent of this cluster.
  • a separate distance measure is therefore defined for each cluster, ie for each texture class, which depends on characteristic properties of the cluster orientation and shape.
  • the center of gravity calculation 231 is carried out first. For each cluster or each texture class i, the cluster center with the location vector ⁇ 1 in the feature space is calculated according to equation (9).
  • the elements of the covariance matrices establish a link between the deviation of the components of each feature vector and the focus of the respective cluster i. In the simplified case according to FIG. 6, these are the deviations in the abscissa and ordinate directions.
  • the matrices C 1 are symmetrical, so that diagonalization and major axis transformation are possible.
  • the eigenvalues ⁇ , ⁇ , ..., ⁇ d of the matrix C 1 are calculated for each cluster i.
  • the eigenvectors for each matrix C 1 are then calculated in step 234.
  • the eigenvectors form the matrices D 1 according to equation (11).
  • the matrices D 1 describe the transition from the original coordinate system common to all clusters to a cluster-specific coordinate system that is spanned by the main axes of the respective clusters i.
  • This coordinate system is illustrated, for example, on cluster 64 in FIG. 6.
  • a local coordinate system is thus introduced for each texture class introduced as part of the partial supervision, the axes of which are calibrated by the shape of the respective cluster.
  • the local calibrations provide information about the orientation and shape of the respective clusters and thus the possibility of defining cluster-specific distance dimensions (ellipsoidal metrics) in the feature space. These distance measures are defined in step 235.
  • N var i • (2 • d + d 2 ) variables corresponding to the number of different clusters (classes) i and the dimension of the feature space d, since for each cluster d parameters ⁇ i, d location coordinates of the cluster centers and d 2 eigenvectors be taken into account.
  • These variables determine the ellipsoidal distance measurements (metrics) in the feature space in which the cluster assignment of the other points is carried out without a label (see also equation (15)).
  • the definition of the distance measure described here with the steps 232 to 234 is not a mandatory feature of the invention.
  • Image segmentation on the basis of the feature vectors determined by nonlinear filtering in feature detection 100 can also be implemented with a simpler distance measure, for example on the basis of the Euclidean distance between a pixel and the center of gravity of a cluster. Accordingly, steps 232 to 234 could be skipped (dashed arrow in FIG. 5).
  • the choice of the distance dimension has an impact on the quality of the image segmentation (see FIGS. 10, 11).
  • the distance vectors y * are then transformed into the coordinate systems of the main axes of each cluster (step 242). This is done according to equation (13) using the transition matrices D 1 .
  • c distance vectors corresponding to the c local coordinate systems of the clusters are available for each pixel that has not yet been classified.
  • the feature vector x "(or the associated pixel) is assigned to the cluster or texture class for which the size A has the lowest value (step 245).
  • the set of all pixels is fully classified, ie each pixel is a cluster or assigned to a texture class.
  • step 241 includes computing the vectors from one pixel to each center of gravity of the clusters. You can then proceed directly to step 241.
  • a segmentation step now takes place, in which pixels which have been assigned to a common cluster are provided with a common label, stored together and / or by means of a false color representation in a display of the processed Be marked.
  • the pixels belonging to a texture class also allow the size of the partial area of the image, which is formed by a texture, to be determined simultaneously according to one of the numerical evaluation methods known per se. Further steps in the evaluation 300 are the forwarding of the texture data to various additional processors and / or display means designed according to the application.
  • a partial area can be closed or consist of several separate sections.
  • the fulfillment of a quality measure can also be checked and, in the case of a negative result, a return to the feature detection 100 or to the texture classification 200 by setting new parameters is provided as.
  • Correctly classified, previously known pixels for example, serve as a quality measure.
  • the procedure explained above can be modified such that not all known pixels are taken into account in step 220 and are included in the following initialization phase 230.
  • the initialization can be carried out with a first part of the known pixels with known texture classification.
  • the result of the implementation phase 240 can then be checked with the other known pixels.
  • FIGS. 1 and 7 to 11 show results of an image segmentation according to the invention using the example of gray-scale images of natural or artificial textures.
  • Fig. 1 shows a 256 x 256 pixel image with four natural textures that have already been named above (Fig. La). These are the Brodatz textures D03, D90, D93 and D51.
  • Fig. Lb shows the segmentation result. 97.3% of the pixels were classified correctly. It also shows that the boundaries between the textures are reproduced relatively well.
  • FIG. 7 shows further Brodatz textures in a 128 • 256 pixel image.
  • the original textures D55, D68 are shown in Fig. 7a.
  • FIG. 7c results. JL-S
  • 8 illustrates the image segmentation according to the invention using the example of artificial textures.
  • 8a shows two artificial textures, of which the first texture in the left half of the picture consists of triangles and the second texture in the right corner of the picture consists of arrows.
  • the textures have the same second-order statistics and can therefore not be distinguished using local linear feature acquisitions (see, for example, Julesz in "Rev. Mod. Phys.”, Vol. 63, 1991, pp. 735 ff.).
  • the white pixels which are printed for clarity, illustrate the reference points or pixels with labels that are used to initialize the cluster method. It concerns 655 pixels, which were chosen randomly distributed.
  • FIG. 8c shows that despite the identical second-order statistics, reliable image segmentation can be achieved with the method according to the invention. 98.2% of the pixels have been correctly classified.
  • FIG. 9a Another set of four natural Brodatz textures (D96, D55, D38 and D77) is shown in Fig. 9a.
  • the figures 9b and 9c illustrate the different results using different distance measures for texture classification. According to FIG. 9b, the use of ellipsoidal metrics results in a proportion of 97.1% of correctly classified points. If, on the other hand, only a Euclidean metric is used for texture classification, only 94.9% of the pixels are correctly classified according to FIG. 9c. This results in a segmentation result which is worse than in FIG. 9b, but which is nevertheless sufficiently good for various applications.
  • the noise effect visualized in FIG. 10 is also illustrated in the curve representation according to FIG. 11.
  • 11 shows the number of correctly classified pixels (ccp) in percent as a function of the noise level ⁇ noise .
  • the solid line corresponds to the segmentation result using the ellipsoidal metrics, whereas the dashed line corresponds to the result using the Euclidean metric. The much higher stability of the image segmentation is shown in the first case.
  • the image examples illustrate the following essential advantages of the image segmentation according to the invention.
  • all texture recognitions are based only on knowledge of the feature image (feature vectors).
  • the image segmentation does not depend on the specific pixel coordinates, but only on the properties of the feature vectors.
  • the procedure is parameter-free with regard to the cluster assignment. This represents a significant difference compared to the parameters to be optimized in conventional texture classification methods.
  • Other components of a device according to the invention, such as a control device is not shown.
  • the filter device 2 and computing device 4 are preferably formed by a common computer.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Pure & Applied Mathematics (AREA)
  • Computational Mathematics (AREA)
  • Probability & Statistics with Applications (AREA)
  • Mathematical Physics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Algebra (AREA)
  • Evolutionary Biology (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Operations Research (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
EP00941981A 1999-06-21 2000-05-24 Verfahren und vorrichtung zur segmentierung einer punkteverteilung Withdrawn EP1203342A2 (de)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DE19928231 1999-06-21
DE19928231A DE19928231C2 (de) 1999-06-21 1999-06-21 Verfahren und Vorrichtung zur Segmentierung einer Punkteverteilung
PCT/EP2000/004739 WO2000079471A2 (de) 1999-06-21 2000-05-24 Verfahren und vorrichtung zur segmentierung einer punkteverteilung

Publications (1)

Publication Number Publication Date
EP1203342A2 true EP1203342A2 (de) 2002-05-08

Family

ID=7911924

Family Applications (1)

Application Number Title Priority Date Filing Date
EP00941981A Withdrawn EP1203342A2 (de) 1999-06-21 2000-05-24 Verfahren und vorrichtung zur segmentierung einer punkteverteilung

Country Status (5)

Country Link
US (1) US6888965B1 (ja)
EP (1) EP1203342A2 (ja)
JP (1) JP2003502765A (ja)
DE (1) DE19928231C2 (ja)
WO (1) WO2000079471A2 (ja)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB0130210D0 (en) * 2001-12-18 2002-02-06 Caladrius Ltd Segmentation of images using the watershed method
US20030123745A1 (en) * 2001-12-28 2003-07-03 Mattias Bryborn Method and apparatus for compression and reconstruction of electronic handwriting
US7554883B2 (en) * 2004-10-11 2009-06-30 Landmark Graphics Corporation Fault filter for seismic discontinuity data
KR100752333B1 (ko) * 2005-01-24 2007-08-28 주식회사 메디슨 3차원 초음파 도플러 이미지의 화질 개선 방법
AT502127B1 (de) * 2005-07-04 2008-10-15 Advanced Comp Vision Gmbh Acv Verfahren zur segmentierung von datenstrukturen
DE102005037367B3 (de) * 2005-08-08 2007-04-05 Siemens Ag Verfahren für eine Röntgeneinrichtung
US20070206844A1 (en) * 2006-03-03 2007-09-06 Fuji Photo Film Co., Ltd. Method and apparatus for breast border detection
ES2303790B1 (es) * 2007-02-14 2009-07-06 Universidad De Castilla-La Mancha Procedimiento de analisis, visualizacion y procesado de imagenes digitales biomedicas.
KR101323439B1 (ko) 2008-11-12 2013-10-29 보드 오브 트러스티스 오브 더 리랜드 스탠포드 주니어 유니버시티 특징 디스크립터를 표현하고 식별하는 방법, 장치 및 컴퓨터 판독가능 저장 매체
RU2542946C2 (ru) * 2009-11-19 2015-02-27 Нокиа Корпорейшн Способ и устройство для отслеживания и распознавания объектов с использованием дескрипторов, инвариантных относительно вращения
US9122956B1 (en) * 2012-11-09 2015-09-01 California Institute Of Technology Automated feature analysis, comparison, and anomaly detection
US10127717B2 (en) 2016-02-16 2018-11-13 Ohzone, Inc. System for 3D Clothing Model Creation
US11615462B2 (en) 2016-02-16 2023-03-28 Ohzone, Inc. System for virtually sharing customized clothing
US10373386B2 (en) 2016-02-16 2019-08-06 Ohzone, Inc. System and method for virtually trying-on clothing
WO2018200348A1 (en) * 2017-04-24 2018-11-01 President And Fellows Of Harvard College Systems and methods for accelerating exploratory statistical analysis

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0627693B1 (fr) * 1993-05-05 2004-11-17 Koninklijke Philips Electronics N.V. Dispositif de segmentation d'images composées de textures
DE4317746A1 (de) * 1993-05-27 1994-12-01 Max Planck Gesellschaft Verfahren und Einrichtung zur Raumfilterung
JP3785700B2 (ja) * 1995-12-18 2006-06-14 ソニー株式会社 近似化方法および装置
US5825909A (en) * 1996-02-29 1998-10-20 Eastman Kodak Company Automated method and system for image segmentation in digital radiographic images
EP0836784B1 (en) * 1996-05-06 2003-09-17 Koninklijke Philips Electronics N.V. Segmented video coding and decoding method and system
DE19633693C1 (de) * 1996-08-21 1997-11-20 Max Planck Gesellschaft Verfahren und Vorrichtung zur Erfassung von Targetmustern in einer Textur
US6192150B1 (en) * 1998-11-16 2001-02-20 National University Of Singapore Invariant texture matching method for image retrieval
US6693962B1 (en) * 1999-02-01 2004-02-17 Thomson Licensing S.A. Process to extract regions of homogeneous texture in a digital picture
KR100308456B1 (ko) * 1999-07-09 2001-11-02 오길록 주파수 공간상에서의 질감표현방법 및 질감기반 검색방법
CA2279797C (en) * 1999-08-06 2010-01-05 Demin Wang A method for temporal interpolation of an image sequence using object-based image analysis
JP2002064825A (ja) * 2000-08-23 2002-02-28 Kddi Research & Development Laboratories Inc 画像の領域分割装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO0079471A2 *

Also Published As

Publication number Publication date
DE19928231A1 (de) 2000-12-28
WO2000079471A3 (de) 2001-03-29
WO2000079471A2 (de) 2000-12-28
DE19928231C2 (de) 2003-03-27
US6888965B1 (en) 2005-05-03
JP2003502765A (ja) 2003-01-21

Similar Documents

Publication Publication Date Title
DE60034668T2 (de) Methode zur texturanalyse von digitalen bildern
DE19521346C2 (de) Bilduntersuchungs/-Erkennungsverfahren, darin verwendetes Verfahren zur Erzeugung von Referenzdaten und Vorrichtungen dafür
DE69322095T2 (de) Verfahren und gerät zur identifizierung eines objekts mittels eine geordneten folge von grenz-pixel-parametern
DE69229856T2 (de) Adaptives sichtverfahren und -system
DE68928895T2 (de) Verfahren und Gerät für universelle adaptiv lernende Bildmessung und -erkennung
DE69811049T2 (de) Elektronisches bildverarbeitungsgerät zur detektion von dimensionnellen änderungen
DE60307583T2 (de) Auswertung der Schärfe eines Bildes der Iris eines Auges
DE69805798T2 (de) Fingerabdrukklassifikation mittels raumfrequenzteilen
DE60307967T2 (de) Bildverarbeitungsverfahren für die untersuchung des erscheinungsbildes
EP0780002B1 (de) Verfahren und vorrichtung zur rekonstruktion von in rasterform vorliegenden linienstrukturen
EP2284795A2 (de) Quantitative Analyse, Visualisierung und Bewegungskorrektur in dynamischen Prozessen
WO2000079471A2 (de) Verfahren und vorrichtung zur segmentierung einer punkteverteilung
DE60118606T2 (de) Auswahlverfahren für wimperntusche, auswahlsystem für wimperntusche und gerät für die wimperntuscheberatung
DE102012208625B4 (de) Verfahren und System zur Verarbeitung von MRT-Daten des menschlichen Gehirns
DE19633693C1 (de) Verfahren und Vorrichtung zur Erfassung von Targetmustern in einer Textur
DE60303138T2 (de) Vergleichen von mustern
DE2903625C2 (ja)
WO2005122092A1 (de) Verfahren und vorrichtung zur segmentierung einer digitalen abbildung von zellen
DE60220118T2 (de) Vorrichtung, Verfahren und Programm zum Vergleichen von Mustern
EP1437685A2 (de) Verfahren zum Segmentieren einer dreidimensionalen Struktur
DE102005049017B4 (de) Verfahren zur Segmentierung in einem n-dimensionalen Merkmalsraum und Verfahren zur Klassifikation auf Grundlage von geometrischen Eigenschaften segmentierter Objekte in einem n-dimensionalen Datenraum
DE69606999T2 (de) Verfahren und vorrichtung zur bilderzeugung
DE19754909C2 (de) Verfahren und Vorrichtung zur Erfassung und Bearbeitung von Abbildungen biologischen Gewebes
EP2096578A2 (de) Verfahren und Vorrichtung zur Charakterisierung der Formation von Papier
DE102021133868A1 (de) Mikroskopiesystem und Verfahren zum Modifizieren von Mikroskopbildern im Merkmalsraum eines generativen Netzes

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20020111

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20061201