WO2000004497A1 - Masquage automatique d'objets dans des images - Google Patents

Masquage automatique d'objets dans des images Download PDF

Info

Publication number
WO2000004497A1
WO2000004497A1 PCT/US1999/015796 US9915796W WO0004497A1 WO 2000004497 A1 WO2000004497 A1 WO 2000004497A1 US 9915796 W US9915796 W US 9915796W WO 0004497 A1 WO0004497 A1 WO 0004497A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
intensity
threshold
pin
background
Prior art date
Application number
PCT/US1999/015796
Other languages
English (en)
Inventor
Bruce E. Ii Desimas
Jeff A. Levi
Original Assignee
The Perkin-Elmer Corporation Pe Biosystems Division
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The Perkin-Elmer Corporation Pe Biosystems Division filed Critical The Perkin-Elmer Corporation Pe Biosystems Division
Priority to CA002336885A priority Critical patent/CA2336885A1/fr
Priority to EP99933959A priority patent/EP1095357B1/fr
Priority to AT99933959T priority patent/ATE217995T1/de
Priority to JP2000560543A priority patent/JP2002520746A/ja
Priority to DE69901565T priority patent/DE69901565T2/de
Priority to AU49898/99A priority patent/AU754884B2/en
Publication of WO2000004497A1 publication Critical patent/WO2000004497A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10064Fluorescence image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20016Hierarchical, coarse-to-fine, multiscale or multiresolution image processing; Pyramid transform
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30072Microarray; Biochip, DNA array; Well plate

Definitions

  • This invention relates to digital imaging, particularly boundary determination of objects in images, and more particularly such determination in optical analytical instruments.
  • light is received from a plurality of samples to effect separate image areas associated with the samples.
  • the light may be transmitted through the samples, or may be emitted by the samples such as with fluorescence or chemiluminescence samples.
  • An analysis is performed for each of the image areas.
  • chemiluminescence light is emitted via a chemical reaction and may be detected with a luminometer.
  • fluorescence for example, in an optical instrument for monitoring polymerase chain reaction production of DNA in a reaction apparatus, fluorescing dye is attached to double stranded DNA so as to produce fluorescence when excited with high energy light.
  • fluorescing dye is attached to double stranded DNA so as to produce fluorescence when excited with high energy light.
  • AOI area of interest
  • objects objects
  • the intensities of the objects correspond to the concentration of DNA or other sample material, and the intensities are processed by computer to provide such information.
  • computer programming is needed to effect adaptive masking to accurately identify object areas from the image data.
  • acquisition includes sampling, digitization, storage and (optionally) compression.
  • Manipulation includes enhancement (if required), segmentation and morphology.
  • Extraction involves object representation and/or quantitation for the desired purpose.
  • the present invention is particularly directed to segmentation which preliminarily defines object edges from grey-scale images, and extraction of the objects for higher level quantitation.
  • a histogram is used to analyze intensities.
  • a "histogram” has ordinary meaning herein as a plot (or stored data equivalent) of frequency of each intensity vs. intensity for an image or section thereof.
  • a threshold intensity in a histogram separates the intensity clusters of objects and background. Once thresholds for each pixel are determined, object pixels can be identified and preliminary edges of the objects detected, subject to refinement, such as by morphological operations.
  • Thresholding is reviewed, for example, in an article "A Survey of Thresholding Techniques" by P. Sahoo, S. Saltani and A. Wong, Computer Graphics and Image Processing, 41_, 233-260 (1988).
  • histogram data is fitted to curves such as a sum of two
  • Gaussian curves (one for each cluster in a "bimodal" histogram), and a threshold intensity is determined stastically to minimize overlap areas.
  • Another method for computing threshold is iterative selection taught in an article "Picture Thresholding Using an Iterative Selection Method” by T.W. Ridler and S. Calvard, IEEE Transactions on Systems, Man and Cybernetics, Vol. SMC-8, No. 8, Jan. 1978.
  • Yet another method is taught in an article "A Threshold Selection Method from Gray-Level Histograms” by N. Otsu, IEEE Transactions on Sytems, Man and Cybernetics, SMC-9, No. 1, Jan. 1979. This involves a criterion function that is selected to maximize the separation of the two pixel classes to give the best threshold. The function is derived from discriminant analysis.
  • the foregoing thresholding methods have been applied to entire images, to determine a "global threshold" which is adequate when all object and background pixels have well separated grey levels.
  • the global threshold may give poor results by returning too many erroneous object pixels in some regions and missing object pixels in others.
  • edge detection Another form of segmentation is edge detection.
  • Several techniques have been used for identifying object boundaries or edges as taught, for example, in an article "A Survey of Edge Detection Techniques” by L. Davis, Computer Graphics and Image Processing, 4 248-270 (1975).
  • a significant problem is pixel noise in the image, making the edges irregular and blurred.
  • region growing in which an initial "seed” pixel is selected and subsequent pixels are added to the region as they qualify.
  • a variation is to use “split and merge” where the image is successively split until each region is homogeneous. The image sub-regions are then merged if similar characteristics exist.
  • noise is a problem.
  • region growing extensive computations are used which impacts performance in both running time and computer emory.
  • a common method for image processing used after segmentation is morphological operations.
  • Morphological operations generally use a structuring system to define the object shape.
  • One such operation is dilation which grows or dilates the target object, and another is erosion which removes pixels from the object. These may be combined in either “opening” which is performed by eroding followed by dilation, or “closing” which is dilation followed by erosion.
  • An objective of the invention is to provide a novel method and a novel means for establishing boundaries of objects in a stored digital image associated with pixels that define an image area.
  • a particular objective is to provide such a method and a means for establishing boundaries automatically.
  • Another objective is to provide such a method and a means for establishing boundaries in an optical instrument for analyzing a plurality of samples such as by fluorescence or chemiluminescence.
  • Total intensity for a section of pixels is considered to be bimodal as representative of both background intensity and object intensity, or unimodal as representative of only background intensity or object intensity.
  • the image area is divided into nonoverlapping blocks each formed of a plurality of pixels. Identification is made as to whether each block has unimodal intensity or bimodal intensity.
  • a threshold is computed for each block having bimodal intensity, such that intensity on one side of the computed threshold is background intensity and on the other side of the computed threshold is object intensity.
  • a block threshold is designated for each block, the block threshold for bimodal intensity being the computed threshold, and for unimodal intensity being an assigned threshold derived from neighboring block thresholds.
  • a pixel threshold is derived for each pixel from block thresholds, preferably by interpolating a pixel threshold for each pixel from block thresholds. From the pixel threshold, each pixel is identified as having either an object intensity and a background intensity.
  • a first value is assigned to each pixel having an object intensity, and a second value to each pixel having a background intensity, thereby converting image objects to binary objects. The binary objects are delineated to thereby establish the boundaries of the image objects.
  • the image area has bimodal intensity
  • the step of dividing the image area comprises n-fold (advantageously 4-fold) subdividing the image area into non-overlapping n-fold sections, and successively n-fold subdividing such sections a selected number of times until final sections define the blocks.
  • Each set of n-fold sections is considered to be a set of child sections of a parent that itself is a section or is the image area.
  • the step of designating an assigned threshold comprises identifying whether each section has unimodal intensity or bimodal intensity, computing thresholds for the image area and each section having bimodal intensity, and designating each computed threshold as a section threshold.
  • the step of designating an assigned threshold further comprises, for each section having unimodal intensity in a set of n-fold sections, designating an average of the computed thresholds for other sections in the set if said other sections have bimodal intensity, and otherwise designating the threshold intensity of the parent of the set, the section thresholds being designated successively through successive subdividings until a block threshold is designated for each block.
  • a search entity is defined mathematically as a combination of a pin, a rod pivotable on the pin, and a wheel attached to the rod approximately one pixel space from the pin.
  • the pin is constrained to move among object pixels
  • the wheel is constrained to move among background pixels.
  • the step of delineating the binary objects comprises, in sequence, steps of searching, alternately moving the wheel and the pin and, if necessary skipping. Searching is effected in a first-hand direction from a comer of the image area (e.g.
  • the pin is placed on the initial pixel with the wheel in an adjacent background pixel.
  • the wheel is moved in a second-hand direction (e.g. clockwise) opposite from the first-hand direction until the wheel is constrained by an object pixel.
  • the pin is moved in the first-hand direction (e.g. clockwise) to a next object pixel and, unless constrained, the pin is further moved until the pin is constrained by a background pixel.
  • an instrument receptive of light from a plurality of samples to effect separate image objects for analysis thereof.
  • the instrument includes a processor for effecting such analysis, with the processor comprising means for effecting the foregoing steps.
  • Objects are further achieved by a computer readable storage medium for utilization with an instrument receptive of light from a plurality of samples to effect separate image objects for analysis thereof.
  • the storage medium contains program code embodied therein so as to be readable by the processor, the program code including means for effecting the foregoing steps.
  • FIG. 1 is a schematic drawing of an instrument incorporating the invention.
  • FIG. 2 is an illustration of a computer screen display of an image area with objects corresponding to a plurality of light emitting samples with respect to the instrument of FIG. i.
  • FIG. 3 is an idealized histogram of frequency vs. intensity for a section of an image area such as illustrated in FIG. 2, illustrating a threshold between background and object intensities.
  • FIG. 4 is a schematic drawing of 4-fold subdividing of an image area such as in FIG. 2 into blocks.
  • FIG. 5 is a flow chart for determination of block thresholds and pixel thresholds utilizing a processor of the instrument of FIG. 1 and histograms such as in FIG. 3.
  • FIG. 6 is an illustration of a computer screen display of an image area showing block thresholds determined according to the flow chart of FIG. 5 for blocks effected according to the subdividing of FIG. 4.
  • FIG. 7 is a schematic drawing for a procedure of deriving pixel thresholds from block thresholds in an image area according to the flow chart of FIG. 5.
  • FIG. 9 is a flow chart for determination of object binderies from pixel intensities and the pixel thresholds determined according to the flow chart of FIG. 8.
  • FIG. 10 is a flow chart for a delineation procedure of the flow chart of FIG. 9.
  • FIG. 11 is a schematic drawing of a portion of an image area illustrating a "pin-and-wheel” procedure utilized in the flow chart of FIG. 10.
  • An apparatus 10 and method of the invention are generally useful for establishing boundaries of objects in a stored digital image associated with pixels that define an image area.
  • the images are generally of the type that are acquired by a video camera 12 which has a lens 14 to focus an image onto an array detector 16.
  • the detector may be, for example, a charge injection device (CLD) or, preferably, a charge coupled device (CCD).
  • CLD charge injection device
  • CCD charge coupled device
  • a conventional video camera containing a CCD detector, a lens and associated electronics for the detector should be suitable, such as an Electrim model 1000L which has 751 active pixels horizontal and 242 (non-interlaced) active pixels vertical.
  • This camera includes a circuit board that directly interfaces to a computer ISA bus.
  • Analog/digital (A/D) interfacing 17 and framegrabber circuitry are in this circuit but may be provided separately.
  • any other digital imaging device or subsystem may be used or adapted that is capable of taking still or freeze- frame images that are stored in memory 18, e.g. in a linear representation, for processing by a processing unit (CPU) 20 in a computer 22 and display of a processed image or associated information on a monitor 24.
  • the raw image typically is stored in a linear representation where scan lines (rows) of pixels are held sequentially in memory.
  • Image intensity for each pixel typically is stored as a byte or word of 8 or 16 bits respectively.
  • the invention is particularly advantageous with an optical analytical instrument 26 wherein light 28 is received from a plurality of samples 30 to effect separate image areas associated with the samples.
  • the DNA reaction apparatus has vials 32 as sample holders containing light-emitting sample material that effect a pattern of well defined regions 34 in the image area 36 (FIG. 2) in a low intensity background 37.
  • the intensity in each region is processed to provide information on the samples such as concentration of DNA.
  • AOI areas of interest
  • the present invention utilizing computer programming is useful for automatic adaptive masking to accurately identify object areas from the image data, in which the shapes may be irregular and the shapes and background may have non-uniform illumination.
  • the invention also should be more broadly useful for outlining objects in any image that may include scenery or the like.
  • the computer programming is conventional such as with C ++ language. Adaptations of the programming for the present invention from flow charts and descriptions herein will readily be recognized and achieved by those skilled in the art.
  • the flow charts represent means and steps for carrying out various aspects of the invention. Details of the computer and the programming are not important to the invention, and any conventional or other desired computer and programming components that would serve the purposes described herein should be deemed equivalent and interchangeable.
  • a conventional or other desired graphics program such as Image Pro PlusTM from Media Cybermetrics may be incorporated to display the image if desired.
  • TIFF Trigger Image Format
  • Adobe version 6.0 TLFF Trigger Image Format
  • a suitable development software package is Microsoft Visual C ++ Developer Studio. Grey scale 8-bit and 16-bit programming should be used without compression (or with lossless compage ression), as a present goal is to identify object shape based on intensity and contrast.
  • Each pixel has an associated intensity (brightness, represented digitally) corresponding to background or an object in the image. There also may be fluctuations due to noise.
  • a group or section of pixels may be bimodal as representative of both background intensity and object intensity, or unimodal as representative of only background intensity or object intensity.
  • Bimodal intensity of a region may be seen in a grey-scale histogram 38 (FIG. 3) of frequency (number of pixels with a given intensity) against intensity. This histogram essentially is a combination of two bell shaped curves, one for background 40 and the other for an object 42.
  • a threshold 44 is computed from the pixels in a selected region of the image, using a histogram, for use in classifying the intensity for each pixel in the region as either object or background. Any of a variety of methods of computing the threshold may be used.
  • One conventional method is minimizing error of classification which assumes two Gaussian distributions.
  • a simplified form of this, useful for the present case, is to compute the half-way point between the two peaks using peak analysis. After smoothing, all peaks are found in the histogram. Then a two-stage hill-climbing method is used to identify the two major peaks corresponding to the "background" and "object" distributions.
  • the background peak is identified as the highest peak at low intensity, as found by a search of peaks from left to right.
  • the object peak is identified as the highest peak at high intensity, as found by a search of peaks from right to left.
  • an initial histogram size may be 10 bits, with additional bits added if the raw histogram does not contain at least 64 bins.
  • a test for bimodality should be performed, where two different peaks must be found and the valley must be lower than both peaks.
  • Another suitable method for computing the threshold is iterative selection taught in the aforementioned article by T.W. Ridler and S. Calvard. A preferred method is taught in the aforementioned article by N. Otsu, incorporated herein by reference in its entirety.
  • Unimodal intensity represents either background or an object, and "threshold" is zero representing "invalid". This will exist for a region small enough to encompass only an object or area of interest in a sample pattern, or a background area between objects or along the edge of the image. Procedures for dealing with this are described below.
  • the global (full) image area is sub-divided in a sequence of steps, advantageously utilizing quad-tree procedures (FIG. 4).
  • the image area 36 is fourfold subdivided into non-overlapping, fourfold sections 46 (i.e. with four sections in the image area) which preferably are dimensionally equal. These sections are similarly subdivided into four more sections 48 which may further be subdivided into more fourfold sections 50.
  • the successive fourfold subdividing is effected a selected number of times until a final pattern of sections constitute blocks of suitable size for analysis. (For clarity FIG. 4 shows successive child subdivisions of only one parent.
  • each set of fourfold sections are deemed to be child sections of a parent that is the image area or a section, in each pair of generation levels.
  • fourfold subdivisions advantageously provide sufficient detail for the present purpose, others may be used such as twofold, threefold or eightfold; this is symbolized more broadly herein as "n-fold”. Even more broadly, "n” may vary from level to level.
  • An initial threshold is computed 52 (FIG. 5) for each section through the subdivisions including each final block.
  • the computed threshold for unimodal intensity is zero, and for bimodal intensity is nonzero such that intensity on one side of the computed threshold is background intensity and on the other side of the computed threshold is object intensity.
  • the zero or nonzero computation effectively identifies whether each block has unimodal intensity or bimodal intensity. Zero is designated as an invalid threshold and, substituted therefor, a section threshold is derived from neighboring thresholds (siblings). If there are enough near (siblings) with nonzero, an average of their thresholds should be suitable, otherwise the threshold of the parent is designated. (Zero may be substituted with another value different than any possible values for the bimodal intensity.)
  • section thresholds are designated successively through the successive subdivided sections until a block threshold is established for each final block. If the computed threshold 55 for a child section 58 being considered is nonzero 54 (bimodal), the computed threshold is designated 56 to be the section threshold 57. If a computed threshold for the child section 58 is zero 60 (unimodal), the sibling sections 66 in its fourfold set are analyzed. If all of the computed thresholds 64 of the siblings are nonzero 68. then the section threshold for the section 58 under consideration is designated 56 to be an average 62 of the computed thresholds for sibling sections in the set. Otherwise 70, the previously-dete ⁇ nined section threshold 72 of the parent 74 is designated 56. Although an average may be sufficient if only two of the other sections have nonzero threshold intensity, preferably the average is utilized only if all three of the other sections have nonzero threshold intensity. Similar procedures can be used for other types of "n-fold".
  • A upper left threshold
  • B upper right threshold
  • a set of virtual outside pixels 79 is established outside of the boundary. These outside pixels are spaced from their nearest pixels by the same distance as the spacings between neighboring center pixels within the image boundary, and thus lie on an outer perimeter 87 spaced a half block out from the image boundary.
  • the outside pixels are given thresholds by extrapolation 77 from one or more neighboring pixels; for example, an outside pixel 79' may be given the same threshold as its nearest neighboring center pixel.
  • a linear extrapolation is made from two adjacent center pixels 67'.
  • the four corner outside pixels may extrapolated diagonally.
  • the interpolations for the pixels P' lying outside of the inner perimeter utilize the virtual pixels.
  • FIG. 8 shows a more uniformly varying pattern of pixel thresholds interpolated from the block thresholds of FIG. 6.
  • the thresholds for the outer pixels P' may be determined directly by extrapolation.
  • interpolation includes such extrapolation near the boundary.
  • identification 76 is made as to whether its actual intensity 78 is either object 80 or background 82.
  • a first value 84 (e.g. 1) is assigned 86 to each pixel having an object intensity
  • a second value 88 (e.g. 0) is assigned to each pixel having a background intensity, thereby converting image objects to binary objects.
  • the assigned values are arbitrary. In the case of light emitting samples (FIG. 2), the objects will be brighter than me background. More generally, for other tvpes of images, either could be brighter.
  • the binary objects are then delineated 90, and this delineation establishes the boundaries 92 of the image objects.
  • the global image area should be bimodal for the foregoing procedures, although in some circumstances this may not be necessary.
  • the image intensities should have sufficiently smooth gradients across the blocks in the image to support the assumption that the center pixel in each block has a threshold equal to the block threshold. For these reasons, at least where the procedures are used for a fixed pattern of objects as with an instrument with light emitting areas, the edges of the fixed objects may be determined only once with a suitable image area. Once these are determined, all subsequent measurements may be made from the same delineated areas. Alternatively, if there is the possibility of area drift, a model pattern may be provided for periodic use in delineating the object areas.
  • an initial neighbor evaluation is made to label every object pixel having at least one neighboring background pixel.
  • the neighbors may be 4-connected boundary pixels of the object (4 neighbors excluding diagonals) or, preferably for reduced ambiguities, 8-connected (all 8 neighbors including diagonals).
  • each such object pixel is assigned a third value (e.g. 2), thereby creating associated tertiary pixels that identify boundary objects.
  • Another technique for delineation is a "pin-and-wheel" search described below.
  • Advantageously both are used consecutively, with the 8-connected neighbor evaluation 91 (FIG. 10) to preliminarily label boundary or edge objects 93 for a purpose of initial searching for starting points, followed by the pin-and-wheel to refme and record the boundaries.
  • an initial scanning search 104 (FIGS. 10 & 11) is made beginning in the upper right corner 106 of the image area, traversing leftward line by line until an initial object pixel 108 is located.
  • Tf a neighbor evaluation 91 has preliminarily labeled edge (tertiary) objects 93, only the latter are searched for. The reason is that such a search more readily allows subsequent searches for an initial object pixel for other objects, without sorting out pixels of objects already delineated by the pin-and-wheel technique.
  • each pixel is considered to be either a background pixel 102 or an object pixel 103.
  • An object (non-background) pixel may have either the ordinary binary value (e.g. 1) or may be an object pixel labeled as a preliminary edge pixel 93 having a tertiary value (e.g. 2).
  • a search entity is defined mathematically as a combination of a pin 94, a rod 96 pivotable on the pin, and a wheel 98 attached to the rod approximately one pixel space from the pin. The pin is constrained to move only among object pixels, and the wheel is constrained to move only among background pixels.
  • the pin is placed 110 on the initial pixel with the wheel in an adjacent background pixel 112, preferably above the pin for a right-to-left search, so as to ensure that the wheel is outside of the object.
  • the wheel is moved 116 in a clockwise direction until the wheel is constrained by an object pixel 118.
  • the pin is moved 120 counterclockwise to a next object pixel 122.
  • a sort of walking motion (virtually by computer), the steps of moving the pin across object pixels are repeated until the pin is constrained 124 by a background pixel, and then the wheel is moved 116 again.
  • the pin and the wheel may both be found to be constrained simultaneously 126 by an adjacent object pixel 128 and an adjacent background pixel 130 respectively, h such case, the wheel is skipped 132 over the adjacent object pixel to a next background pixel 130, and the alternating moving of the pin and the wheel are resumed 116, 120.
  • These boundary pixels have their locations stored for future use in analyzing the objects, for example analyzing intensity of fluorescent samples.
  • the boundary pixels advantageously are given a fourth value (e.g. 3) replacing the original preliminary (tertiary) values which otherwise may be ignored or returned to the binary value (e.g. 1).
  • the boundary pixels thereby delineate a binary object and consequently an actual object.
  • the scanning search 104' is resumed to find an initial pixel for another object.
  • the search is made easier by searching only for preliminary (tertiary) edge pixels not identified as actual boundaries already identified by pin-and-wheel.
  • the steps of searching and applying pin and wheel are further repeated 136 to identify other binary objects until no further initial pixel is located.
  • the tertiary and fourth values may be returned to the binary value (e.g. 1 ). Alternatively the fourth values may be retained to allow display of the object boundaries.
  • the scanning search for an initial pixel for each object should be a grid search preferably beginning in any comer and going from right to left, left to right, etc.
  • the wheel motion may be clockwise or counterclockwise.
  • the pin motion must be opposite from the wheel motion to achieve the "walk".
  • the wheel motion should be "opposite" from the search direction, meaning if the search is in rows from right to left from the upper ⁇ ght (like an initial counterclockwise), then the wheel should be moved clockwise, and vice versa.
  • first-hand direction refers to the initial searching direction which may be right to left in successive lines starting from the upper right, or left to right in successive lines starting from the lower left, or top to bottom in successive columns starting from the upper left, or bottom to top starting from the lower right; i.e "counterclockwise” for the initial line or column search.
  • first- hand direction may initially be “clockwise”, e.g left to right from the upper left.
  • the pin motion should always be this same “first-hand direction”, i.e. counterclockwise if the search starts “counterclockwise”, or clockwise if the search starts "clockwise”.
  • the temi "secondhand direction" for the wheel means oppositely from the first-hand direction, i.e. clockwise if the initial search and wheel motion are counterclockwise, and vice versa.
  • search it is preferable for the search to begin at a comer, other search patterns may be suitable, in which case the term "first-hand direction” should be adapted accordingly and deemed equivalent and interchangeable to the foregoing patterns.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

Le procédé de l'invention consiste à déterminer, dans une zone d'image numérique, des limites d'objets, telles celles destinées à des échantillons émetteurs de lumière placés dans un instrument analytique optique, par subdivision de la zone d'image en blocs de pixels, à déterminer, pour chaque bloc, des seuils entre l'arrière-plan et des objets et à déterminer des seuils de pixels par interpolation des seuils de blocs; puis à assigner à chaque pixel, à partir de l'intensité et du seuil de chacun, une intensité bimodale représentant soit l'objet, soit l'arrière-plan, et à délimiter des objets bimodaux, afin de déterminer des limites d'objet. L'invention concerne un processus à cercles pleins et cercles vides ('pin and wheel'), destiné à la délimitation.
PCT/US1999/015796 1998-07-14 1999-07-13 Masquage automatique d'objets dans des images WO2000004497A1 (fr)

Priority Applications (6)

Application Number Priority Date Filing Date Title
CA002336885A CA2336885A1 (fr) 1998-07-14 1999-07-13 Masquage automatique d'objets dans des images
EP99933959A EP1095357B1 (fr) 1998-07-14 1999-07-13 Masquage automatique d'objets dans des images
AT99933959T ATE217995T1 (de) 1998-07-14 1999-07-13 Automatische maskierung von objekten in bildern
JP2000560543A JP2002520746A (ja) 1998-07-14 1999-07-13 画像中の物体の自動マスキング
DE69901565T DE69901565T2 (de) 1998-07-14 1999-07-13 Automatische maskierung von objekten in bildern
AU49898/99A AU754884B2 (en) 1998-07-14 1999-07-13 Automatic masking of objects in images

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US9278598P 1998-07-14 1998-07-14
US09/351,660 1999-07-13
US60/092,785 1999-07-13

Publications (1)

Publication Number Publication Date
WO2000004497A1 true WO2000004497A1 (fr) 2000-01-27

Family

ID=22235150

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1999/015796 WO2000004497A1 (fr) 1998-07-14 1999-07-13 Masquage automatique d'objets dans des images

Country Status (1)

Country Link
WO (1) WO2000004497A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003012742A2 (fr) * 2001-07-27 2003-02-13 3M Innovative Properties Company Autoseuillage d'images bruitees
EP1617207A3 (fr) * 1999-07-21 2006-05-31 Applera Corporation Station de travail de détection de luminescence
WO2007105107A2 (fr) * 2006-03-14 2007-09-20 Agency For Science, Technology And Research Procédés, appareil et support lisibles par ordinateur destinés à une segmentation d'image
WO2017158560A1 (fr) * 2016-03-18 2017-09-21 Leibniz-Institut Für Photonische Technologien E.V. Procédé de recherche d'objets répartis avec segmentation d'une image d'ensemble

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5119439A (en) * 1990-02-06 1992-06-02 Dainippon Screen Mfg. Co., Ltd. Method of and apparatus for extracting image contour data
WO1997043732A1 (fr) * 1996-05-10 1997-11-20 Oncometrics Imaging Corp. Procede et appareil de detection automatique de changements lies a une malignite
US5901245A (en) * 1997-01-23 1999-05-04 Eastman Kodak Company Method and system for detection and characterization of open space in digital images

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5119439A (en) * 1990-02-06 1992-06-02 Dainippon Screen Mfg. Co., Ltd. Method of and apparatus for extracting image contour data
WO1997043732A1 (fr) * 1996-05-10 1997-11-20 Oncometrics Imaging Corp. Procede et appareil de detection automatique de changements lies a une malignite
US5901245A (en) * 1997-01-23 1999-05-04 Eastman Kodak Company Method and system for detection and characterization of open space in digital images

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1617207A3 (fr) * 1999-07-21 2006-05-31 Applera Corporation Station de travail de détection de luminescence
WO2003012742A2 (fr) * 2001-07-27 2003-02-13 3M Innovative Properties Company Autoseuillage d'images bruitees
WO2003012742A3 (fr) * 2001-07-27 2004-02-19 3M Innovative Properties Co Autoseuillage d'images bruitees
US6961476B2 (en) 2001-07-27 2005-11-01 3M Innovative Properties Company Autothresholding of noisy images
WO2007105107A2 (fr) * 2006-03-14 2007-09-20 Agency For Science, Technology And Research Procédés, appareil et support lisibles par ordinateur destinés à une segmentation d'image
WO2007105107A3 (fr) * 2006-03-14 2008-01-24 Agency Science Tech & Res Procédés, appareil et support lisibles par ordinateur destinés à une segmentation d'image
WO2017158560A1 (fr) * 2016-03-18 2017-09-21 Leibniz-Institut Für Photonische Technologien E.V. Procédé de recherche d'objets répartis avec segmentation d'une image d'ensemble
CN109478230A (zh) * 2016-03-18 2019-03-15 光学技术注册协会莱布尼兹研究所 通过分割概览图像来检查分布式对象的方法
US11599738B2 (en) 2016-03-18 2023-03-07 Leibniz-Institut Für Photonische Technologien E.V. Method for examining distributed objects by segmenting an overview image

Similar Documents

Publication Publication Date Title
US5566249A (en) Apparatus for detecting bubbles in coverslip adhesive
CN111445478B (zh) 一种用于cta图像的颅内动脉瘤区域自动检测系统和检测方法
Kittler et al. Threshold selection based on a simple image statistic
JP3296494B2 (ja) 適応表示システム
US5982916A (en) Method and apparatus for automatically locating a region of interest in a radiograph
JPH07504531A (ja) カラーにより妥当対象物を識別し特性付ける方法
Spirkovska A summary of image segmentation techniques
JP3296493B2 (ja) 背景における対象物の内点を決定する方法
Naufal et al. Preprocessed mask RCNN for parking space detection in smart parking systems
JP3490482B2 (ja) エッジ及び輪郭抽出装置
US7609887B2 (en) System and method for toboggan-based object segmentation using distance transform
EP0681722A1 (fr) Procedes de determination des points exterieurs d'un objet sur un fond
US7565009B2 (en) System and method for dynamic fast tobogganing
CN111539980A (zh) 一种基于可见光的多目标追踪方法
WO2001008098A1 (fr) Extraction d'objet dans des images
EP1095357B1 (fr) Masquage automatique d'objets dans des images
WO2000004497A1 (fr) Masquage automatique d'objets dans des images
Wilkinson Automated and manual segmentation techniques in image analysis of microbes
Dinç et al. Super-thresholding: Supervised thresholding of protein crystal images
Bhardwaj et al. An imaging approach for the automatic thresholding of photo defects
CN113420636A (zh) 一种基于深度学习与阈值分割的线虫识别方法
Shridhar et al. License plate recognition using SKIPSM
Tsai A new approach for image thresholding under uneven lighting conditions
JP3539581B2 (ja) 粒子画像の解析方法
Durak et al. Automated Coronal-Loop Detection based on Contour Extraction and Contour Classification from the SOHO/EIT Images

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AU CA JP

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
ENP Entry into the national phase

Ref document number: 2336885

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 49898/99

Country of ref document: AU

WWE Wipo information: entry into national phase

Ref document number: 1999933959

Country of ref document: EP

ENP Entry into the national phase

Ref country code: JP

Ref document number: 2000 560543

Kind code of ref document: A

Format of ref document f/p: F

WWP Wipo information: published in national office

Ref document number: 1999933959

Country of ref document: EP

WWG Wipo information: grant in national office

Ref document number: 1999933959

Country of ref document: EP

WWG Wipo information: grant in national office

Ref document number: 49898/99

Country of ref document: AU