US6504953B1 - Method for the automatic removal of image errors - Google Patents

Method for the automatic removal of image errors Download PDF

Info

Publication number
US6504953B1
US6504953B1 US09/376,476 US37647699A US6504953B1 US 6504953 B1 US6504953 B1 US 6504953B1 US 37647699 A US37647699 A US 37647699A US 6504953 B1 US6504953 B1 US 6504953B1
Authority
US
United States
Prior art keywords
mask
contour
error
contours
image data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
US09/376,476
Inventor
Rolf Behrends
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Heidelberger Druckmaschinen AG
Original Assignee
Heidelberger Druckmaschinen AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to DE19842572A priority Critical patent/DE19842572B4/en
Application filed by Heidelberger Druckmaschinen AG filed Critical Heidelberger Druckmaschinen AG
Priority to US09/376,476 priority patent/US6504953B1/en
Assigned to HEIDELBERGER DRUCKMASCHINEN AG reassignment HEIDELBERGER DRUCKMASCHINEN AG ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BEHRENDS, ROLF
Application granted granted Critical
Publication of US6504953B1 publication Critical patent/US6504953B1/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators

Definitions

  • the invention is in the field of electronic reproduction technology and is directed to a method for the automatic removal of errors in a digitally stored image.
  • print masters for printing pages are produced that contain all page elements such as texts, graphics and images to be printed.
  • these elements are present in the form of digital data.
  • the data are generated for an image in that the image master is scanned point-by-point and line-by-line in a scanner, each picture element is resolved into color components, and the color values of these components are digitized.
  • the data for the page elements are generated and stored in the color components red, green and blue (RGB) or in the inks of four-color printing cyan, magenta, yellow and black (CMYK).
  • the digitized images together with the texts and graphics are electronically mounted at a computer work station under visual control, being mounted on a color monitor or automatically according to stored layout rules.
  • the finished printing page is thereby converted into a data format suitable for the output and is stored.
  • the printing page data for each of the inks (RGB or, respectively, CMYK) are referred to as color separation data.
  • Printing plates for a conventional printing press are manufactured with the color separation data, or they are transmitted directly to a color printer or to a digital printing press and are printed out thereat.
  • the image data of the scanned images exhibit faulty locations such as, for example, scratches in the original master or error locations that arise due to hairs, fibers, etc. that adhere to the image master during the scan process. Such errors must be corrected in the reproduction before the images can be printed.
  • various electronic retouch methods are utilized in the prior art.
  • a known retouch method is copying retouch as disclosed by European Letters Patent 0 111 026. It is used in order to transfer information of an image area onto another image area picture element by picture element. Error locations in the image can thus be eliminated in that picture elements from a neighboring image area having similar color and structure are copied into the damaged image area.
  • the operator simultaneously moves a read mark and a write mark with a computer mouse at a computer workstation in which the image to be retouched is stored, said read mark and write mark being mixed on the picture screen.
  • the read mark indicates a read area of the image
  • the write mark indicates a write area in which the error location is located.
  • the picture elements situated under the read mark in the stored image are thereby continuously transferred into the corresponding picture elements under the write mark.
  • Another method suitable for the elimination of error locations is known under the name of “bleeding retouch”.
  • Two marks mixed into the picture screen are thereby likewise moved with the computer mouse by the operator.
  • One mark is located at the one side of the error location, for example one side of a scratch, and the other mark is located at the opposite side of the error location.
  • Two picture elements are read from the stored image that lie at the locations identified by the marks.
  • Intermediate values are interpolated from the color separation values of the picture elements that are being read for all picture elements that lie on the connecting line between the, two marks and these intermediate values are written into the picture elements on the connecting line. In this way, the error location is covered with color values that are adapted to the colors in the neighborhood of the error location.
  • European Letters Patent 0 768 621 discloses a partially automated method for illuminating error locations.
  • the operator manually identifies the error locations as a bit map in-a mask memory.
  • the picture elements of the error locations identified in the bit map are then corrected with a specific window operator.
  • the window operator places line segments over every picture element of the error location from different angles, whereby the outer ends of the line segments project into non-defective neighboring regions of the error location.
  • the color values of the non-defective picture elements are compared along each line segment to color values that were acquired from an interpolation model.
  • the line segment that exhibits the fewest interpolation errors is then selected.
  • the defective pixel of the error location is interpolated along the selected line segment according to the interpolation model.
  • a contour mask is produced with a contour filter.
  • the color mask is generated.
  • An error mask is produced by a logical AND operation of the contour mask and the color mask. Contours in the error mask are pared down and vectorized.
  • An automatic bleeding retouch of the digital image data is implemented over the vectorized contours of the error mask.
  • FIG. 1 is a flow chart for the inventive sequence of processing steps:
  • FIG. 2 is an example of an image having error locations
  • FIG. 3 is an example of a contour filter
  • FIG. 4 shows an example of a contour mask
  • FIG. 5 shows an example of a color mask
  • FIG. 6 is an example of an error mask
  • FIG. 7 illustrates the over-filling of a contour
  • FIG. 8 shows the saving and vectorizing of a contour
  • FIG. 9 illustrates the automatic bleeding retouch over a vectorized contour.
  • the error locations are found and illuminated by a sequence of automatically sequencing steps, whereby the operator has essentially only a monitoring function.
  • FIG. 1 shows the sequence of the steps as a flow chart.
  • a contour mask is generated as bit map, i.e. an image having only two brightness values, black and white.
  • the black picture elements indicate the locations at which contours are present in the original image.
  • FIG. 2 shows an example of an image that contains two error locations ( 8 ). Since such error locations usually contrast clearly with the background, they are acquired by the contour mask together with the other contours of the image.
  • the contour mask is generated in that a digital contour filter is applied to an image component that reproduces the contours of the image as clearly as possible.
  • a luminance component can also be acquired by a weighted addition of the color separation components (RGB or, respectively, CMYK).
  • RGB or, respectively, CMYK color separation components
  • an individual color component for example the magenta component, can also be employed as a luminance component.
  • a filter having a high-pass characteristic is employed as the contour filter, i.e. a filter that generates a high output value at contours and a low output value in image areas having little detail.
  • FIG. 3 shows a simple digital contour filter ( 9 ) with a filter window that extends over 3 ⁇ 3 pixels.
  • the circled point P references the position of the current pixel.
  • the values h ij at each position of the filter window are the filter coefficients.
  • the filtering is implemented in that the Point P of the filter window is placed over each pixel of the luminance component, and the pixel values L ij lying under the respective window positions are multiplied by the coefficient h ij and are added up.
  • the filter value F of each filter thus derives as:
  • the shape of the filter window and the exact values of the coefficients of the contour filter shown in FIG. 3 are not critical. Filter windows having more than 3 ⁇ 3 pixels and with different values for the coefficients can also be employed. The only thing important is that it is mainly the contours in the image that are emphasized by the filtering.
  • the contour map is acquired as a bit map from the filtered luminance image in that the filter values F are compared to thresholds. Since the contour filter, dependent on the direction of the luminance discontinuity at a contour, generates positive and negative output values, it is expedient to define only an upper threshold S 1 with which the luminance discontinuities from bright to dark in the contour mask are acquired. Since, moreover, the error locations are extremely dark, it is also advantageous to place the threshold S 1 relatively high so that it is mainly the error locations that are acquired and less the natural contours of the image.
  • the bit map of the contour mask is then produced in that filter values F that lie above the threshold S 1 are converted into the binary value 1 and filter values that lie below the threshold S 1 are converted into the binary value 0 .
  • FIG. 4 shows the contour mask acquired in this way for the exemplary image of FIG. 2 .
  • the contour mask contains both a part of the natural image contour as well as the contours of the error locations 8 .
  • a color mask is produced from the original image as a bit map in which dark colors are identified as black points. This is based on the observation that typical error locations in the image such as scratches and hairs, are extremely dark.
  • the color mask is acquired, for example, from the luminance component that already served the purpose of producing the contour mask in that the picture elements of the luminance component are compared to a threshold. Alternatively, a separate threshold can be defined for each color separation component. Those picture elements in the color mask are then identified wherein all color separation components are darker than the respective threshold. Instead of being acquired with a simple threshold decision, the color mask, however, can also be acquired with a method for selective color recognition.
  • all picture elements are identified in the color mask that lie in a small volume around a typical color of an error location in the RGB system, CMYK system, LAB system or in some other color system.
  • the shape of the volume can, for example, be a ball or an ellipsoid.
  • FIG. 5 shows the color mask for the exemplary image of FIG. 2 .
  • the normal case was thereby assumed that the error locations in the image are darker then the regions around most natural contours of the image, so that the regions in the color mask are contained around the error locations 10 and a few regions are contained around the natural contours.
  • Step 3 in FIG. 1 an error mask is generated in that the contour mask and the color mask are operated with one another by the logical AND function. As a result thereof, nearly all natural contours of the image that are still contained in the contour mask are suppressed.
  • FIG. 6 shows the result in which only the error locations 8 and a few contours in the tennis racket 12 are contained.
  • the areas remaining in the error mask are over-filled.
  • a frame having a specific width is placed around the contours of the error mask, this width being dependent on the resolution of the image.
  • the frame width preferably amounts to a few pixels.
  • a check is carried out to see whether the frame touches other existing pixels of the error mask. When this is the case, the parts of the over-filled error mask that touch are eliminated.
  • the processing step 4 has the job of eliminating regions from the error wherein natural contours lie in close proximity to one another in a dark part of the image, for example in the region wherein real hair is imaged.
  • FIG. 7 schematically shows the over-filling of a contour 13 .
  • a contour 13 At an arbitrary starting point A, one begins to generate a frame 14 having a defined width around the contour 13 purpose, an arbitrary algorithm for frame formation that is known and described in the technical literature is employed. The exact algorithm for producing the frame is not critical for the inventive sequence of the processing steps.
  • the contour 13 at any point B the contour including the sub-frame already generated is deleted.
  • the frame touches another neighboring contour both contours are deleted. Only when the frame generation is possible without touching around the entire contour back to the starting point A is the contour preserved.
  • the contour 13 and the frame 14 have been shown with different hatching in FIG. 7 so that they can be distinguished from one another. Both are written as black pixels in the bit map of the error mask.
  • the error mask is displayed on the picture screen in the next processing step (Step 5 in FIG. 1 ), for example in that it is superimposed on the image in a transparent color.
  • the operator can manually delete these with a brush retouch function at this point of the processing sequence.
  • the operator can likewise add error locations that were possibly not recognized by the automatic processing with the brush retouch function in the error mask.
  • the contours in the error mask are pared down in that pixels are removed from-the outer edge of the contours until a contour having a width of only one pixel remains.
  • This pared down contour is then vectorized in that linear segments of the pared down contour are respectively converted into a vector.
  • Any desired algorithm that is known and described in the technical literature can be employed for the sub-processes of paring and vectorizing. The exact algorithms are not critical for the inventive sequence of the processing steps.
  • FIG. 8 shows a magnified contour of an error location 15 from the error mask, the contour ( 16 ) paired down to a width of one pixel, and the vectorized contour 17 .
  • Step 7 in FIG. 1 a bleeding retouch is automatically implemented in the original image along the vectorized contours 17 of the error mask and the error location are thus removed.
  • FIG. 9 shows the bleeding retouch.
  • a line segment 18 is formed approximately perpendicularly relative to the respective vector for each pixel that lies on the vectorized contour 17 .
  • Two picture elements that lie at the ends C and D of the line segment 18 are read from the stored image. Corrected color separation values are interpolated from the color separation values of the picture elements that have been read for all picture elements that lie on the line segment 18 between the two end points C and D and are written into the picture elements on the line segment. In this way, the arrow location is covered with color values that are matched to the colors in the proximity of the arrow location.
  • the length of the line segment 18 is expediently selected such that the end points C and D lie outside the typical width of error locations such as scratches and hairs.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

In a method for automatic removal of image errors in digital image data, for example which have arisen due to scratches, hairs, etc., a contour mask is generated with a contour filter and a color mask is additionally generated that covers the image areas having the typical color of an image error. By operating the contour mask and the color mask, an error mask arises that is also automatically corrected and is manually edited as warranted. The remaining contours of the error mask are vectorized. An automatic bleeding retouch is then implemented along the vectorized contours.

Description

BACKGROUND OF THE INVENTION
The invention is in the field of electronic reproduction technology and is directed to a method for the automatic removal of errors in a digitally stored image.
In reproduction technology, print masters for printing pages are produced that contain all page elements such as texts, graphics and images to be printed. In the case of electronic production of the print masters, these elements are present in the form of digital data. For example, the data are generated for an image in that the image master is scanned point-by-point and line-by-line in a scanner, each picture element is resolved into color components, and the color values of these components are digitized. Dependent on the output process employed later, for example output on a color printer or printing in a conventional printing press, the data for the page elements are generated and stored in the color components red, green and blue (RGB) or in the inks of four-color printing cyan, magenta, yellow and black (CMYK).
During the course of further work, the digitized images together with the texts and graphics are electronically mounted at a computer work station under visual control, being mounted on a color monitor or automatically according to stored layout rules. The finished printing page is thereby converted into a data format suitable for the output and is stored. The printing page data for each of the inks (RGB or, respectively, CMYK) are referred to as color separation data. Printing plates for a conventional printing press are manufactured with the color separation data, or they are transmitted directly to a color printer or to a digital printing press and are printed out thereat.
It occurs that the image data of the scanned images exhibit faulty locations such as, for example, scratches in the original master or error locations that arise due to hairs, fibers, etc. that adhere to the image master during the scan process. Such errors must be corrected in the reproduction before the images can be printed. For that purpose, various electronic retouch methods are utilized in the prior art.
A known retouch method is copying retouch as disclosed by European Letters Patent 0 111 026. It is used in order to transfer information of an image area onto another image area picture element by picture element. Error locations in the image can thus be eliminated in that picture elements from a neighboring image area having similar color and structure are copied into the damaged image area. For implementing the retouch, the operator simultaneously moves a read mark and a write mark with a computer mouse at a computer workstation in which the image to be retouched is stored, said read mark and write mark being mixed on the picture screen. The read mark indicates a read area of the image, and the write mark indicates a write area in which the error location is located. The picture elements situated under the read mark in the stored image are thereby continuously transferred into the corresponding picture elements under the write mark.
Another method suitable for the elimination of error locations is known under the name of “bleeding retouch”. Two marks mixed into the picture screen are thereby likewise moved with the computer mouse by the operator. One mark is located at the one side of the error location, for example one side of a scratch, and the other mark is located at the opposite side of the error location. Two picture elements are read from the stored image that lie at the locations identified by the marks. Intermediate values are interpolated from the color separation values of the picture elements that are being read for all picture elements that lie on the connecting line between the, two marks and these intermediate values are written into the picture elements on the connecting line. In this way, the error location is covered with color values that are adapted to the colors in the neighborhood of the error location.
European Letters Patent 0 768 621 discloses a partially automated method for illuminating error locations. First, the operator manually identifies the error locations as a bit map in-a mask memory. The picture elements of the error locations identified in the bit map are then corrected with a specific window operator. The window operator places line segments over every picture element of the error location from different angles, whereby the outer ends of the line segments project into non-defective neighboring regions of the error location. The color values of the non-defective picture elements are compared along each line segment to color values that were acquired from an interpolation model. The line segment that exhibits the fewest interpolation errors is then selected. Finally, the defective pixel of the error location is interpolated along the selected line segment according to the interpolation model.
In European Letters Patent 0 614 154, the operator places a narrow limiting mask over an arrow location and the picture elements within the mask are colored white. The continuous color values in a processing area around the limiting mask are converted into a sequence of binary images, in that the color values are compared to thresholds that increase step-by-step. The contour of each binary image is diminished by an edge width that corresponds to the width of the limiting mask. The shape of the limiting mask is thereby eliminated. Subsequently, the modified shape of each binary image is again enlarged by the same edge width, and the sequence of binary images is converted back into an image having continuous color values. The error location is then eliminated from this image.
Methods known from the prior art for removing image errors have the disadvantage that an operator must first seek the position of the error locations in the image and identify them, this being extremely time-consuming. Moreover, error locations can thereby be overlooked by the operator. Subsequently, the operator must manually correct the error locations by interactive retouch in most methods, in that he moves marks mixed in on the picture screen over the error locations. This requires additional time expenditure.
SUMMARY OF THE INVENTION
It is therefore an object of the present invention to avoid the disadvantages of the known methods for removing image errors in digital image data and to specify a new, improved method that enables an automatic removal of error locations. Error locations can thus be removed from an image significantly faster then is possible with the known methods.
According to the method of the invention for automatic removal of image errors in digital image data, a contour mask is produced with a contour filter. The color mask is generated. An error mask is produced by a logical AND operation of the contour mask and the color mask. Contours in the error mask are pared down and vectorized. An automatic bleeding retouch of the digital image data is implemented over the vectorized contours of the error mask.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a flow chart for the inventive sequence of processing steps:
FIG. 2 is an example of an image having error locations;
FIG. 3 is an example of a contour filter;
FIG. 4 shows an example of a contour mask;
FIG. 5 shows an example of a color mask;
FIG. 6 is an example of an error mask;
FIG. 7 illustrates the over-filling of a contour;
FIG. 8 shows the saving and vectorizing of a contour; and
FIG. 9 illustrates the automatic bleeding retouch over a vectorized contour.
DESCRIPTION OF THE PREFERRED EMBODIMENTS
In the inventive method of removing image errors, the error locations are found and illuminated by a sequence of automatically sequencing steps, whereby the operator has essentially only a monitoring function.
FIG. 1 shows the sequence of the steps as a flow chart. In the first step 1, a contour mask is generated as bit map, i.e. an image having only two brightness values, black and white. In the contour mask, the black picture elements indicate the locations at which contours are present in the original image.
FIG. 2 shows an example of an image that contains two error locations (8). Since such error locations usually contrast clearly with the background, they are acquired by the contour mask together with the other contours of the image. The contour mask is generated in that a digital contour filter is applied to an image component that reproduces the contours of the image as clearly as possible. For that purpose, a luminance component is preferably employed that, for example, can be acquired by the transformation of the image data into the CIELAB color system (CIE=Commission Internationale D'Éclairage). A luminance component, however, can also be acquired by a weighted addition of the color separation components (RGB or, respectively, CMYK). By way of substitute, an individual color component, for example the magenta component, can also be employed as a luminance component. A filter having a high-pass characteristic is employed as the contour filter, i.e. a filter that generates a high output value at contours and a low output value in image areas having little detail.
As an example, FIG. 3 shows a simple digital contour filter (9) with a filter window that extends over 3×3 pixels. The circled point P references the position of the current pixel. The values hij at each position of the filter window are the filter coefficients. The filtering is implemented in that the Point P of the filter window is placed over each pixel of the luminance component, and the pixel values Lij lying under the respective window positions are multiplied by the coefficient hij and are added up. The filter value F of each filter thus derives as:
F=Σ(h ij ×L ij)  (1)
For the present invention, the shape of the filter window and the exact values of the coefficients of the contour filter shown in FIG. 3 are not critical. Filter windows having more than 3×3 pixels and with different values for the coefficients can also be employed. The only thing important is that it is mainly the contours in the image that are emphasized by the filtering.
The contour map is acquired as a bit map from the filtered luminance image in that the filter values F are compared to thresholds. Since the contour filter, dependent on the direction of the luminance discontinuity at a contour, generates positive and negative output values, it is expedient to define only an upper threshold S1 with which the luminance discontinuities from bright to dark in the contour mask are acquired. Since, moreover, the error locations are extremely dark, it is also advantageous to place the threshold S1 relatively high so that it is mainly the error locations that are acquired and less the natural contours of the image. The bit map of the contour mask is then produced in that filter values F that lie above the threshold S1 are converted into the binary value 1 and filter values that lie below the threshold S1 are converted into the binary value 0.
Bitmap=1 for F>S1 Bitmap=0 for F ≦S1  (2)
FIG. 4 shows the contour mask acquired in this way for the exemplary image of FIG. 2. One can see that the contour mask contains both a part of the natural image contour as well as the contours of the error locations 8.
In the next processing step (Step 2 in FIG. 1), a color mask is produced from the original image as a bit map in which dark colors are identified as black points. This is based on the observation that typical error locations in the image such as scratches and hairs, are extremely dark. The color mask is acquired, for example, from the luminance component that already served the purpose of producing the contour mask in that the picture elements of the luminance component are compared to a threshold. Alternatively, a separate threshold can be defined for each color separation component. Those picture elements in the color mask are then identified wherein all color separation components are darker than the respective threshold. Instead of being acquired with a simple threshold decision, the color mask, however, can also be acquired with a method for selective color recognition. For example, all picture elements are identified in the color mask that lie in a small volume around a typical color of an error location in the RGB system, CMYK system, LAB system or in some other color system. The shape of the volume can, for example, be a ball or an ellipsoid.
FIG. 5 shows the color mask for the exemplary image of FIG. 2. The normal case was thereby assumed that the error locations in the image are darker then the regions around most natural contours of the image, so that the regions in the color mask are contained around the error locations 10 and a few regions are contained around the natural contours. In this example, there is a region 11 in the tennis racket that is additionally acquired by the color mask.
In the next processing step (Step 3 in FIG. 1), an error mask is generated in that the contour mask and the color mask are operated with one another by the logical AND function. As a result thereof, nearly all natural contours of the image that are still contained in the contour mask are suppressed. FIG. 6 shows the result in which only the error locations 8 and a few contours in the tennis racket 12 are contained.
In the next processing Step (Step 4 in FIG. 1), the areas remaining in the error mask are over-filled. For that purpose, a frame having a specific width is placed around the contours of the error mask, this width being dependent on the resolution of the image. The frame width preferably amounts to a few pixels. During the over-filling, a check is carried out to see whether the frame touches other existing pixels of the error mask. When this is the case, the parts of the over-filled error mask that touch are eliminated. The processing step 4 has the job of eliminating regions from the error wherein natural contours lie in close proximity to one another in a dark part of the image, for example in the region wherein real hair is imaged.
FIG. 7 schematically shows the over-filling of a contour 13. At an arbitrary starting point A, one begins to generate a frame 14 having a defined width around the contour 13 purpose, an arbitrary algorithm for frame formation that is known and described in the technical literature is employed. The exact algorithm for producing the frame is not critical for the inventive sequence of the processing steps. When the frame touches the contour 13 at any point B, the contour including the sub-frame already generated is deleted. When the frame touches another neighboring contour, both contours are deleted. Only when the frame generation is possible without touching around the entire contour back to the starting point A is the contour preserved. The contour 13 and the frame 14 have been shown with different hatching in FIG. 7 so that they can be distinguished from one another. Both are written as black pixels in the bit map of the error mask.
It is assumed in the example of FIG. 6 that the remaining natural contours in the tennis racket 12 can be deleted from the error mask with the over-filling and elimination of contours that touch. As a result, only the contours of the error locations 8 are then contained in the error mask.
For monitoring by the operator, the error mask is displayed on the picture screen in the next processing step (Step 5 in FIG. 1), for example in that it is superimposed on the image in a transparent color. When remaining contours are still contained in the error mask that do not belong to error locations, the operator can manually delete these with a brush retouch function at this point of the processing sequence. The operator can likewise add error locations that were possibly not recognized by the automatic processing with the brush retouch function in the error mask.
In the next processing step (Step 6 in FIG. 1), the contours in the error mask are pared down in that pixels are removed from-the outer edge of the contours until a contour having a width of only one pixel remains. This pared down contour is then vectorized in that linear segments of the pared down contour are respectively converted into a vector. Any desired algorithm that is known and described in the technical literature can be employed for the sub-processes of paring and vectorizing. The exact algorithms are not critical for the inventive sequence of the processing steps.
FIG. 8 shows a magnified contour of an error location 15 from the error mask, the contour (16) paired down to a width of one pixel, and the vectorized contour 17.
In the last processing step (Step 7 in FIG. 1), a bleeding retouch is automatically implemented in the original image along the vectorized contours 17 of the error mask and the error location are thus removed.
FIG. 9 shows the bleeding retouch. A line segment 18 is formed approximately perpendicularly relative to the respective vector for each pixel that lies on the vectorized contour 17. Two picture elements that lie at the ends C and D of the line segment 18 are read from the stored image. Corrected color separation values are interpolated from the color separation values of the picture elements that have been read for all picture elements that lie on the line segment 18 between the two end points C and D and are written into the picture elements on the line segment. In this way, the arrow location is covered with color values that are matched to the colors in the proximity of the arrow location. The length of the line segment 18 is expediently selected such that the end points C and D lie outside the typical width of error locations such as scratches and hairs.
Although various minor modifications might be suggested by those skilled in the art, it should be understood that I wish to embody within the scope of the patent warranted hereon all such modifications as reasonably and properly come with the scope of my contribution to the art.

Claims (8)

I claim as my invention:
1. A method for automatic removal of image errors in digital image data, comprising the steps of:
producing a contour mask with a contour filter;
generating a color mask;
producing an error mask by a logical AND operation of the contour mask and the color mask;
paring down by removing pixels from outer edges of contours and vectorizing the contours in the error mask; and
implementing an automatic bleeding retouch of the digital image data over the vectorized contours of the error mask.
2. The method according to claim 1 wherein the contour filter is applied to a luminance component of the digital image data.
3. The method according to claim 1 wherein the color mask is produced by comparison of the digital image data to thresholds.
4. The method according to claim 1 wherein the automatic bleeding retouch is implemented in that a line segment is generated approximately perpendicularly relative to the respective vector for each picture element on a vectorized contour, and corrected image data are interpolated from the image data of the picture elements at ends of the line segment for the picture elements that lie on the line segment.
5. The method according to claim 1, wherein the error mask is automatically corrected in that the contours are over-filled, and contours that touch are then illuminated.
6. The method according to claim 1 wherein the error mask is displayed on a picture screen together with the image data and is edited by the operator.
7. A method for automatic removal of image errors in digital image data, comprising the steps of:
producing a contour mask with a contour filter;
generating a color mask;
producing an error mask by a logical AND operation of the contour mask and the color mask;
vectorizing contours in the error mask; and
implementing an automatic bleeding retouch of the digital image data over the vectorized contours of the error mask.
8. A method for automatic removal of image errors in digital image data, comprising the steps of:
generating a contour mask;
generating a color mask;
producing an error mask from the contour mask and the color mask;
over-filling the error mask and eliminating touching parts;
displaying and editing the error mask;
trimming by removing pixels at an outside of contour edges and vectorizing the contours of the error mask; and
implementing a bleeding retouch of the digital image data over the error vectors.
US09/376,476 1998-09-17 1999-08-18 Method for the automatic removal of image errors Expired - Fee Related US6504953B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
DE19842572A DE19842572B4 (en) 1998-09-17 1998-09-17 Method for the automatic removal of image defects
US09/376,476 US6504953B1 (en) 1998-09-17 1999-08-18 Method for the automatic removal of image errors

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE19842572A DE19842572B4 (en) 1998-09-17 1998-09-17 Method for the automatic removal of image defects
US09/376,476 US6504953B1 (en) 1998-09-17 1999-08-18 Method for the automatic removal of image errors

Publications (1)

Publication Number Publication Date
US6504953B1 true US6504953B1 (en) 2003-01-07

Family

ID=26048909

Family Applications (1)

Application Number Title Priority Date Filing Date
US09/376,476 Expired - Fee Related US6504953B1 (en) 1998-09-17 1999-08-18 Method for the automatic removal of image errors

Country Status (2)

Country Link
US (1) US6504953B1 (en)
DE (1) DE19842572B4 (en)

Cited By (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050135694A1 (en) * 2003-12-19 2005-06-23 Daly Scott J. Enhancing the quality of decoded quantized images
US20050141779A1 (en) * 2003-12-24 2005-06-30 Daly Scott J. Enhancing the quality of decoded quantized images
US20050147317A1 (en) * 2003-12-24 2005-07-07 Daly Scott J. Enhancing the quality of decoded quantized images
US20050152614A1 (en) * 2004-01-08 2005-07-14 Daly Scott J. Enhancing the quality of decoded quantized images
US20060023943A1 (en) * 2004-07-30 2006-02-02 Canon Kabushiki Kaisha Image processing method and image processing apparatus
US20060119613A1 (en) * 2004-12-02 2006-06-08 Sharp Laboratories Of America, Inc. Methods and systems for display-mode-dependent brightness preservation
US20060119612A1 (en) * 2004-12-02 2006-06-08 Kerofsky Louis J Methods and systems for image-specific tone scale adjustment and light-source control
US20060209003A1 (en) * 2004-12-02 2006-09-21 Sharp Laboratories Of America, Inc. Methods and systems for determining a display light source adjustment
US20060262111A1 (en) * 2004-12-02 2006-11-23 Kerofsky Louis J Systems and Methods for Distortion-Related Source Light Management
US20060267923A1 (en) * 2004-12-02 2006-11-30 Kerofsky Louis J Methods and Systems for Generating and Applying Image Tone Scale Adjustments
US20060274026A1 (en) * 2004-12-02 2006-12-07 Kerofsky Louis J Systems and Methods for Selecting a Display Source Light Illumination Level
US20060284823A1 (en) * 2005-06-15 2006-12-21 Sharp Laboratories Of America, Inc. Methods and systems for enhancing display characteristics with frequency-specific gain
US20060284822A1 (en) * 2004-12-02 2006-12-21 Sharp Laboratories Of America, Inc. Methods and systems for enhancing display characteristics
US20070058890A1 (en) * 2003-07-29 2007-03-15 Farid Al-Bender Novel foil bearing
US20070092139A1 (en) * 2004-12-02 2007-04-26 Daly Scott J Methods and Systems for Image Tonescale Adjustment to Compensate for a Reduced Source Light Power Level
US20070211049A1 (en) * 2006-03-08 2007-09-13 Sharp Laboratories Of America, Inc. Methods and systems for enhancing display characteristics with ambient illumination input
US20080024517A1 (en) * 2006-07-28 2008-01-31 Louis Joseph Kerofsky Systems and methods for color preservation with image tone scale corrections
US20080208551A1 (en) * 2007-02-28 2008-08-28 Louis Joseph Kerofsky Methods and Systems for Surround-Specific Display Modeling
US20080317376A1 (en) * 2007-06-20 2008-12-25 Microsoft Corporation Automatic image correction providing multiple user-selectable options
US20090109233A1 (en) * 2007-10-30 2009-04-30 Kerofsky Louis J Methods and Systems for Image Enhancement
US20090141178A1 (en) * 2007-11-30 2009-06-04 Kerofsky Louis J Methods and Systems for Backlight Modulation with Scene-Cut Detection
US20090140970A1 (en) * 2007-11-30 2009-06-04 Kerofsky Louis J Methods and Systems for Weighted-Error-Vector-Based Source Light Selection
US20090167789A1 (en) * 2007-12-26 2009-07-02 Kerofsky Louis J Methods and Systems for Backlight Modulation with Image Characteristic Mapping
US20090167673A1 (en) * 2007-12-26 2009-07-02 Kerofsky Louis J Methods and Systems for Display Source Light Management with Variable Delay
US20090167672A1 (en) * 2007-12-26 2009-07-02 Kerofsky Louis J Methods and Systems for Display Source Light Management with Histogram Manipulation
US20090185267A1 (en) * 2005-09-22 2009-07-23 Nikon Corporation Microscope and virtual slide forming system
US20090267876A1 (en) * 2008-04-28 2009-10-29 Kerofsky Louis J Methods and Systems for Image Compensation for Ambient Conditions
US20100021077A1 (en) * 2005-09-13 2010-01-28 Roscoe Atkinson Image quality
US20100053222A1 (en) * 2008-08-30 2010-03-04 Louis Joseph Kerofsky Methods and Systems for Display Source Light Management with Rate Change Control
US20100097452A1 (en) * 2005-06-14 2010-04-22 Torre-Bueno Jose De La Chromatic registration for biological sample imaging
US20100166265A1 (en) * 2006-08-15 2010-07-01 Donald Martin Monro Method of Eyelash Removal for Human Iris Recognition
US20100321574A1 (en) * 2009-06-17 2010-12-23 Louis Joseph Kerofsky Methods and Systems for Power-Controlling Display Devices
US20110001737A1 (en) * 2009-07-02 2011-01-06 Kerofsky Louis J Methods and Systems for Ambient-Adaptive Image Display
US20110074803A1 (en) * 2009-09-29 2011-03-31 Louis Joseph Kerofsky Methods and Systems for Ambient-Illumination-Selective Display Backlight Modification and Image Enhancement
US8120570B2 (en) 2004-12-02 2012-02-21 Sharp Laboratories Of America, Inc. Systems and methods for tone curve generation, selection and application
US8169431B2 (en) 2007-12-26 2012-05-01 Sharp Laboratories Of America, Inc. Methods and systems for image tonescale design
US8207932B2 (en) 2007-12-26 2012-06-26 Sharp Laboratories Of America, Inc. Methods and systems for display source light illumination level selection
US8416179B2 (en) 2008-07-10 2013-04-09 Sharp Laboratories Of America, Inc. Methods and systems for color preservation with a color-modulated backlight
US8922594B2 (en) 2005-06-15 2014-12-30 Sharp Laboratories Of America, Inc. Methods and systems for enhancing display characteristics with high frequency contrast enhancement
US9083969B2 (en) 2005-08-12 2015-07-14 Sharp Laboratories Of America, Inc. Methods and systems for independent view adjustment in multiple-view displays

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2829604B1 (en) * 2001-09-12 2004-01-16 Eastman Kodak Co INTERACTIVE DATA SELECTION IN DIGITAL IMAGES
DE10212919B4 (en) * 2002-03-22 2004-11-11 Heidelberger Druckmaschinen Ag Process for the automatic detection of image defects
DE102004005299B4 (en) * 2004-01-29 2006-09-07 Deutsches Zentrum für Luft- und Raumfahrt e.V. Method and device for detecting errors in film templates
US9846739B2 (en) 2006-10-23 2017-12-19 Fotonation Limited Fast database matching
US8577094B2 (en) 2010-04-09 2013-11-05 Donald Martin Monro Image template masking

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4868884A (en) * 1983-12-30 1989-09-19 Dainippon Screen Mfg. Co., Ltd. Image extraction mask
US5134668A (en) * 1990-02-08 1992-07-28 International Business Machines Corporation Masked combinations of video slices for computer display
US5619592A (en) * 1989-12-08 1997-04-08 Xerox Corporation Detection of highlighted regions
US5659490A (en) * 1994-06-23 1997-08-19 Dainippon Screen Mfg. Co., Ltd. Method and apparatus for generating color image mask
US5859929A (en) * 1995-12-01 1999-01-12 United Parcel Service Of America, Inc. System for character preserving guidelines removal in optically scanned text
US5887082A (en) * 1994-11-29 1999-03-23 Sony Corporation Image detecting apparatus
US6167167A (en) * 1996-07-05 2000-12-26 Canon Kabushiki Kaisha Image extractions apparatus and method
US6212287B1 (en) * 1996-10-17 2001-04-03 Sgs-Thomson Microelectronics S.R.L. Method for identifying marking stripes of road lanes
US6404936B1 (en) * 1996-12-20 2002-06-11 Canon Kabushiki Kaisha Subject image extraction method and apparatus

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5036405A (en) * 1986-11-19 1991-07-30 Canon Kabushiki Kaisha Image amending method
EP0624848A3 (en) * 1993-05-04 1994-11-30 Eastman Kodak Company A technique for the detection and removal of local defects in digital continuous-tone images
DE19527148C1 (en) * 1995-07-25 1997-01-09 Siemens Ag Method for operating a digital image system of an X-ray diagnostic device
US6104839A (en) * 1995-10-16 2000-08-15 Eastman Kodak Company Method and apparatus for correcting pixel values in a digital image
IL119221A0 (en) * 1996-09-08 1996-12-05 Scitex Corp Ltd Apparatus and method for retouching a digital representation of a color image

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4868884A (en) * 1983-12-30 1989-09-19 Dainippon Screen Mfg. Co., Ltd. Image extraction mask
US5619592A (en) * 1989-12-08 1997-04-08 Xerox Corporation Detection of highlighted regions
US5134668A (en) * 1990-02-08 1992-07-28 International Business Machines Corporation Masked combinations of video slices for computer display
US5659490A (en) * 1994-06-23 1997-08-19 Dainippon Screen Mfg. Co., Ltd. Method and apparatus for generating color image mask
US5887082A (en) * 1994-11-29 1999-03-23 Sony Corporation Image detecting apparatus
US5859929A (en) * 1995-12-01 1999-01-12 United Parcel Service Of America, Inc. System for character preserving guidelines removal in optically scanned text
US6167167A (en) * 1996-07-05 2000-12-26 Canon Kabushiki Kaisha Image extractions apparatus and method
US6212287B1 (en) * 1996-10-17 2001-04-03 Sgs-Thomson Microelectronics S.R.L. Method for identifying marking stripes of road lanes
US6404936B1 (en) * 1996-12-20 2002-06-11 Canon Kabushiki Kaisha Subject image extraction method and apparatus

Cited By (75)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070058890A1 (en) * 2003-07-29 2007-03-15 Farid Al-Bender Novel foil bearing
US20050135694A1 (en) * 2003-12-19 2005-06-23 Daly Scott J. Enhancing the quality of decoded quantized images
US7440633B2 (en) 2003-12-19 2008-10-21 Sharp Laboratories Of America, Inc. Enhancing the quality of decoded quantized images
US20080298711A1 (en) * 2003-12-24 2008-12-04 Sharp Laboratories Of America, Inc. Enhancing the quality of decoded quantized images
US20050141779A1 (en) * 2003-12-24 2005-06-30 Daly Scott J. Enhancing the quality of decoded quantized images
US20050147317A1 (en) * 2003-12-24 2005-07-07 Daly Scott J. Enhancing the quality of decoded quantized images
US7424166B2 (en) 2003-12-24 2008-09-09 Sharp Laboratories Of America, Inc. Enhancing the quality of decoded quantized images
US7424168B2 (en) 2003-12-24 2008-09-09 Sharp Laboratories Of America, Inc. Enhancing the quality of decoded quantized images
US7907787B2 (en) 2003-12-24 2011-03-15 Sharp Laboratories Of America, Inc. Enhancing the quality of decoded quantized images
US20080279473A1 (en) * 2004-01-08 2008-11-13 Sharp Laboratories Of America, Inc. Enhancing the quality of decoded quantizes images
US7400779B2 (en) 2004-01-08 2008-07-15 Sharp Laboratories Of America, Inc. Enhancing the quality of decoded quantized images
US7787704B2 (en) 2004-01-08 2010-08-31 Sharp Laboratories Of America, Inc. Enhancing the quality of decoded quantized images
US20050152614A1 (en) * 2004-01-08 2005-07-14 Daly Scott J. Enhancing the quality of decoded quantized images
US7653239B2 (en) * 2004-07-30 2010-01-26 Canon Kabushiki Kaisha Image processing method and image processing apparatus for correcting for color shift occurring in color image data
US20100188673A1 (en) * 2004-07-30 2010-07-29 Canon Kabushiki Kaisha Image processing method and image processing apparatus
US7899244B2 (en) * 2004-07-30 2011-03-01 Canon Kabushiki Kaisha Color shift correction image processing method and image processing apparatus
US20060023943A1 (en) * 2004-07-30 2006-02-02 Canon Kabushiki Kaisha Image processing method and image processing apparatus
US8004511B2 (en) 2004-12-02 2011-08-23 Sharp Laboratories Of America, Inc. Systems and methods for distortion-related source light management
US20060119613A1 (en) * 2004-12-02 2006-06-08 Sharp Laboratories Of America, Inc. Methods and systems for display-mode-dependent brightness preservation
US7782405B2 (en) 2004-12-02 2010-08-24 Sharp Laboratories Of America, Inc. Systems and methods for selecting a display source light illumination level
US20060119612A1 (en) * 2004-12-02 2006-06-08 Kerofsky Louis J Methods and systems for image-specific tone scale adjustment and light-source control
US20070092139A1 (en) * 2004-12-02 2007-04-26 Daly Scott J Methods and Systems for Image Tonescale Adjustment to Compensate for a Reduced Source Light Power Level
US20060284822A1 (en) * 2004-12-02 2006-12-21 Sharp Laboratories Of America, Inc. Methods and systems for enhancing display characteristics
US7768496B2 (en) 2004-12-02 2010-08-03 Sharp Laboratories Of America, Inc. Methods and systems for image tonescale adjustment to compensate for a reduced source light power level
US8120570B2 (en) 2004-12-02 2012-02-21 Sharp Laboratories Of America, Inc. Systems and methods for tone curve generation, selection and application
US7800577B2 (en) 2004-12-02 2010-09-21 Sharp Laboratories Of America, Inc. Methods and systems for enhancing display characteristics
US20060262111A1 (en) * 2004-12-02 2006-11-23 Kerofsky Louis J Systems and Methods for Distortion-Related Source Light Management
US7982707B2 (en) 2004-12-02 2011-07-19 Sharp Laboratories Of America, Inc. Methods and systems for generating and applying image tone scale adjustments
US7961199B2 (en) 2004-12-02 2011-06-14 Sharp Laboratories Of America, Inc. Methods and systems for image-specific tone scale adjustment and light-source control
US7924261B2 (en) 2004-12-02 2011-04-12 Sharp Laboratories Of America, Inc. Methods and systems for determining a display light source adjustment
US20060267923A1 (en) * 2004-12-02 2006-11-30 Kerofsky Louis J Methods and Systems for Generating and Applying Image Tone Scale Adjustments
US8947465B2 (en) 2004-12-02 2015-02-03 Sharp Laboratories Of America, Inc. Methods and systems for display-mode-dependent brightness preservation
US20060209003A1 (en) * 2004-12-02 2006-09-21 Sharp Laboratories Of America, Inc. Methods and systems for determining a display light source adjustment
US20060274026A1 (en) * 2004-12-02 2006-12-07 Kerofsky Louis J Systems and Methods for Selecting a Display Source Light Illumination Level
US8179575B2 (en) 2005-06-14 2012-05-15 Carl Zeiss Microimaging Gmbh Chromatic registration for biological sample imaging
US20100097452A1 (en) * 2005-06-14 2010-04-22 Torre-Bueno Jose De La Chromatic registration for biological sample imaging
US8913089B2 (en) 2005-06-15 2014-12-16 Sharp Laboratories Of America, Inc. Methods and systems for enhancing display characteristics with frequency-specific gain
US8922594B2 (en) 2005-06-15 2014-12-30 Sharp Laboratories Of America, Inc. Methods and systems for enhancing display characteristics with high frequency contrast enhancement
US20060284823A1 (en) * 2005-06-15 2006-12-21 Sharp Laboratories Of America, Inc. Methods and systems for enhancing display characteristics with frequency-specific gain
US9083969B2 (en) 2005-08-12 2015-07-14 Sharp Laboratories Of America, Inc. Methods and systems for independent view adjustment in multiple-view displays
US20100021077A1 (en) * 2005-09-13 2010-01-28 Roscoe Atkinson Image quality
US8817040B2 (en) 2005-09-13 2014-08-26 Carl Zeiss Microscopy Gmbh Methods for enhancing image quality
US20090185267A1 (en) * 2005-09-22 2009-07-23 Nikon Corporation Microscope and virtual slide forming system
US20070211049A1 (en) * 2006-03-08 2007-09-13 Sharp Laboratories Of America, Inc. Methods and systems for enhancing display characteristics with ambient illumination input
US7839406B2 (en) 2006-03-08 2010-11-23 Sharp Laboratories Of America, Inc. Methods and systems for enhancing display characteristics with ambient illumination input
US7515160B2 (en) 2006-07-28 2009-04-07 Sharp Laboratories Of America, Inc. Systems and methods for color preservation with image tone scale corrections
US20080024517A1 (en) * 2006-07-28 2008-01-31 Louis Joseph Kerofsky Systems and methods for color preservation with image tone scale corrections
US20100166265A1 (en) * 2006-08-15 2010-07-01 Donald Martin Monro Method of Eyelash Removal for Human Iris Recognition
US7826681B2 (en) 2007-02-28 2010-11-02 Sharp Laboratories Of America, Inc. Methods and systems for surround-specific display modeling
US20080208551A1 (en) * 2007-02-28 2008-08-28 Louis Joseph Kerofsky Methods and Systems for Surround-Specific Display Modeling
US8331721B2 (en) 2007-06-20 2012-12-11 Microsoft Corporation Automatic image correction providing multiple user-selectable options
US20080317376A1 (en) * 2007-06-20 2008-12-25 Microsoft Corporation Automatic image correction providing multiple user-selectable options
US20090109233A1 (en) * 2007-10-30 2009-04-30 Kerofsky Louis J Methods and Systems for Image Enhancement
US8155434B2 (en) 2007-10-30 2012-04-10 Sharp Laboratories Of America, Inc. Methods and systems for image enhancement
US8378956B2 (en) 2007-11-30 2013-02-19 Sharp Laboratories Of America, Inc. Methods and systems for weighted-error-vector-based source light selection
US9177509B2 (en) 2007-11-30 2015-11-03 Sharp Laboratories Of America, Inc. Methods and systems for backlight modulation with scene-cut detection
US20090140970A1 (en) * 2007-11-30 2009-06-04 Kerofsky Louis J Methods and Systems for Weighted-Error-Vector-Based Source Light Selection
US20090141178A1 (en) * 2007-11-30 2009-06-04 Kerofsky Louis J Methods and Systems for Backlight Modulation with Scene-Cut Detection
US8179363B2 (en) 2007-12-26 2012-05-15 Sharp Laboratories Of America, Inc. Methods and systems for display source light management with histogram manipulation
US20090167672A1 (en) * 2007-12-26 2009-07-02 Kerofsky Louis J Methods and Systems for Display Source Light Management with Histogram Manipulation
US20090167789A1 (en) * 2007-12-26 2009-07-02 Kerofsky Louis J Methods and Systems for Backlight Modulation with Image Characteristic Mapping
US8203579B2 (en) 2007-12-26 2012-06-19 Sharp Laboratories Of America, Inc. Methods and systems for backlight modulation with image characteristic mapping
US8207932B2 (en) 2007-12-26 2012-06-26 Sharp Laboratories Of America, Inc. Methods and systems for display source light illumination level selection
US8223113B2 (en) 2007-12-26 2012-07-17 Sharp Laboratories Of America, Inc. Methods and systems for display source light management with variable delay
US20090167673A1 (en) * 2007-12-26 2009-07-02 Kerofsky Louis J Methods and Systems for Display Source Light Management with Variable Delay
US8169431B2 (en) 2007-12-26 2012-05-01 Sharp Laboratories Of America, Inc. Methods and systems for image tonescale design
US8531379B2 (en) 2008-04-28 2013-09-10 Sharp Laboratories Of America, Inc. Methods and systems for image compensation for ambient conditions
US20090267876A1 (en) * 2008-04-28 2009-10-29 Kerofsky Louis J Methods and Systems for Image Compensation for Ambient Conditions
US8416179B2 (en) 2008-07-10 2013-04-09 Sharp Laboratories Of America, Inc. Methods and systems for color preservation with a color-modulated backlight
US20100053222A1 (en) * 2008-08-30 2010-03-04 Louis Joseph Kerofsky Methods and Systems for Display Source Light Management with Rate Change Control
US9330630B2 (en) 2008-08-30 2016-05-03 Sharp Laboratories Of America, Inc. Methods and systems for display source light management with rate change control
US20100321574A1 (en) * 2009-06-17 2010-12-23 Louis Joseph Kerofsky Methods and Systems for Power-Controlling Display Devices
US8165724B2 (en) 2009-06-17 2012-04-24 Sharp Laboratories Of America, Inc. Methods and systems for power-controlling display devices
US20110001737A1 (en) * 2009-07-02 2011-01-06 Kerofsky Louis J Methods and Systems for Ambient-Adaptive Image Display
US20110074803A1 (en) * 2009-09-29 2011-03-31 Louis Joseph Kerofsky Methods and Systems for Ambient-Illumination-Selective Display Backlight Modification and Image Enhancement

Also Published As

Publication number Publication date
DE19842572B4 (en) 2005-03-24
DE19842572A1 (en) 2000-03-23

Similar Documents

Publication Publication Date Title
US6504953B1 (en) Method for the automatic removal of image errors
US6233364B1 (en) Method and system for detecting and tagging dust and scratches in a digital image
EP0445066A2 (en) Method for preparing polychromatic printing plates
JP2005310068A (en) Method for correcting white of eye, and device for executing the method
JP2003228712A (en) Method for identifying text-like pixel from image
US20060115153A1 (en) Page background estimation using color, texture and edge features
JPH05268457A (en) Multi-color marker editing device
JPH08154172A (en) Image processing method, image file and file for image processing
JPH0372780A (en) Picture processor
JP4281244B2 (en) Image forming apparatus, image data processing method, and recording medium recording image data processing program
JP2008092447A (en) Image processing apparatus, image output device, and image processing method
JP4050220B2 (en) Image processing method, image processing apparatus, image forming apparatus, program, and recording medium
JP2003209704A (en) Image processing method, image processor, image forming device, image processing program, and recording medium
JP2006031245A (en) Image processing method and image processor by profile tracking of digital image
JP3190050B2 (en) Color image processing method
JP4333016B2 (en) Image forming apparatus, image data processing method, and recording medium recording image data processing program
JP2005063022A (en) Noise pixel map producing method, device performing the same, program, and photograph printing device
JP3048155B2 (en) Image processing device
US7295344B2 (en) Image processing method and image processing apparatus, program and recording medium, and image forming apparatus
JP3004996B2 (en) Image processing device
JP4073877B2 (en) Image processing method, image processing apparatus, image forming apparatus, and computer program
JP4086537B2 (en) Image processing method, image processing apparatus, image forming apparatus, computer program, and recording medium
JP2008079196A (en) Image correcting method, image correcting program and image correcting module
JP2774567B2 (en) Image processing device
JPH1091792A (en) Document segmentation method and device, color print system and computer program product

Legal Events

Date Code Title Description
AS Assignment

Owner name: HEIDELBERGER DRUCKMASCHINEN AG, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BEHRENDS, ROLF;REEL/FRAME:010190/0832

Effective date: 19990814

FPAY Fee payment

Year of fee payment: 4

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 8

REMI Maintenance fee reminder mailed
LAPS Lapse for failure to pay maintenance fees
STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20150107