WO2004080057A1 - 画像処理装置 - Google Patents

画像処理装置 Download PDF

Info

Publication number
WO2004080057A1
WO2004080057A1 PCT/JP1998/005494 JP9805494W WO2004080057A1 WO 2004080057 A1 WO2004080057 A1 WO 2004080057A1 JP 9805494 W JP9805494 W JP 9805494W WO 2004080057 A1 WO2004080057 A1 WO 2004080057A1
Authority
WO
WIPO (PCT)
Prior art keywords
density
background
image data
conversion
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP1998/005494
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
Hiroki Kanno
Takayuki Sawada
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US09/367,694 priority Critical patent/US6567544B1/en
Anticipated expiration legal-status Critical
Publication of WO2004080057A1 publication Critical patent/WO2004080057A1/ja
Ceased legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/58Edge or detail enhancement; Noise or error suppression, e.g. colour misregistration correction
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration using histogram techniques
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/92Dynamic range modification of images or parts thereof based on global image properties
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10008Still image; Photographic image from scanner, fax or copier
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Definitions

  • the present invention provides, for example, an image processing apparatus that processes a color image read from a document in an image forming apparatus such as a digital color copying machine that forms a copy of a color image, and an image processing apparatus using the image processing apparatus.
  • the present invention relates to an image forming apparatus that forms a color image.
  • the copy density is manually adjusted to make it thinner, and the copy is made with less background and show-through.
  • the setting may be made automatically.
  • a density adjustment characteristic is automatically detected by detecting a density distribution characteristic of an entire image and setting a density conversion characteristic using the density distribution characteristic as a parameter. Set. Then, the user does not need to set the density in consideration of the background density for each document, and can easily perform the copying operation.
  • the density of the background area which occupies a large area of the entire original, tends to be conspicuous in the recording characteristics of a general recording apparatus.
  • the background area is detected and the background density is output at a constant level, the background density unevenness can be suppressed. Disclosure of the invention Therefore, the present invention provides an image that reduces the background density and retains the character density when copying an original with a background density, and reduces the background image and retains the front image density when copying a document with show-through. It is an object to provide a processing device.
  • the present invention provides an image processing method that converts the background density of a character area to another value even if the original is a mixture of photographs, and preserves the density of the photograph portion faithfully, so that the color and density do not change. It is an object to provide an image processing apparatus and an image forming apparatus.
  • the present invention provides an image processing apparatus and an image forming apparatus that can suppress show-through even when a document has a background color while preserving the background color, and at the same time reduce unevenness in the density of the background. It is called Mejirojo to share a tree.
  • an image processing apparatus comprises: a density distribution calculating unit configured to calculate a density distribution of a document image based on input document image density data; Density range calculating means for calculating a density range corresponding to the background density of the document image based on the density distribution; and calculating the density of the document image included in the background density range calculated by the density range calculating means. Conversion means for converting into a density value and outputting the converted value.
  • the density distribution calculating means includes a histogram creating means for creating a density histogram representing the color characteristics of the document based on the input image data.
  • the density range calculating means includes Means for determining the density having the highest frequency in the low density area of the histogram as the background density level of the document, and calculating the background density range based on the background density level;
  • the conversion means has means for converting input image data equal to or lower than the background density level calculated by the density range calculation means into a value “0”.
  • the conversion means also converts input image data below the background density level calculated by the density range calculation means into a value “0”, and converts input image data larger than the background density level according to a predetermined function. Have means to do so.
  • the conversion means also has means for converting input image data equal to or lower than the background density level calculated by the density range calculation means into a constant value.
  • the conversion means calculates the background density level calculated by the density range calculation means. It has means for converting only input image data included in the predetermined density range to a constant value and outputting other input image data as it is.
  • the density range calculating means may determine the density range near the background density level having a frequency from the frequency of the input image data of the background density level to a frequency smaller by a certain value than the frequency as the background density range. Having.
  • the apparatus according to the present invention further comprises setting means for setting whether or not the original is a color original which is a color original, wherein the converting means sets the first of the original set to color by the setting means.
  • the second background density conversion, which is different from the first, is performed for the original set to monochrome.
  • FIG. 1 is a side view schematically showing an internal structure of an image forming apparatus according to an embodiment of the present invention.
  • FIG. 2 is a block diagram showing an electrical configuration of the image forming apparatus shown in FIG.
  • 3A and 3B are block diagrams schematically showing the configuration of the image processing apparatus of FIG.
  • FIG. 4 is a block diagram showing a configuration of a main part of the image processing apparatus according to the first embodiment.
  • FIG. 5 is a block diagram illustrating a configuration of a multi-value conversion unit included in the density distribution calculation unit.
  • FIG. 6 is a block diagram illustrating a configuration of a histogram creation unit that forms the density distribution calculation unit.
  • 7A to 7C show examples of histograms.
  • FIG. 8 is a diagram showing an example of a histogram.
  • FIG. 9 is a block diagram showing the configuration of the background density level calculation unit.
  • FIGS. 10A and 10B are diagrams showing examples of a density conversion table.
  • FIG. 11 is a block diagram illustrating a configuration of a main part of an image processing apparatus according to a second embodiment.
  • FIG. 12 is a block diagram showing the configuration of the background density distribution calculation unit.
  • FIG. 13 is a block diagram showing a configuration of a main part of the image processing apparatus according to the third embodiment.
  • FIGS. 14A and 14B are block diagrams showing the configuration of the background density conversion unit.
  • FIG. 15 is a block diagram showing a configuration of a main part of the image processing apparatus according to the fourth embodiment.
  • FIG. 16 is a block diagram illustrating a first configuration example of the background density unevenness suppression unit.
  • FIG. 17 is a diagram illustrating the density conversion performed by the background density unevenness suppression unit of FIG.
  • FIG. 18 is a block diagram showing a second configuration example of the background density unevenness suppression unit.
  • FIG. 19 is a diagram illustrating the density conversion performed by the background density unevenness suppression unit of FIG.
  • FIG. 20 is a block diagram showing a configuration of a main part of an image processing apparatus according to a fifth embodiment.
  • Figure 21 shows an example of the concentration distribution.
  • FIG. 22 is a diagram for explaining density conversion.
  • FIGS. 23A and 23B show examples of concentration distribution.
  • FIG. 24 is a block diagram schematically showing a configuration of a main part of the image processing apparatus according to the sixth embodiment.
  • FIG. 25 is a block diagram showing a configuration of a background presence / absence determination unit.
  • FIG. 26 is a block diagram showing a configuration of a main part of the image processing apparatus according to the seventh embodiment.
  • FIG. 27 is a block diagram illustrating a configuration of a main part of an image processing apparatus according to an eighth embodiment.
  • FIG. 28 is a block diagram illustrating a configuration of a main part of an image processing apparatus according to a ninth embodiment.
  • FIG. 29 is a block diagram showing a configuration of a main part of the image processing apparatus according to the tenth embodiment.
  • FIG. 30 is a block diagram showing a configuration of a main part of the image processing apparatus according to the first embodiment.
  • FIG. 31 is a block diagram showing a configuration of a main part of the image processing apparatus according to the 12th embodiment.
  • FIG. 1 schematically shows the internal configuration of an image forming apparatus such as a digital color copying machine that forms a duplicated image of the present invention.
  • This image forming apparatus is roughly classified into a color scanner section 1 as an image reading means for reading a color image on a document, and a four-tandem color printer as an image forming means for forming a duplicate image of the read color image.
  • Part 2
  • the color scanner unit 1 has a platen cover 3 on its upper part, and is disposed opposite to the platen cover 3 in a closed state, and has a platen 4 made of transparent glass on which a document is set. Below the platen 4, an exposure lamp 5 for illuminating the original placed on the platen 4, a reflector 6 for condensing the light from the exposure lamp 5 onto the original, and a reflected light from the original A first mirror 7 and the like, which are bent leftward with respect to the drawing, are provided. Dew The light lamp 5, the reflector 6, and the first mirror 7 are fixed to a first carriage 8. The first carriage 8 is driven by a pulse motor (not shown) via a toothed bevel (not shown) or the like, so as to be moved in parallel along the lower surface of the document table 4.
  • a pulse motor not shown
  • a toothed bevel not shown
  • a second carriage 9 movably provided in parallel with the platen 4 is provided.
  • the second carriage 9 folds the reflected light from the document guided by the first mirror 7 downward in the figure, and folds the reflected light from the second mirror 11 rightward in the figure.
  • Third mirrors 12 are arranged at right angles to each other.
  • the second carriage 9 is driven by the first carriage 8 and is translated along the document table 4 at a half speed with respect to the first carriage 8.
  • An imaging lens 13 for forming an image of the reflected light from the third mirror 12 at a predetermined magnification is arranged in a plane including the optical axis of the light turned back by the second and third mirrors 11 and 12.
  • a CCD-type color image sensor that converts the reflected light converged by the imaging lens 13 into electrical signals.
  • the light from the exposure lamp 5 is condensed on the document on the document table 4 by the reflector 6, the reflected light from the document becomes the first mirror 7, the second mirror 11, the third mirror 12, Then, the light enters the color image sensor 15 via the imaging lens 13, where the incident light is converted into an electric signal corresponding to the three primary colors of R (red), G (green), and ⁇ (blue) light. It is converted.
  • the color printer unit 2 performs image separation for each color component based on a well-known subtractive color mixing method, namely, yellow ( ⁇ ), magenta ( ⁇ ), cyan (C), and black ( ⁇ ).
  • First to fourth image forming units 10y, 10m, 10c, and 10k which form the four color images, respectively.
  • each image forming unit 10y, 10m, 10c, and 10k there is a conveying means for conveying the image of each color formed by each image forming unit in the direction of arrow a in the figure.
  • Conveyor belt 2 A transfer mechanism 20 including 1 is provided.
  • the transport roller 21 is wound and stretched between a driving roller 91, which is rotated in the direction of arrow a by a motor (not shown), and a driven roller 92, which is separated from the driving roller 91 by a predetermined distance.
  • the vehicle runs endlessly at a constant speed in the direction of arrow a.
  • the image forming units 10 y, 10 m, 10 c, and 10 k are arranged in series along the carrying direction of the carrying belt 21.
  • Each of the image forming sections 10 y, 10 m, 10 c, and 10 k is a photosensitive member as an image carrier whose outer peripheral surface is formed to be rotatable in the same direction at a position in contact with the conveyor belt 21. It includes body drums 6 1 y, 6 1 m, 6 1 c, and 6 1 k. Each of the photoconductor drums 6 1 y, 61 m, 6 lc, and 6 lk is rotated at a predetermined peripheral speed by a motor (not shown).
  • the photoreceptor drums 6 1 y, 61 m, 61 c and 61 k are arranged such that their axes are equally spaced from each other, and the axes are conveyed by the conveyor belt 21. It is arranged so as to be orthogonal to the direction in which it is performed.
  • the axis direction of each photoconductor drum 61 y, 61 m, 61 c, 61 k is defined as the main scanning direction (second direction), and the photoconductor drum 61 y , 61 m, 61 c, 61 k, that is, the rotation direction of the conveyor belt 21 (direction of arrow a in the figure) is the sub-scanning direction (first direction).
  • the toner recovery screws 66 y, 66 m, 66 c, 66 k are arranged in the rotation direction
  • Each transfer device 93 y, 93 m, 93 c, 93 k transfers the transfer belt 21 to the corresponding photosensitive drum 61 y, 61 m, 61 c, 61 k. It is arranged at the position where it is held, that is, inside the transport belt 21.
  • the IS optical device 50 described later The photoconductor drums 6 1 y, 6 y, 62 m, 62 c, 62 k and the developing rollers 64 y, 64 m, 64 c, 64 k are respectively connected to the photosensitive drums 6 1 y, 62 m. It is formed on the outer peripheral surface of 61m, 61c and 6lk.
  • a plurality of sheets P as an image forming medium for transferring the images formed by the image forming units 10y, 10m, 10c, and 10k are accommodated.
  • Paper cassettes 22a and 22b are arranged.
  • one sheet of paper P stored in the paper cassettes 22a and 22b is read from the top.
  • Pickup rollers 23a and 23b for taking out each one are arranged. Between the pickup rollers 23a, 23b and the driven roller 92, the leading end of the sheet P taken out of the paper cassette h22a, 22b and the photosensitive drum 6 of the image forming section 10y are provided.
  • a registration roller 24 for aligning the leading end of the Y toner image formed in 1y is arranged.
  • the toner images formed on the other photosensitive drums 6 1 y, 6 1 m, and 6 1 c are supplied to each transfer position according to the transport timing of the paper P transported on the transport belt 21.
  • An attraction roller 26 for applying electrostatic attraction to the paper P conveyed at a predetermined timing via the resist roller 24 is provided.
  • the axis of the suction roller 26 and the axis of the driven roller 92 are set to be parallel to each other.
  • An image formed on the transport belt 21 is located at one end of the transport belt 21 and near the drive roller 91, that is, substantially on the outer periphery of the drive roller 91 with the transport belt 21 interposed therebetween.
  • a misalignment sensor 96 for detecting the position is provided.
  • the displacement sensor 96 is formed of, for example, a transmission type or reflection type optical sensor.
  • Transport belt cleaning device 95 is disposed on the transport belt 21 on the outer periphery of the drive roller 91 and downstream of the displacement sensor 96.
  • the paper P conveyed via the conveyor belt 21 is detached from the drive roller 91, and is transferred to the paper P by heating the paper P to a predetermined temperature in a further conveying direction.
  • a fixing device 80 for melting the toner image and fixing the toner image to the paper P is provided.
  • the fixing device 80 includes a heat roller pair-81, oil applying rollers 82 and 83, a web winding roller 84, a web roller 85, and a web pressing roller 86.
  • the toner formed on the paper P is fixed on the paper, and is discharged by the discharge roller pair 87.
  • An exposure device 50 that forms an electrostatic latent image that is color-separated on the outer peripheral surface of each of the photoconductor drums 61 y, 61 m, 61 c, and 61 k is provided by an image processing device 36 described below.
  • a semiconductor laser oscillator 60 whose emission is controlled based on the image data (Y, M, C, K) of each image! /
  • a polygon mirror 51 rotated by a polygon motor 54 for reflecting and scanning the laser beam light, and a focal point of the laser beam light reflected through the polygon mirror 51 are corrected.
  • F ⁇ lenses 52 and 53 for forming an image are sequentially provided.
  • the laser beam light of each color passing through the f ⁇ lens 53 is applied to each photoconductor drum 61 y, 61 m, 61 c.
  • Fold mirrors 55 y, 55 m, 55 c, and 55 k which are bent toward the exposure positions of 61 k and 61 k, and the laser beam bent by the first fold mirrors 55 y, 55 m, and 55 c
  • Second and third folding mirrors 56 y, 56 m, 56 c, 57 y, 57 m, and 57 c for further folding the light are provided.
  • FIG. 2 is a block diagram schematically showing a signal flow for electrical connection and control of the digital copying machine shown in FIG. 1.
  • the control system is composed of three CPUs: a main CPU (central processing unit) 91 in the main control unit 30, a scanner CPU 100 in the color scanner unit 1, and a printer CPU 110 in the color printer unit 2. You.
  • the main CPU 91 performs bidirectional communication with the printer CPU 110 via a shared RAM (random access memory) 35.
  • the main CPU 91 issues an operation instruction, and the printer CPU 110 returns a status status.
  • Printer The CPU 10 and the scanner CPU 100 perform serial communication, the printer CPU 110 issues an operation instruction, and the scanner CPU 100 returns a status.
  • the operation panel 40 has a liquid crystal display section 42, various operation keys 43, and a panel CPU 41 to which these are connected, and is connected to a main CPU 91.
  • the main control unit 30 includes a main CPU 91, ROM (read only memory) 32, RAM33, NVRAM34, shared RAM35, image processing unit 36, page memory control unit 37, page memory 38, printer controller 39, and printer font. It is configured by the ROM 121.
  • the main CPU 91 is responsible for overall control.
  • the ROM 32 stores a control program and the like.
  • the RAM 33 temporarily stores data.
  • NVRAM Non-Volatile RAM
  • NVRAM Non-Volatile RAM
  • the shared RAM 35 is used for performing bidirectional communication between the main CPU 91 and the printer CPU 110.
  • the page memory control unit 37 stores and reads image information in and from the page memory 38.
  • the page memory 38 has an area capable of storing image information for a plurality of pages, and is configured to be able to store data obtained by compressing image information from the color scanner unit 1 for each page.
  • the printer font ROM 121 stores font data corresponding to the print data.
  • the printer controller 39 converts print data from an external memory 122 such as a personal computer using font data stored in the printer font ROM 121 at a resolution corresponding to data indicating the resolution assigned to the print data. It is developed into image data.
  • the color scanner unit 1 includes a scanner CPU 100 that controls the entire system, a ROM 101 storing a control program and the like, a RAMI 02 for storing data, a CCD dryino 103 that drives the color image sensor 15, a first carriage 8, and the like. Move It is composed of a running motor driver 104 for controlling the rotation of the running motor, and an image correction unit 105.
  • the image correction unit 105 includes an A / D conversion circuit that converts the analog signals of R, G, and B output from the color image sensor 15 into digital signals.
  • the color printer unit 2 drives a printer CPU 110 that controls the entire system, a ROM 111 that stores a control program and the like, a RAM I 12 for storing data, and the semiconductor laser oscillator 60.
  • image processing device 36 page memory 38, printer controller 39, image #fH correct part 105, and laser driver 113 are connected by an image data bus 120.
  • FIG. 3 schematically shows the configuration of the image processing device 36.
  • the color image data R, G, and B output from the color scanner unit 1 are sent to an in-place interpolation unit 151, respectively.
  • the alignment interpolator 151 performs alignment interpolation on the color image data R, G, B. That is, in general, when an image read from a document is enlarged or reduced, digital processing is performed on an image read in the main scanning direction, and a scanner key is output on an image read in the sub-scanning direction. This is done by changing the moving speed of the ridge. However, when an RGB 3-line CCD sensor (8-line pitch) is used as the color image sensor 15, there is no problem at 1 ⁇ / integer magnification, but at other magnifications. A displacement occurs in the subscan direction between R, G, and B. The alignment interpolator 15 1 calculates the pixel value based on this shift amount. Interpolation is used to compensate for the displacement.
  • the color image data R, G, and B output from the registration interpolator 15 1 are transferred to the ACS 15 2, the monochrome generator 15 3, the image processor 15 4, and the macro identifier 15 5. Sent individually.
  • ACS 152 determines whether the original to be read is a color original or a monochrome original. The above-mentioned determination is made at the time of pre-scan, and at the time of the main scan, it is switched to either color processing or monochrome processing.
  • the monochrome generator 153 generates monochrome image data from the R, G, B color image data in the monochrome copy mode.
  • the image processing unit 154 performs a background removal process on a document with a background, and will be described later in detail.
  • the macro identification unit 155 determines a photograph area and a character area in the document to be read. That is, the original is pre-scanned, and the overall judgment is made based on the run image input to the page memory 38. The result of the determination by the macro identification section 155 is stored in the identification memory 156, and is output to the microphone opening identification section 160 during the main scan.
  • the output of the image processing unit 154 is sent to the color conversion unit 157.
  • the input signals from the color scanner unit 1 are R, G, and B, but the signals from the color printer unit 2 are C, M, ⁇ , and K, so color signal conversion is required. Therefore, the color conversion unit 157 converts the R, G, B image data into C, M, Y image data, and performs color adjustment according to the user's preference by switching the color conversion parameters.
  • the output (color image data C, M, Y) of the color conversion unit 157 is a reduced pass filter (LPF) 158, a high-frequency emphasis filter (HEF) 159, and a black mouth identification unit 16 Sent to 0 respectively.
  • LPF reduced pass filter
  • HEF high-frequency emphasis filter
  • the low-pass filter 158 and the high-frequency emphasis filter 159 perform noise filtering, such as noise elimination, moiré elimination, and edge emphasis in the original.
  • the output of the low-pass filter 158 is sent to the synthesizing unit 161, and the output of the high-pass emphasis filter 159 is sent to the character emphasizing unit 162.
  • the micro-identifying section 160 determines a photograph area and a character area in the document. Here, for example, the determination is performed with reference to a local area of about 3 ⁇ 3 pixels. Based on this judgment result , Synthesis section 161, character enhancement section 162, inking section 169, black character generation section 170, selector 171, record processing section 173, screen processing section 175 Is switched.
  • the character emphasizing section 162 performs emphasis processing on the character section, and sends the processing result to the synthesizing section 161.
  • the synthesizing unit 161 synthesizes the output of the low-pass filter 158 and the output of the character emphasizing unit 162, and sends the synthesized result to the enlargement / reduction unit 163.
  • the enlargement / reduction unit 1 6 3 performs the main Z-direction enlargement Z reduction processing.
  • image data is temporarily stored in a page memory 38, and each processing unit reads a necessary part to be processed from a page memory 38 as needed and executes the processing. Therefore, it is necessary to read out an arbitrary area of the image at a constant rate. Therefore, when image data is stored in the page memory 38, first, fixed-length compression / decompression processing is performed by the YIQ conversion unit 164 and the error diffusion unit 165.
  • the YIQ conversion unit 164 converts the C, M, Y image data into Y, I, Q data to eliminate the redundancy of the color components. And save bits.
  • the CMY conversion unit 166 performs expansion of the image data and conversion of Y, I, Q data to C, M, Y data. It has become.
  • the image data is stored in the hard disk drive (HDD) 167.
  • the variable length compression unit 168 performs variable length compression processing with the best possible compression efficiency.
  • the output of the enlargement / reduction unit 163 is sent to the inking unit 169 and the black character generation unit 170, respectively.
  • Inking section 1 6 9 is image data. , M, Y, the black signal K is generated, and the black signal K is added to the image data C, M, Y.
  • the black character generating section 170 generates the black signal K by superimposing the image data C, M, and Y.
  • black characters have higher image quality in terms of both color and resolution when recorded in black, rather than when image data C, M, and Y are superimposed. Therefore, in the selector 171, the output of the blackening section 169 and the output of the black character generation section 170 are separated from the micro-identification section 160. Switching is performed based on the output identification signal, and the signal is output to the y correction section 172.
  • the correction section 17 2 corrects the ⁇ characteristic of the printer section 2. This correction is performed by referring to the ⁇ table set for each of the image data C, M, Y, and ⁇ .
  • the output of the V corrector 172 is sent to the recording processor 173.
  • the recording processing unit 173 performs gradation processing such as error diffusion, and converts, for example, input 8-bit image data into data of about 4 bits without deteriorating the gradation.
  • the direct memory 174 applies a delay corresponding to the phase to each image signal. It is so.
  • the output of the screen processing section 17 5 is It is sent to the luth width conversion section 1 76.
  • the pulse width conversion unit 176 controls the laser drive time of the laser modulation unit of the printer unit 2 because the signal level and the recording density of the image processing performed by the above units are not linear.
  • the pulse width is converted so as to be sent to the printer unit 2.
  • FIG. 4 shows a configuration of a main part of the image processing device 36 according to the first embodiment.
  • illustration other than the image processing unit 154 is omitted.
  • color image data R, G, and B output from the color scanner unit 1 are sent to the density distribution calculation unit 201 via the registration interpolation unit 151.
  • the density distribution calculation unit 201 calculates the density distribution of the color of the document as color characteristics of the document based on the input color image data R, G, and B, and uses the calculation result as the background density level calculation unit 2 0 Send to 2.
  • the background density level calculation unit 202 calculates the background density level of the document based on the density distribution calculated by the density distribution calculation unit 201, and sends the calculation result to the density conversion table creation unit 203. .
  • the density conversion table creation unit 203 is calculated by the background density level calculation unit 202.
  • the image conversion unit 204 Based on the background density level, the image conversion unit 204 creates a density conversion table for use in conversion.
  • the image conversion unit 204 converts the image densities of the input color image data R, G, and B based on the density conversion table created by the density conversion table creation unit 203.
  • the density distribution calculation unit 201 calculates the density distribution of the color of the document, and includes a multi-value conversion unit 181 as multi-value conversion means as shown in FIG. 5 and a histogram generation means as shown in FIG. It comprises a histogram extraction means formed by the histogram generator 182.
  • the multi-level conversion unit 181 performs multi-level conversion processing by comparing the input image data R, G, and B with predetermined thresholds Th 1 to Th n ⁇ 1, and outputs multi-level image signals Rg, Gg, It outputs B g, and as shown in Fig. 5, a threshold memory 183 for storing n-1 thresholds Th 1 to t hn-1, an input image data R (G, B) and a threshold memory, respectively.
  • To 184 n which is composed of an encoder 185 that encodes each comparison result.
  • the multi-level quantization unit 181 will be described with the number of multi-level quantization levels being n.
  • the input image data R (values from 0 to 255) is compared with the thresholds Th 1 to Th n ⁇ 1 in the threshold memory 183 by the comparators 184, 184 n _,. Each comparator outputs "0" when the input image data is smaller than the threshold, and outputs "1" otherwise.
  • Encoder 185 multiplies the comparison result and outputs image signal Rg.
  • the multi-valued dangling unit 181 multi-values the input image data R as follows and outputs a multi-valued image signal Rg.
  • Rg 1: R ⁇ Th 1 force R ⁇ Th 2
  • Rg 2: R ⁇ Th2 power
  • R ⁇ Th3 3: R ⁇ Th3 force
  • Rg n-2: R ⁇ Th n-2 R R Th N-1
  • the calculation is performed on the image data G and B in the same manner as in the case of the image data R described above, and the multi-valued image signals G g and B g are calculated.
  • the histogram creating section 182 creates the histogram information based on the multi-level image signals Rg, G g and B g output from the multi-level section 18 1.
  • the His I and gram creation unit 18 2 is a decoder 186 that decodes the input straightened image signal R g (G g, B g), and n adders 1 8 7 . , 1 8 7! ⁇ 1 8 7 n 1, and, n number of register 1 8 8. , L SS il SS n—i.
  • FIG. 6 shows only the circuit for the multi-level image signal R g, the same circuit is actually provided for the multi-level image signals G g and B g. The illustration is omitted.
  • Register 1 8 8. ⁇ 1 8 8 n _! For example, if you type the image of A 3 size, 400 dpi, it is necessary 25 bits. Each register 1 8 8. 1188 are previously cleared to “0”. If the multi-level image signal R g is “0”, the adder 1887. Is incremented by one and Register 1 8 8 is added. Is the adder 1 87. Holds the output of the adder 1 8 7. Output to That is, the adder 1 87. Adds the output of register 1880 and the output of decoder 186.
  • the adder 1887 is added. If the multi-level image signal R g is “2”, “1” is added to the adder 18 87 2. You. As a result, histogram information is created in the registers 1880 to 188n-1. These processes are performed independently for each of the multilevel image signals Rg, Gg, and Bg.
  • register 1 8 Each frequency of the image information obtained by accumulating the (low density portion) from the register 1 8 8 n _i (high density portion) (histogram information), image data R for RH (0), RH (1)-RH (nl), and for image data G and B, GH (0), GH (1)-GH (n-1), BH (0), BH (1) —BH (nl).
  • color features are extracted, they are originally calculated based on the mutual values of R, G, and B (that is, they are not independent of R, G, and B), and a large number of registers are required as shown in the following formula.
  • g l: R ⁇ Th1 power Rku Th2 and GkuTh1 and BkuTh1
  • the color distribution satisfying the use of the present invention can be extracted by obtaining histogram information independently for each of the image data R, G, and B by the density distribution calculation unit 201. It is possible and can greatly reduce memory.
  • the number of registers may be 11 ⁇ 3.
  • the background density level calculation unit 202 calculates the background density level (or the level of each color) of the read document based on the density distribution information calculated by the density distribution calculation unit 201. .
  • the background density level calculating unit 202 will be described with reference to an example in which a monochrome original is read by the monochrome opening shown in FIG.
  • the horizontal axis represents concentration and the vertical axis represents frequency. That is, the low density level corresponding to the background is located on the left, and the high density level corresponding to the characters is located on the right.
  • the background density level calculation unit 202 determines the background density level using, for example, the following determination formula.
  • Hmax ma X (H (0), H (1), ?? H (Bmax)) ⁇ (3)
  • BL density level having Hmax value (4)
  • Hmax indicates the maximum density distribution
  • BL indicates the calculated background density level
  • Bmax indicates the range of the background area.
  • the density distribution values H (0) and H (1) are input to the comparator 301, and the larger H (11) value and the selection signal SL1 indicating the larger one are output.
  • the selector 304 receives the density levels LV0 and LV1 of the density distribution values H (0) and H (1), and receives the selection signal SL1 output from the comparator 301 to determine the density level of the larger density distribution value. Is selected and output. In the example of the density distribution in FIG. 8, “1” is output as the density distribution value H (1) 1 density level ′.
  • the comparator 302 and the selector 305 operate in the same manner.
  • H (2) is output as the density distribution value
  • “2” is output as the density value.
  • the comparator 303 and the selector 306 receive the outputs of the comparators 301 and 302 and the outputs of the selectors 304 and 305, respectively, and operate in the same manner as the comparators 301 and 302 and the selectors 304 and 305.
  • H (1) is output as the maximum density distribution value Hma X
  • “1” is output as the background density level BL.
  • the density conversion table creation unit 203 creates a density conversion table based on the background density level created by the background density level calculation unit 202.
  • 10A and 10B show an example of the density conversion table.
  • an 8-bit (256 level) Create a table (256 bytes, 256 x 3 bytes for color RGB) to convert the input signal.
  • the 16 background density levels B L calculated by the background density level calculation section 202 are converted into one of 256 density levels as described later.
  • the image conversion unit 204 converts the image density based on the density conversion table created by the density conversion table creation unit 203.
  • the background density can be removed.
  • the background density level is calculated.
  • the background density level is calculated in consideration of the spread of the background density distribution. You.
  • FIG. 11 shows a configuration of a main part of an image processing device 36 according to the second embodiment.
  • the difference from the first embodiment is that a background density distribution calculation unit 205 is used instead of the background density level calculation unit 202, and the other points are the same as those of the first embodiment. Portions are given the same reference numerals and description thereof is omitted.
  • FIG. 12 shows a specific configuration example of the background density distribution calculation unit 205, which is composed of three comparators 301, 302, 303, three selectors 304, 305, 306, and a caro calculator 307.
  • the comparator 301 receives the input of the concentration distributions H (0) and H (1), and outputs a selection signal SL1 indicating the larger H (n) value, which is larger.
  • the selector 304 receives the density levels LV0 and LV1 of the density distribution values H (0) and H (1), and receives the selection signal SL1 output from the comparator 301 to select the higher density level of the density distribution value. Is selected and output. In the example of the density distribution in FIG. 8, “1” is output as the H (1) power density value as the density distribution.
  • the comparator 302 and the selector 305 operate in the same manner.
  • the density distribution value is output as H (2) and the density level is output as “2”.
  • the comparator 303 and the selector 306 are supplied with the outputs of the comparators 301 and 302 and the outputs of the selectors 304 and 305, respectively, and perform the same operations as the comparators 301 and 302 and the selectors 304 and 305.
  • the base can be satisfactorily removed.
  • the third embodiment is entirely performed by hardware operation.
  • FIG. 13 shows a configuration of a main part of an image processing device 36 according to the third embodiment.
  • the difference from the second embodiment lies in that the density conversion table creation unit 203 is deleted, and the background density conversion unit 206 is used instead of the image conversion unit 204. Therefore, the same portions are denoted by the same reference numerals and description thereof will be omitted.
  • FIGS. 14A and 14B show a specific configuration example of the background density conversion unit 206.
  • FIG. 14A a subtractor 308 that performs subtraction between the input density level Di and the background density level BL, and a division between the output of the subtractor 308 and a predetermined value “255 J”
  • the fourth embodiment is an effective configuration for suppressing a density unevenness and show-through of the background, instead of performing the removal of the background with the original having a color in the case of a blank original.
  • FIG. 15 schematically illustrates a configuration of a main part of an image processing device 36 according to the fourth embodiment.
  • the difference from the third embodiment is that a background density unevenness suppression unit 207 is used instead of the background density conversion unit 206, and the other points are the same as those of the third embodiment.
  • the same reference numerals are given to the same parts, and the description is omitted.
  • Fig. 16 shows a first specific example of the background density unevenness suppression unit 207.
  • a comparator 312 which compares the background density level BL with the input density level Di,
  • the selector 313 selects one of the background density level BL and the input density level Di based on the comparison result of the comparator 313.
  • the background density unevenness suppression unit 207 performs the density conversion shown in FIG. 17 based on the background density level BL output from the background density distribution calculation unit 205.
  • the image density equal to or lower than the background density is uniformly replaced with the background density level BL, and unevenness of the background density and show-through can be suppressed.
  • FIG. 18 shows a second specific example of the configuration of the background density unevenness suppression unit 207.
  • the subtractor 3 1 4 performs subtraction between the background density level BL and a constant level 1, and the background density
  • the adder 3 15 that performs addition between the level BL and the fixed level 1
  • the comparator 3 16 that compares the output of the subtractor 3 14 with the input density level D i
  • the output of the adder 3 15 The comparator 317 comparing the input density level D i, the AND circuit 318 which takes the logical product of the outputs of the comparators 316 and 317, and the output of the AND circuit 318 provide the background. It comprises a selector 319 for selecting either the density level BL or the input density level Di.
  • the background density unevenness suppression unit 207 is an effective configuration for suppressing the density unevenness of the background or show-through in a color original, instead of removing the background from the original with a color in the background. is there. Further, the background density unevenness suppression unit 207 is particularly effective when there is an image thinner than the background (for example, white characters or a white area of the original pressing cover outside the original).
  • FIG. 18 shows a second configuration of the background density unevenness suppressing section 207, which performs the density conversion shown in FIG.
  • the second background density unevenness suppression unit 207 uses the background density level B L output from the background density distribution calculation unit 205 based on
  • Input density level D i ⁇ background density level B L— 1: output level D o D i input density level D i background density level B L— 1, and
  • the image density near the background density is uniformly replaced with the background density level BL, and the density unevenness and show-through of the background can be suppressed. Furthermore, the image density is clearly lower than the background density, and the image density is preserved.
  • the peak position of the background density in the image density distribution and the background density It detects the skirt position of the density distribution indicating the spread, and converts the background density from the peak position and skirt position.
  • FIG. 20 schematically shows a configuration of a main part of an image processing device 36 according to the fifth embodiment.
  • the third embodiment differs from the third embodiment in that a background position detection unit 208 and a skirt position detection unit 209 are used instead of the background density distribution calculation unit 205, and the other points are the same as those of the third embodiment. Therefore, the same portions are denoted by the same reference numerals, and description thereof will be omitted.
  • the hem position detection unit 209 will be described with reference to the density distribution shown in FIG.
  • the tail position is a frequency value that continuously decreases monotonically from the peak position (Hmax) of the background density BL (ie, the density distribution value).
  • the minimum and maximum density levels of the density level H (n) satisfying the following.
  • the minimum density level B Lmin is “0”
  • the maximum density level B Lmax is “3”.
  • FIG. 22 shows an example of density conversion based on the skirt position B Lmin, BLmax, and density conversion such as a solid line or a dashed line can be performed.
  • the density conversion is performed on the input image density included in the density area having a predetermined width including the background density B L at the center.
  • the density conversion is performed according to the width of the background density unevenness, so that more accurate image density conversion can be performed.
  • the position accuracy of the background density is determined depending on the density division number of the density distribution calculation unit 201. That is, if the number of divisions is n, the position accuracy is as follows.
  • p is a density position having a peak frequency on the histogram, and is one of 0 to 15 when the number of density divisions is 16.
  • BL is the background density. If the resolution of the scan section 1 is 8 bits, it is one of 0-255.
  • the frequency H (p) at the peak position p is weighted by the frequencies H (p-1) and H (p + 1) before and after the peak position.
  • the sixth embodiment has a configuration in which it is determined whether or not the background should be removed, and the background should be removed in the case of a document to be removed.
  • FIG. 24 schematically shows a configuration of a main part of an image processing device 36 according to the sixth embodiment.
  • the difference from the third embodiment resides in that a background presence / absence determination unit 210 for determining whether to remove the background is added, and the other points are the same as in the third embodiment. Are denoted by the same reference numerals and description thereof is omitted.
  • FIG. 25 shows the configuration of the background presence / absence determining unit 210.
  • the comparator compares the background density levels B Lr, BLg, BLb calculated for each of the color image data R, G, B with a predetermined threshold th1.
  • 321, 322, 323, a subtractor 324 that performs an operation between the background density levels BLr and BLg, a subtractor 325 that performs a subtraction between the background density levels BLg and BLb, and a background density level BLb
  • a subtractor 326 that performs subtraction with BL r, a comparator 327, 328, 329 that compares each output of the subtractors 324, 325, and 326 with a predetermined threshold th2, and a comparator 321, 322, and 323
  • An AND circuit 330 that takes the logical product of the outputs, an AND circuit 331 that takes the logical product of the outputs of the comparators 327, 328, and 329, and an OR circuit 332 that takes
  • the background is removed. That is, the background density levels BL r, BL g, and BLb of the image data R, G, and B are less than a predetermined frequency, and the level of the three channels When the difference is small, the background is removed.
  • the original to be removed from the background is determined, and the process of removing the background is appropriately performed according to the type of the original.
  • a character / background area is determined from a target document, and the background density is converted based on the determination result.
  • FIG. 26 illustrates a configuration of a main part of an image processing device 36 according to the seventh embodiment.
  • a density distribution calculation unit 201 a background density distribution calculation unit 205, a background density conversion unit 206, and a character / background for determining a character / background area as area determination means are provided. It is composed of a judgment unit 2 1 1.
  • the density distribution calculation unit 201, the background density distribution calculation unit 205, and the background density conversion unit 206 are the same as those in the third embodiment described above, Detailed description is omitted.
  • the character / base determination unit 211 determines a character and a base area, that is, an area where a character is written on the base, from the input image data R, G, and B.
  • the character / background determination unit 211 determines, for example, an area where the density gradient changes rapidly as a character / background area.
  • the density distribution calculation unit 201 calculates the density distribution of the input image data R, G, B for the area determined as the character Z base area by the character / base determination unit 211.
  • the background density distribution output unit 205 calculates the background density distribution of the document based on the density distribution calculated by the density distribution calculation unit 201.
  • the background density conversion unit 206 converts the background density of the input image data R, G, B based on the background density distribution calculated by the background density distribution calculation unit 205.
  • a non-photographic area is determined from a target document, and the background density is converted based on the determination result.
  • FIG. 27 schematically shows a configuration of a main part of an image processing device 36 according to the eighth embodiment.
  • the eighth embodiment includes a density distribution calculation unit 201, a background density distribution calculation unit 205, a background density conversion unit 206, and a non-photo determination unit that determines a non-photo region as a region determination unit. It is composed of 2 1 2. Note that the density distribution calculator 201 and the background density distribution Since the calculation unit 205 and the background density conversion unit 206 are the same as those in the third embodiment, the same reference numerals are assigned and detailed explanations are omitted.
  • the non-photo determination unit 2 12 determines a non-photo area from the input image data R, G, and B.
  • Density distribution calculation unit 2 0 1 calculates the input image data R, G, the concentration distribution of B.
  • the background density distribution calculation unit 205 calculates the background density distribution of the document based on the density distribution calculated by the density distribution calculation unit 201. Based on the background density distribution calculated by the background density distribution calculation unit 205, the background density conversion unit 206 determines the non-photographic area of the input image data R, G, and B by the non-photo determination unit 212. The background density is converted for the area determined to be.
  • a character / background area is determined from a target document, and the background density is converted based on the determination result.
  • FIG. 28 schematically illustrates a configuration of a main part of the image processing device 36 according to the ninth embodiment.
  • the ninth embodiment includes a density distribution calculation unit 201, a background density distribution calculation unit 205, a background density conversion unit 206, and a character / base determination unit 211.
  • the density distribution calculation unit 201, the background density distribution calculation unit 205, the background density conversion unit 206, and the character Z background determination unit 211 are the same as those in the above-described seventh embodiment. Therefore, the same reference numerals are given and the detailed description is omitted.
  • the character / base determination unit 211 determines a character / base area from the input image data R, G, and B.
  • the density distribution calculator 201 calculates the density distribution of the input image data R, G, B.
  • Background density distribution calculation unit 2 0 5 calculates the background density distribution of an original based on the density distribution calculated by the density distribution calculation unit 2 0 1.
  • the background density conversion unit 206 determines the character / background by the character / base determination unit 211 of the input image data R, G, and B. The background density is converted for the area determined as the area.
  • FIG. 29 schematically shows a configuration of a main part of the image processing device 36 according to the tenth embodiment.
  • the density distribution calculation unit 201 and the background density distribution calculation unit 205 are the same as those in the third embodiment described above, and therefore, are denoted by the same reference numerals and detailed description thereof will be omitted.
  • the character / background / photograph determination section 213 determines a character Z background area and a photograph area from the input image data R, G, and B.
  • the density distribution calculator 201 calculates the density distribution of the input image data R, G, B.
  • the background density distribution calculation unit 205 calculates the background density distribution of the document based on the density distribution calculated by the density distribution calculation unit 201.
  • the background density conversion unit 206 a converts the background density of the input image data R, G, and B based on the background density distribution calculated by the background density distribution calculation unit 205.
  • the background density conversion unit 206 b converts the background density of the input image data R, G, and B based on the background density distribution calculated by the background density distribution calculation unit 205.
  • the selector 2 14 selects the output of the background density conversion unit 206 a when the character / background / photo determination unit 2 13 determines that it is a character / background area, and selects the character / background / photo determination unit 2 1 If it is determined in Step 3 that the area is a photographic area, the output of the background density conversion unit 206 b is selected.
  • a target document is a color document or a monochrome document
  • the background density is converted by a different method between the color document and the monochrome document.
  • FIG. 30 illustrates a configuration of a main part of the image processing device 36 according to the first embodiment.
  • the density distribution calculation unit 201, the background density distribution calculation unit 205, the background density conversion units 206a and 206b, the selector 211 and the target document It is composed of a color / monochrome setting section 215 for manually setting a color or monochrome document.
  • the density distribution calculation unit 201, the background density distribution calculation unit 205, the background density conversion unit 206a, 206b, and the selector 214 are the same as those of the tenth embodiment described above. Therefore, the same reference numerals are given and detailed description is omitted.
  • the color / monochrome setting unit 215 sets whether the target document is a color document or a monochrome document.
  • the density distribution calculator 201 calculates the density distribution of the input image data R, G, B.
  • the background density distribution calculation unit 205 calculates the background density distribution of the document based on the density distribution calculated by the density distribution calculation unit 201.
  • the background density conversion unit 206a calculates the background density of the input image data R, G, and B based on the background density distribution calculated by the background density distribution calculation unit 205, for example, as shown in FIG. Convert.
  • the background density conversion unit 206 b converts the background density of the input image data R, G, and B based on the background density distribution calculated by the background density distribution calculation unit 205 to the background density conversion unit 206. Conversion is performed in a different way from a, for example, as shown in FIG. 10B.
  • the selector 2 14 selects the output of the background density conversion unit 206 a when the original is set to color by the color Z monochrome setting unit 15, and selects the monochrome by the color Z monochrome setting unit 2 15 When the original is set, select the output of the background density converter 206b.
  • FIG. 31 shows a configuration of a main part of an image processing device 36 according to the 12th embodiment.
  • a density distribution calculation unit 201 a background density distribution calculation unit 205, a background density conversion unit 206a, 206b, a selector 211, and an original manuscript Is composed of a color / monochrome original discriminating unit 216 that automatically judges whether the original is a color original or a monochrome original.
  • the density distribution calculation unit 201 the background density distribution calculation unit 205, the background density conversion units 206a and 206b, and the selector 214 are the same as those in the first embodiment described above. Since it is the same as the embodiment, the same reference numerals are given and the detailed description is omitted.
  • the color / monochrome document determination unit 2 16 determines whether the target document is a color document or a monochrome document based on the density difference between the input image data R, G, and B, for example, as in equation (14). I do.
  • the density distribution calculation unit 201 calculates the density distribution of the input image data R, G, B.
  • the background density distribution calculation unit 205 calculates the background density distribution of the document based on the density distribution calculated by the density distribution calculation unit 201.
  • the background density conversion unit 206 a is the background density calculated by the background density distribution calculation unit 205.
  • the background density of the input image data R, G, B is converted based on the distribution.
  • the background density conversion unit 206 b converts the background density of the input image data R, G, and B based on the background density distribution calculated by the background density distribution calculation unit 205. 6 Transform in a different way than a.
  • the selector 2 14 selects the output of the background density conversion unit 206 a when the color original is judged to be a color original by the color Z monochrome original judgment unit 2 16, and the color When it is determined that the document is an original, the output of the background density conversion unit 206 b is selected.
  • the background density is reduced and the character density is maintained.
  • the back image is thinned and the front image density is maintained.
  • the background density of the character area is converted to another value, and the density of the photograph portion is faithfully stored, so that the color and density do not change.
  • show-through can be suppressed while the background color is preserved, and the density unevenness of the background can be reduced at the same time.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Facsimile Image Signal Circuits (AREA)
  • Color Image Communication Systems (AREA)
  • Control Or Security For Electrophotography (AREA)
  • Color Electrophotography (AREA)
  • Image Processing (AREA)
PCT/JP1998/005494 1997-12-19 1998-12-04 画像処理装置 Ceased WO2004080057A1 (ja)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US09/367,694 US6567544B1 (en) 1997-12-19 1998-12-04 Image processing apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP9/350884 1997-12-19
JP35088497A JP3845509B2 (ja) 1997-12-19 1997-12-19 画像処理装置および画像形成装置

Publications (1)

Publication Number Publication Date
WO2004080057A1 true WO2004080057A1 (ja) 2004-09-16

Family

ID=18413554

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP1998/005494 Ceased WO2004080057A1 (ja) 1997-12-19 1998-12-04 画像処理装置

Country Status (3)

Country Link
US (1) US6567544B1 (enExample)
JP (1) JP3845509B2 (enExample)
WO (1) WO2004080057A1 (enExample)

Families Citing this family (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001094810A (ja) * 1999-09-22 2001-04-06 Toshiba Tec Corp 画像処理方法及び画像処理装置並びに画像形成装置
JP3768052B2 (ja) 1999-12-14 2006-04-19 株式会社リコー カラー画像処理方法、カラー画像処理装置、及びそのための記録媒体
US6920245B1 (en) 2000-11-22 2005-07-19 Kabushiki Kaisha Toshiba Image processing apparatus and image forming apparatus
JP4509415B2 (ja) * 2001-04-12 2010-07-21 株式会社リコー 画像処理装置
US7167261B2 (en) * 2001-12-28 2007-01-23 Kabushiki Kaisha Toshiba Image forming apparatus with predetermined copy quality set by user or operator
US7277596B2 (en) 2002-04-10 2007-10-02 Ricoh Company, Ltd. Apparatus configured to eliminate image data show-through
US7466445B2 (en) * 2003-07-14 2008-12-16 Toshiba Corporation Color and density calibration of color printers
US20050078867A1 (en) * 2003-10-14 2005-04-14 Kabushiki Kaisha Toshiba System and method for generating black and white reproductions of color documents
JP4553297B2 (ja) * 2004-08-02 2010-09-29 株式会社リコー 画像処理装置
US7362470B2 (en) * 2005-03-17 2008-04-22 Kabushiki Kaisha Toshiba Color image processing apparatus
US20060274376A1 (en) * 2005-06-06 2006-12-07 Lexmark International, Inc. Method for image background detection and removal
JP4504294B2 (ja) * 2005-09-30 2010-07-14 株式会社東芝 映像信号処理装置及び映像信号処理方法
JP4282081B2 (ja) * 2005-10-31 2009-06-17 キヤノン株式会社 画像処理装置およびその方法
JP2007166562A (ja) * 2005-12-17 2007-06-28 Fuji Xerox Co Ltd 色変換装置および色変換方法、色変換プログラム、記憶媒体
US8018494B2 (en) * 2006-01-10 2011-09-13 Panasonic Corporation Color correction device, color correction method, dynamic camera color correction device, and video search device using the same
JP4227627B2 (ja) * 2006-04-24 2009-02-18 キヤノン株式会社 インクジェット記録装置および画像処理方法
US7679796B2 (en) 2007-02-02 2010-03-16 Kabushiki Kaisha Toshiba Image processing apparatus and image processing method
US8049908B2 (en) 2007-02-02 2011-11-01 Kabushiki Kaisha Toshiba Image processing apparatus and image processing method
US7826112B2 (en) * 2007-10-24 2010-11-02 Kabushiki Kaisha Toshiba Color conversion apparatus and color conversion method
JP4721077B2 (ja) * 2008-10-23 2011-07-13 ソニー株式会社 画像処理装置、画像処理方法、及び、プログラム
JP5293444B2 (ja) * 2009-06-18 2013-09-18 船井電機株式会社 スケーリング処理システム、映像出力装置及び再生装置
JP5321275B2 (ja) * 2009-06-18 2013-10-23 船井電機株式会社 映像表示システム、映像出力装置、及び映像再生装置
JP2011155620A (ja) * 2010-01-28 2011-08-11 Kyocera Mita Corp 画像処理装置,画像形成装置
JP5322980B2 (ja) * 2010-03-12 2013-10-23 京セラドキュメントソリューションズ株式会社 画像処理装置,画像形成装置
JP5216799B2 (ja) 2010-03-17 2013-06-19 京セラドキュメントソリューションズ株式会社 画像処理装置,画像形成装置
JP2012034004A (ja) * 2010-07-28 2012-02-16 Kyocera Mita Corp 画像処理装置,画像形成装置
JP5478428B2 (ja) * 2010-08-31 2014-04-23 京セラドキュメントソリューションズ株式会社 画像処理装置
JP5361830B2 (ja) * 2010-08-31 2013-12-04 京セラドキュメントソリューションズ株式会社 画像処理装置
JP5241796B2 (ja) * 2010-10-29 2013-07-17 京セラドキュメントソリューションズ株式会社 画像形成装置
CN102455628B (zh) 2010-10-29 2014-08-13 京瓷办公信息系统株式会社 图像形成装置
JP5736241B2 (ja) * 2011-06-08 2015-06-17 京セラドキュメントソリューションズ株式会社 画像形成装置
JP5942546B2 (ja) * 2012-03-30 2016-06-29 ブラザー工業株式会社 画像処理装置
JP5949399B2 (ja) 2012-09-28 2016-07-06 ブラザー工業株式会社 画像処理装置およびプログラム
JP6756171B2 (ja) * 2016-07-11 2020-09-16 コニカミノルタ株式会社 画像処理装置、画像形成装置、画像形成システム、及び画像処理プログラム

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0344268A (ja) * 1989-07-12 1991-02-26 Matsushita Electric Ind Co Ltd 下地除去装置
JPH0437259A (ja) * 1990-05-31 1992-02-07 Fuji Xerox Co Ltd 画像処理装置
JPH0437258A (ja) * 1990-05-31 1992-02-07 Fuji Xerox Co Ltd 画像処理装置
JPH0490258A (ja) * 1990-08-02 1992-03-24 Canon Inc 画像処理装置
JPH04313744A (ja) * 1991-01-25 1992-11-05 Fuji Xerox Co Ltd 画像記録装置の地肌除去方式
JPH0730757A (ja) * 1993-07-06 1995-01-31 Sharp Corp 画像データ処理装置
JPH1013681A (ja) * 1996-06-20 1998-01-16 Toshiba Corp 画像形成装置
JPH1093835A (ja) * 1997-07-31 1998-04-10 Canon Inc 画像処理装置及び方法

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0368270A (ja) * 1989-08-08 1991-03-25 Fuji Xerox Co Ltd 画像処理装置
US4982128A (en) * 1990-02-01 1991-01-01 Mcdonald Maurice F Double air gap alternator
JPH05183749A (ja) * 1991-02-26 1993-07-23 Sharp Corp 閾値決定方法

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0344268A (ja) * 1989-07-12 1991-02-26 Matsushita Electric Ind Co Ltd 下地除去装置
JPH0437259A (ja) * 1990-05-31 1992-02-07 Fuji Xerox Co Ltd 画像処理装置
JPH0437258A (ja) * 1990-05-31 1992-02-07 Fuji Xerox Co Ltd 画像処理装置
JPH0490258A (ja) * 1990-08-02 1992-03-24 Canon Inc 画像処理装置
JPH04313744A (ja) * 1991-01-25 1992-11-05 Fuji Xerox Co Ltd 画像記録装置の地肌除去方式
JPH0730757A (ja) * 1993-07-06 1995-01-31 Sharp Corp 画像データ処理装置
JPH1013681A (ja) * 1996-06-20 1998-01-16 Toshiba Corp 画像形成装置
JPH1093835A (ja) * 1997-07-31 1998-04-10 Canon Inc 画像処理装置及び方法

Also Published As

Publication number Publication date
JP3845509B2 (ja) 2006-11-15
US6567544B1 (en) 2003-05-20
JPH11187266A (ja) 1999-07-09

Similar Documents

Publication Publication Date Title
WO2004080057A1 (ja) 画像処理装置
JP3891654B2 (ja) 画像形成装置
JP3777785B2 (ja) 画像処理装置
JP3700381B2 (ja) 画像処理装置
US5329385A (en) Color correction device having a color conversion function
JPH1169164A (ja) 画像符号化方法および画像符号化装置および画像復号化装置および画像形成装置
JPH11266372A (ja) カラー画像処理装置
US7324244B2 (en) Image forming apparatus and image forming method
JPH11187255A (ja) 画像形成装置
JP2008271488A (ja) 画像処理方法、画像処理装置、画像形成装置、画像読取装置、コンピュータプログラム、及び記録媒体
US7221480B2 (en) Image processing apparatus and image forming apparatus
JP3637001B2 (ja) 画像処理装置及び画像処理システム
JP4101983B2 (ja) 画像処理装置
JPH1141473A (ja) 画像処理装置と画像記録装置と画像形成装置
JP4085932B2 (ja) 画像形成装置
JP5760426B2 (ja) 画像形成装置、画像処理方法及びプログラム
JP2002223369A (ja) 画像処理方法および画像処理装置ならびに画像形成装置
US7551320B2 (en) Image processing apparatus, image forming apparatus, control program, and computer-readable recording medium
JP3961074B2 (ja) 画像処理装置と画像形成装置
JPH05236277A (ja) 画像処理装置
JP2004102551A (ja) 画像処理装置および画像処理方法並びにそれを備えた画像読取装置、画像形成装置、プログラム、記録媒体
JP4043982B2 (ja) 画像処理装置、画像形成装置、画像処理方法、画像処理プログラム、並びにそれを記録したコンピュータ読み取り可能な記録媒体
JP4549227B2 (ja) 画像処理装置、画像形成装置、画像処理方法、コンピュータプログラム及び記録媒体
JP3816298B2 (ja) 画像処理装置および画像形成装置
JPH11266360A (ja) 画像処理装置

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 09367694

Country of ref document: US

AK Designated states

Kind code of ref document: A1

Designated state(s): US