CN101465939B - Image processing apparatus, image processing method - Google Patents

Image processing apparatus, image processing method Download PDF

Info

Publication number
CN101465939B
CN101465939B CN2008101864570A CN200810186457A CN101465939B CN 101465939 B CN101465939 B CN 101465939B CN 2008101864570 A CN2008101864570 A CN 2008101864570A CN 200810186457 A CN200810186457 A CN 200810186457A CN 101465939 B CN101465939 B CN 101465939B
Authority
CN
China
Prior art keywords
reading
original image
unit
view data
area sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2008101864570A
Other languages
Chinese (zh)
Other versions
CN101465939A (en
Inventor
村松瑞纪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Publication of CN101465939A publication Critical patent/CN101465939A/en
Application granted granted Critical
Publication of CN101465939B publication Critical patent/CN101465939B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Editing Of Facsimile Originals (AREA)
  • Image Input (AREA)
  • Image Processing (AREA)
  • Facsimile Scanning Arrangements (AREA)

Abstract

An image processing apparatus including an area sensor unit reading image data items corresponding to frames from an original image, a correction unit correcting the inclinations of the image data items, a high-resolution conversion unit acquiring image data with a resolution higher than the pixel sensor's resolution through interpolation, a maximum frame number storage unit storing data of the maximum number of frames of the acquired image data items, a resolution setting unit setting a resolution for outputting the original image, a necessary frame number acquisition unit acquiring the number of frames necessary to perform the high resolution conversion based on the setting result, a read frequency calculation unit calculating the frequency of reading the original image through the necessary frame number acquisition unit and the maximum frame number storage unit, and a read frequency control unit reading the determined frequency and the original image is provided.

Description

Image processing equipment and image processing method
Technical field
The program, storage medium program and the storage medium that the present invention relates to a kind of image processing equipment, control the method for this image processing equipment and carry out this image processing method.
Background technology
Existed the view data of passing through to use multiframe to have predetermined resolution of a kind of being called as " SUPERRESOLUTION PROCESSING FOR ACOUSTIC and super-resolution conversion " to improve the technology of resolution.The use of above-mentioned technology makes it possible to low-resolution image is converted to high-definition picture and obtains high-definition picture (" Super-resolutionprocessing performed based on a plurality of digital image-dataitems " by known devices, Ricoh Technical Report No.24, November 1998).
In order to carry out super-resolution technique, should prepare and the corresponding original image data item of a plurality of frames.The original image of each view data item reads the position little difference (sub-pix is less than a pixel) mutually on the magnitude of sub-pix (sub pixel).Therefore, super-resolution technique is widely used in fields such as Video processing.
Yet in order to carry out SUPERRESOLUTION PROCESSING FOR ACOUSTIC, the view data that should prepare a plurality of frames is so that a corresponding image of pixel of generation and high-definition picture.Therefore, data volume and amount of calculation have been increased.
Therefore, in the past, as described in TOHKEMY 2006-092450, be identified for carrying out the frame number of SUPERRESOLUTION PROCESSING FOR ACOUSTIC to reduce amount of calculation based on the size of interested image-region.
Yet,, only interesting areas is identified for carrying out the quantity of the image of SUPERRESOLUTION PROCESSING FOR ACOUSTIC according to above-mentioned known technology.Therefore, the quantity that be in advance on the entire image zone, obtains the frame image data item that should prepare in order to carry out SUPERRESOLUTION PROCESSING FOR ACOUSTIC.
In addition, when the multi-function peripheral (MFP) as image processing equipment is used SUPERRESOLUTION PROCESSING FOR ACOUSTIC, use line sensor usually as the reader that is arranged in MFP and the scanner etc.
That is to say, obtain one by a read operation and read frame.According to above-mentioned reader, read original image by the element sensor group that spacing ground horizontal arrangement is arranged on main scanning direction, wherein, the value of each spacing is corresponding with the integral multiple of pixel.When having the same little skew with sub-pix between the position of reading original image on the main scanning direction, above-mentioned reader is difficult to read original image.
Therefore, when the location of pixels that will read on main scanning direction and/or sub scanning direction mutually during offset slightly, installation region transducer in image processing equipment makes it favour the benchmark installation site, thereby can obtain frame image data by a read operation.Yet, in this case, use with the corresponding low resolution frame image data of a plurality of frames that read and no matter the condition of output image data how.Therefore, often be difficult to reproduce output image based on the low resolution frame image data that is read with required quality.
Summary of the invention
Therefore, the invention provides a kind of image processing equipment and be used for the image processing method of this image processing equipment, thereby response is to the demand of output image quality and realize SUPERRESOLUTION PROCESSING FOR ACOUSTIC.
Therefore, image processing equipment according to a first aspect of the invention, comprise: the area sensor unit, be used for reading and the corresponding view data item of a plurality of frames from original image, described view data item has at least one corresponding to the skew less than a pixel; Correcting unit is used for the inclination of each view data item of being obtained by described area sensor unit is proofreaied and correct; The high resolution conversion unit, be used for by the view data item after use proofreading and correct carry out interpolation processing obtain resolution than data read during employed resolution high view data; Maximum frame number memory cell is used to store the data of the maximum frame number of the view data item that is obtained by described area sensor unit; Resolution setting unit is used to be provided for exporting the resolution of described original image; Required frame number acquiring unit is used for obtaining the frame number that carries out the required view data item of high resolution conversion based on the set result of described resolution setting unit; The reading times computing unit is used for using described required frame number acquiring unit required frame number that is obtained and the maximum frame number that is stored in described maximum frame number memory cell to calculate the number of times that reads described original image; And the reading times control unit, be used to read by determined number of times of described reading times computing unit and described original image.
According to a second aspect of the invention, a kind of image processing method that is used for image processing equipment is provided, described image processing equipment comprises: the area sensor unit, be used for reading and the corresponding view data item of a plurality of frames from original image, described view data item has at least one corresponding to the skew less than a pixel; Correcting unit is used for the inclination of each view data item of being obtained by described area sensor unit is proofreaied and correct; The high resolution conversion unit, be used for by the view data item after use proofreading and correct carry out interpolation processing obtain resolution than data read during employed resolution high view data, described image processing method comprises: maximum frame number storing step is used to store the data of the maximum frame number of the view data item that is obtained by described area sensor unit; Resolution is provided with step, is used to be provided for exporting the resolution of described original image; Required frame number obtaining step is used for obtaining the frame number that carries out the required view data item of high resolution conversion based on the set result of step is set in described resolution; The reading times calculation procedure is used for using the maximum frame number of being stored in required frame number that described required frame number obtaining step obtained and the described maximum frame number storing step to calculate the number of times that reads described original image; And the reading times controlled step, be used for reading in determined number of times of described reading times calculation procedure and described original image.
Image processing equipment according to a third aspect of the invention we, comprise: the area sensor unit, it comprises a plurality of transducers, described a plurality of transducer comprises first sensor and second transducer adjacent with described first sensor, wherein, arrange described first sensor and described second transducer, make the second reading fetch bit offset of reading described original image that first of original image reads position and described second transducer of reading of described first sensor move less than a pixel; And the high resolution conversion unit, be used for carrying out interpolation processing by using with a plurality of frames view data item corresponding, that read by described area sensor unit, obtain resolution than time for reading during the high view data of employed resolution; Maximum frame number memory cell is used to store the data of the maximum frame number of the view data item that is obtained by described area sensor unit; Resolution setting unit is used to be provided for exporting the resolution of described original image; Required frame number acquiring unit is used for obtaining the frame number that carries out the required view data item of high resolution conversion based on the set result of described resolution setting unit; The reading times computing unit is used for using described required frame number acquiring unit required frame number that is obtained and the maximum frame number that is stored in described maximum frame number memory cell to calculate the number of times that reads described original image; And the reading times control unit, be used to read by determined number of times of described reading times computing unit and described original image.
According to a forth aspect of the invention, a kind of image processing method that is used for image processing equipment is provided, described image processing equipment comprises: the area sensor unit, it comprises a plurality of transducers, described a plurality of transducer comprises first sensor and second transducer adjacent with described first sensor, wherein, arrange described first sensor and described second transducer, make the second reading fetch bit offset of reading described original image that first of original image reads position and described second transducer of reading of described first sensor move less than a pixel; And high resolution conversion unit, be used for carrying out interpolation processing by using with a plurality of frames view data item corresponding, that read by described area sensor unit, obtain resolution than time for reading during the high view data of employed resolution, described image processing method comprises: maximum frame number storing step is used to store the data of the maximum frame number of the view data item that is obtained by described area sensor unit; Resolution is provided with step, is used to be provided for exporting the resolution of described original image; Required frame number obtaining step is used for obtaining the frame number that carries out the required view data item of high resolution conversion based on the set result of step is set in described resolution; The reading times calculation procedure is used for using the maximum frame number of being stored in required frame number that described required frame number obtaining step is obtained and described maximum frame number storing step to calculate the number of times that reads described original image; And the reading times controlled step, be used for reading in determined number of times of described reading times calculation procedure and described original image.
The present invention allows in response to the demand of output image quality is obtained and carried out the required low resolution frame image data of SUPERRESOLUTION PROCESSING FOR ACOUSTIC, and the output high-definition picture that is difficult to reproduce by read operation.
By below with reference to the explanation of accompanying drawing to exemplary embodiments, further feature of the present invention will be apparent.
Description of drawings
Fig. 1 is the outside drawing of image processing equipment.
Fig. 2 illustrates the structure of the reading unit of image processing equipment.
Fig. 3 is the block diagram that explanation is arranged on the structure of the controller in the image processing equipment.
Fig. 4 is the block diagram that the internal structure of scanner graphics processing unit is shown.
Fig. 5 illustrates the example images that is obtained by scanner unit.
Fig. 6 is the block diagram that the internal structure of printer image processing unit is shown.
Fig. 7 illustrates the structure of area sensor.
Fig. 8 illustrates the original image that is read by area sensor.
Fig. 9 illustrates the method for obtaining capable view data.
Figure 10 illustrates the method that another kind obtains capable view data.
Figure 11 illustrates the method that another kind obtains capable view data.
Figure 12 illustrates the method that another kind obtains capable view data.
Figure 13 illustrates the view data that is read by the line sensor that is arranged in the area sensor.
Figure 14 A is the structure chart that the area sensor of along inclined direction installing is shown.
Figure 14 B is the structure chart that illustrates along the area sensor of another incline direction installation.
Figure 15 illustrates the method for obtaining capable view data by the area sensor that tilts.
Figure 16 illustrates the another kind of method of obtaining capable view data by the area sensor that tilts.
Figure 17 illustrates the another kind of method of obtaining capable view data by the area sensor that tilts.
Figure 18 illustrates by the view data that line sensor read in the area sensor that is arranged on inclination.
Figure 19 is the flow chart that is used to be schematically illustrated as the operation of carrying out the SUPERRESOLUTION PROCESSING FOR ACOUSTIC pattern set handling relevant with first embodiment of the invention and carrying out.
Figure 20 illustrates according to the graphic user interface of first embodiment (GUI), and this graphic user interface illustrates the exemplary operation unit that is used for determining output resolution ratio shown in Figure 19.
The table about the information of the correspondence between the quantity of output resolution ratio shown in Figure 19 and low-resolution image of providing according to first embodiment is provided Figure 21.
Figure 22 provides the detailed description of SUPERRESOLUTION PROCESSING FOR ACOUSTIC.
Figure 23 provides another detailed description of SUPERRESOLUTION PROCESSING FOR ACOUSTIC.
Figure 24 illustrates the method according to the cut zone transducer of first embodiment.
Figure 25 illustrates according to figure second embodiment of the invention, that be arranged on the structure of the ADF in the scanner unit shown in Figure 1.
Figure 26 illustrates according to figure second embodiment, that be arranged on the structure of the scanner main body in the scanner unit shown in Figure 1.
Figure 27 schematically illustratedly carries out original copy shown in Figure 19 according to second embodiment and reads the flow chart of handling the operation carry out.
Embodiment
The first embodiment of the present invention will be described.In first embodiment, use area sensor to generate the method for high-definition picture explanation, this method is used to comprise the image processing equipment of colour scanner.
The outward appearance of image processing equipment
Fig. 1 illustrates the outward appearance of image processing equipment 1.Image processing equipment 1 is divided into: scanner unit 11 is used to read original image; Printer unit 12 is used to reproduce the view data that is read; And operating unit 13, be used to specify the various operation setting of image processing equipment 1.Scanner unit 11 by will expose and the scan manuscript image on the reverberation that obtains of the image that shows to be transmitted into charge coupled device (CCD) will be the signal of telecommunication about the information translation of original image.In addition, scanner unit 11 is the luminance signal of red (R), green (G) and blue (B) with this electrical signal conversion, and this luminance signal is sent to the back as view data will be with reference to the controller 20 of figure 3 explanations.
Here, original image is placed in the pallet 14 of original copy feeder 15.As user during from operating unit 13 indication beginning reading of data, controller 20 reads indication with original image and sends to scanner unit 11.Receive original image and read when indicating, scanner unit 11 is carried out read operation from 14 ground feeding original images of pallet of original copy feeder 15 and to original image.Here, substitute the auto feed system of realizing by original copy feeder 15, the method for reading original image can be to be placed on original image on the surface of contact glass (not shown) and the moving exposure unit comes the system of scan manuscript image.
Printer unit 12 is the image processing equipments that are used for the view data that formation slave controller 20 sends on thin slice.Here, although the image processing system that uses among first embodiment be to use the electrophotographic system of photosensitive drums and/or sensitization band, the present invention can also realize by other system that use is not limited to above-mentioned electrophotographic system.For example, can use and to be printed on ink-jet system on the thin slice from the micro-nozzle array ink jet and with data.Here, print unit 12 is provided with a plurality of thin slice boxes 17,18 and 19 that permission is big or small to different thin slices and/or different web direction are selected.After printing is finished, thin slice is discharged to discharge pallet 16.
The structure of the reading unit of image processing equipment
Fig. 2 illustrates the example images reading unit of the MFP that uses the foregoing description.Fig. 2 illustrates reader body 201 and keeps original image 203 when reading and original image is sent to the auto document feeder 202 that original copy reads the position carrying out mobile original copy.Fig. 2 also illustrates contact glass, promptly at the original text platform 204 that carries out placing when the original text platform reads original image 203.
Fig. 2 also illustrates reading unit 205, and reading unit 205 comprises reading device, the device that is used to take original image 203 that is used to read original image 203, light source 206 and the mirror 207,208,209,210 and 211 that comprises white light sources such as xenon pipe.When the rayed original image 203 that utilizes from light source 206 emissions, will be sent to imaging apparatus 213 from the light of original image reflection by mirror 207~211.Fig. 2 also illustrates and is used for from original image reflection and the lens 212 from the optical convergence of mirror 211 reflections to the width of imaging apparatus 213.
The detailed description of controller
Fig. 3 is the block diagram that illustrates in greater detail the structure of the controller 20 that is arranged in the image processing equipment 1.
Controller 20 is electrically connected to scanner unit 11 and printer unit 12.On the other hand, by Local Area Network 21 and/or wide area network (WAN) 22 controller 20 is connected to external device (ED) etc.Therefore, can input and/or output image data and/or device information.
CPU 301 controls the visit of the current various devices that are connected with controller 20 based on being stored in control program in the read-only memory (ROM) 303 etc.In addition, the various processing procedures of carrying out in 301 pairs of controllers 20 of CPU are controlled.
Random-access memory (ram) 302 is to be used to system working memory that CPU 301 can be worked, also is the memory that is used for temporarily storing image data.Above-mentioned RAM 302 is included in the static RAM (SRAM) (SRAM) that keeps the data of being stored after the deenergization and deletes the dynamic ram (DRAM) of the data of being stored deenergization after.
The boot of ROM 303 memory image treatment facilities 1 etc.Hard disk drive (HDD) 304 can storage system software and/or view data.
Operating unit interface (I/F) 305 is used for system bus 310 is connected to operating unit 13.Aforesaid operations unit interface 305 receives the view data that will show at operating unit 13 that sends from system bus 310, view data is sent to operating unit 13, and will send to system bus 310 from the information that operating unit 13 sends.
Network I/F 306 is connected to LAN 21 and system bus 310 with input and output information.Be connected modulator-demodulator 307 and WAN 22 with input with system bus 310 and go out information.Bianry image rotary unit 308 changed the direction of view data before sending view data.Bianry image compression/extension unit 309 was predetermined resolution and/or the suitable resolution that sends the capacity of destination with the resolution changing of view data before sending view data.When compression and/or expanded image data, use associating bianry image (the Joint Bi-level Image Experts Group of expert group, JBIG) system, improved two dimensional compaction coding (Modified Modified Read, MMR) system, improved relative element address are specified (Modified Read, MR) system, improved huffman coding (Modified Huffman, MH) system etc.Image bus 330 is the transmit paths that are used to send and/or receive view data, and it comprises peripheral component interconnect (Peripheral Components Interconnect, PCI) bus and/or Institute of Electrical and Electric Engineers (IEEE) 1394.
312 pairs of scanner graphics processing units are proofreaied and correct, are handled and edit from the view data that scanner unit 11 sends by scanner I/F 311.Here, scanner graphics processing unit 312 judges that the view data that sent is corresponding in colored original, monochromatic original copy, text original copy and the photograph original copy etc. which.Then, scanner graphics processing unit 312 will be additional to view data about the result's of above-mentioned judgement information.Above-mentioned additional information is called attribute data.The details of the processing procedure of carrying out in the scanner graphics processing unit 312 will be described below.
Compression unit 313 receives view data and view data is divided into the piece of 32 pixels * 32 pixels.Here, the view data with 32 pixels * 32 pixels is called sheet (tile) data.Be called picture with what illustrate on the original copy (the paper medium before being read) with the above-mentioned corresponding zone of data.In addition, will add the sheet data to about the information of the coordinate position of the mean flow rate of the piece of 32 pixels * 32 pixels and the picture on the original copy, as header.In addition, 313 pairs of view data that comprise a plurality of data item of compression unit are compressed.316 pairs of view data that comprise a plurality of data item of expanding element are expanded, and carry out the grating development (raster development) of view data.Then, expanding element 316 sends to printer image processing unit 315 with view data.
Printer image processing unit 315 receives the view data that sends from expanding element 316, and comes view data is carried out image processing with reference to the attribute data that appends to view data.The view data that to pass through image processing by printer I/F 314 sends to printer unit 12.The details of the processing procedure of carrying out in the printer image processing unit 315 will be described below.
The change that the 317 pairs of view data in image modification unit are scheduled to is handled.Image modification unit 317 comprises the unit that illustrates below.
318 pairs of view data that are sent to it of expanding element are expanded.319 pairs of view data that are sent to it of compression unit are compressed.320 pairs of view data that are sent to it of rotary unit are rotated.321 pairs of variable power unit are sent to its view data and carry out resolution changing processing (for example, from 600dpi to 200dpi).Color space changes the 322 pairs of color spaces that are sent to its view data in unit and changes.Color space change unit 322 can use matrix and/or table carry out known background eliminate handle, known logarithm (LOG) conversion process (RGB → CMY), known output color correction process (CMY → CMYK) etc.The binary image data that two-value-many-valued converting unit 323 will be sent to it is converted to 256 value view data.The 256 value view data that many-valued-two-value converting unit 324 uses the system that comprises error diffusion processing etc. will be sent to it are converted to binary image data.
Two the view data items of synthesis unit 327 by will being sent to it are synthetic mutually to generate a view data item.When two view data items are synthetic mutually, at least one below the following use in the method.Described method comprises: the mean value of the brightness value of the pixel that will synthesize mutually is defined as the method for synthetic brightness value; The value that will have the pixel of levels of brightness is defined as the method for the brightness value of the synthetic pixel of using afterwards; And the value that will have the pixel of low brightness level is defined as the method for the brightness value of the synthetic pixel of using afterwards.In addition, can also use by the pixel that will synthesize mutually being carried out the method for the brightness value that inclusive-OR operation, AND operation and nonequivalence operation wait to determine that synthetic back is used.Each of above-mentioned synthetic method is well-known.Conversion of resolution is carried out by the dredging of pixel to the view data that is sent to it in thinization unit 326, and generates and be sent to 1/2nd, 1/4th and eight/first-class corresponding view data of the view data of thinization unit 326.The view data that 325 pairs of mobile units are sent to it increases margin (margin) and/or deletion margin.
Raster image processor (raster image processor, RIP) 328 receive based on page-description language (PageDescription Language from transmissions such as printing server (not shown), PDL) intermediate data of code data generation, generate data bitmap (many-valued), and compress by 329 pairs of data bitmaps of compression unit.
The detailed description of scanner graphics processing unit 312
Fig. 4 illustrates the internal structure of scanner graphics processing unit 312.Scanner graphics processing unit 312 receives the view data that comprises the RGB luminance signal that sends from scanner unit 11 by scanner I/F 311, and wherein, each signal all is 8 signals.This luminance signal is converted to the normal brightness signal of the filter color that is independent of CCD by mask process unit 401.
403 pairs of spatial frequencys that are sent to its view data of filter processing unit are at random proofreaied and correct.For example, filter processing unit 403 is carried out computing by using 7 * 7 picture element matrixs to the view data that is sent to it.Incidentally, under the situation of using photocopier, the user of photocopier can select Text Mode, phase tablet mode or text/phase tablet mode as copy mode by operating operation unit 13.If the user has selected Text Mode under above-mentioned environment, then carry out text filtering by 403 pairs of whole image data of filter processing unit.In addition, if selected the phase tablet mode, then whole image data is carried out photograph filtering.In addition, if selected text/phase tablet mode, then filter processing unit 403 judges that according to the text that will illustrate below/photograph signal (part of attribute data) is at each pixel switching filter adaptively.That is, judge at each pixel and should use in photograph filter and the text filter which.In addition, setting only allows the coefficient of smooth high frequencies component to the photograph filter, so that reduce the roughness of image.In addition, the text filter is provided with allows to carry out the outstanding coefficient in strong limit, so that strengthen the character acutance.
404 pairs of brightness datas that are included in each pixel in the view data that is sent to it of histogram generation unit are sampled.More specifically, along main scanning direction and sub scanning direction with the spacing of rule to falling into by the starting point of appointment and the brightness data of the rectangular area that terminal point surrounds are sampled respectively at main scanning direction and sub scanning direction.Then, histogram generation unit 404 generates histogram data based on sampled result.When carrying out background elimination processing, use the histogram data that is generated to come the estimated background rank.Input side gammate 405 is converted to the brightness data with non-linear attributes by use table etc. with histogram data.
Colour/monochrome judging unit 406 judges that it is colour or achromatic being sent to each pixel that comprises in its view data, and the judged result data are judged that as colour/monochrome signal (part of attribute data) appends to view data.
Text/photograph judging unit 407 judges based on the value of each pixel and the value of the pixel around each pixel whether each pixel that is included in the view data constitutes character shown in character, grid point, the grid point or solid-state image (solid image).If pixel does not constitute in character shown in character, grid point, the grid point and the solid-state image any one, then pixel constitutes white region (white area).The judged result data are judged that as text/photograph signal (part of attribute data) appends to view data.
402 pairs of SUPERRESOLUTION PROCESSING FOR ACOUSTIC unit are sent to its view data and carry out SUPERRESOLUTION PROCESSING FOR ACOUSTIC.In addition, under the situation of using photocopier, the user of photocopier can select the SUPERRESOLUTION PROCESSING FOR ACOUSTIC pattern by operating operation unit 13.
Although will be explained below details, carry out SUPERRESOLUTION PROCESSING FOR ACOUSTIC under given conditions about the SUPERRESOLUTION PROCESSING FOR ACOUSTIC pattern.
At first, should prepare the view data of the original image of a plurality of frames, wherein, read the view data offset slightly of the original image that the position of original image on main scanning direction and/or sub scanning direction, reads with respect to sensor resolution with reader.
That is, should prepare and the corresponding view data item of continuous frame, wherein, the original copy position of the position of the original copy that transducer reads offset slightly benchmark image data on main scanning direction and/or sub scanning direction.
In addition, when reading with the corresponding view data item of frame, the side-play amount that reads the position of original image, the side-play amount that exists between the view data item that promptly adjacent sensors obtained should be less than a pixel (sub-pix) on main scanning direction and/or sub scanning direction.The skew of reading the position can be the skew less than the amount of a pixel, carries out offset correction by the integral multiple to offset and obtains this skew.Hereinafter, will read when scanning comprises the original image of single width picture image (frame) and data formation and the corresponding original image of above-mentioned single width picture image (frame) are called " frame image data ".In addition, the locations of pixels that will be read on original image is called " phase place ".
In addition, the phenomenon of phase deviation is called " phase deviation ", and the skew of the locations of pixels that read is called " phase deviation ".
In addition, the value of the low resolution of using in the above-described embodiments is not limited to 300dpi.That is, low resolution is represented from the resolution of the image of image processing equipment 1 output of carrying out common printing.
Main scanning direction is the vertical direction of direction that reading unit 205 moves to the original image that is placed on the original text platform when reading original copy by scanner.Then, shown in the arrow A as shown in Figure 8, with laterally being called of the original image that read " main scanning direction ".
Similarly, sub scanning direction is the parallel direction of direction that moves with reading unit 205.Then, shown in the arrow B as shown in Figure 8, with vertically being called of the original image that read " sub scanning direction ".
According to the foregoing description, area sensor along inclined direction to be placed, this makes and can obtain a plurality of images of phase place with respect to main scanning direction and sub scanning direction skew at each RGB passage.
Fig. 5 illustrates the example images that obtains by the foregoing description.The phase place of each among the Figure 50 1,502,503,504 and 505 that is obtained is with respect to main scanning direction and sub scanning direction skew.
Area sensor
In the above-described embodiments, the transducer of reads image data is set as area sensor.Area sensor is the imaging apparatus that is used for digital camera etc.Compare with the above-mentioned transducer that is provided with at each row, the element sensor that is used for reading of data is a two-dimensional arrangement.
Fig. 7 illustrates the structure that the above-mentioned zone transducer is an area sensor 701.
Area sensor 701 comprises element sensor 702.Element sensor 702 comprises element sensor set in H the pixel that is arranged on the long side direction and is arranged in set element sensor in L the pixel on the short side direction.Each pixel can comprise be used for the RGB coloured image pass through the element sensor of above-mentioned pixel is divided into the element sensor that four parts realize.In addition, H pixel can equal L pixel (long limit=minor face).Based on determining the resolution of area sensor apart from N between the element sensor.
The area sensor that is used for high-resolution digital camera comprises a large amount of pixels, thereby has increased element sensor that is arranged on the long side direction and the quantity that is arranged in the element sensor on the short side direction.For example, some comprise that the digital camera of about ten million pixel comprises that 3800 pixels are as being arranged in element sensor on the long side direction and 2800 pixels as the element sensor that is arranged on the short side direction.
Usually, under the situation that area sensor is used for camera etc., the image that is sent is caught and taken to area sensor, as the data of 2 dimensional region.
That is, area sensor is to once making a video recording by using the element sensor captured image data of two-dimensional arrangement.When being installed in area sensor on the reader, laying out pixel transducer obliquely not is not so that captured original image data can be distorted in the horizontal direction with on the vertical direction.
Therefore, the laying out pixel transducer makes that the image that is reproduced can in an inclined direction not be offset fully when reproducing captured original image.
For example, when in ordinary camera area sensor being installed, the view data that reads by the element sensor on the row that is arranged in black surround 703 expression becomes the view data of the head portion that constitutes captured subject image.
At this moment, the view data that is read not to tilt on the direction that forms row.
Similarly, the view data that reads by the element sensor that is arranged on the represented row of black surround 704 is the view data that is arranged in different position, the position of the captured subject image that reads with black surround 703.That is, the position of the captured subject image that reads in the black surround 704 is arranged in the below of the position of the captured subject image that black surround 703 reads in vertical direction.Therefore, the view data that reads by the element sensor on the row that is arranged in black surround 705 expressions is arranged in four locations of pixels places, below, position of the captured subject image that reads by the element sensor that is arranged on black surround 703 in vertical direction.
When using the area sensor of digital camera in the above described manner, view data is taken and is 2 dimensional region.Therefore, all element sensors that constitute area sensor are taken captured subject image from diverse location.Yet the using method that the area sensor in the equipment that uses in the above-described embodiments is set is different with the using method of area sensor in being arranged on above-mentioned digital camera.
At first, area sensor shown in Figure 7 is installed in the benchmark installation site that defines on reader.
On original image being placed on original text platform 204 shown in Figure 1 during appointed positions, at light source under the parallel mobile situation of vertical identical direction of original image lower edge and original image, by transducer to being applied to original image from light source and assembling from the light of original image reflection.Catch the light that is reflected, make the light that is reflected not to sensor perturbations.With horizontal (long side direction) of transducer shown in Figure 7 assemble abreast by making light source carry out parallel sweep as with the reverberation of the corresponding image data acquisition of delegation.
Therefore, transducer is installed in is defined as making transducer can take the position of original image hardly obliquely.
" the benchmark installation site " that sensor installation is called transducer with the above-mentioned installation site of the output of realization original image.
In the explanation, for the purpose of simplifying the description, transducer comprises the element sensor that is arranged in 20 pixels on the long side direction and is arranged in the element sensor of 10 pixels on the short side direction below.Certainly, the quantity that can be configured to be arranged in the element sensor on the long side direction equates with the quantity of element sensor on being arranged in short side direction.In addition, determine the quantity of above-mentioned element sensor for the use of area sensor that the foregoing description is described and structure, the quantity of element sensor is not limited to the quantity of element sensor shown in Figure 7.
In fact, the quantity of element sensor can equate with the quantity of the element sensor that uses in the digital camera.
Direction along arrow shown in Figure 2 drives the reading unit 205 that comprises imaging apparatus 213, i.e. the area sensor of drive installation on reader is so that read the original image 203 that is placed on the original text platform 204.
That is, as the above line transducer handle and read line sensor 704 and 705, read line sensor 704 and 705 and be one group of element sensor.Thereby, carry out read operation.
Then, read line sensor 704 and 705 view data that read with illustrating how to handle.Fig. 8 illustrates the original image that reads in the following explanation.
Grid shown in Figure 8 with read line sensor 704 and/or read the resolution of element sensor included in the line sensor 705 corresponding.
With drive at original text platform 204 lower edge sub scanning directions and mobile reading unit 205 simultaneously, sequentially read and send to the view data item that reads line sensor 704 and 705.
That is to say that each reads the corresponding part of line width master copy data and position expression reading unit 205 constantly.
Explanation is read the processing of original image.When original text platform 204 lower edge sub scanning directions move reading unit 205, (a) that utilizes the rayed Fig. 9 that goes out from light emitted partly, Figure 10 (a) partly, the diagonal line hatches zone of (a) part of Figure 11 and Figure 12 (a) original image shown in partly.
At first, utilizing sometime from the diagonal line hatches zone shown in (a) part of rayed Fig. 9 of light emitted.Then, area sensor detect light and with utilize the corresponding original image of light-struck line width part.
At this moment, for example, line sensor 704 detect Fig. 9 (b) part shown in the example images data item.Simultaneously, line sensor 705 detect Fig. 9 (c) part shown in the example images data item.
Owing on sub scanning direction, have physical separation ground that these two line sensors are installed, thereby the position of reading of these two view data items is offset mutually.
Then, the original image that is read is treated to because of reading the different view data item of line sensor, and the view data item is stored in (d) of Fig. 9 respectively and (e) in the storage mediums such as memory shown in the part.
Then, in movable sensor unit 205 and light source, shown in (a) part of Figure 10, change position by the detected original image of line sensor.Then, line sensor 704 detect Figure 10 (b) part shown in image, line sensor 705 detect Figure 10 (c) part shown in image.
Then, the original image that is read is treated to because of reading the different view data item of line sensor, and these view data items is stored in (d) of Figure 10 respectively and (e) in the storage mediums such as memory shown in the part.
Similarly, when the position corresponding part that reads shown in original image and (a) Figure 11 the part, with (b) of Figure 11 and (c) the view data item shown in the part store (d) of Figure 11 into and (e) in the storage mediums such as memory shown in partly.
In addition, when the position corresponding part that reads shown in original image and (a) Figure 12 the part, with (b) of Figure 12 and (c) the view data item shown in the part store (d) of Figure 12 into and (e) in the storage mediums such as memory shown in partly.
At last, utilize from the whole original image of the rayed of light emitted, each line sensor reads in the data item of the original image of its position.Then, the view data item that is read is stored in the memory successively, so that can obtain a plurality of view data items, wherein, and as (a) of Figure 13 with (b) shown in the part, view data item mutual pixel of skew on sub scanning direction.
The view data item that has skew on sub scanning direction is the view data of the frame that equates with the quantity of line sensor of quantity, and wherein, each line sensor comprises one group of area sensor.When element sensor two-dimensional arrangement and when acting on the area sensor of reads image data in the above described manner, can by a read operation obtain that phase place is offset on sub scanning direction, with the corresponding continuous frame images data item of a plurality of frames.
Then, will the using method of the area sensor in the equipment that uses in the foregoing description be described.At first, area sensor shown in Figure 7 is installed on the reader with the position that tilts.
Figure 14 A and 14B all illustrate the exemplary installation of the area sensor that uses in the foregoing description.Figure 14 A illustrates area sensor device 1401 and element sensor 1402.In the explanation, element sensor 1402 comprises the element sensor that is arranged in 20 pixels on the long side direction and is arranged in the element sensor of 10 pixels on the short side direction below.
Then, the installation region transducer makes it tilt with respect to the benchmark installation site.Be defined as initial point, long side direction is defined as the x direction and short side direction is defined as the position that the y direction is represented each element sensor of comprising in the area sensor by left upper end area sensor.That is, (x y)=(0,0) represents the coordinate of left upper end, and passes through equation (x, y)=(19,0) coordinate of expression upper right side by equation.
Similarly, (x y)=(0,9) represents the coordinate of lower-left end, and passes through equation (x, y)=(19,9) coordinate of expression bottom righthand side by equation.
Black surround 1403 expression constitute area sensor devices 1401 with the corresponding element sensor group of delegation.More specifically, black surround 1403 expressions are arranged in 20 element sensors on the long side direction.
That is to say black surround 1403 expression positions and coordinate (0,4), (1,4), (2,4) .... (19,4) corresponding element sensor.
In the following description, the element sensor that black surround 1403 is surrounded is called and reads line sensor 1403.Similarly, black surround 1404 illustrates position and coordinate (0,5), (1,5), (2,5) .... (19,5) corresponding element sensor.In the following description, the element sensor that black surround 1404 is surrounded is called and reads line sensor 1404.
In the above-described embodiments, drive along the direction of arrow shown in Figure 2 and to comprise the reading unit 205 of the area sensor 213 that is installed on the reader, so that read the original image that is placed on the original text platform 204.
That is, as line sensor handle all be one group of element sensor read line sensor 1403 and 1404.Thereby, carry out read operation.
Then, read line sensor 1403 and 1404 view data that read with illustrating how to handle.As described below, Fig. 8 illustrates the original image that is read.That is to say that above-mentioned original image is corresponding with original image 203 shown in Figure 2.
The content of the grid representation shown in Fig. 8 with read line sensor 1403 and 1404 in the resolution of included element sensor corresponding.With reference to figure 9~13, by reading original image as mentioned above.Yet, because original image cant angle theta ° angle, thereby obtain the view data of cant angle theta °.
If area sensor does not tilt, for example, then read in the shown image in position partly represented of diagonal line hatches shown in (a) part of Figure 15.Yet because area sensor tilts, thereby line sensor 1403 and 1404 detects (b) of Figure 15 and (c) the view data item shown in partly.
Then, with above-mentioned view data item be stored in (d) of Figure 15 and (e) part shown in storage mediums such as memory in, wherein, the inclination of view data item remains unchanged.Similarly, in movable sensor unit 205 and light source, read in the shown image in position partly represented of diagonal line hatches shown in (a) part of Figure 16.At this moment, line sensor 1403 and 1404 detect (b) of Figure 16 and (c) part shown in the view data item.
Then, above-mentioned view data item is stored in (d) of Figure 16 and (e) part shown in storage mediums such as memory in.
In addition, when moving reading unit, moving sub scanning direction light source, so that read in the shown image in position partly represented of diagonal line hatches shown in (a) part of Figure 17.At this moment, line sensor 1403 and 1404 acquisition Figure 17 (b) and (c) the view data item shown in the part.
Then, above-mentioned view data item is stored in (d) of Figure 17 and (e) part shown in storage mediums such as memory in.
At last, the view data item that is detected and read by line sensor 1403 and 1404 is Figure 18 (a) and (b) data item shown in partly.Each data item is read as the view data at cant angle theta ° angle.At this moment, the direction of the arrow shown in Figure 18 (A) expression and the direction of arrow (B) expression are called main scanning direction and sub scanning direction.On the other hand, the direction with arrow (C) expression is called the horizontal of reads image data.In addition, the direction with arrow (D) expression is called the vertical of reads image data.
Shown in Figure 14 A, read line sensor 1403 and read line sensor 1404 and on short side direction, physically be offset an element sensor mutually.Therefore, the phase place that reads element sensor included in the line sensor 1403 is offset with the phase place that reads element sensor included in the line sensor 1404 on long side direction.
For example, read included in the line sensor 1403, be positioned at and coordinate (x, y)=(15,4) element sensor of corresponding position included in the line sensor 1404 with respect to reading on as the y direction of principal axis of short side direction, be positioned at and coordinate (x, y)=(15,5) skew of the element sensor of corresponding position is y=1.The element sensor that above-mentioned skew causes reading line sensor 1403 and 1404 has been offset Δ β mutually at sub scanning direction.
On the other hand, the element sensor that reads line sensor 1403 is being x=15 as the axial position of the x of long side direction, and its position with the element sensor that reads line sensor 1404 is identical.Yet because tiltangle, the phase place of above-mentioned element sensor is offset the very little amount Δ α that falls within the sub-pix mutually on the horizontal direction of benchmark installed position definition.Promptly, even be confirmed as on the x direction of principal axis of long side direction, the position of reading the element sensor of line sensor 1403 is arranged on the position identical with the position of the element sensor that reads line sensor 1404, when area sensor tilts, the phase deviation with subsection tolerance also occurs.Above-mentioned phase deviation depends on the inclination angle.
Therefore, for different line sensors, by the phase deviation difference that reads each view data item that line sensor reads of definition in the area sensor 213.
More specifically, the phase place of the view data shown in (b) part of the phase place of the view data shown in Figure 18 (a) part and Figure 18 not only has been offset Δ β on sub scanning direction, but also has been offset Δ α on main scanning direction.
Although reading the quantity of line sensor (reading line sensor 1403 and 1404) in the above-described embodiments is two, can uses the reading line sensor of varying number and be not limited to the foregoing description.
Can on the x direction of principal axis, increase the quantity of element sensor included in the area sensor 213, so that a large amount of line sensors that reads to be set.That is, can be provided with area sensor 213 in included, the line sensor that reads that is arranged in pixel as much on the x direction of principal axis.
The quantity that reads line sensor equates with the quantity of the view data item that obtains by read operation.That is,, then can obtain the reading images that 30 width of cloth all have the proper phase skew by a read operation if in area sensor 213, be provided with and the corresponding line sensor that reads of 30 row.
Therefore, area sensor is tilted, thereby make and once read a plurality of frame image data items.According to the frame image data item, by on the long side direction of scanning mutually the position of the original image that read of adjacent pixels transducer be offset mutually less than a pixel.
In addition, installation region transducer 213 as shown in Figure 14B.Long side direction is identical with the horizontal direction of determining in the benchmark installation site.Yet area sensor 213 tilts with respect to the benchmark installation site on short side direction.
In this case, also can obtain the frame image data item by an original image scanning, the same with the situation shown in Figure 14 A, by position mutual skew on main scanning direction and/or sub scanning direction of the original image that is read at mutual adjacent pixels transducer on the short side direction less than a pixel.
That is to say, thereby as long as obtain frame image data with original image opposing parallel motion scan position, then can use the arbitrary region transducer that comprises a plurality of transducers, wherein, the position by the original image that reads at mutual adjacent transducer on the short side direction is offset less than a pixel on main scanning direction and/or sub scanning direction mutually.
In addition, as long as can obtain the frame image data item by an original image scanning, the tiltangle shown in the tiltangle shown in Figure 14 A and Figure 14 B ' can be arbitrarily angled then, wherein, be offset less than a pixel mutually on main scanning direction and/or sub scanning direction by position at the original image that transducer read mutual adjacent on the short side direction.
In addition, can increase the quantity of the frame image data item that on short side direction, obtains by transducer by being increased in number of times that the number of times that reads original image on the sub scanning direction and time per unit sample.
The detailed description of printer image processing unit 315
Fig. 6 illustrates the internal structure of printer image processing unit 315.Background is eliminated processing unit 601 is eliminated (removal) view data based on the histogram data that is generated by scanner graphics processing unit 312 background color.
Monochromatic generation unit 602 is converted to monochromatic data with color data.Logarithm converting unit 603 is carried out brightness-concentration conversion.For example, above-mentioned logarithm converting unit 603 is converted to the CMY view data with the RGB input image data.
Color correction is exported in output color correction unit 604.For example, by use table and/or matrix the CMY input image data is converted to the CMYK view data.
Outlet side gammate 605 is proofreaied and correct, thereby makes that the signal value data that is sent to it is proportional with the reflection density Value Data that duplicates the back acquisition, and carries out data output.
Halftoning correcting unit 606 carries out halftone process according to the number of grey levels of the printer unit 12 of dateout.For example, the halftoning correcting unit 606 high grade grey level view data that will be sent to it is converted to 2 value view data and/or 32 value view data.
In addition, being arranged on each processing unit in scanner graphics processing unit 312 and/or the printer image processing unit 315 can export the view data that is sent to it and view data not handled.The situation that data are not handled data by particular processor unit is expressed as " making data pass through processing unit ".
SUPERRESOLUTION PROCESSING FOR ACOUSTIC is provided with
Hereinafter, the SUPERRESOLUTION PROCESSING FOR ACOUSTIC setting of carrying out in the foregoing description will be described in detail.Here, use the area sensor shown in Figure 14 A in the above-described embodiments.In addition, used following image processing equipment.That is, when reading original image by area sensor, image processing equipment can obtain the frame image data that 50 frames have the low resolution of about 100dpi by the single pass operation.In addition, above-mentioned image processing equipment can generate the high-resolution image with 200dpi by the frame image data of use with the corresponding low resolution of 4 frames by High-resolution Processing.
Similarly, image processing equipment can generate the high-resolution image with 300dpi by using the frame image data of 10 frame low resolution.In addition, image processing equipment can generate the high-resolution image with 600dpi by using the frame image data of 40 frame low resolution.In addition, image processing equipment can generate the high-resolution image with 1200dpi by using the frame image data of 100 frame low resolution.In addition, image processing equipment can generate the high-resolution image with 2400dpi by using the frame image data of 400 frame low resolution.
Here, the frame number for the required low resolution frame image data of the resolution of obtaining expectation is not limited to above-mentioned frame number.For example, image processing equipment can generate the high-resolution image with 1000dpi by using the frame image data of 50 frame low resolution.In addition, image processing equipment can generate the high-resolution image with 500dpi by using the frame image data of 50 frame low resolution.Above-mentioned frame number depends on the capacity of image processing equipment.
In addition, the not all low resolution frame image data that obtains can use.For example, when the locations of pixels that reads the frame image data that is obtained by adjacent sensors is offset a pixel or more mutually, that is,, be difficult to this frame image data is used for SUPERRESOLUTION PROCESSING FOR ACOUSTIC when the value of phase deviation is a pixel or more for a long time.The frame image data counting with above-mentioned low resolution is not the frame number that is obtained.
Figure 19 illustrates the operation that is used to carry out SUPERRESOLUTION PROCESSING FOR ACOUSTIC pattern set handling.As mentioned above, will be used for realizing that the control program of processing procedure shown in Figure 19 is stored in ROM 303 and carries out this control program by CPU 301.
At first, in step S1901, send the instruction that is used to obtain SICNT by user interface (UI) from the user, SICNT is a maximum quantity of operating the quantity of the low resolution frame image data item that can obtain by single pass.
Then, when receiving this instruction, UI sends to CPU 301 by operating unit I/F 305 with above-mentioned instruction, the data that CPU 301 obtains the SICNT data and stores the frame number that is used for the largest frames view data.Quantity based on the row on the short side direction that is arranged in the transducer that will be explained below is determined above-mentioned maximum frame number.
In addition, by being increased in the number of times that reads original image on the sub scanning direction and the number of times of time per unit sampled data, can be increased in the quantity of the frame image data item that is obtained on the short side direction of transducer.Owing to can obtain the frame image data that 50 frames have the low resolution of about 100dpi in the above-described embodiments, therefore the value with the SICNT data is defined as 50.
Then, in step S1902, obtain data based on the determined output resolution ratio of SICNT data that obtains among the step S1901.
Figure 20 is the schematic diagram that the exemplary operation unit that is used for definite output resolution ratio is shown.The user can determine to determine on the menu 2001 output resolution ratio at output resolution ratio.
Then, in step S1903, calculate the data of DICNT, DICNT is illustrated in step S1902 and obtains employed frame number in the required low resolution frame image data of the output resolution ratio of data.That is, obtain the data of the frame number of required frame image.
In the above-described embodiments, will provide storage about the table of the information of the corresponding relation between the frame number used in output resolution ratio and the low resolution frame image data item in ROM 303.
Figure 21 illustrates the table that is used to illustrate about the information of the corresponding relation between output resolution ratio and the employed frame number of low resolution frame image data item.
Then, in step S1904, calculate the scan operation times N according to following expression:
N=DICNT/SICNT
For example, when output resolution ratio was 2400dpi, expression formula DICNT=400 set up.Therefore, calculate the scan operation times N by following expression:
N=400/50=8
Then, in step S1905, the value of scan counter CNT is reset to 0, and in step S1906, value and the scan operation times N of scan counter CNT is compared mutually.
If be judged as the value ("Yes" in step S1906) of the value of scan counter CNT less than the scan operation times N, then in step S1907, read the data of original copy, value with scan counter CNT in step S1908 adds 1, then, handles turning back to step S1906.Thereby, control reading times based on the reading times result calculated.
On the other hand, if be judged as the value ("No" among the step S1906) that the value of scan counter CNT is not less than the scan operation times N, then be judged as and finished master copy data and read.Then, in step S1909, proofread and correct the inclination of the frame image data that is obtained and in step S1910, carry out SUPERRESOLUTION PROCESSING FOR ACOUSTIC, thus the processing procedure of finishing.
Inclination to the frame image data that obtained as described below is proofreaied and correct.As mentioned above, the inclination of the frame image data that is obtained equals the tiltangle of area sensor.
The tiltangle of above-mentioned zone transducer 213 is to comprise the value that can obtain in the process of MFP of area sensor 213 in assembling when being installed to area sensor 213 on the reading unit 205.
In the storage area of storage in being arranged at MFP with above-mentioned tiltangle, as data about the intrinsic value of institute's erecting device.
By using above-mentioned angle information to carry out affine transformation, so that the frame image data of the inclination that rotation is obtained.At this moment, frame image data is rotated the gradient of relative datum installation site, with the inclination of correct frames view data.If be not confirmed as (X respectively through the coordinate of the frame image data of affine transformation and the coordinate that passed through the frame image data of affine transformation, Y) and (X ', Y '), the anglec of rotation (inclination angle of the area sensor in the foregoing description) is confirmed as θ, then shown in equation 1, obtain to handle the frame image data of having proofreaied and correct inclination by affine transformation.
Equation 1
[ X , , Y , ] = [ X , Y ] cos θ sin θ - sin θ cos θ
X ', Y ': the coordinate position that is obtained after the conversion
X, Y: the coordinate position that is obtained before the conversion
Become the low resolution frame image data of having proofreaied and correct inclination by the frame image data that affine transformation obtained.
Here, the method that is used for correct tilt is not limited to affine transformation.That is, if by this method can the correct frames view data inclination, can use any method.As shown in Figure 14B, if can obtain the frame image data that the transducer of frame image data can obtain not have inclination, then above-mentioned processing is unnecessary, wherein, be offset less than a pixel mutually on main scanning direction and/or sub scanning direction in the position of sensor readings mutual adjacent on the short side direction.
Then, by use proofreaied and correct tilt with a plurality of frames corresponding frame image datas, carry out SUPERRESOLUTION PROCESSING FOR ACOUSTIC at step S1910, thereby finish processing.Here, step S1909 and S1910 do not have specific order.Can carry out SUPERRESOLUTION PROCESSING FOR ACOUSTIC earlier, thereby the inclination to the high resolution image data that obtains by SUPERRESOLUTION PROCESSING FOR ACOUSTIC is proofreaied and correct after SUPERRESOLUTION PROCESSING FOR ACOUSTIC.
As shown in Figure 5, carry out SUPERRESOLUTION PROCESSING FOR ACOUSTIC, so that use on main scanning direction and sub scanning direction skew mutually to generate high resolution image data less than a plurality of frame image data items of the phase place of a pixel at each RGB passage.
For example, by using four view data items, make frame image data item shown in Figure 5 501,502,503 and 504 skew 1/2nd pixels mutually.Therefore, by using above-mentioned view data, can obtain the picture element density high-definition picture higher four times than the picture element density of original image data 505.
To be described more specifically the High-resolution Processing of carrying out this moment with reference to Figure 22 and 23.Figure 22 illustrates low resolution frame image data that is used for High-resolution Processing and the view data of having passed through High-resolution Processing.Figure 22 illustrates original copy and by read benchmark low resolution image data F0 and the target low resolution image data item F1~F3 that original copy obtains by area sensor.Each dashed rectangle around the original copy represents to be read by area sensor the zone of benchmark low resolution image data F0, and solid-line rectangle represents to be read respectively by area sensor the zone of target low resolution image data item F1~F3.
In the above-described embodiments, the offset-lists on main scanning direction is shown " um ", the offset-lists on sub scanning direction is shown " vm ".In addition, the offset-lists with target low resolution image data item Fn (n=1~3) is shown " umn " and " vmn ".For example, as shown in figure 22, target low resolution image data item F 1 departs from benchmark low resolution image data F0 on sub scanning direction, and offset-lists is shown um1, vm1.Similarly, the side-play amount with target low resolution image data item F2 and F3 is expressed as um2, vm2 and um3, vm3 respectively.
Calculate correcting value un, the vn of target low resolution image data item Fn based on the view data of the view data of benchmark low resolution image data F0 and target low resolution image data item F1~F3.Carry out aforementioned calculation according to using the predetermined computation method be stored in advance among the ROM 303 about the information of the inclination of area sensor.
Figure 22 is illustrated schematically in the skew that each target low resolution image data item takes place in the pixel.Yet, when passing through the area sensor reading of data of the foregoing description, the phase deviation less than a pixel takes place on main scanning direction and sub scanning direction.Can make high-definition picture by using above-mentioned little skew.
Therefore, in the pixel that in the SUPERRESOLUTION PROCESSING FOR ACOUSTIC image that is generated, comprises, there is the pixel that is not included in benchmark low resolution image data and the target low resolution image data.
For above-mentioned pixel, be scheduled to interpolation processing when synthesizing at the pixel data of the value by using expression to be arranged on to generate the pixel around the pixel, the generation high-definition picture.As interpolation processing, can use bilinear interpolation (bilinearinterpolation method), bicubic method (bicubic method) and nearest neighbor method (nearest neighbor method) etc.
For example, Figure 23 illustrates the situation of carrying out interpolation processing according to bilinear interpolation.At first, from benchmark low resolution image data and target low resolution image data, extract about with the data of the nearest nearest pixel 1802 in the position that generates pixel 1801.Then, be defined as surrounding pixel 1802,1803,1804 and 1805 with generating location of pixels four pixels on every side.In addition, shown in following equation, the value that obtains by the data value of predefined weight being composed to surrounding pixel is averaged, and obtain to generate the data value of pixel.
f(x,y)=[|x1-x|{|y1-y|f(x0,y0)+|y-y0|f(x0,y1)}+|x-x0|{|y1-y|f(x,y0)+|y-y0|f(x1,y1)}]/|x1-x0||y1-y0|
As shown in figure 20, generate location of pixels for each and carry out above-mentioned processing procedure, thereby obtain the super-resolution image of resolution than the high twice of resolution of original image.Here, resolution can not be the high twice of resolution than original image.That is, can use different multiplying powers.In addition, owing to use the value of low resolution image data item for interpolation processing, thereby increased the definition of the super-resolution image that obtains by interpolation processing.
Original copy reads processing
Read processing with specifically describing the original copy that illustrates among the step S1907 shown in Figure 19.In the above-described embodiments, read original copy by area sensor.
Figure 24 illustrates the exemplary area transducer in the employed in the above-described embodiments scanner unit 11 of setting.In this case, shown in Reference numeral 2401, on sub scanning direction, area sensor is divided into 50 parts.In addition, can obtain 50 frames at most is that unit, resolution are the frame image data of 100dpi with band (band).
Control, make and come reading of data by a band in the above-mentioned band is defined as a line sensor.In addition, can at random determine how many parts above-mentioned zone will be divided into, promptly will prepare how many line sensors.In the above-described embodiments, the area sensor of being installed is divided into 50 parts and use this area sensor as 50 line sensors.
Therefore, if the value of the scan operation number of times that calculates is at least two, then will be stored as frame image data item ID0~ID49 respectively in step S1904 shown in Figure 19 by the band data item 0~49 that first scan operation is obtained.
Then, will be stored as frame image data item ID50~ID99 by the band data item 50~99 that second scan operation is obtained respectively.Therefore, manage the frame image data of each scan operation based on frame image data ID, this makes it possible to obtain the low resolution frame image data of the output resolution ratio that is used to reproduce final appointment.
In addition, in the above-described embodiments, carry out scan operation continuously for one page of original image.At this moment, be fixed on the original text platform original image and mobile light source, thereby read original image.Said structure is defined as the first area sensing system.
When using above-mentioned first area sensing system to carry out at least twice scan operation, cannot at random move the original image that is placed on the original text platform.Although do not change the position of placing original image, because the precision of the optical characteristics of the light source control that device carries out during with scanner to scan causes by different mutually with the signal that obtains of the scan operation second time for the first time.
If signal difference falls within the sub-pix, can adopt its corresponding data as the low resolution frame image data and use it for subsequent treatment.Therefore, when carrying out above-mentioned scan operation, can on UI, show the not alert message of mobile original copy of indication user.The data that can on UI, show in addition, residue scan operation number of times.
In addition, often original image is placed in the auto document feeder (hereinafter, being called " ADF "), comprises one group of original copy of multipage with scanning.
At this moment, light source is fixed and mobile original image, thereby read original image.Said structure is defined as the second area sensing system.
In this case, after carrying out the single pass operation, can impel the user in ADF, to place another group original copy for one group of original copy.In this case, can on UI, show about remaining the data of scan operation number of times.
Carrying out first read operation on the original text platform and using ADF carrying out under the situation of second reading extract operation by the first area sensing system in the above described manner, because the optical characteristics of the control precision of the position of device for reading data and light source is made that the value of the signal that obtains in time for reading by first and second read operations is different mutually by the second area sensing system.
If signal difference falls within the sub-pix, can adopt data corresponding as the low resolution frame image data and use it for subsequent treatment with it.
In addition, if the user stops to carry out scan operation in processing, then can generate high resolution image data by using a plurality of low resolution frame image data items that at least scan operation obtained that carry out so far.
Determine scan operation quantity according to output resolution ratio by above-mentioned processing procedure setting.Therefore, can export the high resolution image data that is difficult to reproduce by the single pass operation.
In addition, the image transitions that above-mentioned term " output " expression will scanning is high-definition picture and view data is printed on the thin slice practically.In addition, above-mentioned term " output " also represent to have passed through the high-resolution conversion image data storage in image processing equipment and view data is not printed on the thin slice.
According to first embodiment, determine the scan operation number of times based on set output resolution ratio, and carry out scan operation, so that the output high resolution image data with determined number of times.
The situation that the image processing equipment of ADF carries out repeatedly scan operation has been installed in second embodiment of the present invention explanation.In this case, the scanning of being carried out when working as the mobile original copy that reads from the ADF feeding is as first scan operation.In addition, use identical Reference numeral represent with first embodiment in the identical processing procedure of processing procedure of carrying out, and omit its unnecessary schematically illustrating.
Figure 25 illustrates the ADF2501 as the part of scanner unit 11 illustrated in fig. 1, and Figure 26 illustrates the structure of scanner main body 2601.Scanning element 11 is provided with pressing plate read mode and mobile original copy read mode, in the pressing plate read mode, original image is placed on the on glass and moving optical system of original text platform still to read original image, in mobile original copy read mode, stop optical system and mobile original image to read original image.
Figure 27 schematically shows the original image that carries out according to second embodiment and reads in step S1907 shown in Figure 19 during, read that scanning and pressing plate read scanning and the example of the operation that can carry out in order to carry out to move.As mentioned above, will be used for realizing that the control program of processing procedure shown in Figure 27 is stored in ROM 303 and carries out this control program by CPU 301.
At first, in step S2701, obtain the scan operation times N that in step S1904, calculates.
Then, whether the value of the scan operation times N that judgement is obtained in step S2701 in step S2702 is at least two.
Be at least two ("Yes" among the step S2702) if be judged as the value of scan operation times N, continue to handle, otherwise ("No" in step S2702) continues to handle in step S2705 at step S2703.In step S2703, first scan operation is set to mobile original copy read mode.Then, ADF 2501 is transported on the contact glass of scanner main body 2601 by the original image that conveying roller will be placed on wherein.
In this case, in step S2704, read by the original image of scanner main body 2601, as a plurality of low resolution image data items with the optical scanner of mobile original copy read mode.
After this, original image is transported on the contact glass of scanner main body 2601.Then, in step S2705, remaining scan operation is defined as carrying out with the pressing plate read mode.
Then, first mirror unit 2602 and second mirror unit 2603 that is arranged in the scanner main body 2601 returned the initial position (home position) that initial position sensor 2604 is set temporarily.
Then, connect original copy illuminating lamp 2605 and utilize its rayed original image.Pass lens 2609 from the light of original image reflection through second mirror 2607 and the 3rd mirror 2608 that are arranged on first mirror 2606 first mirror unit 2602 and be arranged in second mirror unit 2603, thereby on area sensor 2610, form image.After this, will send to area sensor 2610 as light signal, and in step S2706, carry out original image and read about the data of image, thus the end process process.
Therefore, undertaken even the user is not placed on diverse location with original image, also original image automatically being transported to the position for the definition of pressing plate read mode after first original image reads by ADF, this has saved user's time and trouble.
On the other hand, be one if in step S2702, be judged as the value of scan operation times N, then carry out original copy and read, and do not carry out and step S2703 and the corresponding processing procedure of S2704 with the pressing plate read mode.
Therefore, when carrying out repeatedly scan operation based on set output resolution ratio, the scanning of carrying out when working as the mobile original copy that reads from the ADF feeding is as first scan operation.
After this, carry out pressing plate and read scanning, this makes can export the high resolution image data that is difficult to reproduce by the single pass operation.Especially,, then compare, can read processing with more speed with carrying out the system that twice pressing plate read scanning if carry out twice scan operation in a second embodiment.
In addition, the present invention can be used in system that comprises a plurality of unit (for example, computer, interface arrangement, reader and printer etc.) and/or the equipment that comprises a unit (image processing equipment, Printers and Faxes machine etc.).
In addition, the present invention also can be by realizing that from storage the storage medium of program code of the process steps of the flow chart that illustrates the foregoing description reads and the executive program code is realized by computer (or CPU (CPU) and/or microprocessing unit (MPU)).In this case, the program code that reads from storage medium is realized the function of the foregoing description.The storage medium of this program code and this program code of storage constitutes embodiments of the invention.
Be used to provide the storage medium of this program code can be, can write down compact disk (CD-R), tape, Nonvolatile memory card and ROM etc. for for example floppy disk (floppy, registered trade mark), hard disk, CD, magneto optical disk, read-only compact disc (CD-ROM).
In addition, the function of the foregoing description not only can realize by the computer that reads and carry out this program code, can also be by being realized by instruction utilization operation operating system operating parts such as (OS) or whole processing on computers of computer based in this program code.The latter also is one of embodiments of the invention.
In another embodiment of the present invention, the program code that reads from storage medium can be write the memory that inserts the expansion board the computer and/or be connected to the functional expansion unit of computer.At this moment, the function by realizing the foregoing description based on instruction operating part or whole processing of this program code by CPU of expansion board or functional expansion unit etc.
Although the present invention has been described, should be appreciated that the present invention is not limited to disclosed exemplary embodiments with reference to exemplary embodiments.The scope of appended claims meets the wideest explanation, to comprise all these class modifications, equivalent structure and function.

Claims (26)

1. image processing equipment comprises:
The area sensor unit is used for reading and the corresponding view data item of a plurality of frames from original image, and described view data item has at least one skew less than a pixel, and described skew is the skew of reading the position of described original image;
Correcting unit is used for the inclination of each view data item of being obtained by described area sensor unit is proofreaied and correct, and described inclination is the inclination angle of the area sensor of described area sensor unit;
The high resolution conversion unit, be used for by the view data item after use proofreading and correct carry out interpolation processing obtain resolution than data read during employed resolution high view data;
Maximum frame number memory cell is used to store the data of the maximum frame number of the view data item that is obtained by described area sensor unit;
Resolution setting unit is used to be provided for exporting the resolution of described original image;
Required frame number acquiring unit is used for obtaining the frame number that carries out the required view data item of high resolution conversion based on the set result of described resolution setting unit;
The reading times computing unit is used for using described required frame number acquiring unit required frame number that is obtained and the maximum frame number that is stored in described maximum frame number memory cell to calculate the number of times that reads described original image; And
The reading times control unit is used to read by determined number of times of described reading times computing unit and described original image.
2. image processing equipment according to claim 1 is characterized in that, when described reading times computing unit is judged as need repeatedly read described original image the time, shows to be confirmed as essential residue reading times on user interface.
3. image processing equipment according to claim 1 is characterized in that, when stop with the number of times that is calculated carry out read the time, carry out described high resolution conversion by using in the view data that stops to be read before described the reading.
4. image processing equipment according to claim 1 is characterized in that, described area sensor unit comprises:
First reading unit, it is fixed on the original text platform described original image and mobile light source, thereby reads described original image;
Second reading unit, its fixing described light source also moves described original image, thereby reads described original image; And
Read control unit, be used to control, make the value work as the reading times that is calculated more than or equal to two the time, by using described first reading unit and described second reading unit to read described original image, and the value of working as the reading times that is calculated reads described original image by using described first reading unit or described second reading unit less than two the time.
5. image processing equipment according to claim 2 is characterized in that, described area sensor unit comprises:
First reading unit, it is fixed on the original text platform described original image and mobile light source, thereby reads described original image;
Second reading unit, its fixing described light source also moves described original image, thereby reads described original image; And
Read control unit, be used to control, make the value work as the reading times that is calculated more than or equal to two the time, by using described first reading unit and described second reading unit to read described original image, and the value of working as the reading times that is calculated reads described original image by using described first reading unit or described second reading unit less than two the time.
6. image processing equipment according to claim 3 is characterized in that, described area sensor unit comprises:
First reading unit, it is fixed on the original text platform described original image and mobile light source, thereby reads described original image;
Second reading unit, its fixing described light source also moves described original image, thereby reads described original image; And
Read control unit, be used to control, make the value work as the reading times that is calculated more than or equal to two the time, by using described first reading unit and described second reading unit to read described original image, and the value of working as the reading times that is calculated reads described original image by using described first reading unit or described second reading unit less than two the time.
7. image processing equipment according to claim 1 is characterized in that, with the zone that Region Segmentation is an any amount of reading of described area sensor unit, and reads described original image at each zone.
8. image processing equipment according to claim 2 is characterized in that, with the zone that Region Segmentation is an any amount of reading of described area sensor unit, and reads described original image at each zone.
9. image processing equipment according to claim 3 is characterized in that, with the zone that Region Segmentation is an any amount of reading of described area sensor unit, and reads described original image at each zone.
10. image processing equipment according to claim 4 is characterized in that, with the zone that Region Segmentation is an any amount of reading of described area sensor unit, and reads described original image at each zone.
11. image processing equipment according to claim 5 is characterized in that, with the zone that Region Segmentation is an any amount of reading of described area sensor unit, and reads described original image at each zone.
12. image processing equipment according to claim 6 is characterized in that, with the zone that Region Segmentation is an any amount of reading of described area sensor unit, and reads described original image at each zone.
13. an image processing method that is used for image processing equipment, described image processing equipment comprises:
The area sensor unit is used for reading and the corresponding view data item of a plurality of frames from original image, and described view data item has at least one skew less than a pixel, and described skew is the skew of reading the position of described original image;
Correcting unit is used for the inclination of each view data item of being obtained by described area sensor unit is proofreaied and correct, and described inclination is the inclination angle of the area sensor of described area sensor unit;
The high resolution conversion unit, be used for by the view data item after use proofreading and correct carry out interpolation processing obtain resolution than data read during employed resolution high view data, described image processing method comprises:
Maximum frame number storing step is used to store the data of the maximum frame number of the view data item that is obtained by described area sensor unit;
Resolution is provided with step, is used to be provided for exporting the resolution of described original image;
Required frame number obtaining step is used for obtaining the frame number that carries out the required view data item of high resolution conversion based on the set result of step is set in described resolution;
The reading times calculation procedure is used for using the maximum frame number of being stored in required frame number that described required frame number obtaining step obtained and the described maximum frame number storing step to calculate the number of times that reads described original image; And the reading times controlled step, be used for reading in determined number of times of described reading times calculation procedure and described original image.
14. image processing method according to claim 13 is characterized in that, when being judged as in described reading times calculation procedure need repeatedly read described original image the time, shows to be defined as essential residue reading times on user interface.
15. image processing method according to claim 13 is characterized in that, when stop with the number of times that is calculated carry out read the time, carry out described high resolution conversion by using in the view data that stops to be read before described the reading.
16. image processing method according to claim 13 is characterized in that, described area sensor unit comprises:
First reading unit, it is fixed on the original text platform described original image and mobile light source, thereby reads described original image;
Second reading unit, its fixing described light source also moves described original image, thereby reads described original image; And
Read control unit, be used to control, make the value work as the reading times that is calculated more than or equal to two the time, by using described first reading unit and described second reading unit to read described original image, and the value of working as the reading times that is calculated reads described original image by using described first reading unit or described second reading unit less than two the time.
17. image processing method according to claim 14 is characterized in that, described area sensor unit comprises:
First reading unit, it is fixed on the original text platform described original image and mobile light source, thereby reads described original image;
Second reading unit, its fixing described light source also moves described original image, thereby reads described original image; And
Read control unit, be used to control, make the value work as the reading times that is calculated more than or equal to two the time, by using described first reading unit and described second reading unit to read described original image, and the value of working as the reading times that is calculated reads described original image by using described first reading unit or described second reading unit less than two the time.
18. image processing method according to claim 15 is characterized in that, described area sensor unit comprises:
First reading unit, it is fixed on the original text platform described original image and mobile light source, thereby reads described original image;
Second reading unit, its fixing described light source also moves described original image, thereby reads described original image; And
Read control unit, be used to control, make the value work as the reading times that is calculated more than or equal to two the time, by using described first reading unit and described second reading unit to read described original image, and the value of working as the reading times that is calculated reads described original image by using described first reading unit or described second reading unit less than two the time.
19. image processing method according to claim 13 is characterized in that, with the zone that Region Segmentation is an any amount of reading of described area sensor unit, and reads described original image at each zone.
20. image processing method according to claim 14 is characterized in that, with the zone that Region Segmentation is an any amount of reading of described area sensor unit, and reads described original image at each zone.
21. image processing method according to claim 15 is characterized in that, with the zone that Region Segmentation is an any amount of reading of described area sensor unit, and reads described original image at each zone.
22. image processing method according to claim 16 is characterized in that, with the zone that Region Segmentation is an any amount of reading of described area sensor unit, and reads described original image at each zone.
23. image processing method according to claim 17 is characterized in that, with the zone that Region Segmentation is an any amount of reading of described area sensor unit, and reads described original image at each zone.
24. image processing method according to claim 18 is characterized in that, with the zone that Region Segmentation is an any amount of reading of described area sensor unit, and reads described original image at each zone.
25. an image processing equipment comprises:
The area sensor unit, it comprises a plurality of transducers, described a plurality of transducer comprises first sensor and second transducer adjacent with described first sensor, wherein, arrange described first sensor and described second transducer, make the second reading fetch bit offset of reading described original image that first of original image reads position and described second transducer of reading of described first sensor move less than a pixel; And
The high resolution conversion unit is used for carrying out interpolation processing by using with a plurality of frames view data item corresponding, that read by described area sensor unit, obtain resolution than time for reading during the high view data of employed resolution;
Maximum frame number memory cell is used to store the data of the maximum frame number of the view data item that is obtained by described area sensor unit;
Resolution setting unit is used to be provided for exporting the resolution of described original image;
Required frame number acquiring unit is used for obtaining the frame number that carries out the required view data item of high resolution conversion based on the set result of described resolution setting unit;
The reading times computing unit is used for using described required frame number acquiring unit required frame number that is obtained and the maximum frame number that is stored in described maximum frame number memory cell to calculate the number of times that reads described original image; And
The reading times control unit is used to read by determined number of times of described reading times computing unit and described original image.
26. an image processing method that is used for image processing equipment, described image processing equipment comprises:
The area sensor unit, it comprises a plurality of transducers, described a plurality of transducer comprises first sensor and second transducer adjacent with described first sensor, wherein, arrange described first sensor and described second transducer, make the second reading fetch bit offset of reading described original image that first of original image reads position and described second transducer of reading of described first sensor move less than a pixel; And
The high resolution conversion unit, be used for carrying out interpolation processing by using with a plurality of frames view data item corresponding, that read by described area sensor unit, obtain resolution than time for reading during the high view data of employed resolution, described image processing method comprises:
Maximum frame number storing step is used to store the data of the maximum frame number of the view data item that is obtained by described area sensor unit;
Resolution is provided with step, is used to be provided for exporting the resolution of described original image;
Required frame number obtaining step is used for obtaining the frame number that carries out the required view data item of high resolution conversion based on the set result of step is set in described resolution;
The reading times calculation procedure is used for using the maximum frame number of being stored in required frame number that described required frame number obtaining step is obtained and described maximum frame number storing step to calculate the number of times that reads described original image; And
The reading times controlled step is used for reading in determined number of times of described reading times calculation procedure and described original image.
CN2008101864570A 2007-12-21 2008-12-19 Image processing apparatus, image processing method Expired - Fee Related CN101465939B (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP2007330978 2007-12-21
JP2007330978 2007-12-21
JP2007-330978 2007-12-21
JP2008-318563 2008-12-15
JP2008318563A JP2009171563A (en) 2007-12-21 2008-12-15 Image processor, image processing method,program for executing image processing method, and storage medium
JP2008318563 2008-12-15

Publications (2)

Publication Number Publication Date
CN101465939A CN101465939A (en) 2009-06-24
CN101465939B true CN101465939B (en) 2011-11-16

Family

ID=40806279

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2008101864570A Expired - Fee Related CN101465939B (en) 2007-12-21 2008-12-19 Image processing apparatus, image processing method

Country Status (2)

Country Link
JP (1) JP2009171563A (en)
CN (1) CN101465939B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102006393A (en) * 2010-12-20 2011-04-06 东莞市金翔电器设备有限公司 Large format scanning method capable of realizing automatic image deformation correction
CN105389148B (en) * 2015-11-02 2023-07-04 京东方科技集团股份有限公司 Data transmission method, data receiving method, related equipment and system
CN107065815A (en) * 2017-06-16 2017-08-18 深圳市新太阳数码有限公司 A kind of the elderly's emotion intelligence control system
CN107481187B (en) * 2017-09-29 2022-04-19 康佳集团股份有限公司 Video image processing method, intelligent terminal and storage medium
JP7056176B2 (en) * 2018-01-26 2022-04-19 株式会社リコー Position detection device, image forming device, and position detection method
JP2024024906A (en) 2022-08-10 2024-02-26 コニカミノルタ株式会社 Image reading device, image reading method, and image reading program

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0938064A2 (en) * 1998-02-24 1999-08-25 Sony Corporation Error diffusion image processing method and apparatus
EP1323537A1 (en) * 2000-10-06 2003-07-02 Seiko Epson Corporation Image processing device, printing control device, image processing method, and recorded medium
CN1574879A (en) * 2003-05-27 2005-02-02 精工电子有限公司 Image sensor

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5627571A (en) * 1979-08-14 1981-03-17 Nec Corp High-resolution image pickup unit
US6459823B2 (en) * 1998-10-28 2002-10-01 Hewlett-Packard Company Apparatus and method of increasing scanner resolution
JP3191794B2 (en) * 1999-02-05 2001-07-23 富士ゼロックス株式会社 Image reading device
JP2001086287A (en) * 1999-09-16 2001-03-30 Fuji Xerox Co Ltd Image reader
JP2002094724A (en) * 2000-09-13 2002-03-29 Fuji Photo Film Co Ltd Image reader
JP2004234191A (en) * 2003-01-29 2004-08-19 Fujitsu Ltd Slip reading device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0938064A2 (en) * 1998-02-24 1999-08-25 Sony Corporation Error diffusion image processing method and apparatus
EP1323537A1 (en) * 2000-10-06 2003-07-02 Seiko Epson Corporation Image processing device, printing control device, image processing method, and recorded medium
CN1574879A (en) * 2003-05-27 2005-02-02 精工电子有限公司 Image sensor

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
JP特开2006-92450A 2006.04.06

Also Published As

Publication number Publication date
JP2009171563A (en) 2009-07-30
CN101465939A (en) 2009-06-24

Similar Documents

Publication Publication Date Title
US8294947B2 (en) Image processing apparatus with front and back side reading units and method for correcting a color difference for a specific color
CN102196134B (en) Image processing apparatus and image processing method
US8180179B2 (en) Image processing apparatus, image processing method, program executing image processing method, and storage medium
CN101465939B (en) Image processing apparatus, image processing method
JP2010151606A (en) Image inspecting apparatus, image inspection method and program
US8259356B2 (en) Apparatus and method of image processing for selective color determination
CN101867673A (en) Image processing apparatus and image processing method
US20090284800A1 (en) Image processing apparatus handling copy-forgery-inhibited pattern image data
CN101197896B (en) Image processing apparatus, image processing method
JP5388559B2 (en) Image processing apparatus, image processing method, program for executing image processing method, and storage medium
US7365873B2 (en) Image processing apparatus, image processing method, and storage medium
CN101562680A (en) Image forming apparatus and image forming method
US9542130B2 (en) Mask based toner reduction
JP4208369B2 (en) Image processing apparatus, image processing method, storage medium, and image processing system
US8294954B2 (en) Image forming apparatus and method for reducing the difference of image qualities between the image data read by a plurality of reading units
US20080112014A1 (en) Image forming apparatus and image processing method
CN102035979A (en) In place line splitting process and method for multiple beam printers
JP7003568B2 (en) Image processing device, image processing method
JP6163244B2 (en) Image processing apparatus, image forming apparatus, image processing program, and recording medium
JP2004255610A (en) Control method of image formation apparatus
JP2001309183A (en) Image processing unit and method
JP6045182B2 (en) Image processing apparatus, image forming apparatus including the same, computer program, and recording medium
JP5389096B2 (en) Apparatus and control method thereof
JP4474001B2 (en) Image processing apparatus and method
JP4900149B2 (en) Image processing apparatus and image processing program

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20111116

Termination date: 20191219

CF01 Termination of patent right due to non-payment of annual fee