WO2016035387A1 - Dispositif de traitement d'image, procédé de traitement d'image, dispositif de lecture d'image, et programme - Google Patents

Dispositif de traitement d'image, procédé de traitement d'image, dispositif de lecture d'image, et programme Download PDF

Info

Publication number
WO2016035387A1
WO2016035387A1 PCT/JP2015/063877 JP2015063877W WO2016035387A1 WO 2016035387 A1 WO2016035387 A1 WO 2016035387A1 JP 2015063877 W JP2015063877 W JP 2015063877W WO 2016035387 A1 WO2016035387 A1 WO 2016035387A1
Authority
WO
WIPO (PCT)
Prior art keywords
low
frequency
image data
data
scanning direction
Prior art date
Application number
PCT/JP2015/063877
Other languages
English (en)
Japanese (ja)
Inventor
まさ子 浅村
善隆 豊田
聡 山中
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2016546342A priority Critical patent/JP6246379B2/ja
Publication of WO2016035387A1 publication Critical patent/WO2016035387A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/024Details of scanning heads ; Means for illuminating the original
    • H04N1/028Details of scanning heads ; Means for illuminating the original for picture information pick-up
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/04Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa
    • H04N1/19Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using multi-element arrays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/387Composing, repositioning or otherwise geometrically modifying originals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/40Picture signal circuits

Definitions

  • the present invention relates to an image processing apparatus, an image processing method, and an image reading apparatus that combine image data obtained by scanning a read object with a plurality of line sensors to generate composite image data corresponding to the read object.
  • the present invention also relates to a program for causing a computer to execute processing for combining the image data.
  • An image reading apparatus that scans a reading object with a line sensor (one-dimensional imaging element) having a plurality of imaging elements arranged in a line in the main scanning direction and generates image data corresponding to the reading object is a copying machine, Widely used in scanners and facsimiles.
  • a line shape is formed in the main scanning direction (direction in which line sensors are arranged).
  • the line sensor group in the first row and the line sensor group in the second row composed of a plurality of line sensors arranged with a space between each other, so that the ends of the adjacent line sensors overlap each other (overlap).
  • the read image data is image data of a rectangular area long in the sub-scanning direction obtained by dividing the document into a plurality of areas in the main scanning direction.
  • the image data read by such an image reading device is image data in which a plurality of image data of rectangular regions that are long in the sub-scanning direction are arranged in the main scanning direction for each line sensor. Therefore, as processing for a plurality of image data read by a plurality of line sensors, an overlap portion between the first image data and each of the plurality of second image data is compared, and a difference is determined from the plurality of image data.
  • a technique for correcting image shift in the sub-scanning direction by selecting the smallest second image data (for example, Patent Document 1) and a linear image pattern of an adjustment document are read, and the difference in the number of pixels in the overlapping range
  • the technique for example, patent document 2 which measures the position shift along the sub-scanning direction of adjacent sensors from the sensor, and determines the delay time required for the synthesis
  • JP 2011-50039 paragraphs 0040-0041, FIG. 9
  • JP 2006-140599 A paragraphs 0073 to 0075, 0081 to 0084, FIG. 3
  • the image data read by the line sensors are compared, and the adjacent image data are detected by detecting the position in the sub-scanning direction where the image data matches.
  • the positional deviation in the sub-scanning direction between the two image data generated by the line sensor has been obtained.
  • the difference includes the level difference between the adjacent line sensors. It is. For this reason, in the image reading apparatus, even if the adjacent line sensors read the same image, a difference due to the level difference of the read image is detected as a comparison result, and this difference is erroneously detected as a positional deviation in the sub-scanning direction. There is a problem that the positional deviation in the sub-scanning direction of the image data generated by the adjacent line sensors cannot be properly detected.
  • the present invention has been made to solve the above-described problems of the prior art, and an object of the present invention is to detect a positional deviation in the sub-scanning direction between image data read by the line sensors in each column. High-quality composite image corresponding to the object to be read by suppressing the influence of level differences caused by differences in performance and reading conditions, and detecting misalignment in the sub-scanning direction without misdetection of the image.
  • An object is to provide an image processing apparatus, an image processing method, and an image reading apparatus capable of generating data, and a program for causing a computer to execute processing for combining the image data.
  • An image processing apparatus includes a plurality of line sensors arranged in the main scanning direction at least two rows at different positions in the sub-scanning direction, and the end portions of adjacent line sensors in different rows are main.
  • An image processing apparatus that processes image data generated by an imaging unit that is arranged to have overlapping areas within the same range in the scanning direction, and that is configured to output image data based on outputs from the plurality of line sensors.
  • the image data to be stored and the image data in the overlap area of the image data stored in the image memory are determined in advance from image data in a pixel range including the pixel of interest and peripheral pixels of the pixel of interest.
  • the low frequency component which is a frequency component lower than the selected frequency, is removed, and the image data before the low frequency component is removed.
  • a process of comparing the low-frequency removal reference data obtained by the low-frequency removal unit and the low-frequency removal comparison data with the low-frequency removal comparison data obtained by the low-frequency removal comparison data obtained from the low-frequency removal comparison data Similarity is performed for a plurality of low frequency band removal comparison data at a plurality of positions whose positions are moved in the sub-scanning direction, and a plurality of similarities between the low frequency band removal reference data and the plurality of low frequency band comparison data are calculated.
  • a shift amount estimation unit that calculates shift amount data based on a difference from the position in the inspection direction, and a sub-scan of image data read from the image memory based on the shift amount data calculated by the shift amount estimation unit
  • a combination that determines a position in a direction, reads out image data at the determined position from the image memory, and combines image data read by adjacent line sensors in different columns to generate composite image data
  • a processing unit that calculates shift amount data based on a difference from the position in the inspection direction, and a sub-scan of image data read from the image memory based on the shift amount data calculated by the shift amount estimation unit
  • An image processing method includes a plurality of line sensors arranged in the main scanning direction at least two rows at different positions in the sub-scanning direction, and ends of adjacent line sensors in different rows are adjacent to each other.
  • the low frequency component which is a frequency component lower than the predetermined frequency, is removed, and the low frequency component is removed.
  • the low-frequency removal reference data at a predetermined position in the sub-scanning direction in the overlap region of the image data and the low-frequency component in the region overlapping the overlap region of the low-frequency removal reference data are removed.
  • a plurality of low-frequency removal comparison data at a plurality of positions where the position of the removal comparison data is moved in the sub-scanning direction, and a plurality of similarities between the low-frequency removal reference data and the plurality of low-frequency removal comparison data A similarity calculation step for calculating the low-frequency band removal reference data, a position in the sub-scanning direction of the low-frequency band removal reference data, and the highest of the plurality of low-frequency band removal comparison data Based on the difference between the low-frequency removal comparison data having similarity and the position in the sub-scanning direction, based on the shift amount estimation step for calculating shift amount data, and on the shift amount data calculated in the shift amount estimation step
  • the position in the sub-scanning direction of the image data read from the image memory is determined, the image data at the determined position is read from the image memory, and the image data read by the adjacent line sensors in different columns is read And a combination processing step of generating composite image data by combining.
  • An image reading apparatus has a plurality of line sensors arranged in the main scanning direction at least two rows at different positions in the sub-scanning direction, and ends of adjacent line sensors in different rows are adjacent to each other.
  • An image pickup unit arranged so as to have overlapping areas in the same range in the main scanning direction, an image memory storing image data based on outputs from the plurality of line sensors, and stored in the image memory
  • a low frequency component which is a frequency component lower than a predetermined frequency from image data in a pixel range including a target pixel and peripheral pixels of the target pixel with respect to the image data in the overlap region of the image data In the sub-scanning direction in the overlap region of the image data from which the low frequency component has been removed.
  • a low-frequency removal unit that obtains low-frequency removal reference data in an image and low-frequency removal comparison data based on image data from which the low-frequency components in the region overlapping the overlap region of the low-frequency removal reference data are removed And a process of comparing the low-frequency removal reference data obtained by the low-frequency removal unit with the low-frequency removal comparison data, a plurality of positions in which the position of the low-frequency removal comparison data is moved in the sub-scanning direction A plurality of low-frequency removal comparison data, and a similarity calculation unit that calculates a plurality of similarities between the low-frequency removal reference data and the plurality of low-frequency removal comparison data; and Shift amount data is calculated based on the difference between the position in the sub-scanning direction and the position in the sub-scanning direction of the low-frequency removal comparison data having the highest similarity among the plurality of low-frequency removal comparison data And determining a position in the sub-scanning direction of the image data read from the image memory based on the shift amount data calculated by the shift
  • a program according to another aspect of the present invention has a plurality of line sensors arranged in the main scanning direction at least two columns at different positions in the sub-scanning direction, and ends of adjacent line sensors in different columns are in the main scanning.
  • a program for causing a computer to process image data generated by an imaging unit arranged so as to have overlapping regions within the same range in a direction, based on outputs from the plurality of line sensors A storage process for storing image data in an image memory, and a pixel range including a target pixel and peripheral pixels of the target pixel with respect to the image data in the overlap region of the image data stored in the image memory.
  • a low frequency component that is a frequency component lower than a predetermined frequency is removed from the image data, and the low frequency
  • the low-frequency removal reference data at a predetermined position in the sub-scanning direction in the overlap region of the image data from which the wave number component has been removed, and the low-frequency region in a region overlapping the overlap region of the low-frequency removal reference data A low-frequency removal process for obtaining low-frequency removal comparison data based on image data from which frequency components have been removed, and the low-frequency removal reference data obtained in the low-frequency removal process and the low-frequency removal comparison data are compared.
  • the processing is performed on a plurality of low-frequency removal comparison data at a plurality of positions obtained by moving the position of the low-frequency removal comparison data in the sub-scanning direction, and the low-frequency removal reference data and the plurality of low-frequency removal comparison data are A similarity calculation process for calculating a plurality of similarities, a position in the sub-scanning direction of the low-frequency removal reference data, and the low-frequency removal comparison data Shift amount estimation processing for calculating shift amount data based on the difference between the low-frequency removal comparison data having a high similarity and the position in the sub-scanning direction, and the shift amount data calculated by the shift amount estimation processing
  • the image data read from the image memory is determined based on the position in the sub-scanning direction, the image data at the determined position is read from the image memory, and the image read by the adjacent line sensors in the different columns. It is characterized by causing a computer to execute a combining process for generating composite image data by combining data.
  • the low frequency component is removed from the image data in the pixel range including the target pixel and its surrounding pixels with respect to the image data in the overlap region, and the sub-scanning direction in the overlap region is determined in advance.
  • the shift amount data obtained from the low-frequency removal reference data and the low-frequency removal comparison data are obtained by obtaining the low-frequency removal reference data at the position and the low-frequency removal comparison data in the region overlapping the overlap region of the low-frequency removal reference data.
  • the present invention it is possible to suppress the influence of the level difference between the line sensors caused by the difference in performance and reading conditions for each line sensor, and to accurately detect the positional deviation of the image without erroneous detection of the positional deviation in the sub-scanning direction. And high-quality composite image data corresponding to the object to be read can be generated.
  • FIG. 1 is a functional block diagram schematically showing a configuration of an image reading apparatus according to Embodiment 1 of the present invention.
  • (A) And (b) is a figure for demonstrating an imaging part, (a) shows roughly the 1st line
  • FIG. 6B is a diagram illustrating a document as a reading object read by the first row line sensor group and the second row line sensor group. It is a schematic plan view which expands and shows one line sensor shown by Fig.2 (a).
  • FIG. 1 is a block diagram illustrating a configuration example of a low frequency removing unit in the image processing apparatus according to Embodiment 1.
  • FIG. 1 is a block diagram illustrating a configuration example of a low frequency removing unit in the image processing apparatus according to Embodiment 1.
  • FIG. 6 is a diagram illustrating an example of a pixel range from which a low-frequency component is extracted in the low-frequency removing unit according to Embodiment 1.
  • FIG. FIG. 6 is a diagram for explaining an operation when a document is in close contact with a glass surface in the low-frequency removing unit and the similarity calculating unit according to the first embodiment.
  • FIG. 6 is a diagram for explaining an operation when a document is separated from a glass surface in the low frequency removing unit and the similarity calculating unit according to the first embodiment.
  • FIG. 6 is a diagram illustrating a positional relationship of image data in an overlap region to be output in the low frequency removing unit according to the first embodiment.
  • FIG. 6 is a block diagram illustrating a configuration example of a low-frequency removal unit that is another configuration example of the low-frequency removal unit according to Embodiment 1.
  • FIG. (A) And (b) is a figure regarding the similarity calculation part of Embodiment 1
  • (a) is a figure for demonstrating the operation
  • (b) is the calculated similarity data (correlation). It is a figure for demonstrating the relationship between the position shift of data) and a subscanning direction.
  • 6 is an explanatory diagram illustrating an example of a combining operation in the combining processing unit according to the first embodiment.
  • FIG. 1 is explanatory drawing which shows the other example of the coupling
  • FIG. It is a figure which shows notionally the image data output from a joint process part.
  • (A) And (b) is a figure which shows the example from which the position of a document changes during conveyance of an imaging part.
  • 6 is a block diagram illustrating a configuration example of a low frequency removing unit in an image processing apparatus according to Embodiment 2.
  • FIG. FIG. 10 is a block diagram illustrating another configuration example of the low-frequency removal unit according to the second embodiment. It is a functional block diagram which shows roughly the structure of the image reading apparatus and image processing apparatus which concern on Embodiment 3 of this invention.
  • FIG. 12 is a flowchart schematically illustrating an example of a processing procedure performed by the arithmetic device according to the third embodiment.
  • (A) And (b) is a side view which shows roughly the structure of the imaging part of the image reader which concerns on Embodiment 4 of this invention.
  • FIG. 10 is a diagram illustrating an example of image data read by an imaging unit according to a fourth embodiment. In Embodiment 4, it is a figure for demonstrating the image data after correct
  • FIG. 1 is a functional block diagram schematically showing a configuration of an image reading apparatus 1 according to Embodiment 1 of the present invention.
  • the image reading apparatus 1 according to the first embodiment includes an imaging unit 2, an A / D conversion unit 3, and an image processing unit 4.
  • the image processing unit 4 is an image processing apparatus according to the first embodiment (an apparatus that can perform the image processing method according to the first embodiment), and includes an image memory 41, a low-frequency removal unit 42, and a similarity degree A calculation unit 43, a shift amount estimation unit 44, and a combination processing unit 45 are provided.
  • the imaging unit 2 includes a first row of line sensor groups including a plurality of (eg, n) first row line sensors arranged at intervals (eg, in a line) in the main scanning direction (see FIG. 2 (described later)). a) and a second row of lines including a plurality of (eg, n) second row line sensors arranged at intervals (eg, in a line) in the main scanning direction.
  • a sensor group (indicated as 21E or 21O in FIG. 2A).
  • n is a positive integer.
  • the positions of the plurality of first row line sensors in the main scanning direction are regions where the plurality of second row line sensors are not provided, that is, regions between the second row line sensors adjacent in the main scanning direction
  • the positions of the plurality of second row line sensors in the main scanning direction are regions where the plurality of first row line sensors are not provided, that is, in the first row adjacent to the main scanning direction.
  • the position is opposite to the area between the line sensors.
  • the plurality of first row line sensors and the plurality of second row line sensors are arranged in a staggered pattern on the sensor substrate.
  • the first row line sensor and the second row line sensor adjacent to each other are adjacent to each other (see FIG. 2 ( In a)
  • the end portion sr and the end portion sl) that are closest to each other are arranged so as to have overlapping regions (also referred to as “overlap regions”) that overlap within the same range in the main scanning direction.
  • the imaging unit 2 optically reads an image of a document as an object to be read and generates an electrical signal (image data) SI corresponding to the image of the document.
  • the electrical signal (image data) SI generated by the imaging unit 2 includes first image data output from a plurality of first row line sensors constituting the first row line sensor group, and second row lines. And second image data output from a plurality of second line sensors constituting the sensor group. Note that, for example, an erect image is formed on each of the first row line sensor group and the second row line sensor group between the first row line sensor group and the second row line sensor group and the original.
  • An optical system such as a lens may be provided.
  • the present invention can also be applied to an apparatus having three or more line sensor groups. Further, the present invention can be applied to a case where an image of an object to be read is read by an imaging unit having two or more line sensors having an overlap region. Therefore, the present invention can also be applied to the case where the first row line sensor group is composed of one line sensor and the second row line sensor group is composed of one line sensor.
  • FIG. 2 (a) and 2 (b) are diagrams for explaining the imaging unit 2
  • FIG. 2 (a) is a plan view schematically showing the imaging unit 2
  • FIG. It is a top view which shows the original 60 as a to-be-read object.
  • FIG. 2A shows, for example, a state in which an original table glass (hereinafter referred to as “glass surface”) 26 of the copying machine is viewed from above.
  • FIG. 3 is a diagram for explaining the configuration of the line sensor using the line sensor 21O 1 which is one of the components of the imaging unit 2.
  • the imaging unit 2 includes a sensor substrate 20.
  • a plurality of line sensors are arranged on the sensor substrate 20 in two rows.
  • one end (e.g., left side) line sensor 21O 1 located odd counted from, ..., 21O k, ..., 21O n is linearly interval in the main scanning direction (X-direction) Is opened and placed.
  • Line sensor 21E 1 located to the even-numbered counting from the left end, ..., 21E k, ..., 21E n is the line sensor 21O 1 located odd, ..., 21O k, ..., 21O n main scanning They are arranged linearly in the main scanning direction at different positions in the direction (X direction).
  • Line sensor 21E 1 located even-numbered, ..., 21E k, ..., the line sensor 21O 1 positioned to the odd-numbered and 21E n, ..., 21O k, ..., and 21O n, are staggered.
  • n is an integer of 2 or more
  • k is an integer of 1 to n.
  • Line sensor 21O 1 located odd, ..., 21O k, ..., 21O n is the first column line group of sensors comprising a plurality of first line sensor (or first comprises a plurality second line sensor
  • the line sensors 21E 1 ,..., 21E k ,..., 21E n located in the even-numbered line sensors are configured as a second line sensor group (or a plurality of second line sensors) (or 1st line sensor group including a plurality of first line sensors).
  • the plurality of first line sensors belonging to the first row line sensor group are arranged in the second row line.
  • a plurality of line sensors (for example, 21O 1 ,..., 21O k ,..., 21O n ) arranged so as to face the interval in the main scanning direction of the line sensors in the sensor group and belonging to the second row line sensor group are
  • the line sensors in the first row of line sensor groups are arranged so as to face the interval in the main scanning direction.
  • the adjacent end portions (end portions sr and sl) of the adjacent first line sensor and the second line sensor have the same position in the main scanning direction (an overlapping region in the main scanning direction). ) It has an overlap area.
  • the imaging unit 2 is moved in the sub-scanning direction (Y direction) by the transport unit (24 in FIG. 4A described later), and the document (FIG. 4) as the object to be read. 60) in (a) is read.
  • the transport unit 24 may be a device that fixes the imaging unit 2, transports the document 60 in the direction opposite to the sub-scanning direction ( ⁇ Y direction), and reads the document as an object to be read.
  • ⁇ Y direction sub-scanning direction
  • the sub-scanning direction indicates the moving direction of the imaging unit 2 (the arrow Dy direction FIG. 2 (a)), the main scanning direction, the line sensor 21O 1 located odd, ..., 21O k , ..., the arranging direction of 21O n, or line sensor 21E 1 located even-numbered, ..., shown 21E k, ..., the arranging direction of 21E n.
  • the line sensor 21O 1 includes a plurality of red photoelectric conversion elements (R photoelectric conversion elements) 26R that convert red component light of received light into an electrical signal, and the received light.
  • the plurality of R photoelectric conversion elements 26R are linearly arranged in the main scanning direction (X direction), and the plurality of G photoelectric conversion elements 26G are linear in the main scanning direction (X direction).
  • the plurality of B photoelectric conversion elements 26B are linearly arranged in the main scanning direction (X direction).
  • the line sensor having the configuration shown in FIG. 3 will be described.
  • the present invention may be configured such that black-and-white photoelectric conversion elements that do not identify colors are arranged in a line.
  • the arrangement of the plurality of R photoelectric conversion elements 26R, the plurality of G photoelectric conversion elements 26G, and the plurality of B photoelectric conversion elements 26B is not limited to the example of FIG.
  • the line sensor 21O 1 outputs the received information as an electric signal SI (O 1 ).
  • the line sensors 21E 1 , 21O 2 ,..., 21O n , 21E n also receive the received information as electrical signals SI (E 1 ), SI (O 2 ),..., SI (O n ), SI (E n ). Electrical signals output from all line sensors are denoted as electrical signal SI.
  • the electrical signal SI output from the imaging unit 2 is input to the A / D conversion unit 3.
  • Line sensor 21O 1 located odd, ..., 21O k, ..., 21O n a line sensor 21E 1 located even-numbered, ..., 21E k, ..., A 21E n, the original 60 and partially overlapping overlap region a 1,1, a 1,2 reading, ..., a k, k, a k, K + 1, a k + 1, K + 1, ..., has a n, a n. Details of the overlap area will be described later.
  • the A / D converter 3 converts the electrical signal SI output from the imaging unit 2 into digital data (image data) DI.
  • the image data DI is input to the image processing unit 4 and stored in the image memory 41 of the image processing unit 4.
  • FIGS. 4A to 4C are diagrams for explaining the image data DI stored in the image memory 41 in the image processing unit 4.
  • 4 (a) is a line sensor 21O 1 located odd, ..., 21O k, ..., 21O n a line sensor 21E 1 located even-numbered, ..., 21E k, ..., the optical axis 27O of 21E n
  • the optical axis 27E intersect with each other (that is, when the optical axes 27O and 27E are viewed in the sub-scanning direction (Y direction), the optical axis 27O and the optical axis 27E are on the glass surface 26).
  • FIG. 4B is a diagram illustrating an example of the document 60.
  • FIG. 4C conceptually shows image data DI corresponding to the document 60 of FIG. 4B read when the document 60 and the line sensor are in the positional relationship of FIG. 4A. .
  • FIG. 4A is a schematic side view of the image reading apparatus 1 and shows a state in which an apparatus (for example, a copying machine) including the image reading apparatus 1 is viewed from the side.
  • Line sensor 21O 1 positioned to the odd-numbered as shown in FIG. 2 (a), ..., 21O k, ..., 21O n is also the line sensor 21O expressed, the line sensor 21E 1 located even-numbered, ..., 21E k ,..., 21E n are also expressed as a line sensor 21E.
  • the reflected light of the original 60 irradiated with light from the illumination light source 25 such as a light emitting diode (LED) travels toward the line sensor 21O along the optical axis 27O and travels toward the line sensor 21E along the optical axis 27E.
  • the imaging unit 2 conveyed in the sub-scanning direction (Y direction) sequentially photoelectrically converts the reflected light of the document 60 placed on the glass surface 26 and outputs the converted electric signal SI.
  • the A / D conversion unit 3 The electric signal SI is converted into image data DI and output.
  • image data DI as shown in FIG. Stored in the image memory 41.
  • Image data DI the line sensor 21O 1 located odd, ..., 21O k, ..., image data DI (O 1) generated by 21O n, ..., DI (O k), ..., DI (O n) , 21E k ,..., 21E n generate image data DI (E 1 ),..., DI (E k ),..., DI (E n ) generated by the even-numbered line sensors 21E 1 ,. .
  • the image data DI (O k ) and DI (O k + 1 ) generated by the odd-numbered line sensors 21O k and 21O k + 1 and the even-numbered line sensors 21E k and 21E k + 1 are included.
  • Image data DI (E k ) and DI (E k + 1 ) to be generated are shown.
  • the line sensor 21O because of the difference in read conditions such as variation and the illumination distribution of the performance, the line sensor 21O 1, ..., 21O k, ..., image data DI (O 1) generated by 21O n, ... , DI (O k), ... , DI (O n) and the line sensor 21E 1, ..., 21E k, ..., image data DI to be generated 21E n (E 1), ... , DI (E k), ..., There may be a difference in level of image data read for each line sensor (brightness / darkness difference, difference in pixel data value) with each of DI (E n ). For this reason, even when the same image is read, there is a difference in the pixel value (level) in the image data between the image data DI (O k ) and DI (E k ).
  • the overlap area A 1,1, A 1,2, ..., A k, k, A k, K + 1, A k + 1, K + 1, ..., A n, n will be described.
  • the odd-numbered line sensors 21O 1 ,..., 21O k ,. line sensor 21E 1 positioned in the n and the even-numbered, ..., 21E k, ..., partially overlapping regions (overlap region) is read at 21E n.
  • the left end sl the right sr and the line sensor 21E 1 of the line sensor 21O 1 has the same area of the document 60, i.e., read the regions A 1, 1.
  • the left end sl the right sr and the line sensor 21O 2 line sensors 21E 1 is the same area of the document 60, i.e., read the regions A 1, 2.
  • the left end sl the right sr and the line sensor 21E k of the line sensor 21O k are both read the area A k, k of the original 60
  • the right end sr line sensor 21E k left sl line sensor 21O k + 1 are both read area a k of the original 60
  • a k + 1 the left end sl the right sr and the line sensor 21E k + 1 of the line sensor 21O k + 1 are both read the area a k + 1, k + 1 of the document 60.
  • image data DI (O k) corresponding to the line sensor 21O k is the overlap area A k of the document 60, it includes a digital data d r corresponding to k, image data DI (E corresponding to the line sensor 21E k k ) includes digital data d 1 corresponding to the areas A k, k of the document 60.
  • the digital data d r and d l adjacent will free data position shift of the sub-scanning direction of the document 60 (Y-direction).
  • the position in the sub-scanning direction to be read by the line sensor 21O and the line sensor 21E (Y-direction), i.e., (hereinafter also referred to as "read line") line is substantially the same position in the digital data d r and d l adjacent
  • the amount of positional deviation (number of lines) in the sub-scanning direction between the image data read by the line sensor 21O and the line sensor 21E becomes a value close to zero.
  • the overlap area A k included in the image data DI (O k) corresponding to the line sensor 21O k , a digital data d r corresponding to k, the area a k included in the image data DI (E k) corresponding to the line sensor 21E k, in the digital data d l corresponding to k, the area a k of the same document 60 , K is read, a level difference occurs in the value of the image data.
  • FIGS. 5A to 5C are diagrams for explaining the image data DI stored in the image memory 41.
  • FIG. 5 (a) is, document 60 is floated from the glass surface 26 Cage (apart), the line sensor 21O 1 located odd, ..., 21O k, ..., is positioned on the optical axis and the even-numbered 21O n line sensor 21E 1, ..., 21E k, ..., a position different from the position where the optical axis intersects the 21E n, in the case where there is a document 60 is a diagram showing the positional relationship between the document 60 and the line sensor.
  • FIG. 5B is a diagram showing an example of the original 60.
  • FIG. 5C is an original 60 shown in FIG. 5B when the original 60 and the line sensor are in the positional relationship shown in FIG. 5A. It is a figure which shows notionally the image data DI corresponding to.
  • each line sensor acquires data at the same position in the main scanning direction (X direction) when the document 60 is floating from the glass surface 26 and when it is in close contact with the glass surface 26.
  • the left sl the right sr and the line sensor 21E k of the line sensor 21O k are both read the area A k, k of the original 60, line rightmost sr and the line sensor 21O k + 1 of the left sl sensor 21E k reads both areas a k of the original 60, a k + 1, the left end sl the right sr and the line sensor 21E k + 1 of the line sensor 21O k + 1 are both regions of the document 60 Read A k + 1, k + 1 .
  • the odd-numbered line sensors 21 ⁇ / b> O 1 ,. , ..., the line sensor 21E 1 to the optical axis 27O of 21O n is located at the position and the even-numbered intersecting the document 60, ..., 21E k, ..., a position where the optical axis 27E intersects the document 60 21E n different. Therefore, when the document 60 is floating from the glass surface 26, the reading position is different for each line sensor in the sub-scanning direction (Y direction). This is because the line sensors 21E 1 ,..., 21E k ,...
  • 21E n is the line sensor 21O 1 located odd, ..., 21O k, ..., compared to 21O n, is because so that acquiring an image of the same position temporally delayed. Accordingly, as shown in FIG. 5C, the image data DI (O k ) and DI (O k + 1 ) corresponding to the odd-numbered line sensors 21O k and 21O k + 1 and the even-numbered line sensor 21E.
  • the image data DI (E k ) and DI (E k + 1 ) corresponding to k 1 and 21E k + 1 are shifted in position (line) in the sub-scanning direction from (image data DI (O k ) and DI (O k + 1 )).
  • the image data DI (E k ) and DI (E k + 1 ) are shifted downward in the drawing) and stored in the image memory 41.
  • the line sensor 21O reads image data DI (O k ) because there is a difference in reading conditions such as performance variation and illumination distribution for each line sensor.
  • image data DI (E k ) read by the line sensor 21E there may be a level difference between the image data read for each line sensor.
  • the image processing unit 4 of the image reading apparatus 1 includes an image memory 41 and a low frequency removing unit 42.
  • the image memory 41 stores first image data and second image data generated from outputs from a plurality of line sensors in the first row line sensor group and the second row line sensor group.
  • the low frequency removing unit 42 performs processing on the image data in the overlap region of the image data DI stored in the image memory 41. That is, the low-frequency removing unit 42 removes a low-frequency component from the image data in the vicinity of the pixel of interest in the overlap region (that is, a pixel range including the pixel of interest and the peripheral pixels of the pixel of interest).
  • reference data at a predetermined position in the sub-scanning direction also referred to as a predetermined sub-scanning position or a predetermined line
  • Comparison data of a predetermined number of lines in the area is output.
  • the reference data and comparison data output from the low-frequency removal unit 42 are also referred to as low-frequency removal reference data and low-frequency removal comparison data.
  • near pixel of interest means a range (pixel range) including the pixel of interest and a peripheral image within a predetermined range around the pixel of interest.
  • predetermined position in the sub-scanning direction means a predetermined position in the sub-scanning direction.
  • the “low frequency component” means a component having a frequency lower than a preset (predetermined) frequency.
  • the image processing unit 4 performs a process of comparing the low-frequency removal reference data generated by the low-frequency component removal processing in the low-frequency removing unit 42 with the low-frequency removal comparison data, at a plurality of positions.
  • a similarity calculation unit 43 that performs the comparison data and calculates the similarity between the low-frequency removal reference data and the low-frequency removal comparison data at a plurality of positions in the sub-scanning direction (that is, low-frequency removal comparison data of a plurality of lines).
  • the image processing unit 4 changes the position in the sub-scanning direction based on the shift amount data, reads the image data from the image memory 41, and combines the first image data and the second image data. 45.
  • the image data in the overlap region of the image data DI stored in the image memory 41 is processed.
  • the component is removed, the low frequency removal reference data in the overlap region from which the low frequency component is removed is compared with a plurality of low frequency removal comparison data, and a plurality of similarities are calculated.
  • the shift amount data is calculated from the position in the sub-scanning direction of the low-frequency removal comparison data having the highest similarity. Then, the combination processing unit 45 in the image processing unit 4 reads out the image data from the image memory 41 based on the calculated shift amount data, and combines the read image data to generate composite image data. .
  • the configuration and operation of the image processing unit 4 will be described more specifically.
  • the low-frequency removing section 42 of the image processing unit 4 the image data in the overlap area of the image memory 41, i.e., out of the image data DI, a predetermined position y n in the sub-scanning direction (Y direction) the preceding and Image data rM in the overlap region (d r or d l in FIGS. 4 and 5) in the line is read and input.
  • the image data rM in the overlap area is the same overlap read as the image data rMO (corresponding to the low-frequency removal reference data) of the image data DI in the overlap area in the image memory 41 and the image data rMO. It includes image data rME (corresponding to low-frequency removal comparison data) located in the wrap region.
  • the low frequency removing unit 42 performs processing on the image data rM in the overlap region, that is, removes the low frequency component of the image data from the image data for the pixel range including the target pixel and the surrounding pixels in the overlap region, Low-frequency-removed image data hM from which low-frequency components have been removed (low-frequency removal reference data hMO and low-frequency removal comparison data hME) is generated.
  • the low-frequency removal unit 42 of the low-frequency removed image data hM, low-frequency removed image data in the sub-scanning direction position Y m and its surrounding (Y direction one line interval and for each X-direction) of the (Y-direction) Is a reference line Y m which is a position in the sub-scanning direction (Y direction) of the low-frequency removal reference data MO in the same overlap area read overlapping with the low-frequency removal reference data MO.
  • a range from ⁇ y to + y as a center (range from (Y m ⁇ y) to (Y m + y)), that is, a predetermined search range “ ⁇ y to + y” in the lines before and after the reference line Y m
  • the low-frequency-removed image data at the position and its surroundings is used as low-frequency-removed comparison data ME, and the low-frequency removal reference data MO and the low-frequency removal comparative data ME having a predetermined number of lines are used.
  • the image data DI read by the line sensor has a level difference (brightness and darkness) of the value of the image data between the line sensors. Difference, difference in pixel data value).
  • the change in the value of the image data acquired by each line sensor causing the level difference between the line sensors is caused by random noise generated in each pixel and the sub-scanning direction in the read image.
  • a low frequency component that changes relatively slowly in the sub-scanning direction or the main scanning direction, unlike a pattern and an edge or the like that change in units of one line or one pixel in the main scanning direction. appear.
  • This low frequency component can be treated as a change in the value of the image data, for example, which changes over a line cycle larger than several lines (the number of lines of 3 lines or more) in the sub-scanning direction.
  • the low-frequency removing unit 42 removes this slowly changing low-frequency component from the image data rM in the overlap region, so that the image processing unit 4 obtains due to differences in performance and reading conditions for each line sensor. Except for the change in the value of the image data DI in the overlap region, the low-frequency removal reference data MO and the low-frequency removal comparison data ME of the same overlap region, which are read in duplicate, are It can be obtained as image data in which the influence of the level difference caused by the difference is suppressed.
  • the low frequency removing unit 42 is configured as shown in FIG. 6, for example.
  • the low frequency removing unit 42 performs processing on the image data rM in the overlap area read from the image memory 41. That is, the low-frequency removal unit 42 generates low-frequency-removed image data hM by removing low-frequency components of the image data rM in the overlap region from the image data rM.
  • the low-frequency removal reference data hMO and the low-frequency removal comparison data hME are temporarily stored in either the overlap MO image memory 424 or the overlap ME image memory 425, respectively.
  • the low frequency band removal reference data MO and the low frequency band removal comparison data ME having a predetermined number of lines are read and output.
  • the low-frequency removal unit 42 includes an overlap low-frequency removal unit 421, an overlap MO image memory 424, and an overlap ME image memory 425.
  • the overlap low frequency removing unit 421 includes a low frequency component extracting unit 422 that extracts a low frequency component and a subtracting unit 423.
  • the overlap low-frequency removing unit 421 in the low-frequency removing unit 42 reads the image data rM (reference image data rMO and comparison image data rME) in the overlap region from the image memory 41.
  • the overlap low frequency removing unit 421 performs processing on the image data rM in the overlap region. That is, the overlap low-frequency removal unit 421 generates low-frequency-removed image data hM by removing low-frequency components of the image from the image data rM.
  • the low-frequency-removed image data hM is an outline of a pattern and an edge that changes in units of one line and one pixel in the sub-scanning direction or main-scanning direction of the image data from which the low-frequency component is removed.
  • the image data is constituted by values having change information, for example, signed image data or image data converted to a positive value by a predetermined offset value.
  • FIG. 7 shows an example of a pixel range in which the low frequency removing unit 42 extracts a low frequency component.
  • FIG. 7 shows the positional relationship of the low-frequency component extraction pixel range in the low-frequency removal unit 42.
  • the range (low-frequency component extraction range) extracted by the overlap low-frequency removing unit 421 as the low-frequency component of the image is, for example, a predetermined value in the sub-scanning direction in the image memory 41 as shown in FIG. the center pixel of interest P0 in the overlap region at position y n, the sub-scanning direction 9 lines, a pixel range composed of nine pixels in the main scanning direction by one pixel (low-frequency component extraction range) b2.
  • the overlap low-frequency removing unit 421 obtains a low-frequency component in the range of nine lines in the sub-scanning direction (low-frequency component extraction line range), and generates low-frequency-removed image data hM from which the low-frequency component has been removed. .
  • This low-frequency-removed image data hM is low-frequency-removed image data hM from which the low-frequency component in the sub-scanning direction has been removed.
  • the low-frequency component extraction range is a range composed of (9 lines from the line y n +4 to the line y n -4), a total of nine pixels in the main scanning direction by one pixel sub-scanning direction 9 lines shown in FIG. 7 It is not limited to, but the change of the value of the image data acquired by each line sensor that leads to the level difference between the line sensors, for example, the comparison that changes over several line periods in the sub-scanning direction Any other range may be used as long as a low frequency component that changes gradually can be extracted.
  • Low-frequency component extraction range for example, the pixel of interest P0 in the overlap region in the sub-scanning direction at a predetermined position y n of the image memory 41 as a center, sub-scanning direction u line, the sum of the main scanning direction v pixels (u ⁇ v)
  • a pixel range composed of pixels may be used.
  • u is an integer of 2 or more
  • v is an integer of 1 or more.
  • the positional deviation between the line sensors in the sub-scanning direction generally coincides with horizontal lines or diagonal lines with the same value in the main scanning direction (edges in the sub-scanning direction, high-frequency parts).
  • the determination by is the main element, and the coincidence of the images in the sub-scanning direction cannot be determined only by the vertical lines that are the edges in the main scanning direction. Therefore, even if the low frequency component is extracted only in the sub-scanning direction, the accuracy of extracting the low frequency component is unlikely to decrease.
  • the hardware configuration can be simplified and the arithmetic processing can be facilitated, compared with the case where the low frequency component is obtained in the two-dimensional direction.
  • the scale can be reduced.
  • the image data rM in the overlap region input to the overlap low frequency removing unit 421 is input to the low frequency component extracting unit 422 and the subtracting unit 423.
  • the low-frequency component extraction unit 422 in the overlap low-frequency removal unit 421 has a low-frequency component extraction range b2 of nine lines in the sub-scanning direction centering on the pixel of interest P0 in the image data rM in the overlap region (in FIG. In a range surrounded by a broken line), a low-frequency component adc, which is a low-frequency component in the sub-scanning direction, is extracted from the image data in the vicinity of the target pixel P0 (or a position corresponding thereto), and this is output.
  • the low-frequency component (low-frequency component adc) extraction of the image data in the low-frequency component extracting unit 422 is performed by, for example, calculating an average of pixel values of pixels at positions corresponding to the low-frequency component extraction range b2. Since the average in the low-frequency component extraction range b2 is calculated, it is possible to extract pixel levels that exist on average among the nine lines in the sub-scanning direction, that is, low-frequency components in the nine lines. Here, in the low-frequency component extraction range b2 shown in FIG. 7, the average value of the pixel values of 9 pixels (9 lines) is obtained, so the sum of the pixel values is reduced to 1/9.
  • the low-frequency component extraction unit 422 is configured so that the calculation is divided by 8 (2 to the 3rd power) or 4 (2 to the 2nd power) instead of 9. May be.
  • the process performed by the low frequency component extraction unit 422 may be another process as long as an index that reflects the average value as the low frequency component is obtained.
  • the low frequency component extraction unit 422 is configured to perform conversion such as gain adjustment by multiplying the average value of the pixel values of the pixels at the position corresponding to the low frequency component extraction range b2 by a predetermined coefficient. You can also
  • the low frequency component extraction unit 422 has described the case where the low frequency component adc is obtained from the average of the pixel values. However, the low frequency component extraction unit 422 determines the position of each pixel with respect to the target pixel P0 (line interval in FIG. 7). Accordingly, weighted averaging may be performed. For example, weighted averaging is performed by weighting and adding so that the weight in the pixel close to the target pixel P0 is large and the weight gradually decreases as the distance from the target pixel P0 increases. It is good.
  • the low-frequency component extraction unit 422 can be configured by a low-pass filter (also referred to as a low-pass filter or LPF) or the like, and a value obtained by performing LPF processing on pixels between lines can be obtained as the low-frequency component adc. it can.
  • the low-frequency component extraction unit 422 gradually changes the low-frequency component so that the low-frequency component to be transmitted is 1 / (for example, 1/9 in FIG. 7) of several line frequencies. What is necessary is just to set the filter coefficient which extracts a component.
  • the pixel of interest P0 in the center line of the image data rM in the overlap region (sub-scanning direction readout line y n in FIG. 7) and the low-frequency component extraction unit
  • the low frequency component adc which is the low frequency component from 422 is input.
  • the subtraction unit 423 of the overlap low-frequency removal unit 421 performs processing on the image data rM, that is, subtracts the low-frequency component adc from the low-frequency component extraction unit 422 from the image data of the target pixel P0 of the image data rM. Then, the low frequency component of the image in the vicinity of the target pixel P0 is removed from the image data rM, and the low frequency removal image data hM is generated. At this time, the low-frequency-removed image data hM is generated by subtracting the low-frequency component adc, which is a low-frequency component, from the image data rM. It becomes the image data of the value it has.
  • the low-frequency-removed image data hM can be handled with a signed value, but may be data converted into a positive value (converted into an absolute value) with a predetermined offset value.
  • the absolute value of the subtraction result is a value indicating the presence / absence of edge and contour information, and does not indicate the direction of change.
  • the sign value (the direction of change is changed). Need to be held together.
  • the overlap region image data rM input to the overlap low frequency removing unit 421 is read in duplicate with the image data rMO (corresponding to the reference data) of the image data DI in the overlap region and the image data rMO.
  • Image data rME (corresponding to comparison data) located in the same overlap region, the low-frequency-removed image data hM from which the low-frequency component has been removed by the subtracting unit 423 also serves as reference data.
  • the image data rMO and the image data rME serving as the comparison data include the low frequency removal reference data hMO obtained by removing the low frequency components and the low frequency removal comparison data hME.
  • the overlap low-frequency removal unit 421 shown in FIG. 6 performs processing on the image data rM in the overlap region, that is, generates low-frequency-removed image data hM from which low-frequency components of the image are removed. Perform output processing.
  • the low-frequency-removed image data hM from the overlap low-frequency removal unit 421 is input to the overlap MO image memory 424 and the overlap ME image memory 425 and temporarily stored.
  • the low-frequency removal image data hM from the overlap low-frequency removal unit 421 is low-frequency removal obtained by removing low-frequency components from the image data rMO serving as low-frequency removal reference data and the image data rME serving as low-frequency removal comparison data. Since the reference data hMO and the low-frequency removal comparison data hME are included, the low-frequency removal reference data hMO is included in the overlap MO image memory 424 in accordance with the position of the overlap region (processing timing) in the low-frequency removal image data hM. In addition, the low frequency band removal comparison data hME is stored in the overlap ME image memory 425, respectively.
  • the low-frequency removal reference data MO having a predetermined number of lines and the low-frequency removal comparison are matched with the processing timing in the next similarity calculation unit 43.
  • Data ME is read and output.
  • low pass removed image position Y m and surrounding a predetermined sub-scanning direction of the low-frequency canceling reference data HMO (Y-direction) (Y-direction every one-line spacing and the X-direction) Data is read out as low-frequency removal reference data MO.
  • the low-frequency removal reference data MO read from the overlap MO image memory 424 and the low-frequency removal comparison data ME read from the overlap ME image memory 425 are predetermined lines from which low-frequency components are removed.
  • the low-frequency removal reference data MO and the low-frequency removal comparison data ME are sent from the low-frequency removal unit 42 to the similarity calculation unit 43.
  • the predetermined search range “ ⁇ y to + y” is different from the center line position of the low-frequency removal reference data MO (low-frequency removal reference data hMO), that is, the positional deviation amount in the sub-scanning direction (both the shift amount). This is the range of values that dY can take.
  • the search range “ ⁇ y to + y” is the amount of positional deviation in the sub-scanning direction that may occur due to the distance between the original 60 and the glass surface, the optical axis, or the like when the original 60 is read by the imaging unit 2 of the image reading apparatus 1.
  • the search range “ ⁇ y to + y” is a search range in which a position (line) in the sub-scanning direction in which image data is shifted in the combination process, that is, a position shift in the sub-scanning direction is detected.
  • the positional deviation amount dY takes a value within a range from ⁇ y to + y.
  • the low-frequency removal unit 42 in FIG. 6 uses the low-frequency-removed image data hM (low-frequency removal reference data hMO and low-frequency removal comparison data hME) obtained by removing the low-frequency components from the image data rM in the overlap region.
  • the low-frequency removal for the search range “ ⁇ y ⁇ + y” is generated for the line Y m (shown in FIGS. 8 to 10 described later) in the overlap region and the surrounding image data in the low-frequency removal image data hM. Output as reference data MO.
  • the low frequency band removing unit 42 searches the search range “ ⁇ y to + y” centered on the line Y m (that is, from the line (Y m ⁇ y) to the line (Y m + y)) for the corresponding overlap area.
  • Line) and the surrounding image data are obtained as low-frequency removal comparison data hME and output as low-frequency removal comparison data ME for each line within the search range “ ⁇ y to + y”.
  • the search range for detecting the positional deviation amount dY in the sub-scanning direction when outputting from the low-frequency removing unit 42 is a line within the range from the line (Y m -y) to the line (Y m + y).
  • the low-frequency removal reference data MO and the low-frequency removal comparison data ME as many as the number of lines in the search range “ ⁇ y to + y” are sent to the similarity calculation unit 43.
  • the low frequency removing unit 42 removes the low frequency component from the image data rM in the overlap region in the overlap low frequency removing unit 421 and removes the low frequency image data hM (the low frequency removal reference data hMO and the low frequency removal comparison).
  • the removal comparison data ME can be output.
  • the low-frequency removal unit 42 generates the low-frequency removal reference data MO and the low-frequency removal comparison data ME of the same overlap region that are duplicated and read according to the performance of each line sensor and the difference in the reading conditions. It can be output as image data in which the influence of the difference is suppressed.
  • the output low-frequency-removed image data is a low frequency band from which the low-frequency component in the sub-scanning direction has been removed. It becomes the removal reference data MO and the low-frequency removal comparison data ME, so that the accuracy of extraction of the low-frequency components can be reduced, the hardware configuration can be simplified, the arithmetic processing can be facilitated, and the operational scale can be reduced.
  • FIG. 8 is a diagram for explaining the operation of the low frequency removing unit 42 and the similarity calculating unit 43 when the document is in close contact with the glass surface
  • FIG. 9 is a diagram when the document is separated from the glass surface. It is a figure for demonstrating operation
  • movement of the low region removal part 42 and the similarity calculation part 43 of FIG. 8 and 9 show the low-frequency-removed image obtained by the low-frequency removing unit 42 reading the image data rM in the overlap region from the image memory and removing the low-frequency components from the read image data rM.
  • FIG. 6 is a diagram for explaining the position and operation on the overlap region.
  • FIG. 8 is a diagram corresponding to FIG. 4C (when the original 60 is in close contact with the glass surface), and the low-frequency removal reference data MO and the low-frequency removal comparison data ME of the low-frequency removal image data.
  • the positional relationship on the image data DI is shown.
  • the low-frequency removal reference data MO is low-frequency-removed image data obtained by removing low-frequency components from the image data at the position Y m (line Y m ) in the sub-scanning direction (Y direction).
  • the comparison data ME is low-frequency-removed image data obtained by removing low-frequency components from image data in which the positional deviation amount dY takes a value within a range from ⁇ y to + y.
  • FIG. 9 is a diagram corresponding to FIG. 5C (in the case where the document 60 is separated from the glass surface), from the image data at the position Y m (line Y m ) in the sub-scanning direction (Y direction).
  • the low-frequency component is removed from the low-frequency-removal reference data MO, which is the low-frequency-removed image data obtained by removing the frequency component, and the image data in which the positional deviation amount dY takes a value in the range from -y to + y.
  • the positional relationship with the low-frequency removal comparison data ME obtained in this way is shown.
  • the image data rM in the overlap region read from the image memory 41 is input to the overlap low-frequency removal unit 421 to the low-frequency removal unit 42.
  • the low-frequency canceling reference data MO which is a low-frequency removed image data at the position Y m in the sub-scanning direction (Y direction), the image data DI (O corresponding to the line sensor 21O k in the overlap region d r of k), a low-frequency removing image data corresponding to the image data of the surrounding around the line Y m, the MO (O k, d r, Y m) in FIG. 8 and 9 .
  • Positional displacement amount dY is reduction cancellation comparative data ME obtained values from the image data taking by removing the low frequency components in the range from -y to + y, the image data DI corresponding to the line sensor 21E k ( E k ) in the overlap region d l , the low-frequency-removed image data corresponding to the line in the search range “ ⁇ y to + y” and the surrounding image data with the line Y m as the center.
  • the low-frequency-removed comparison data ME is low-frequency-removed image data corresponding to the image data of the lines in the search range “ ⁇ y to + y”. Widens in (Y direction).
  • FIG. 10 is a diagram for explaining in more detail the positional relationship in the sub-scanning direction between the low-frequency removal reference data MO having a predetermined number of lines output from the low-frequency removal unit 42 and the low-frequency removal comparison data ME. .
  • FIG. 10 shows the positional relationship between the low-frequency removal reference data MO and the low-frequency removal comparison data ME when output from the low-frequency removal unit 42 in the sub-scanning direction (line arrangement direction).
  • Reduction cancellation unit 42 a line Y m as a reference line, to remove the low frequency components from the image data rM in the overlap region, to generate a low-frequency canceling reference data MO and the low frequency removal comparative data ME.
  • Low-frequency canceling reference data MO and the low frequency removal comparative data ME is composed of data of a plurality of pixels around the detection reference line Y m.
  • the low-frequency removing unit 42, the low-frequency canceling reference data MO and the low frequency removal comparison data ME is the overlap MO image memory 424 and overlap ME image memory 425, around the detection reference line Y m
  • the low-frequency removal reference data MO (O k , d r ) and the low-frequency removal comparison data ME (E k , d l ) are stored respectively.
  • the low-frequency removal unit 42 reads image data from the overlap MO image memory 424 and the overlap ME image memory 425, and outputs low-frequency removal reference data MO and low-frequency removal comparison data ME.
  • Low-frequency canceling reference data MO (O k, d r), the image data in the low-frequency canceling reference data hMO the reference line Y m center and the sub-scanning direction width bh line MO (O k, d r, Y m ).
  • the low-frequency band removal comparison data ME (E k , d l ) is a search range “ ⁇ y to + y” centered on the reference line Y m in the overlap region d l of the digital data DI (E k ).
  • image data ME (E k , d l , dY) is obtained such that each line has the same width bh as MO (O k , d r , Y m ).
  • the image data ME (E k, d l, 0), Data ME (E k of position shifted one line, d 1 , 1), data ME (E k , d l , y) shifted by y lines, and data ME (E k , d l , -y) shifted by y lines in the reverse direction Output from the removal unit 42.
  • y is a positive integer indicating the number of lines.
  • the width bh may be only one center line.
  • the image processing unit 4 may include a low frequency removing unit 42b shown in FIG. 11 instead of the low frequency removing unit 42 shown in FIG.
  • the image data rM in the overlap region in the image memory 41 is read out separately as image data rMO serving as reference data and image data rME serving as comparison data.
  • the same processing as that of the overlap low-frequency removal unit 421 in FIG. 6 is performed on the data to obtain low-frequency removal reference data hMO and low-frequency removal comparison data hME.
  • the low-frequency removing unit 42b shown in FIG. 11 includes two overlapping low-frequency removing units 4211 and 4212, and the low-frequency removing unit of FIG. 6 includes one overlapping low-frequency removing unit 421. 42.
  • Reference image data rMO and comparison image data rME among the image data rM in the overlap region are input to the overlap low-frequency removal units 4211 and 4212 shown in FIG.
  • the overlap low-frequency removal unit 4211 removes the low-frequency component of the image from the image data and outputs the low-frequency removal reference data hMO for the reference image data rMO in the image data rM in the overlap region.
  • the overlap low-frequency removal unit 4212 removes the low-frequency component of the image from the image data and outputs low-frequency removal comparison data hME for the comparison image data rME in the image data rM in the overlap region.
  • the low band removal reference data hMO is stored in the overlap MO image memory 424, and the low band removal comparison data hME is the overlap ME image.
  • the low band removal reference data MO and the low band removal comparison data ME having a predetermined number of lines from which the low band frequency components have been removed are read out and output as outputs from the low band removing unit 42b. Except for the points described above, the low-frequency removal unit 42b shown in FIG. 11 is the same as the low-frequency removal unit 42 shown in FIG.
  • the similarity calculation unit 43 in the image processing unit 4 includes low-frequency-removal reference data MO and low-frequency-removed comparison data ME that are low-frequency-removed image data from which the low-frequency components from the low-frequency remover 42 are removed. Sent.
  • the low-frequency removal comparison data ME is low-frequency removal comparison data by lines within the search range “ ⁇ y to + y” of the positional deviation amount dY.
  • the similarity calculation unit 43 performs a process of comparing the low-frequency removal reference data MO with the low-frequency removal comparison data ME for a plurality of positions where the positional deviation amount dY falls within the range from ⁇ y to + y. Certain correlation data D43 is calculated and generated.
  • Figure 12 (a) and (b) are diagrams for explaining the operation of the similarity calculation unit 43, low-frequency canceling reference data MO (O around the detection reference line Y m to similarity calculation unit 43 k , dr ) and low-frequency band removal comparison data ME ( Ek , dl ) are input.
  • the similarity calculation unit 43 first takes the positional deviation amount dY within the range from ⁇ y to + y with respect to the input low-frequency removal reference data MO (O k , d r ).
  • E k, d l is the amount of positional deviation in the sub-scanning direction with respect to the low-frequency removal reference data MO (O k , d r ), and takes a value within the search range “ ⁇ y to + y”.
  • the similarity calculation unit 43 includes the image data ME (E k , d l , ⁇ y) in the low-frequency removal reference data MO (O k , d r ) and the low-frequency removal comparison data ME (E k , d l ). ) To ME (E k , d l , y) and the correlation data D43 (O k , E k ) is output.
  • Each of the low-frequency removal reference data MO (O k , dr ) and the low-frequency removal comparison data ME (E k , d l , -y) to ME (E k , d l , y) is large in the sub-scanning direction. Are the same.
  • the similarity calculation unit 43 includes the low-frequency removal reference data MO (O k , d r ) and the low-frequency removal comparison data ME (E k , d l , -y) to ME (E k , d l , y).
  • the sum of absolute values of differences for each pixel and the sum of squares of differences for each pixel is calculated as similarity, and output as correlation data D43 (O k , E k , dY).
  • the similarity calculation unit 43 includes the next low-frequency removal reference data MO (O k + 1 , d l ), low-frequency removal comparison data ME (E k , dr ), and low-frequency removal reference data MO (O k + 1 , d r ) and low-frequency band removal comparison data ME (E k + 1 , d l ),... are sequentially calculated, and correlation data D43 (O k + 1 , E k , dY), D43 (O k + 1 , E k + 1 , dY) , ... are generated.
  • the image data is shifted from the low-frequency removal comparison data ME (E k , d l ) within the search range “ ⁇ y to + y” by one line as the positional deviation amount dY in the sub-scanning direction.
  • ME low-frequency removal comparison data
  • D43 similarity
  • FIG. 12B shows the similarity calculation unit 43 in which the low-frequency removal reference data MO (O k , d r ) centered on the detection reference line Y m and the low-frequency removal comparison data ME (E k , d l ).
  • Correlation data for the positional deviation amount dY search range “ ⁇ y to + y” when the similarity between the image data ME (E k , d l , ⁇ y) to ME (E k , d l , y) is calculated
  • D43 (O k, E k, dY) shows an example of a value of.
  • the broken line is the correlation data D43 (O k , E k , dY) corresponding to FIG.
  • the similarity calculation unit 43 calculates the similarity for a plurality of positions where the positional deviation amount dY falls within the range from ⁇ y to + y based on the low-frequency removal reference data MO and the low-frequency removal comparison data ME from the low-frequency removal unit 42. Since the correlation data D43 is calculated, the influence of the level difference caused by the difference in performance and reading conditions for each line sensor is excluded from the correlation data D43. Therefore, according to the image processing apparatus according to the first embodiment, it is possible to obtain the correlation data D43 based on the similarity obtained from the low-frequency-removed image data based on the change information in the contour of the image data pattern and the edge.
  • the correlation data D43 for the positional deviation amounts dY of a plurality of lines within the range of “ ⁇ y to + y” from the similarity calculation unit 43 is input to the shift amount estimation unit 44.
  • the shift amount estimation unit 44 outputs the positional deviation amount dY corresponding to the data with the highest similarity among the correlation data D43 of the plurality of lines as the shift amount data dyb to the combination processing unit 45.
  • the correlation data D43 from the similarity calculation unit 43 is calculated based on the similarity obtained from the low-frequency-removed image data, and the influence of level differences caused by differences in performance and reading conditions for each line sensor is excluded. Therefore, when the shift amount estimation unit 44 obtains the positional deviation amount dY corresponding to the data with the highest similarity among the correlation data D43 and detects the positional deviation in the sub-scanning direction, the level difference for each line sensor is detected.
  • the shift amount data dyb corresponding to the value in the sub-scanning direction can be obtained with high accuracy and without erroneous detection while suppressing the influence.
  • the shift amount data dyb from the shift amount estimation unit 44 is input to the combination processing unit 45 in the image processing unit 4.
  • the combination processing unit 45 calculates the read position RP of the image data based on the shift amount data dyb, reads the image data M45 corresponding to the read position, and performs a process of shifting the image data based on the shift amount data dyb, The data in the overlapping area of the image data is combined.
  • FIG. 13 and FIGS. 14A and 14B are diagrams for explaining the operation of the combination processing unit 45.
  • Coupling processor 45 as shown in FIG. 13, the line sensor 21O 1 located odd, ..., 21O k, ..., without shifting the image data DI (O k) corresponding to 21O n, the even The image data DI (E k ) corresponding to the line sensors 21E 1 ,..., 21E k ,..., 21E n positioned in the second position is shifted in the sub-scanning direction by the shift amount data dyb. Then, the combined image data can be generated by combining the odd-numbered image data and the even-numbered image data.
  • binding processing unit 45 as shown in FIGS.
  • the combined image data can be generated by combining the odd-numbered image data and the even-numbered image data.
  • FIG. 14B shows both image data DI (O k ) and DI (E k ) corresponding to adjacent line sensors in the sub-scanning direction (Y direction) read out as a reference line by the low-frequency removing unit 42.
  • the in toward the shift amount dyb / 2 by line Y m is shifted in the opposite directions, shows a case where the shift of the shift amount data dyb.
  • the combination processing unit 45 calculates the read position RP of the image data based on the shift amount data dyb, reads the image data M45 corresponding to the read position from the image memory 41, and the line sensor positioned at the odd number. 21O 1, ..., 21O k, ..., 21O n data and the line sensor 21E 1 located even-numbered, ..., 21E k, ..., the image data D45 by combining the overlapping region data 21E n overlap Generate.
  • the shift amount data dyb is obtained from the similarity (correlation data D43) based on the low-frequency-removed image data, it is possible to suppress the influence of the level difference caused by the performance variation for each line sensor and the difference in the reading condition, and without erroneous detection. It is obtained as accurate data indicating the positional deviation in the sub-scanning direction. Therefore, if the combination processing unit 45 calculates the reading position RP of the image data based on the shift amount data dyb, the combining processing unit 45 can accurately and accurately obtain the reading position corresponding to each line.
  • the low-frequency removing unit 42 generates low-frequency-removed image data hM by removing low-frequency components from the image data for the image data rM in the overlap region, and generates a certain sub-scanning direction (Y as a reference line position Y m direction), the line sensor 21O 1 located odd, ..., 21O k, ..., a low-frequency removed image data in the overlap region corresponding to 21O n, a line which is located even number , 21E k ,..., 21E n corresponding to the sensors 21E 1 ,..., 21E n are obtained as low-frequency-removed image data.
  • Similarity calculation unit 43 compares the low-frequency canceling reference data MO and the low frequency removal comparison data ME calculates similarity data (correlation data) D43, the shift amount estimation unit 44, the detection reference line Y m A shift amount corresponding to the position in the sub-scanning direction of the low-frequency removal comparison data having the highest similarity (the highest correlation) is calculated as shift amount data dyb, and the combination processing unit 45 is based on the shift amount data dyb.
  • the image data is read and the combined image data D45 is output.
  • the image processing unit 4 sequentially repeats such a series of processes.
  • FIG. 15 is a diagram illustrating a composite image obtained by combining the image in FIG. 4C, the image in FIG. 5C, the image in FIG. 8, or the image in FIG. 9.
  • the composite image data of the same image as that of the original 60 shown in FIGS. 4B and 5B can be generated by correcting the positional deviation in the (Y direction). For example, in the case shown in FIG.
  • the image data of 21E k ,..., 21E n are shifted in the sub-scanning direction (Y direction).
  • the document 60 shown in FIG. The image data of the same image can be output.
  • FIG. 15 is also an explanatory diagram conceptually showing the combined image data D45 output from the combining processing unit 45.
  • the read position is obtained from the shift amount data dyb accurately obtained without erroneous detection, and the position shifted by the value by the shift amount data dyb and the lines around the position because generates image data D45 by using the image data of the line sensor 21O 1 located odd, ..., 21O k, ..., the line sensor 21E 1 located to the even-numbered and 21O n, ..., 21E k, ... , 21E n in the sub-scanning direction is corrected, and images of overlapping regions that have been read in duplicate are combined.
  • FIGS. 16A and 16B are diagrams illustrating an example in which the position of the document 60 with respect to the glass surface 26 changes during conveyance of the imaging unit 2.
  • the position Y m in the sub-scanning direction (Y-direction) the document 60 is floated from the glass surface 26, in the position Yu, document 60 is adhered to the glass surface 26.
  • the level difference (level fluctuation) caused by the performance variation for each line sensor and the difference in the reading condition is caused by the position in the sub-scanning direction even in the image data by the same line sensor. Since the imaging unit 2 sequentially processes image data at each position in the sub-scanning direction (Y direction), even if the position of the document 60 changes during conveyance or the level value fluctuates, image processing is performed.
  • the unit 4 can accurately calculate the position shift amount and correctly combine the images. it can. Moreover, as explained in FIGS. 8 to 10, the line sensor 21O 1 located odd, ..., 21O k, ..., the line sensor 21E 1 located to the even-numbered and 21O n, ..., 21E k, ..., Since deviation amounts are calculated individually between all adjacent line sensors of 21E n , even if the position of the document 60 relative to the glass surface 26 changes in the main scanning direction (X direction), an image is correctly displayed. Can be combined.
  • the low-frequency removing unit 42 performs processing on the image data rM in the overlap region. treatment, i.e., to remove low frequency components from the image data to generate low-frequency removed image data hM, the position Y m of a sub-scanning direction (Y direction) as a reference line, of the low-frequency removed image data
  • a process of outputting the low-frequency removal reference data MO and the low-frequency removal comparison data ME of a predetermined number of lines is performed, and the similarity calculation unit 43 compares the low-frequency removal reference data MO with the low-frequency removal comparison data ME.
  • a process of calculating a plurality of similarities (correlation data) D43 is performed, and the shift amount estimation unit 44 positions the low-frequency removal comparison data having the highest similarity among the plurality of similarities in the sub-scanning direction.
  • the shift amount data dyb is calculated as the shift amount data dyb, and the combined processing unit 45 divides the divided image (at least one of the odd-numbered image and the even-numbered image) based on the shift amount data dyb. Is shifted to an appropriate position in the sub-scanning direction to generate image data obtained by combining adjacent divided images.
  • the image reading device 1, the image processing unit (image processing device) 4, and the image processing method according to the first embodiment the difference in performance and reading conditions for each line sensor in the image data in the overlap region.
  • the low frequency band removal reference data MO which is image data in which the influence of the level difference caused by the above, is suppressed
  • the low frequency band removal comparison data ME equal to the number of lines in the search range “ ⁇ y to + y” can be obtained. Therefore, according to the image reading device 1, the image processing unit (image processing device) 4, and the image processing method according to the first embodiment, the shift amount data dyb indicating the positional deviation in the sub-scanning direction with high accuracy without erroneous detection.
  • the shift amount data dyb can be used to correct misalignment between the odd-numbered image data and the even-numbered image data in the sub-scanning direction, and combined image data can be generated. High-quality composite image data corresponding to the read object can be generated.
  • the image reading device 1 the image processing unit (image processing device) 4, and the image processing method according to the first embodiment, a plurality of lines in which the low-frequency component extraction ranges in the low-frequency removing unit 42 are arranged in the sub-scanning direction. Since the low-frequency-removal reference data MO and the low-frequency-removed comparison data ME from which the low-frequency component in the sub-scanning direction has been removed are generated as the range of the low-frequency component, the accuracy of the extraction of the low-frequency component is not reduced.
  • the hardware configuration can be simplified, the arithmetic processing can be facilitated, and the arithmetic scale can be reduced.
  • the low-frequency removal reference data MO which is low-frequency removal image data
  • digital data DI (O k ) corresponding to the line sensor 21O k
  • the low-frequency removal comparison data ME is low-frequency-removed image data corresponding to the image data of the overlap region of the digital data DI (E k ) corresponding to the line sensor 21E k.
  • the present invention is not limited to this.
  • the image processing unit 4 associates the low-frequency removal reference data MO, which is low-frequency-removed image data, with the image data in the overlap region of the digital data DI (E k ) corresponding to the line sensor 21E k, and performs low-frequency removal.
  • the comparison data ME is made to correspond to the image data of the overlap area of the digital data DI (O k ) corresponding to the line sensor 21O k to obtain the low-frequency removal image data, and the low-frequency removal reference data MO and the low-frequency removal comparison data ME And the correlation data D43 may be calculated.
  • the image processing unit 4 using overlap region d r of the image data DI by the line sensor 21O or 21E (O k) or DI (E k) as a low-frequency canceling reference data, the overlap area of the adjacent line sensors dl may be used as low-frequency removal comparison data, and low-frequency component is removed from the image data for the image data in the overlap region to generate low-frequency removal image data hM. If the low-frequency removal reference data MO and the low-frequency removal comparison data ME having a predetermined number of lines can be obtained from the image data of the overlap region that is redundantly read by the adjacent line sensors, the first embodiment described above Has the same effect as.
  • FIG. 1 In the image reading device 1, the image processing unit (image processing device) 4, and the image processing method according to the first embodiment, the low-frequency removing units 42 and 42b in the image processing unit 4 are shown in FIGS.
  • the low-frequency component extracting unit 422 extracts the low-frequency component adc that is the low-frequency component of the image data in the vicinity of the target pixel P0, and the subtracting unit 423 extracts the low-frequency component adc from the image data. By subtracting the low-frequency component adc, the low-frequency-removed image data hM is generated.
  • the image reading apparatus, the image processing apparatus, and the image processing method according to the second embodiment are replaced with the low frequency removing unit 42c shown in FIG. 17 instead of the low frequency removing units 42 and 42b according to the first exemplary embodiment. Is adopted.
  • the second embodiment is the same as the first embodiment. Therefore, FIG. 1 is also referred to in the description of the second embodiment.
  • FIG. 17 is a block diagram illustrating a configuration of a low-frequency removal unit 42c used in place of the low-frequency removal unit 42 in the image processing unit 4 in the image reading apparatus 1 according to the second embodiment.
  • the same or corresponding components as those described with reference to FIGS. 1 and 6 are denoted by the same reference numerals as those shown in FIG.
  • the low-frequency removing unit 42c in the image processing unit 4 of the image reading apparatus 1 according to the second embodiment includes an overlap low-frequency removing unit 421c, and the overlapping low-frequency removing unit 421c is a low-frequency unit.
  • a high-frequency component extraction unit 426 is provided that removes frequency components and extracts high-frequency components.
  • Other configurations and operations of the overlap MO image memory 424 and the overlap ME image memory 425 are the same as those described in the first embodiment, and a detailed description thereof will be omitted.
  • image data rM reference image data rMO and comparison image data rME
  • the image data rM in the overlap region is input.
  • the low frequency component of the image is removed from the image data to generate the low frequency removed image data hM.
  • This operation is the same as the overlap low-frequency removal unit 421 of the low-frequency removal unit 42 in the first embodiment.
  • the image data rM in the overlap region input to the overlap low frequency removing unit 421c is input to the high frequency component extracting unit 426.
  • the target pixel P0 in the range of 9 lines in the sub-scanning direction (low frequency component extraction range b2 in FIG. 7) centered on the target pixel P0 in the image data rM of the overlap region, the target pixel P0 ( Alternatively, a high frequency component in the sub-scanning direction is extracted from image data in the vicinity of the corresponding position).
  • the extraction of the high frequency component of the image data in the high frequency component extraction unit 426 may be performed from pixels located in the same range as the low frequency component extraction range b2 shown in FIG. 7 in the first embodiment.
  • a band-pass filter (BPF), a high-pass filter (HPF), and the like are performed, BPF processing in the sub-scanning direction is performed on pixels between lines, and a high-frequency component in the sub-scanning direction is extracted.
  • the low frequency components removed by the BPF processing are the high frequency components other than the low frequency components except for the low frequency components that gently change as 1 / (for example, 1/9 in FIG. 7) of the line frequency. What is necessary is just to set the filter coefficient which allows a frequency component to pass.
  • the high frequency component extracted by the high frequency component extraction unit 426 may be multiplied by a predetermined coefficient to perform conversion such as gain adjustment.
  • the extraction range of the high frequency component is not limited to the low frequency component extraction range b2 of 9 lines in the sub-scanning direction shown in FIG. 7, but is acquired by each line sensor that leads to a level difference between the line sensors. It is only necessary to extract a change in the value of the image data, for example, a high-frequency component excluding a low-frequency component that changes relatively slowly, such as a change in a sub-scanning direction over several line cycles. It can also be obtained in the two-dimensional direction of the direction and the main scanning direction.
  • the accuracy of low frequency component removal is not reduced, the hardware configuration can be simplified, the arithmetic processing can be facilitated, and the calculation scale can be increased. Can be small.
  • the high-frequency component extracted by the high-frequency component extracting unit 426 is output from the overlap low-frequency removing unit 421c as low-frequency-removed image data hM obtained by removing the low-frequency component of the image in the vicinity from the image data rM. . Since the low-frequency-removed image data hM is image data obtained by extracting a high-frequency component, the low-frequency-removed image data hM is image data having a value having change information on a signed pattern and an edge or the like. The low-frequency-removed image data hM may be a signed value or may be data converted to a positive value by a predetermined offset value.
  • the value obtained by converting the subtraction result into an absolute value is a value indicating the presence / absence of contour information such as an edge, in this case, in the low-frequency-removed image data hM, a code value (indicating a changing direction) is used. Hold.
  • the image data rM of the overlap region input to the overlap low-frequency removal unit 421c includes reference image data rMO and comparison image data rME among the image data DI in the overlap region.
  • the low-frequency-removed image data hM obtained by extracting the high-frequency component by the high-frequency component extracting unit 426 is also obtained from the image data rMO serving as the low-frequency removal reference data and the image data rME serving as the low-frequency removal comparison data.
  • the low band removal reference data hMO and the low band removal comparison data hME are obtained by removing the low band frequency component.
  • the low band removal reference data hMO is stored in the overlap MO image memory 424
  • the low band removal comparison data hME is stored in the overlap ME image memory 425.
  • the low-frequency removal reference data MO and the low-frequency removal comparison data ME having a predetermined number of lines from which the low-frequency components have been removed are read out and output as outputs from the low-frequency removal unit 42c. Except for the above, the low-frequency removal unit 42c is the same as that shown in FIG. 6 of the first embodiment.
  • the image processing unit 4 can use the low-frequency removal unit 42d of FIG. 18 instead of the low-frequency removal unit 42c shown in FIG.
  • the image data rM in the overlap region in the image memory 41 is read out separately as image data rMO as reference data and image data rME as comparison data, and each image is read out.
  • the same processing as that of the overlap low-frequency removal unit 421c in FIG. 17 is performed on the data, and low-frequency removal reference data hMO and low-frequency removal comparison data hME obtained by extracting high-frequency components are obtained.
  • components that are the same as or correspond to the components shown in FIG. 17 are given the same reference numerals as those in FIG.
  • the low-frequency removing unit 42d shown in FIG. 18 includes two overlapping low-frequency removing units 421c1 and 421c2, and the low-frequency removing unit shown in FIG. 17 includes one overlapping low-frequency removing unit 421c. 42c.
  • Reference image data rMO and comparison image data rME of the image data rM in the overlap region are input to the overlap low-frequency removal units 421c1 and 421c2 shown in FIG.
  • the overlap low-frequency removal unit 421c1 removes the low-frequency component of the image from the image data and outputs the low-frequency removal reference data hMO with respect to the reference image data rMO in the image data rM in the overlap region.
  • the overlap low-frequency removal unit 421c2 removes the low-frequency component of the image from the image data and outputs low-frequency removal comparison data hME for the comparison image data rME in the image data rM in the overlap region.
  • the low band removal reference data hMO is stored in the overlap MO image memory 424, and the low band removal comparison data hME is the overlap ME image.
  • the low band removal reference data MO and the low band removal comparison data ME having a predetermined number of lines from which the low band frequency components have been removed are read out and output as outputs from the low band removing unit 42d. Except for the points described above, the low-frequency removal unit 42d shown in FIG. 18 is the same as the low-frequency removal unit 42c shown in FIG.
  • the overlap low frequency removing unit 421c extracts high frequency components from the image data rM in the overlap region.
  • low-frequency-removed image data hM low-frequency-removed reference data hMO and low-frequency-removed comparison data hME
  • the low-frequency-removed reference data MO in the low-frequency removed image data and the positional deviation amount dY is generated, and the low-frequency-removed reference data MO in the low-frequency removed image data and the positional deviation amount dY.
  • the low frequency removal reference data MO from which the low frequency component in the sub-scanning direction has been removed is compared with the low frequency removal comparison. It becomes the data ME, and the simplification of the hardware configuration and the simplification of the arithmetic processing can be realized without reducing the accuracy of the extraction of the low frequency components, and the arithmetic scale can be reduced.
  • each line sensor among the image data in the overlap region The low-frequency-removal reference data MO, which is image data in which the influence of the level difference caused by the difference in performance and reading conditions is suppressed, and the number of low-frequency-removed comparison data equal to the number of lines in the search range “ ⁇ y to + y” ME can be determined. Therefore, according to the image reading device 1, the image processing unit (image processing device) 4, and the image processing method by the low-frequency removing unit 42c according to the second embodiment, the position in the sub-scanning direction can be accurately detected without erroneous detection.
  • the shift amount data dyb indicating the deviation can be calculated, and the shift amount data dyb obtained with high accuracy without any misdetection can be obtained as a value indicating the positional deviation amount, and combined image data can be generated. High-quality composite image data corresponding to an object can be generated.
  • Embodiment 3 A part of the functions of the image reading apparatus 1 according to the first embodiment and the second embodiment may be realized by a hardware configuration, or may be executed by a microprocessor including a CPU (central processing unit). It may be realized by a program. When a part of the functions of the image reading device 1 is realized by a computer program, the microprocessor loads and executes the computer program from a computer-readable storage medium or by communication such as the Internet. Part of the function can be realized.
  • FIG. 19 is a functional block diagram showing a configuration when a part of the functions of the image reading apparatus 1b according to the third embodiment is realized by a computer program.
  • the image reading device 1 b includes an imaging unit 2, an A / D conversion unit 3, and an arithmetic device 5.
  • the arithmetic device 5 includes a processor 51 including a CPU, a RAM (Random Access Memory) 52, a nonvolatile memory 53, a large-capacity storage medium 54, and a bus 55.
  • the non-volatile memory 53 for example, a flash memory can be used.
  • the large-capacity storage medium 54 for example, a hard disk (magnetic disk), an optical disk, or a semiconductor storage device can be used.
  • the A / D conversion unit 3 has the same function as the A / D conversion unit 3 in FIG.
  • the electrical signal SI output from the imaging unit 2 is converted into digital data by the A / D conversion unit 3 and stored in the RAM 52 via the processor 51.
  • the processor 51 can realize the functions of the image processing unit 4 shown in FIG. 1 by loading and executing a computer program from the nonvolatile memory 53 or the large-capacity storage medium 54.
  • FIG. 20 is a flowchart schematically showing an example of processing by the arithmetic device 5 according to the third embodiment.
  • the processor 51 first reads data in the overlap region of the image data, removes the low frequency component of the image in the pixel range from the image data, and removes the low frequency component.
  • an overlap region low-frequency removal process for extracting low-frequency removal reference data MO and low-frequency removal comparison data ME of a predetermined number of lines is executed (step S1).
  • the processor 51 executes a similarity calculation process for comparing the low-frequency removal reference data MO, which is low-frequency removal image data, with the low-frequency removal comparison data ME (step S2).
  • step S3 shift amount estimation processing to obtain shift amount data
  • step S4 executes a combination process. Note that the processing in steps S1 to S4 by the arithmetic device 5 is the same as the processing performed by the image processing unit 4 in the first embodiment.
  • the low frequency component having a predetermined number of lines is removed by removing the low frequency components from the image data.
  • the removal reference data MO and the low-frequency removal comparison data ME are obtained, the similarity is calculated, and the low-frequency removal comparison data in which the highest similarity is obtained among the calculated similarities in the sub-scanning direction is obtained.
  • Shift amount data is calculated from the position, and the position of the divided image in the sub-scanning direction is shifted based on the shift amount data dyb to generate combined image data.
  • the processing for the image data in the overlap region is image data in which the influence of the level difference caused by the difference in performance and reading conditions for each line sensor is suppressed.
  • the number of low-frequency band removal comparison data ME equal to the number of lines within the band-removal reference data MO and the search range “ ⁇ y to + y” can be obtained.
  • Embodiment 4 FIG.
  • the line sensors 21O 1 ,..., 21O located at odd numbers from one end (for example, the left end) in the longitudinal direction of the imaging unit 2 are positioned.
  • k, ..., the line sensor 21E 1 located to the even-numbered and the optical axis 27O of 21O n, ..., 21E k, ..., and the optical axis 27E of 21E n has been described a case where intersecting.
  • an odd number is counted from one end of the imaging unit 2 in the longitudinal direction (for example, the front side in the direction perpendicular to the paper surface on which FIGS. 21A and 21B are drawn).
  • the image reading apparatus according to the fourth embodiment is substantially the same as the image reading apparatus 1 according to the first embodiment except that the optical axis 28O and the optical axis 28E are parallel to each other. Therefore, FIG. 1 is also referred to in the description of the fourth embodiment.
  • FIG. 21A shows a case where the document 60 is in close contact with the glass surface 26 which is the document table mounting surface
  • FIG. 21B shows a case where the document 60 is slightly lifted away from the glass surface 26.
  • the line sensor 21O 1 located odd, ..., 21O k, ..., 21O n is the line sensor 21E 1 located even number, ..., 21E k, ..., compared to 21E n, is to acquire an image of the same position temporally delayed, since the optical axis 28O and the optical axis 28E are parallel, the positional deviation in the sub-scanning direction to read the It can be determined as a substantially constant amount YL. However, a slight positional shift may be added to the substantially constant amount YL.
  • the minute positional deviation includes a minute positional deviation bt including a positional deviation of the optical axis 28O or the optical axis 28E due to an attachment error of the line sensor or the like and a positional deviation due to a temporal fluctuation (that is, speed fluctuation) of the conveyance speed. .
  • the image data DI (O k ) generated by the line sensor 21O and the line sensor 21E There may be a difference in level of image data read for each line sensor (difference in brightness and difference in pixel data) between the generated image data DI (E k ).
  • FIG. 22 shows image data DI (O k ) and DI (O k + 1 ) corresponding to the even-numbered line sensors 21O k and 21O k + 1 when the document is read, and the odd-numbered line sensors 21E k and 21E.
  • image data DI corresponding to k + 1 (E k) shows the DI (E k + 1).
  • the position (line) in the sub-scanning direction is shifted by an amount obtained by adding a minute position shift bt to a certain amount YL (that is, YL + bt).
  • the image processing unit 4 the line sensor 21O 1 located odd when writing to the image memory 41, ..., 21O k, ..., the image reading by 21O n, or line sensor 21E 1 located even number ,..., 21E k ,..., 21E n , one or both of the scanned images are shifted by a substantially constant amount YL in the sub-scanning direction. Accordingly, as shown in FIG. 23, the image data stored in the image memory 41 is stored in the image memory 41 with the position (line) in the sub-scanning direction shifted by bt.
  • the image processing unit 4 obtains a minute positional deviation bt in the sub-scanning direction, and performs the following processing on the image data in the overlap region in order to combine the images. That is, the image processing unit 4 obtains the low-frequency-removal reference data MO and the low-frequency-removed comparison data ME having a predetermined number of lines by removing low-frequency components from the image data in the overlap region.
  • the similarity is calculated by comparing the low-pass removal reference data and the low-pass removal comparison data, and the position in the sub-scanning direction of the low-pass removal comparison data from which the highest similarity is obtained among the calculated similarities Shift amount data is calculated from the image data, image data is read from the image memory 41 based on the calculated shift amount data, and the read image data is combined.
  • the configuration for performing such processing and such processing operation are the same as those in the first embodiment.
  • the image reading apparatus As described above, in the image reading apparatus according to the fourth embodiment, at the time of imaging by the imaging unit 2, conveyance when moving either one or both of the document 60 and the imaging unit 2 in the sub-scanning direction. Even in the case where there is a temporal fluctuation (that is, speed fluctuation) in the conveyance speed by the machine, the processing on the image data in the overlap region can be performed as in the case of the first embodiment. That is, in the image reading apparatus according to the fourth embodiment, the low-frequency removal reference data MO and the low-frequency removal comparison data that are image data in which the influence of the level difference caused by the difference in performance and reading conditions for each line sensor is suppressed.
  • the low-frequency removal reference data MO and the low-frequency removal comparison data that are image data in which the influence of the level difference caused by the difference in performance and reading conditions for each line sensor is suppressed.
  • the shift amount data dyb indicating the positional deviation in the sub-scanning direction can be calculated accurately without misdetection, and the shift amount data dyb obtained accurately without misdetection can be obtained. It can be obtained as a value indicating the amount, and the image data can be read from the image memory 41 based on the calculated shift amount data, and image data obtained by combining the read image data can be generated. Therefore, according to the fourth embodiment, it is possible to generate high-quality composite image data corresponding to an object to be read by accurately combining image data.
  • the present invention can be applied to image reading apparatuses such as copying machines, scanners, and facsimiles.
  • the program to which the present invention is applied can be applied to an information processing apparatus such as a personal computer that can communicate with an image reading apparatus such as a copying machine, a scanner, and a facsimile.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Input (AREA)
  • Facsimile Scanning Arrangements (AREA)
  • Facsimile Heads (AREA)
  • Editing Of Facsimile Originals (AREA)

Abstract

L'invention concerne une unité de traitement d'image (4) d'un dispositif de lecture d'image (1) qui comprend : une unité d'élimination de plage basse (42) qui élimine des composantes de fréquence de plage basse à partir de données d'image dans une plage de pixels comprenant un pixel de notification et des pixels voisins de ce dernier, pour obtenir des données de référence à plage basse éliminée et des données de comparaison à plage basse éliminée ; une unité de calcul de degré de similarité (43) qui calcule une pluralité de degrés de similarité entre les données de référence à plage basse éliminée et une pluralité d'éléments de données de comparaison à plage basse éliminée ; une unité d'estimation d'amplitude de changement (44) qui, sur la base d'une différence depuis la position dans une direction de sous-balayage des données de comparaison à plage basse éliminée ayant le degré de similarité le plus élevé, calcule des données d'amplitude de changement ; et une unité de traitement de couplage (45) qui détermine la position dans la direction de sous-balayage de données d'image, sur la base des données d'amplitude de changement calculées, et accouple des données d'image lues par des capteurs de ligne adjacents dans des lignes différentes l'une de l'autre pour générer ainsi des données d'image composites.
PCT/JP2015/063877 2014-09-03 2015-05-14 Dispositif de traitement d'image, procédé de traitement d'image, dispositif de lecture d'image, et programme WO2016035387A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2016546342A JP6246379B2 (ja) 2014-09-03 2015-05-14 画像処理装置、画像処理方法、画像読取装置、及びプログラム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014178821 2014-09-03
JP2014-178821 2014-09-03

Publications (1)

Publication Number Publication Date
WO2016035387A1 true WO2016035387A1 (fr) 2016-03-10

Family

ID=55439461

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/063877 WO2016035387A1 (fr) 2014-09-03 2015-05-14 Dispositif de traitement d'image, procédé de traitement d'image, dispositif de lecture d'image, et programme

Country Status (2)

Country Link
JP (1) JP6246379B2 (fr)
WO (1) WO2016035387A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10806600B2 (en) 2015-02-03 2020-10-20 Rcm Enterprise Llc Bio-mechanical prosthetic finger with H-shaped rocker
US10842652B2 (en) 2015-05-15 2020-11-24 RCM Enterprise, LLC Bidirectional biomechanical prosthetic full finger configured for abduction and adduction with MCP pivot and multiple-finger ring
US11173052B2 (en) 2015-02-03 2021-11-16 Rcm Enterprise Llc Bio-mechanical prosthetic finger with y-shaped rocker

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000134537A (ja) * 1998-10-28 2000-05-12 Ricoh Co Ltd 画像入力装置及びその方法
JP2004139219A (ja) * 2002-10-16 2004-05-13 Seiko Instruments Inc 画像処理方法および画像処理装置
JP2013110582A (ja) * 2011-11-21 2013-06-06 Mitsubishi Electric Corp 画像読取装置および画像読取方法、並びにmtf補正パラメータ決定方法
JP2014022806A (ja) * 2012-07-13 2014-02-03 Sharp Corp 撮像装置および撮像装置制御方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000134537A (ja) * 1998-10-28 2000-05-12 Ricoh Co Ltd 画像入力装置及びその方法
JP2004139219A (ja) * 2002-10-16 2004-05-13 Seiko Instruments Inc 画像処理方法および画像処理装置
JP2013110582A (ja) * 2011-11-21 2013-06-06 Mitsubishi Electric Corp 画像読取装置および画像読取方法、並びにmtf補正パラメータ決定方法
JP2014022806A (ja) * 2012-07-13 2014-02-03 Sharp Corp 撮像装置および撮像装置制御方法

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10806600B2 (en) 2015-02-03 2020-10-20 Rcm Enterprise Llc Bio-mechanical prosthetic finger with H-shaped rocker
US11173052B2 (en) 2015-02-03 2021-11-16 Rcm Enterprise Llc Bio-mechanical prosthetic finger with y-shaped rocker
US10842652B2 (en) 2015-05-15 2020-11-24 RCM Enterprise, LLC Bidirectional biomechanical prosthetic full finger configured for abduction and adduction with MCP pivot and multiple-finger ring

Also Published As

Publication number Publication date
JP6246379B2 (ja) 2017-12-13
JPWO2016035387A1 (ja) 2017-04-27

Similar Documents

Publication Publication Date Title
US10326908B2 (en) Image reading apparatus and image reading method
JP6115781B2 (ja) 画像処理装置及び画像処理方法
US9367902B2 (en) Image processing device, endoscope apparatus, isolated point noise correction method, and information storage device
US8963910B2 (en) Pixel information management apparatus and image capture apparatus using the same
US8320715B2 (en) Device and method for interpolating image, and image scanner
JP6246379B2 (ja) 画像処理装置、画像処理方法、画像読取装置、及びプログラム
US9398173B2 (en) Two-dimensional calibration of image sensor alignment in optical scanners
JP6422428B2 (ja) 画像処理装置、画像処理方法、画像読取装置、及びプログラム
JP5821563B2 (ja) 画像読取装置および画像読取方法、並びにmtf補正パラメータ決定方法
JP5224976B2 (ja) 画像補正装置、および、画像補正方法、および、プログラム、および、記録媒体
EP3439286B1 (fr) Étalonnage de pixels pour produire des images de résolution supérieure
JP6058115B2 (ja) 画像処理装置、画像処理方法、画像読取装置、及びプログラム
KR20160049371A (ko) 이미지 생성 장치 및 이미지 생성 방법
JP2011199681A (ja) 画像処理装置及び画像読取装置
JP2013135246A (ja) 画像読取装置の原稿浮き量検出方法及びそれを適用した画像処理方法及び画像読取装置
JP2013005176A (ja) 画像読取装置、画像形成装置及びラインイメージセンサの制御方法
JP2009224829A (ja) 画像読取装置
JP2005175909A (ja) 信号処理方法および画像取得装置
JP5590911B2 (ja) 画像読取装置及び方法
CN110521197B (zh) 图像读取装置
JP2009141583A (ja) 撮像装置
JP2017158182A (ja) 画像読取装置
JP7141973B2 (ja) 画像読取装置、画像処理システム、制御方法及び制御プログラム
JP4568610B2 (ja) 信号補正装置、信号補正方法及び信号補正プログラム
JP5901559B2 (ja) 画像処理装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15837942

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2016546342

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15837942

Country of ref document: EP

Kind code of ref document: A1