WO2016163133A1 - 画像結合装置、画像読取装置及び画像結合方法 - Google Patents
画像結合装置、画像読取装置及び画像結合方法 Download PDFInfo
- Publication number
- WO2016163133A1 WO2016163133A1 PCT/JP2016/052287 JP2016052287W WO2016163133A1 WO 2016163133 A1 WO2016163133 A1 WO 2016163133A1 JP 2016052287 W JP2016052287 W JP 2016052287W WO 2016163133 A1 WO2016163133 A1 WO 2016163133A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- images
- combined
- group
- information
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/387—Composing, repositioning or otherwise geometrically modifying originals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00795—Reading arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/04—Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa
- H04N1/19—Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using multi-element arrays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/04—Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa
- H04N1/19—Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using multi-element arrays
- H04N1/1903—Arrangements for enabling electronic abutment of lines or areas independently scanned by different elements of an array or by different arrays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/04—Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa
- H04N1/19—Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using multi-element arrays
- H04N1/191—Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using multi-element arrays the array comprising a one-dimensional array, or a combination of one-dimensional arrays, or a substantially one-dimensional array, e.g. an array of staggered elements
- H04N1/192—Simultaneously or substantially simultaneously scanning picture elements on one main scanning line
- H04N1/193—Simultaneously or substantially simultaneously scanning picture elements on one main scanning line using electrically scanned linear arrays, e.g. linear CCD arrays
- H04N1/1932—Simultaneously or substantially simultaneously scanning picture elements on one main scanning line using electrically scanned linear arrays, e.g. linear CCD arrays using an array of elements displaced from one another in the sub scan direction, e.g. a diagonally arranged array
- H04N1/1933—Staggered element arrays, e.g. arrays with elements arranged in a zigzag
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/387—Composing, repositioning or otherwise geometrically modifying originals
- H04N1/3876—Recombination of partial images to recreate the original image
Definitions
- the present invention relates to an image combining device, an image reading device, and an image combining method.
- An image reading apparatus that reads a paper document with a line sensor group composed of a plurality of short line sensors arranged in a zigzag pattern along the longitudinal direction is used.
- the longitudinal direction (arrangement direction) of the line sensors is called a main scanning direction
- the direction orthogonal to the longitudinal direction of the line sensors is called a sub-scanning direction.
- the line sensors arranged in the main scanning direction in the line sensor group partially overlap the detection ranges when viewed from the sub scanning direction.
- the line sensor group and the paper document are scanned relatively in the sub-scanning direction, and each line sensor reads a part of the paper document. Images read by each line sensor are combined at overlapping portions, and finally combined into one piece of image data.
- the present invention has been made in view of the above circumstances, and an object thereof is to provide an image combining device, an image reading device, and an image combining method capable of combining image sequences at high speed.
- an image combining apparatus includes a plurality of images having overlapping portions that are portions overlapping with other images, and images having overlapping portions corresponding to the same portion of the subject are adjacent to each other.
- an image sequence that is ordered by arranging the images in this manner is provided for each group formed by grouping into a plurality of consecutive images, and the first image in the group and the order in the image sequence are provided with a plurality of image processing units that perform in parallel a process of detecting, by image matching, joint position information for joining adjacent second images without positional deviation, and a joint information transfer path.
- the combined information transfer path transfers combined position information between the second image and the first image, the group of which is different from the first image, between the image processing units.
- Each of the plurality of image processing units has no positional deviation between images belonging to other groups based on the combination position information detected by itself and the combination position information transferred via the combination information transfer path.
- the images are combined in a state, and a plurality of images belonging to the corresponding group are combined with no positional deviation.
- a plurality of images having overlapping portions that are portions overlapping with other images are ordered by arranging the images so that the images having the overlapping portions corresponding to the same portion of the subject are adjacent to each other.
- the image sequence is divided into a plurality of consecutive images and grouped, and the first image in the group, the first image, and the second image in which the sequence in the image sequence is adjacent are parallel in each group. Since the combined position information is detected, the time for detecting the combined position information of these images can be shortened.
- the coupling position information between the first image and the second image that is adjacent to the first image and is in a different group is transmitted and received between the image processing units via the coupling information transfer path.
- each image processing unit can combine the images belonging to other groups with no positional deviation, and can combine the images belonging to the corresponding group without any positional deviation.
- the image sequences can be combined at high speed.
- FIG. 1 is a block diagram illustrating a configuration of an image reading apparatus according to an embodiment of the present invention. It is a figure which shows the arrangement
- FIG. 4 is a diagram illustrating an example of a paper document read by an image reading device. It is a figure which shows a mode that a line sensor group and the image of a paper original are relatively scanned. It is a figure which shows an example of the image imaged by each line sensor. It is a block diagram which shows the structure of an A / D conversion part. It is a figure which shows the structure of the memory control part in the image combining apparatus which concerns on one embodiment of this invention.
- FIG. 1 is a block diagram showing a configuration of an image reading apparatus according to an embodiment of the present invention.
- the image reading apparatus 200 includes a line sensor group 1 formed on a substrate 50.
- the line sensor group 1 includes a plurality of line sensors 1a-1 to 1a-4, 1b-1 to 1b-4, and 1c-1 to 1c-4.
- Each of the line sensors 1a-1 to 1a-4, 1b-1 to 1b-4, and 1c-1 to 1c-4 is an image pickup device composed of a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor). It is a sensor arranged in a dimension.
- the line sensors 1a-1 to 1a-4, 1b-1 to 1b-4, and 1c-1 to 1c-4 output analog signals corresponding to the intensity of light beams incident on the image sensors arranged in the longitudinal direction. .
- the line sensors 1a-1 to 1a-4, 1b-1 to 1b-4, 1c-1 to 1c-4 repeatedly output analog signals at a constant period.
- FIG. 2 is a diagram showing an array of line sensors in the image reading apparatus according to the embodiment of the present invention. As shown in FIG. 2, the plurality of line sensors 1a-1 to 1a-4, 1b-1 to 1b-4, 1c-1 to 1c-4 have the same longitudinal direction and are staggered along the longitudinal direction. Is arranged.
- FIG. 2 defines an XYZ orthogonal coordinate system composed of an X axis, a Y axis, and a Z axis.
- the X-axis direction which is the longitudinal direction of the line sensors 1a-1 to 1a-4, 1b-1 to 1b-4, 1c-1 to 1c-4 is defined as the main scanning direction.
- the Y-axis direction is a sub-scanning direction orthogonal to the main scanning direction.
- the Z-axis direction is a depth direction orthogonal to both the main scanning direction and the sub-scanning direction.
- FIG. 3 is a diagram illustrating an example of a paper document read by the image reading apparatus.
- an image of characters “ABCDEFGH” is formed on the paper document 20 in the main scanning direction (X-axis direction).
- a plurality of line sensors 1a-1 to 1a-4, 1b-1 to 1b-4, 1c-1 to 1c-4 capture the image “ABCDEFGH” of the paper document 20 as an object to be imaged (subject). Will be described as being imaged.
- FIG. 4 is a diagram showing a state in which the line sensor group and the image of the paper original are relatively scanned.
- the reflected light emitted from the illumination light source and reflected by the paper document 20 reaches a position where the substrate 50 is scanned via the imaging optical system, as shown in FIG. Twenty images 21 are formed.
- the line driving unit 40 of the image reading apparatus 200 scans the line sensor group 1 relative to the image 21 of the paper document 20 at a constant speed in the sub-scanning direction (Y-axis direction). Cross the image 21. By this scanning, “ABCDEFGH” of the paper document 20 is imaged by the line sensor group 1.
- FIG. 5 is a diagram illustrating an example of an image captured by each line sensor.
- Each of the line sensors 1a-1 to 1a-4, 1b-1 to 1b-4, 1c-1 to 1c-4 reads a part of the image 21 of the paper document 20 as a subject.
- the plurality of images picked up by the line sensors 1a-1 to 1a-4, 1b-1 to 1b-4, and 1c-1 to 1c-4 are scanned at the image pickup position on the subject.
- the image sequences are ordered in the order in which those close to the direction are adjacent to each other.
- the line sensors 1a-1 to 1a-4, 1b-1 to 1b-4, 1c-1 to 1c-4 are staggered, that is, arranged in two rows every other along the X-axis direction.
- An image captured by the line sensor is shifted in the sub-scanning direction.
- the detection ranges of adjacent line sensors in different rows partially overlap as viewed from the sub-scanning direction, some of the images captured by the adjacent line sensors overlap. Therefore, the images picked up by the line sensors 1a-1 to 1a-4, 1b-1 to 1b-4, and 1c-1 to 1c-4 are displayed as a whole when the overlapping portions of the subject are overlapped with each other.
- One subject image is formed, and an image sequence is formed by arranging the images side by side with overlapping portions corresponding to the same portion of the subject next to each other.
- the overlapping portion is a portion where the same portion of the subject is captured in a plurality of images and overlapped in the plurality of images. If it sees from a certain image, the part which overlaps with another image is an overlap part.
- the line sensor group 1 is divided into three groups.
- the first group is line sensors 1a-1 to 1a-4
- the second group is line sensors 1b-1 to 1b-4
- the third group is line sensors 1c-1 to 1c-. 4.
- the image sequences captured by the line sensors 1a-1 to 1a-4, 1b-1 to 1b-4, 1c-1 to 1c-4 are also captured by the line sensors 1a-1 to 1a-4.
- the images are grouped into a plurality of images in which the imaging positions of the captured images, the images captured by the line sensors 1b-1 to 1b-4, and the images captured by the line sensors 1c-1 to 1c-4 are sequentially consecutive.
- the image reading apparatus 200 includes a plurality of line sensors 1a-1 to 1a-4, 1b-1 to 1b-4, 1c-1 to 1c-4, and a plurality of line sensors 1a-1 to 1a. -4, 1b-1 to 1b-4, and 1c-1 to 1c-4.
- the image combining device 100 includes adjacent images captured by a plurality of line sensors 1a-1 to 1a-4, 1b-1 to 1b-4, 1c-1 to 1c-4, and has a predetermined number or more at the end. By superimposing the portions where the pixels match, the image sequences captured by the line sensor group 1 are combined to generate one image.
- the image combining device 100 includes a plurality of A / D conversion units 2 and a plurality of image processing units 7 that perform image processing on digital data signals output from the line sensor group 1 and A / D converted by the A / D conversion unit 2. And an image memory 8 for storing image data. Further, the image combining device 100 includes an image data transfer path 9 that transmits and receives image data between the image processing units 7, a combined information transfer path 10 that transfers combined information for combining images between the image processing units 7, And an output unit 15 that outputs the synthesized image data.
- an A / D converter 2a, an image processor 7a, and an image memory 8a are provided corresponding to the group of line sensors 1a-1 to 1a-4. Further, an A / D conversion unit 2b, an image processing unit 7b, and an image memory 8b are provided corresponding to the group of line sensors 1b-1 to 1b-4. Further, an A / D conversion unit 2c, an image processing unit 7c, and an image memory 8c are provided corresponding to the group of line sensors 1c-1 to 1c-4.
- image data transfer paths 9 a and 9 b are provided as the image data transfer path 9.
- the image data transfer path 9a transfers image data between the image processing unit 7a and the image processing unit 7b
- the image data transfer path 9b transfers image data between the image processing unit 7b and the image processing unit 7c.
- combined information transfer paths 10a and 10b are provided as the combined information transfer path 10.
- the combined information transfer path 10a transmits and receives combined information between the image processing unit 7a and the image processing unit 7b.
- the combined information transfer path 10b transmits and receives combined information between the image processing unit 7b and the image processing unit 7c.
- the A / D converters 2a, 2b, 2c, the image processors 7a, 7b, 7c and the image memories 8a, 8b, 8c can operate independently and in parallel.
- FIG. 6 is a block diagram showing the configuration of the A / D converter.
- the A / D converter 2 b includes four A / D converters 25 and a buffer 26.
- the A / D converter 2b receives analog signals output from the line sensors 1b-1, 1b-2, 1b-3, 1b-4.
- the four A / D converters 25 A / D convert analog signals input from the line sensors 1b-1, 1b-2, 1b-3, and 1b-4, respectively, and output digital data signals.
- the four digital data signals output from the four A / D converters 25 are input to the buffers 26, respectively.
- the buffer 26 stores four digital data signals arranged in a line.
- the buffer 26 outputs digital data signals in the order of the line sensors 1b-1, 1b-2, 1b-3, 1b-4, that is, in the order in which they are arranged in the main scanning direction.
- the A / D converter 2b outputs the digital data signal output from the buffer 26 to the image processor 7b as 1-line image data 1b-m.
- the configuration of the A / D conversion units 2a and 2c is the same as the configuration of the A / D conversion unit 2b.
- the A / D converter 2a receives analog signals output from the line sensors 1a-1, 1a-2, 1a-3, 1a-4, performs A / D conversion, and outputs digital data signals 1a-m. Output.
- the A / D converter 2c receives analog signals output from the line sensors 1c-1, 1c-2, 1c-3, 1c-4, performs A / D conversion, and outputs a digital data signal 1c-m. Output.
- each image processing unit 7 (7a, 7b, 7c) is implemented by an integrated logic IC (IC; Integrated ⁇ Circuit) such as an ASIC (Application Specific Integrated Circuit) or FPGA (Field Programmable Gate Array). It may be realized by a wear circuit.
- IC integrated logic
- ASIC Application Specific Integrated Circuit
- FPGA Field Programmable Gate Array
- Each image processing unit 7 (7a, 7b, 7c) is a software program in which hardware such as a CPU (Central Processing Unit), an MPU (Micro Processing Unit), a DSP (Digital Signal Processor) is stored in a memory. Execution, that is, hardware and software programs may be realized in cooperation.
- Each image processing unit 7 includes a shading correction unit 3 that performs shading correction on the image data input from the A / D conversion unit 2, a memory control unit 4 that controls reading / writing of the image data to the image memory 8, and The image data stored in the image memory 8 is read via the memory control unit 4, and the combined information deriving unit 5 for deriving combined information based on the read image data, and the combined information derived by the combined information deriving unit 5 And an image combining unit 6 for combining images.
- the image processing unit 7a includes a shading correction unit 3a, a memory control unit 4a, a combined information derivation unit 5a, and an image combining unit 6a.
- the image processing unit 7b includes a shading correction unit 3b, a memory control unit 4b, A combined information deriving unit 5b and an image combining unit 6b are provided, and the image processing unit 7c is provided with a shading correction unit 3c, a memory control unit 4c, a combined information deriving unit 5c, and an image combining unit 6c. .
- the shading correction unit 3 (3a, 3b, 3c) performs shading correction on the input digital data signal. For example, in the shading correction, luminance errors caused by individual characteristics of the image sensors of the line sensors 1a-1 to 1a-4, 1b-1 to 1b-4, and 1c-1 to 1c-4 are reduced. If the brightness is uniform, the digital data signal is corrected so that the entire image has an average uniform brightness.
- FIG. 7 is a diagram showing a configuration of the memory control unit in the image combining device according to the embodiment of the present invention. As shown in FIG. 7, each time the digital data signals 1a-m, 1b-m, 1c-m are input, the memory control unit 4 (4a, 4b, 4c) receives the digital data signals 1a-m, 1b-m. 1c-m are stored in the image memory 8 (8a, 8b, 8c).
- the memory control units 4a and 4b include image temporary storage memories 11 (11a and 11b).
- the memory control unit 4b converts t-byte data from the beginning of the input digital data signal 1b-m as data 1b-1t to the image temporary storage memory 11a of the memory control unit 4a via the image data transfer path 9a. Transfer and memorize.
- the memory control unit 4c sets t1 (t is a natural number) bit data from the head of the input digital data signal 1c-m as data 1c-1t, and passes through the image data transfer path 9b to the memory control unit 4b. Is transferred to and stored in the temporary image storage memory 11b.
- the t bit is large enough to cover the width in the main scanning direction of the overlapping portion of the detection range of each line sensor.
- the memory control unit 4a stores the digital data 1b-1t stored in the image temporary storage memory 11a in a form connected to the digital data 1a-m stored in the image memory 8a.
- the memory control unit 4b stores the digital data 1c-1t stored in the image temporary storage memory 11b in a form that is connected to the digital data 1b-m stored in the image memory 8b.
- the memory control unit 4a converts the digital data obtained by combining the digital data 1a-m with the front end of the digital data 1b-1t into the image memory 8a. It memorizes sequentially. Further, each time the digital data 1b-m is input, the memory control unit 4b sequentially stores the digital data obtained by combining the rear end of the digital data 1b-m with the front end of the digital data 1c-1t in the image memory 8b. Each time the digital data 1c-m is input, the memory control unit 4c sequentially stores the digital data 1c-m in the image memory 8c.
- FIG. 8 is a diagram showing an example of image data stored in the image memory in the image combining device according to the embodiment of the present invention.
- the image memory 8a an image based on the digital data 1a-m captured by the line sensors 1a-1 to 1a-4 and a digital data 1b captured by the head portion of the line sensor 1b-1 are stored. -1t image is stored.
- the image memory 8b stores an image based on the digital data 1b-m captured by the line sensors 1b-1 to 1b-4 and an image based on the digital data 1c-1t captured by the head portion of the line sensor 1c-1. Is done.
- the image memory 8c stores images based on digital data 1c-m captured by the line sensors 1c-1 to 1c-4.
- the memory control unit 4 (4a, 4b, 4c) stores the image data read by the line sensor group 1 in the image memory 8 (8a, 8b, 8c) and then, if necessary, a combined information deriving unit.
- the image data stored in the image memory 8 (8a, 8b, 8c) is read and output to 5 (5a, 5b, 5c) or the image combining unit 6 (6a, 6b, 6c).
- the memory control unit 4 (4a, 4b, 4c) can also read and output part of the image data stored in the image memory 8 (8a, 8b, 8c).
- the combined information deriving units 5a, 5b, and 5c are a plurality of images stored in the image memory 8a, a plurality of images stored in the image memory 8b, and a plurality of images stored in the image memory 8c. Are combined without any positional deviation and without any luminance difference.
- the line sensors 1a-1 to 1a-4, 1b-1 to 1b-4, 1c-1 to 1c-4 are arranged in a staggered manner in the main scanning direction, and the detection range of each line sensor is They are partially overlapped when viewed from the sub-scanning direction. For this reason, the images read by the line sensors 1a-1 to 1a-4, 1b-1 to 1b-4, 1c-1 to 1c-4 are alternately positioned in the sub-scanning direction with respect to the adjacent images.
- images are combined by shifting one of adjacent images in the sub-scanning direction and superimposing this area in the main scanning direction.
- a region where a predetermined number or more of pixels coincide with each other at the end portions of the images in the main scanning direction becomes an overlapping portion which is a portion overlapping with other images in the image.
- FIG. 9 is a diagram showing how images are combined.
- the images captured by the line sensors 1a-1 to 1a-4 are referred to as images Pa1, Pa2, Pa3, and Pa4, respectively.
- the images picked up by the line sensors 1b-1 to 1b-4 are referred to as images Pb1, Pb2, Pb3, and Pb4, respectively.
- the images picked up by the line sensors 1c-1 to 1c-4 are referred to as images Pc1, Pc2, Pc3, and Pc4, respectively.
- a part of the image Pb1 stored in the image memory 8a is referred to as an image Pa5.
- This image Pa5 is an image formed by the digital data 1b-1t.
- the image Pa5 is image data used for image matching between the image Pa4 and the image Pb1 in which the image Pa4 is in a different group.
- the image Pa5 is transferred and stored in the image temporary storage memory 11a of the memory control unit 4a via the image data transfer path 9a, and then stored in the image memory 8a by the memory control unit 4a.
- a part of the image Pc1 stored in the image memory 8b is referred to as an image Pb5.
- This image Pb5 is an image formed by the digital data 1c-1t.
- the image Pb5 is image data used for image matching between the image Pb4 and an image Pc1 in which the image Pb4 is in a different group.
- the image Pb5 is transferred and stored in the image temporary storage memory 11b of the memory control unit 4b via the image data transfer path 9b, and then stored in the image memory 8b by the memory control unit 4b.
- the memory control unit 4a may acquire the image Pa5 with reference to the image memory 8b, and the memory control unit 4b may acquire the image Pb5 with reference to the image memory 8c. In that case, the image data transfer paths 9a and 9b are unnecessary.
- the combined information deriving unit 5a includes images Pa1 to Pa4 (first image and second image in the group) belonging to the corresponding group, and image Pa5 (second image adjacent to the first image).
- image Pa5 second image adjacent to the first image
- connection information for combining adjacent images without positional deviation and luminance deviation is derived by image matching.
- the combined information deriving unit 5b includes the images Pb1 to Pb4 (first image and second image in the group) belonging to the corresponding group, and the image Pb5 (second image belonging to another group adjacent to the first image).
- Image information is used to derive connection information for combining adjacent images without positional deviation and luminance deviation.
- the combination information deriving unit 5c is a combination for combining adjacent images out of the images Pc1 to Pc4 (first image and second image in the group) belonging to the corresponding group without any positional deviation and luminance deviation.
- Information is derived by image matching.
- the second image that is image-matched with the first image is an image that is adjacent to the first image in the order in the image sequence.
- the second image includes an image belonging to the same group as the first image and an image belonging to a group different from the first group.
- FIG. 10 is a diagram illustrating an absolute coordinate system defined by a combined image in which each image is finally combined.
- FIG. 10 shows a design coupling position Pa12 as a reference on which the image Pa1 and the image Pa2 are superimposed.
- the combined information deriving unit 5a derives the positional deviation amount ( ⁇ Xp1, ⁇ Yp1) of the combined position between the image Pa1 and the image Pa2 from the point Pa12 as combined information.
- FIG. 10 shows a design coupling position Pa23 that serves as a reference for superimposing the image Pa2 and the image Pa3.
- the coupling information deriving unit 5a derives the positional deviation amounts ( ⁇ Xp2, ⁇ Yp2), ( ⁇ Xp3, ⁇ Yp3), ( ⁇ Xp4, ⁇ Yp4) of the coupling positions as coupling information (coupling position information).
- design coupling positions serving as a reference for superposition in images Pb1, Pb2, Pb2, Pb3, Pb3, Pb4, Pb4, and Pb5 are Pb12, Pb23, Pb34, and Pb45.
- the combined information deriving unit 5b combines the positional deviation amounts ( ⁇ Xp5, ⁇ Yp5), ( ⁇ Xp6, ⁇ Yp6), ( ⁇ Xp7, ⁇ Yp7), ( ⁇ Xp8, ⁇ Yp8) of the overlapping positions with respect to the combined positions Pb12, Pb23, Pb34, and Pb45. Derived as (binding position information). Further, as shown in FIG.
- the design coupling positions serving as a reference for superposition in the images Pc1, Pc2, Pc2, Pc3, Pc3, and Pc4 are Pc12, Pc23, and Pc34.
- the combined information deriving unit 5c derives the positional deviation amounts ( ⁇ Xp9, ⁇ Yp9), ( ⁇ Xp10, ⁇ Yp10), ( ⁇ Xp11, ⁇ Yp11) of the overlapping positions with respect to the combined positions Pc12, Pc23, and Pc34 as combined information (coupled position information). .
- FIG. 11 is a block diagram showing a configuration of a combined information deriving unit in the image combining apparatus according to the embodiment of the present invention.
- the combined information deriving unit 5 (5b) includes a matching region range initial value storage unit 30 (30b), a combined information detection unit 31 (31b), a combined information storage unit 32 (32b), A combined information temporary storage unit 33 (33b) and a combined information determination unit 34 (34b) are provided.
- the matching area range initial value storage unit 30 (30b) stores the above-described combination positions Pb12, Pb23, Pb34, and Pb45 in the overlay design and the initial values of the range in which image matching is performed with the position coordinates as the center. .
- the combination information detection unit 31 (31b) reads the image data around the overlapping area in the adjacent images from the image memory 8b via the memory control unit 4b.
- the combined information detecting unit 31 (31b) reads the designed combined positions Pb12, Pb23, Pb34, Pb45 and the matching region range from the matching region range initial value storage unit 30 (30b) and reads the overlapping region from the image memory 8b. Perform image matching using surrounding images.
- the combination information detection unit 31 (31b) detects the combination position information that is the optimum solution with the highest degree of correlation between both images. Specifically, the positional shift amounts ( ⁇ Xp5, ⁇ Yp5) to ( ⁇ Xp8, ⁇ Yp8) having the highest degree of correlation of image matching are detected as combined position information with reference to the designed combined positions Pb12 to Pb45.
- FIG. 12 is a diagram showing a state of image matching between two images.
- an image Pb2 located at the design coupling position Pb12 with respect to the image Pb1 is indicated by a dotted line.
- the image Pb2 at the position of the optimum solution having the highest degree of correlation with the image Pb1 by image matching is indicated by a solid line.
- a positional deviation between the image Pb2 (dotted line) and the image Pb2 (solid line) is obtained as a positional deviation amount ( ⁇ Xp5, ⁇ Yp5).
- the combined information detection unit 31 (31b) detects the accuracy of the combined position in addition to the combined positions obtained as a plurality of optimum solutions. For example, if there are k (k is a natural number) multiple solutions, the accuracy may be calculated as 1 / k. The calculated accuracy is associated with the coupling position information and stored as coupling information in the coupling information storage unit 32 (32b).
- the combined information detection unit 31 (31b) has an overlapping specified range when the images Pb1 to Pb5 are overlapped with each other centering on the positional deviation amounts ( ⁇ Xp5, ⁇ Yp5) to ( ⁇ Xp8, ⁇ Yp8) as combined position information.
- the average luminance of the plurality of pixels is obtained, and the difference between the average luminances of the two images is calculated as luminance difference information.
- the combined information detection unit 31 (31b) stores the calculated luminance difference information as combined information in the combined information storage unit 32 (32b).
- the combined information detection unit 31 (31b) detects the combined position information, the accuracy of the combined position information, and the luminance difference information, and uses the detected information as combined information as the combined information storage unit 32 (32b). To remember.
- image matching may be performed only in the sub-scanning direction to detect only the amount of misalignment in the sub-scanning direction.
- FIG. 13 is a block diagram showing the overall configuration of the combined information deriving unit.
- the combined information deriving units 5a and 5c operate in substantially the same manner as the combined information deriving unit 5b.
- the combined information detecting unit 31 (31a) reads the images Pa1 to Pa5 from the image memory 8a via the memory control unit 4a, and further combines the designed combined positions Pa12, Pa23, Pa34, Pa45 and the matching area range are read from the matching area range initial value storage unit 30 (30a). Then, the combination information detection unit 31 (31a) performs image matching around the overlapping area in the adjacent image read from the image memory 8a.
- the positional deviation amounts ( ⁇ Xp1, ⁇ Yp1) to ( ⁇ Xp4, ⁇ Yp4) are detected as the coupling position information, and the accuracy of the coupling position information and the luminance difference information are further detected.
- the combined information is stored in the combined information storage unit 32 (32a).
- the combined information detecting unit 31 reads from the image memory 8c via the memory control unit 4c, and matches the designed combined positions Pc12, Pc23, Pc34 and the matching region range.
- the image is read from the area range initial value storage unit 30 (30c), and image matching around the overlapping area in the adjacent image read from the image memory 8c is performed.
- the positional deviation amounts ( ⁇ Xp9, ⁇ Yp9) to ( ⁇ Xp11, ⁇ Yp11) are detected as the coupling position information, and further, the accuracy of the coupling position and the luminance difference information are detected.
- the combined information is stored in the combined information storage unit 32 (32c).
- an image Pa4 (first image) and an image Pa5 (second image having different groups from the first image). Is transmitted to and stored in the combined information temporary storage unit 33 (33b) of the image processing unit 7b via the combined information transfer circuit 10a. Further, of the combination information stored in the combination information storage unit 32 (32a), the combination between the image Pb4 (first image) and the image Pb5 (second image in which the first image is different in group). The information is transmitted and stored in the combined information temporary storage unit 33 (33c) of the image processing unit 7c via the combined information transfer circuit 10b.
- FIG. 14 is a diagram showing how images are combined across different groups.
- the amount of positional deviation ( ⁇ Xp4, ⁇ Yp4) between the image Pa4 and the image Pa5 is the amount of positional deviation between the image Pa4 and the image Pb1. Therefore, as described above, the combined information deriving unit 5 (5a) determines the positional deviation amount ( ⁇ Xp4, ⁇ Yp4) between the image Pa4 and the image Pa5 via the combined information transfer path 10a. It transmits to the combined information temporary storage part 33 (33b) of (5b), and stores it. Further, the positional deviation amounts ( ⁇ Xp8, ⁇ Yp8) between the image Pb4 and the image Pb5 are the positional deviation amounts between the image Pb4 and the image Pc1.
- the combined information deriving unit 5 (5b) stores the positional deviation amount ( ⁇ Xp8, ⁇ Yp8) between the image Pb4 and the image Pb5 in the combined information temporary storage unit 33 (33c) of the combined information deriving unit 5 (5c). Send.
- the combination information determination unit 34 (34b) stores the image Pb1.
- the combined position information and brightness difference information of Pb4 are determined. For example, if the detected combination position (positional deviation amount) of an adjacent image is statistically significantly different from the combination position (positional deviation amount) of another adjacent image, the combination information determination unit 34 ( 34b) determines the coupling position based on the average value of the coupling positions (position shift amounts) of the other images without using the coupling information.
- the joint information determination unit 34 (34b) does not use the joint information.
- the combination position and the luminance difference are determined based on the average value of the combination information between other adjacent images.
- the combined information determination unit 34 (34b) averages the combined position information of image matching between other adjacent images. Based on the value, the joint position information between the adjacent images is determined.
- FIG. 15 is a diagram illustrating an example of how to obtain the combined position information of adjacent images based on the average value of the combined position information of other adjacent images. For example, as shown in FIG. 15, when a plurality of coupling positions are derived by image matching between the image Pb2 and the image Pb3, the amount of positional deviation obtained between the image Pb1 and the image Pb2 ( If an average value of the positional deviation amount (solid arrow) obtained between the image Pb3 and the image Pb4 is adopted as the positional deviation amount (dotted arrow) between the image Pb2 and the image Pb3. Good. In this way, in this embodiment, the influence of erroneously detected combined information is minimized.
- the combined information temporary storage unit 33 (33b) stores the combined information between the image Pa4 and the image Pa5 sent from the combined information storage unit 32 (32a) of the image processing unit 7a. For this reason, the combination information determination unit 34 (34b) receives the combination information transmitted from the image processing unit 7a and combines adjacent images in the group in the same manner as the adjacent images in the group are combined. Determine binding information. All the determined combination information is output to the image combining unit 6b.
- the image combining unit 6 (6a) reads the corresponding groups of images Pa1 to Pa4 from the image memory 8 (8a) via the memory control unit 4 (4a), and is determined by the combined information determining unit 34 (34a). Based on the luminance difference information, the luminance of each of the images Pa1 to Pa4 is corrected, and the images are combined based on the determined combination position information. The correction of the brightness is performed with reference to the image Pa1, for example. The images are combined on the absolute coordinate system (Xp, Yp). The image combining unit 6a outputs the combined image data to the output unit 15.
- the image combination unit 6 (6b) reads the corresponding group of images Pb1 to Pb4 from the image memory 8 (8b) via the memory control unit 4 (4b), and is determined by the combination information determination unit 34 (34b).
- the brightness of the images Pb1 to Pb4 is corrected based on the brightness difference information, and the images Pb1 to Pb4 are joined based on the joining position.
- the correction of luminance is performed with reference to the image Pb1 (Pa5) whose luminance is corrected by the luminance difference from the image Pa4.
- the image combining unit 6 b outputs the combined image data to the output unit 15.
- the image combining unit 6 (6c) reads the corresponding group of images Pc1 to Pc4 from the image memory 8 (8c) via the memory control unit 4 (4c), and is determined by the combined information determining unit 34 (34c).
- the brightness of the images Pc1 to Pc4 is corrected based on the brightness difference information, and the images Pc1 to Pc4 are combined based on the combination position.
- the correction of luminance is performed on the basis of the image Pc1 (Pb5) whose luminance is corrected by the luminance difference with the image Pb4.
- the image combining unit 6 c outputs the combined image data to the output unit 15.
- the output unit 15 has, for example, an image memory 15M (see FIG. 1) defined by the absolute coordinate system (Xp, Yp) in FIG. 10, and displays and outputs an image set in the image memory 15M.
- the combined image data output from the image combining units 6a, 6b, and 6c is set to the respective position coordinates determined at the time of combining in the absolute coordinate system (Xp, Yp).
- FIG. 16 is a diagram illustrating a state in which the image data output from the image combining unit is set in the absolute coordinate system of the image memory 15M of the output unit.
- images Pa1 to Pa4, images Pb1 to Pb4, and images Pc1 to Pc4 are set in the image memory 15M in a combined form. Since the images Pa1 to Pa4, the images Pb1 to Pb4, and the images Pc1 to Pc4 are combined in consideration of the positional deviation of all the images in the image combining units 6a, 6b, and 6c, there is no positional deviation between the images.
- the image Pa1 to Pa4, the image Pb1 to Pb4, and the image Pc1 to Pc4 are corrected for the luminance difference in consideration of all luminance deviations in the image combining units 6a, 6b, and 6c, so that the image is unnatural. There is no brightness shift. As a result, a combined image obtained by combining all the images is displayed and output from the output unit 15, for example.
- FIG. 17 is a diagram illustrating an example of an image output from the output unit.
- the combined image 22 output from the output unit 15 is an image combined with no image positional deviation and no luminance difference, and thus matches well with the paper document shown in FIG. 3. .
- FIG. 18 is a flowchart showing the operation of the image combining device.
- the memory control unit 4 reads and transfers an image (step S1).
- Digital data 1a-m, 1b-m, 1c-m captured by the line sensor group 1 and A / D converted by the A / D converter 2 (2a, 2b, 2c) are stored in the memory controller 4 (4a, 4b, 4c), the image memory 8 (8a, 8b, 8c) is written.
- the digital data 1b-1t and 1c-1t are transferred from the memory control units 4b and 4c to the memory control units 4a and 4b via the image data transfer path 9 (9a and 9b). 8b is written.
- the combination information detection unit 31 (31a, 31b, 31c) of the combination information deriving unit 5 (5a, 5b, 5c) detects the combination position information (position shift amount) of the adjacent images by image matching (step) S2).
- FIG. 19 is a timing chart of image matching processing performed by the combined information detection unit. As shown in FIG. 19, image matching is performed in parallel by the three combination information detection units 31a, 31b, and 31c, and the combination position information of images related to each group is detected. Compared with a single image processing unit, the time for detecting the coupling position information can be shortened to each stage.
- the combined information deriving unit 5 detects luminance difference information of adjacent images within a predetermined range centered on the detected combined position (step S3). ).
- the combined information transfer path 10a transfers the detected combined information from the combined information deriving unit 5a to the combined information temporary storage unit 33b of the combined information deriving unit 5b, and the combined information transfer path 10b detects the combined information detected.
- the information is transferred from the combined information deriving unit 5b to the combined information temporary storage unit 33c of the combined information deriving unit 5c (step S4).
- the combined information determination unit 34 stores the combined information stored in the combined information storage unit 32 (32a, 32b, 32c) and, if present, the combined information temporary storage unit 33 (33b). , 33c) to determine the combination information (step S5).
- step S2 when a combination position that becomes a plurality of optimum solutions is detected between images or when an optimum solution is not obtained, the average value of the combination positions detected between other images is obtained. Based on this, the joint position between the images is determined.
- the image combination unit 6 (6a, 6b, 6c), based on the combination information determined by the combination information determination unit 34 (34a, 34b, 34c), the images Pa1 to Pa4, the images Pb1 to Pb4, and the image Pc1.
- ⁇ Pc4 are combined (step S6).
- the images are arranged and combined according to an absolute coordinate system (Xp, Yp) in which all the images are combined.
- the output unit 15 inputs the images Pa1 to Pa4, the images Pb1 to Pb4, and the images Pc1 to Pc4 output from the image combining unit 6 (6a, 6b, 6c) into the image memory 15M, and combines all the images.
- the image is displayed and output (step S7).
- the image sequence captured by the line sensor group 1 is arranged so that images having overlapping portions corresponding to the same portion of the subject are arranged side by side, Are divided into a plurality of continuous images Pa1 to Pa4, Pb1 to Pb4, Pc1 to Pc4, and the images Pa1 to Pa4 in the group and the second images Pa2 to Pa5 adjacent thereto are parallel in each group. Since the combined position information is detected, the time for detecting the combined position information of these images can be shortened.
- image data used for image matching between the image Pa4 and the image Pb4 and the images Pb1 and Pc1 in different groups are transmitted between the image processing units 7a and 7b via the image data transfer paths 9a and 9b, the image processing unit 7b, 7c is transferred.
- the combined position information between the image Pa4 and the images Pb1 (Pa5) and Pc1 (Pb5) adjacent to the image Pa4 and in different groups can be detected by one of the image processing units 7a and 7b.
- the combined position information between the images Pa4 and Pb4 and the images Pb1 and Pc1 that are adjacent to the images Pa4 and Pb4 and have different groups is connected to the image processing units 7a and 7b via the combined information transfer paths 10a and 10b.
- each of the image processing units 7a, 7b, and 7c corrects the position of the image Pb1 by taking into account the positional deviation between the images belonging to other groups (for example, the positional deviation amount between the image Pa4 and the image Pa5).
- the positional deviation between the images belonging to other groups for example, the positional deviation amount between the image Pa4 and the image Pa5.
- an image sequence in which a plurality of images are arranged can be combined at high speed.
- Considering a positional shift with an image belonging to another group means that the images are combined without any positional shift with an image belonging to another group.
- Considering a luminance shift with an image belonging to another group means that the images are combined with an image belonging to another group without a luminance shift.
- Time can be reduced to about 1/3.
- the number of line sensors 1a-1 etc. is set to 12, and the number of line sensors 1a-1 etc. included in the group of line sensors 1a-1 etc. is set to 4, but the present invention is not limited to this. Is not limited.
- the number of line sensors 1a-1 etc. may be four or more.
- the number of line sensors 1a-1 etc. included in the group may be two or more. Even if the number of line sensors 1a-1 and the like increases, the time required for image combination can be shortened by increasing the number of groups.
- the line sensor group 1 may not be arranged in a zigzag pattern. For example, it may be arranged in three or more rows. Further, the lengths of the line sensors 1a-1 and the like need not be the same.
- the image reading apparatus 200 may include an imaging device that can capture an image sequence in which a plurality of images are ordered in the order of imaging positions on a subject and a predetermined number or more of pixels coincide with each other at the ends of the images having the closest imaging positions. That's fine. In that sense, a two-dimensional image sensor may be provided instead of the line sensor 1a-1 or the like.
- the hardware configuration or software configuration of the image processing unit 7 (7a, 7b, 7c) is an example, and can be arbitrarily changed and modified.
- the image temporary storage memories 11a and 11b may be outside the memory control units 4a and 4b.
- the image memories 8a, 8b, and 8c may be inside the image processing units 7a, 7b, and 7c.
- image data is transferred from the image processing units 7b and 7c to the image processing units 7a and 7b via the image data transfer paths 9a and 9b, and the image processing units 7a and 7b straddle groups.
- Image matching between adjacent images was performed to detect combined information, and the detected combined information was transferred from the image processing units 7a and 7b to the image processing units 7b and 7c via the combined information transfer paths 10a and 10b.
- the image data is transferred from the image processing units 7a and 7b to the image processing units 7b and 7c, and the image processing units 7b and 7c perform image matching of adjacent images across the groups to detect the combined information, and
- the combined information may be transferred from the processing units 7b and 7c to the image processing units 7a and 7b.
- the image processing unit 7 (7a, 7b, 7c) can be realized by using a normal computer system without using a dedicated system.
- a computer program for executing the above operation is stored and distributed in a computer-readable recording medium (flexible disk, CD-ROM, DVD-ROM, etc.), and the computer program is installed in the computer.
- the image processing unit 7 (7a, 7b, 7c) that executes the above-described processing may be configured.
- the computer program is stored in a storage device included in a server device on a communication network such as the Internet, and the image processing unit 7 (7a, 7b, 7c) is configured by being downloaded by a normal computer system. Also good.
- the functions of the image processing unit 7 (7a, 7b, 7c) are realized by sharing an OS (Operating System) and an application program, or in cooperation with the OS and the application program, only the application program portion is recorded on the recording medium. Or may be stored in a storage device.
- OS Operating System
- the computer program may be posted on a bulletin board (BBS, “Bulletin” Board System) on the communication network, and the computer program may be distributed via the network.
- BBS bulletin board
- the computer program may be started and executed in the same manner as other application programs under the control of the OS, so that the above-described processing may be executed.
- the present invention can be applied to an image combining device that continuously combines adjacent images partially overlapping.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Processing (AREA)
- Facsimile Scanning Arrangements (AREA)
- Facsimile Heads (AREA)
Abstract
Description
メモリ制御部4aが画像メモリ8bを参照して画像Pa5を取得し、メモリ制御部4bが画像メモリ8cを参照して画像Pb5を取得してもよい。その場合には、画像データ転送経路9a、9bは不要になる。
Claims (5)
- 他の画像と重複する部分である重複部を有する複数の画像を、被写体の同じ部分に対応する前記重複部を持つ画像が隣り合うように各画像を並べることで順序付けられる画像列を、順番が連続する複数枚ずつの画像にグループ分けすることにより形成されたグループ毎に設けられ、グループ内の第1の画像と該第1の画像と前記画像列における順番が隣接する第2の画像とを位置ずれなく結合するための結合位置情報を画像マッチングにより検出する処理をそれぞれ並行して行う複数の画像処理部と、
前記第1の画像とはグループが異なる前記第2の画像と前記第1の画像との間の前記結合位置情報を、前記画像処理部の間で転送する結合情報転送経路と、
を備え、
前記複数の画像処理部のそれぞれは、
自分が検出した前記結合位置情報と、前記結合情報転送経路を介して転送された前記結合位置情報とに基づいて、他のグループに属する画像との間で位置ずれがない状態で結合するとともに、対応するグループに属する複数の画像を位置ずれのない状態で結合する、
画像結合装置。 - 前記複数の画像処理部は、
前記第1の画像と前記第2の画像との間で検出された前記結合位置情報に基づいて、前記第1の画像と前記第2の画像とを重ね合わせた時の前記輝度差情報を検出する処理をそれぞれ並行して行い、
前記結合情報転送経路は、
前記第1の画像とはグループが異なる前記第2の画像と前記第1の画像との間の前記輝度差情報を、前記画像処理部の間で転送し、
前記複数の画像処理部のそれぞれは、
自分が検出した輝度差情報と、前記結合情報転送経路を介して受信した輝度差情報とに基づいて、他のグループに属する画像との間で輝度ずれがない状態で結合するとともに、対応するグループに属する複数の画像を、輝度ずれのない状態で結合する、
請求項1に記載の画像結合装置。 - 前記複数の画像処理部のそれぞれは、
前記第1の画像と前記第2の画像との画像マッチングにより、前記結合位置情報を1つに絞ることができない場合には、
他の画像同士の画像マッチングの前記結合位置情報の平均値に基づいて、前記第1の画像と前記第2の画像との間の前記結合位置情報を決定する、
請求項1に記載の画像結合装置。 - 長手方向が同一で、前記長手方向に1つ置きに2列に配列され、異なる列で隣り合うラインセンサの検出範囲が前記長手方向に直交する方向から見て一部重複する複数のラインセンサから成るラインセンサ群と、
前記ラインセンサ群と撮像対象とを、前記長手方向に直交する方向に相対走査させる走査駆動部と、
前記走査駆動部による相対走査中に前記ラインセンサ群で撮像された画像列を構成する画像を結合する請求項1から3のいずれか一項に記載の画像結合装置と、
を備える画像読取装置。 - 他の画像と重複する部分である重複部を有する複数の画像を、被写体の同じ部分に対応する前記重複部を持つ画像が隣り合うように各画像を並べることで順序付けられる画像列を、順番が連続する複数枚ずつの画像にグループ分けすることにより形成されたグループ毎に設けられた複数の画像処理部が、グループ内の第1の画像と前記画像列における順番が隣接する第2の画像とを位置ずれなく結合するための結合位置情報を画像マッチングにより検出する処理をそれぞれ並行して行う第1のステップと、
前記第1の画像とはグループが異なる前記第2の画像との間の前記結合位置情報を、前記画像処理部の間で転送する第2のステップと、
前記複数の画像処理部のそれぞれが、自分が検出した前記結合位置情報と、転送された前記結合位置情報とに基づいて、他のグループに属する画像との間で位置ずれがない状態で結合するとともに、対応するグループに属する複数の画像を位置ずれのない状態で結合する第3のステップと、
を含む画像結合方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016546122A JP6075588B1 (ja) | 2015-04-09 | 2016-01-27 | 画像結合装置、画像読取装置及び画像結合方法 |
US15/540,125 US9936098B2 (en) | 2015-04-09 | 2016-01-27 | Image combination device, image reading device and image combination method |
CN201680018908.2A CN107409165B (zh) | 2015-04-09 | 2016-01-27 | 图像结合装置、图像读取装置及图像结合方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-080093 | 2015-04-09 | ||
JP2015080093 | 2015-04-09 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016163133A1 true WO2016163133A1 (ja) | 2016-10-13 |
Family
ID=57072500
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/052287 WO2016163133A1 (ja) | 2015-04-09 | 2016-01-27 | 画像結合装置、画像読取装置及び画像結合方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US9936098B2 (ja) |
JP (1) | JP6075588B1 (ja) |
CN (1) | CN107409165B (ja) |
WO (1) | WO2016163133A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3352444A1 (en) * | 2017-01-23 | 2018-07-25 | Seiko Epson Corporation | Scanner, scan program, and method of producing scan data |
EP3352443A1 (en) * | 2017-01-23 | 2018-07-25 | Seiko Epson Corporation | Scanner, scan program, and method of producing scan data |
CN108347542A (zh) * | 2017-01-23 | 2018-07-31 | 精工爱普生株式会社 | 扫描仪、扫描程序以及扫描数据的生产方法 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10326908B2 (en) * | 2015-03-16 | 2019-06-18 | Mitsubishi Electric Corporation | Image reading apparatus and image reading method |
CN115023937A (zh) * | 2020-01-31 | 2022-09-06 | 三菱电机株式会社 | 图像读取装置和图像读取方法 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004120705A (ja) * | 2002-09-30 | 2004-04-15 | Fuji Photo Film Co Ltd | 画像読取装置 |
JP2010244184A (ja) * | 2009-04-02 | 2010-10-28 | Seiko Epson Corp | 映像処理装置、映像処理方法 |
JP5322885B2 (ja) * | 2009-10-22 | 2013-10-23 | 三菱電機株式会社 | 画像結合装置及び画像結合位置算出方法 |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5465163A (en) * | 1991-03-18 | 1995-11-07 | Canon Kabushiki Kaisha | Image processing method and apparatus for processing oversized original images and for synthesizing multiple images |
US5481375A (en) * | 1992-10-08 | 1996-01-02 | Sharp Kabushiki Kaisha | Joint-portion processing device for image data in an image-forming apparatus |
US5532845A (en) * | 1994-11-21 | 1996-07-02 | Xerox Corporation | High speed high resolution platen scanning system using a plurality of scanning units |
JPH09149192A (ja) * | 1995-11-17 | 1997-06-06 | Minolta Co Ltd | 画像読取装置 |
US6373995B1 (en) * | 1998-11-05 | 2002-04-16 | Agilent Technologies, Inc. | Method and apparatus for processing image data acquired by an optical scanning device |
US6348981B1 (en) * | 1999-01-19 | 2002-02-19 | Xerox Corporation | Scanning system and method for stitching overlapped image data |
JP2004172856A (ja) * | 2002-11-19 | 2004-06-17 | Fuji Photo Film Co Ltd | 画像データ作成方法および装置 |
US8228566B2 (en) * | 2008-03-31 | 2012-07-24 | Mitsubishi Electric Corporation | Image reading apparatus |
JP5068236B2 (ja) * | 2008-10-28 | 2012-11-07 | 三菱電機株式会社 | 画像読取装置 |
JP5923867B2 (ja) * | 2010-05-25 | 2016-05-25 | 株式会社リコー | 画像読み取り装置及び画像形成装置 |
JP5747083B2 (ja) * | 2010-10-01 | 2015-07-08 | コンテックス・エー/エス | 光学式スキャナにおけるイメージ・センサ・アラインメントの二次元較正 |
JP5806103B2 (ja) * | 2011-12-20 | 2015-11-10 | 三菱電機株式会社 | 画像読取装置 |
JP6046966B2 (ja) * | 2012-04-19 | 2016-12-21 | キヤノン株式会社 | 画像処理装置及び画像処理方法、プログラム、並びに記憶媒体 |
CN103295215A (zh) * | 2013-06-28 | 2013-09-11 | 电子科技大学 | 基于cis大幅面扫描仪的图像自动拼接方法 |
US10326908B2 (en) * | 2015-03-16 | 2019-06-18 | Mitsubishi Electric Corporation | Image reading apparatus and image reading method |
JP6536183B2 (ja) * | 2015-06-01 | 2019-07-03 | 富士ゼロックス株式会社 | 画像読取装置及びプログラム |
JP6422428B2 (ja) * | 2015-12-11 | 2018-11-14 | 三菱電機株式会社 | 画像処理装置、画像処理方法、画像読取装置、及びプログラム |
JP6759662B2 (ja) * | 2016-03-30 | 2020-09-23 | コニカミノルタ株式会社 | 画像読み取り装置、同装置における読み取りガラス面の異物検出方法及び異物検出プログラム |
-
2016
- 2016-01-27 JP JP2016546122A patent/JP6075588B1/ja active Active
- 2016-01-27 WO PCT/JP2016/052287 patent/WO2016163133A1/ja active Application Filing
- 2016-01-27 CN CN201680018908.2A patent/CN107409165B/zh active Active
- 2016-01-27 US US15/540,125 patent/US9936098B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004120705A (ja) * | 2002-09-30 | 2004-04-15 | Fuji Photo Film Co Ltd | 画像読取装置 |
JP2010244184A (ja) * | 2009-04-02 | 2010-10-28 | Seiko Epson Corp | 映像処理装置、映像処理方法 |
JP5322885B2 (ja) * | 2009-10-22 | 2013-10-23 | 三菱電機株式会社 | 画像結合装置及び画像結合位置算出方法 |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3352444A1 (en) * | 2017-01-23 | 2018-07-25 | Seiko Epson Corporation | Scanner, scan program, and method of producing scan data |
EP3352443A1 (en) * | 2017-01-23 | 2018-07-25 | Seiko Epson Corporation | Scanner, scan program, and method of producing scan data |
CN108347540A (zh) * | 2017-01-23 | 2018-07-31 | 精工爱普生株式会社 | 扫描仪、扫描程序以及扫描数据的生产方法 |
CN108347541A (zh) * | 2017-01-23 | 2018-07-31 | 精工爱普生株式会社 | 扫描仪、扫描程序以及扫描数据的生产方法 |
CN108347542A (zh) * | 2017-01-23 | 2018-07-31 | 精工爱普生株式会社 | 扫描仪、扫描程序以及扫描数据的生产方法 |
US10397434B2 (en) | 2017-01-23 | 2019-08-27 | Seiko Epson Corporation | Scanner that combines images read by first and second sensor arrays, scan program, and method of producing scan data |
CN108347541B (zh) * | 2017-01-23 | 2019-10-15 | 精工爱普生株式会社 | 扫描仪、非易失性存储介质以及扫描数据的生成方法 |
CN108347540B (zh) * | 2017-01-23 | 2019-11-08 | 精工爱普生株式会社 | 扫描仪、扫描程序以及扫描数据的生产方法 |
CN108347542B (zh) * | 2017-01-23 | 2020-02-28 | 精工爱普生株式会社 | 扫描仪、计算机可读的非易失性存储介质以及扫描数据的生成方法 |
US10659653B2 (en) | 2017-01-23 | 2020-05-19 | Seiko Epson Corporation | Scanner that combines images read by first and second sensor arrays, scan program, and method of producing scan data |
Also Published As
Publication number | Publication date |
---|---|
JPWO2016163133A1 (ja) | 2017-04-27 |
CN107409165B (zh) | 2019-07-23 |
CN107409165A (zh) | 2017-11-28 |
JP6075588B1 (ja) | 2017-02-08 |
US9936098B2 (en) | 2018-04-03 |
US20180007232A1 (en) | 2018-01-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6075588B1 (ja) | 画像結合装置、画像読取装置及び画像結合方法 | |
US10674041B2 (en) | Image capturing unit, color measuring device, image forming apparatus, color measuring system and color measurement method for capturing a reference chart and a subject | |
US20090074330A1 (en) | Image reading apparatus, image forming apparatus, image reading method and computer readable medium storing program thereof | |
US20100067069A1 (en) | Image forming apparatus and image forming method | |
US20070019255A1 (en) | Image input apparatus and image forming method | |
TWI435600B (zh) | Image sensor module and image sensor | |
US20170237873A1 (en) | Image scanning apparatus, control method therefor, and multifunction apparatus | |
JP6701454B2 (ja) | 読取装置および識別装置 | |
JP6246379B2 (ja) | 画像処理装置、画像処理方法、画像読取装置、及びプログラム | |
JP5322885B2 (ja) | 画像結合装置及び画像結合位置算出方法 | |
JP3719020B2 (ja) | 画像読取り装置における画像データ補正方法 | |
JP5732923B2 (ja) | 光走査装置、画像形成装置、および光走査方法 | |
US20110001844A1 (en) | Blink signal detection circuit, blink signal detection method, object sensing apparatus, object sensing method, and imaging system | |
JP6690068B1 (ja) | 受光ユニット | |
JP2015226127A (ja) | 画像読取装置、画像結合方法及びプログラム | |
US6897979B1 (en) | Multi-level image reading apparatus capable of detecting and correcting color position offsets | |
JP2020170992A (ja) | 画像処理装置および計測装置 | |
JP4107200B2 (ja) | 欠陥画素補正装置及び方法 | |
JP2002366887A (ja) | シンボル情報読み取り装置 | |
JP5429035B2 (ja) | 密着型イメージセンサ | |
US20160112595A1 (en) | Image Sensing Device | |
JP5251912B2 (ja) | 画像読取装置 | |
JP2019047441A (ja) | 画像処理装置、読取装置、画像処理方法、およびプログラム | |
JP2006025289A (ja) | マーカー装置及びそれを備えた画像読取装置 | |
JP2000253210A (ja) | カラー画像センサの光検出器 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2016546122 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16776302 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15540125 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16776302 Country of ref document: EP Kind code of ref document: A1 |