WO2022202671A1 - Image processing device, image processing method, image processing program, and image reading device - Google Patents

Image processing device, image processing method, image processing program, and image reading device Download PDF

Info

Publication number
WO2022202671A1
WO2022202671A1 PCT/JP2022/012653 JP2022012653W WO2022202671A1 WO 2022202671 A1 WO2022202671 A1 WO 2022202671A1 JP 2022012653 W JP2022012653 W JP 2022012653W WO 2022202671 A1 WO2022202671 A1 WO 2022202671A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
image reading
scanning direction
image processing
image data
Prior art date
Application number
PCT/JP2022/012653
Other languages
French (fr)
Japanese (ja)
Inventor
康博 向川
卓哉 舩冨
賢一郎 田中
貴弘 櫛田
孔明 田原
幸大 香川
匡 小久保
勇人 千馬
Original Assignee
国立大学法人奈良先端科学技術大学院大学
株式会社ヴィーネックス
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 国立大学法人奈良先端科学技術大学院大学, 株式会社ヴィーネックス filed Critical 国立大学法人奈良先端科学技術大学院大学
Priority to JP2023509127A priority Critical patent/JPWO2022202671A1/ja
Publication of WO2022202671A1 publication Critical patent/WO2022202671A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/10Image enhancement or restoration by non-spatial domain filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/04Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa
    • H04N1/19Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using multi-element arrays
    • H04N1/195Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using multi-element arrays the array comprising a two-dimensional array or a combination of two-dimensional arrays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/387Composing, repositioning or otherwise geometrically modifying originals

Definitions

  • the present invention relates to an image processing device, an image processing method, an image processing program, and an image reading device for processing read image data obtained by reading a line-shaped object.
  • the systems that have been adopted for surface inspection equipment using light sources in the visible range include line sensors, contact image sensors, or scanning optical systems using laser beams and photoelectric conversion elements (photomultipliers, avalanche photodiodes, CCD sensor, CMOS sensor, etc.) and a light receiving optical system including a light guide are typical inspection systems. It is a reflective type that receives reflected light and fluorescent light from. Examples of objects to be inspected include paper products, food products, and resin moldings such as films and sheets for industrial use.
  • the inspection object is often transparent, thin, and has high transmittance. It is rare for a system to detect objects such as foreign matter, scratches, and defects contained in a thick inspection object.
  • X-ray inspection equipment which is well-known as a non-destructive inspection device and has excellent transparency, uses X-rays, which are radiation, so it is necessary to set up a radiation control area and control the amount of radiation exposure to people. .
  • the hurdles to determine the installation location are high.
  • it since it is large and heavy, it is not easy to add it to the existing production line of the factory.
  • it since it is expensive, many inspection points cannot be provided.
  • X-ray inspection equipment has the advantage of the good transparency of X-rays themselves, and often foreign objects, defects, and scratches cannot be distinguished.
  • Patent Document 1 discloses an example of a conventional image reading device including a line light source and a line sensor.
  • a line light source In conventional transmissive image reading apparatuses, light emitted from a line light source scatters due to foreign matter, defects, scratches, etc. contained in an inspection object like an X-ray inspection apparatus.
  • the image reading device reads the object in a line and generates an image of the data format showing the object.
  • the outline of the target area is easily blurred.
  • the target region is specifically a region of pixels forming a target object on the image.
  • the inventors of the present application focused on the blurring of the outline of the target area on the image as described above, and as a result of intensive studies, it was found that the blurring of the outline of the target area on the image due to the wraparound light can be eliminated in the sub-scanning direction. However, in the main scanning direction, the fact that blurring is not removed is the reason why clear output image data cannot be obtained.
  • the present invention has been made in view of the above circumstances, and performs image processing on a plurality of read image data obtained by reading an object in a line to output image data with a clear contour of the target area. It is an object of the present invention to provide an image processing apparatus, an image processing method, an image processing program, and an image reading apparatus capable of generating a
  • the image processing apparatus applies image processing to a plurality of read image data obtained by reading the same object linearly along the main scanning direction extending in different directions.
  • An image processing apparatus comprising a spectrum generation processing section, a comparison processing section, a composite spectrum generation processing section, and an image generation processing section.
  • the spectrum generation processing unit generates a plurality of spectrum data by performing a Fourier transform process on each of the plurality of read image data.
  • the comparison processing unit compares each component corresponding to the same frequency of the plurality of spectral data.
  • the synthetic spectrum generation processing unit generates synthetic spectrum data by selecting one of the components corresponding to the same frequency of the plurality of spectrum data based on the comparison result by the comparison processing unit. .
  • the image generation processing unit generates output image data by performing inverse Fourier transform processing on the synthesized spectral data.
  • the synthesized spectrum generation processing unit may select the largest component from among the components corresponding to the same frequency of the plurality of spectrum data.
  • the image processing device is configured to arrange the plurality of read image data before being subjected to Fourier transform processing so that the regions corresponding to the object on the image of each of the plurality of read image data are positioned substantially at the same position.
  • the spectrum generation processing section may perform Fourier transform processing on each of the plurality of pieces of read image data that have been subjected to correction processing by the correction processing section.
  • the areas corresponding to the object can be positioned substantially at the same position.
  • the correction processing unit corrects an inclination or magnification of an area corresponding to the object on the image in at least one of the plurality of read image data before being subjected to Fourier transform processing. You may
  • the area corresponding to the object can be positioned substantially at the same position in each read image data.
  • An image reading device includes a plurality of line sensors and the image processing device.
  • Each of the plurality of line sensors can read an image in a line along the main scanning direction, and the main scanning direction extends in a direction different from the sub-scanning direction.
  • the image processing device performs image processing on a plurality of read image data obtained by reading the same object with the plurality of line sensors.
  • the plurality of line sensors may include two line sensors arranged so that the main scanning directions are orthogonal to each other.
  • the two line sensors may be arranged such that their respective main scanning directions intersect with the sub-scanning direction at an angle of 45°.
  • the plurality of line sensors may be arranged side by side in a direction crossing the sub-scanning direction.
  • a plurality of line sensors arranged side by side in a direction intersecting the sub-scanning direction can divide and read an object.
  • the image reading device of the present invention includes at least one line sensor and the image processing device.
  • the at least one line sensor has an elongated shape along a longitudinal direction intersecting the sub-scanning direction.
  • the image processing device performs image processing on a plurality of read image data obtained by reading the same object with the at least one line sensor.
  • the at least one line sensor includes a plurality of first imaging elements that are inclined with respect to the longitudinal direction and arranged side by side in the longitudinal direction, and at an angle different from that of the first imaging elements with respect to the longitudinal direction. and a plurality of second imaging elements that are inclined and arranged side by side in the longitudinal direction.
  • the at least one line sensor includes two line sensors, and one of the two line sensors includes the plurality of first imaging elements arranged in the longitudinal direction and inclined with respect to the longitudinal direction. and the other of the two line sensors is provided with a plurality of second imaging elements that are inclined at an angle different from that of the first imaging element with respect to the longitudinal direction and are arranged side by side in the longitudinal direction. good too.
  • the arrangement area of each line sensor in the sub-scanning direction is reduced to It can be narrowed to save space.
  • the plurality of first imaging elements inclined with respect to the longitudinal direction are arranged side by side in the longitudinal direction, and the first imaging element is arranged with respect to the longitudinal direction.
  • the plurality of second imaging elements inclined at an angle different from that of the elements are arranged side by side in the longitudinal direction.
  • the two line sensors are integrated,
  • the space can be saved by narrowing the arrangement area of the line sensor in the sub-scanning direction.
  • the image reading device of the present invention includes at least one line sensor and the image processing device.
  • the at least one line sensor can read an image linearly along the main scanning direction.
  • the image processing device performs image processing on a plurality of read image data obtained by reading the same object with the at least one line sensor.
  • the line sensor has a light source and a slit member.
  • the light source extends in the main scanning direction and irradiates the object with light.
  • the slit member extends in the main scanning direction and is arranged between the object and the light source.
  • the image reading device of the present invention includes at least one line sensor and the image processing device.
  • the at least one line sensor can read an image linearly along the main scanning direction.
  • the image processing device performs image processing on a plurality of read image data obtained by reading the same object with the at least one line sensor.
  • the line sensor has a line laser light source. The line laser light source irradiates the object with light.
  • the image reading device of the present invention includes at least one line sensor, the image processing device, and a plurality of transport devices.
  • the at least one line sensor can read an image linearly along the main scanning direction.
  • the image processing device performs image processing on a plurality of read image data obtained by reading the same object with the at least one line sensor.
  • the plurality of transport devices transport the target object along a transport path extending in a direction intersecting the main scanning direction.
  • the line sensor has a light source. The light source irradiates the object with light.
  • the plurality of conveying devices are arranged with a gap from each other along the conveying path, and the light source irradiates the object conveyed along the conveying path with light through the gap, Alternatively, the light irradiated to the object conveyed along the conveying path is caused to enter the gap.
  • the image reading device of the present invention includes at least one line sensor, the image processing device, and a conveying device.
  • the at least one line sensor can read an image linearly along the main scanning direction.
  • the image processing device performs image processing on a plurality of read image data obtained by reading the same object with the at least one line sensor.
  • the transport device transports the object along a transport path extending in a direction intersecting the main scanning direction.
  • the line sensor has a light source. The light source irradiates the object with light.
  • the conveying device includes a conveying surface that forms the conveying path, and the conveying surface is light transmissive, so that the light from the light source is transmitted through the conveying surface to irradiate the object.
  • the image processing method provides an image processing method for a plurality of read image data obtained by scanning the same object linearly along the main scanning direction extending in different directions.
  • An image processing method for processing comprising a spectrum generation step, a comparison step, a synthetic spectrum generation step, and an image generation step.
  • the spectrum generation step generates a plurality of spectrum data by performing a Fourier transform process on each of the plurality of read image data.
  • the comparison step compares each component corresponding to the same frequency of the plurality of spectral data.
  • the synthetic spectrum generating processing step generates synthetic spectral data by selecting one of the components corresponding to the same frequency of the plurality of spectral data based on the comparison result of the comparing step.
  • the image generation step generates output image data by performing inverse Fourier transform processing on the synthesized spectral data.
  • An image processing program performs image processing on a plurality of read image data obtained by scanning the same object linearly along main scanning directions extending in different directions.
  • the image processing program causes a computer to execute a spectrum generation step, a comparison step, a synthesized spectrum generation step, and an image generation step.
  • the spectrum generation step generates a plurality of spectrum data by performing a Fourier transform process on each of the plurality of read image data.
  • the comparison step compares each component corresponding to the same frequency of the plurality of spectral data.
  • the synthetic spectrum generating processing step generates synthetic spectral data by selecting one of the components corresponding to the same frequency of the plurality of spectral data based on the comparison result of the comparing step.
  • the image generation step generates output image data by performing inverse Fourier transform processing on the synthesized spectral data.
  • FIG. 2 is a block diagram showing an example of the electrical configuration of the image reading device according to the first embodiment;
  • FIG. 2 is a cross-sectional view showing an example of an image reading section of the first embodiment;
  • FIG. 4 is an exploded perspective view schematically showing an example of the appearance of a linear light source in the image reading section of the first embodiment;
  • FIG. 4A and 4B are diagrams illustrating an example of a read image according to the first embodiment;
  • FIG. FIG. 5 is a diagram showing another example of a read image according to the first embodiment;
  • FIG. 4 is a diagram showing an example of an output image according to the first embodiment;
  • FIG. It is a figure which shows an example of the memory map of RAM of 1st Embodiment.
  • FIG. 1 is a block diagram showing an example of a functional configuration of an image reading device according to a first embodiment
  • FIG. 4 is a flow chart showing an example of image processing by a CPU of the image reading apparatus according to the first embodiment
  • FIG. FIG. 10 is a schematic diagram showing an example of the periphery of a plurality of image reading units according to the second embodiment; It is a figure for explaining an example of correction processing of a 2nd embodiment. It is a figure which shows an example of the memory map of RAM of 2nd Embodiment.
  • FIG. 11 is a block diagram showing an example of the functional configuration of an image reading apparatus according to a second embodiment
  • FIG. FIG. 10 is a flow chart showing an example of image processing by a CPU of the image reading apparatus according to the second embodiment;
  • FIG. 11 is a schematic diagram for explaining the arrangement of image reading units according to the second embodiment;
  • FIG. 11 is a schematic diagram for explaining the arrangement of image reading units according to the second embodiment;
  • FIG. 11 is a schematic diagram for explaining the arrangement of image reading units according to the second embodiment;
  • FIG. 11 is a schematic diagram for explaining the arrangement of image reading units according to the second embodiment;
  • FIG. 11 is a schematic diagram for explaining the arrangement of image reading units according to the second embodiment;
  • FIG. 11 is a schematic diagram for explaining the arrangement of image reading units according to the third embodiment;
  • FIG. 11 is a schematic diagram for explaining the configuration of an image reading unit according to a third embodiment;
  • FIG. 11 is a schematic diagram for explaining a modification of the configuration of the image reading section of the third embodiment;
  • FIG. 11 is a cross-sectional view showing a part of an example of an image reading unit according to a fourth embodiment;
  • FIG. 11 is a schematic diagram for explaining the peripheral configuration of an image reading unit according to a fifth embodiment;
  • FIG. 14 is a schematic diagram for explaining a modification of the configuration around the image reading unit of the fifth embodiment;
  • FIG. 1 is a block diagram showing an example of the electrical configuration of an image reading apparatus 10 according to the first embodiment.
  • the image reading device 10 includes a control section 20 and an image reading section 28, each electrically connected via a bus 30.
  • the image reading device 10 includes a control section 20 and an image reading section 28, each electrically connected via a bus 30.
  • FIG. 1 shows that the image reading device 10 includes a control section 20 and an image reading section 28, each electrically connected via a bus 30.
  • the control unit 20 also includes a CPU (Central Processing Unit) 22, a RAM (Random Access Memory) 24, and a storage unit 26.
  • CPU Central Processing Unit
  • RAM Random Access Memory
  • the CPU 22 is in charge of overall control of the image reading device 10.
  • the RAM 24 is used as a work area and a buffer area for the CPU 22 .
  • the storage unit 26 is the main storage device of the image reading device 10, and non-volatile memory such as HDD (Hard Disk Drive) and EEPROM (Electrically Erasable Programmable Read Only Memory) is used. Moreover, the storage unit 26 may be configured to include the RAM 24 .
  • the storage unit 26 stores data on control programs for the CPU 22 to control the operation of each component of the image reading apparatus 10, data on various images, execution data necessary for executing the control programs, and the like.
  • the image reading unit 28 includes at least an image sensor, and reads an object linearly along the main scanning direction while scanning in the sub-scanning direction.
  • the image sensor is a sensor for reading an image by converting light into electrical output through photoelectric conversion.
  • the type and number of image sensors are not particularly limited as long as they can read the target in a line along the main scanning direction while scanning in the sub-scanning direction.
  • the image reading unit 28 for example, a general-purpose line sensor, general-purpose camera, CIS (Contact Image Sensor), or the like can be used.
  • a CIS Contact Image Sensor
  • the emission angle of the light source is close to zero.
  • the electrical configuration of the image reading device 10 of the first embodiment is an example.
  • the image reading device 10 may be provided with a plurality of image reading units 28 .
  • FIG. 2 is a cross-sectional view showing an example of the image reading section 28 of the first embodiment.
  • FIG. 3 is an exploded perspective view schematically showing an example of the appearance of the linear light source 44 (44A, 44B) in the image reading section 28 of the first embodiment.
  • the image reading unit 28 includes two housings 42 (42A, 42B) that face each other across a focal plane 40 and various components provided in the housings 42. As shown in FIG.
  • a linear light source 44 (44A, 44B) for illuminating an object on the focal plane 40 is provided in each housing 42.
  • the linear light source 44 is a component that emits light toward a reading target (inspection target) T on the focal plane 40 .
  • the light emitted from the linear light source 44A toward the focal plane 40 is denoted by L1
  • the light emitted from the linear light source 44B toward the focal plane 40 is denoted by L2.
  • the linear light source 44 includes a transparent light guide 46 extending along the longitudinal direction D, a first light source section 48 provided on one end face in the longitudinal direction D, and A second light source section 50 provided on the other end surface of D, and a cover member 51 for holding each outer side surface of the light guide 46 are included.
  • the linear light source 44 is not limited to a guide type in which light is guided by the light guide 46 .
  • the first light source section 48 and the second light source section 50 are shown separated from the light guide 46 .
  • the linear light source 44 has a light diffusion pattern P.
  • the light diffusion pattern P is formed to extend along the longitudinal direction D on the bottom surface 52 of the light guide 46 .
  • the light diffusion pattern P diffuses and refracts the light traveling through the light guide 46, The light is emitted from the light emitting side surface 54 of 46 .
  • the light emitting side surface 54 is formed in a smooth convex curved shape outward in order to provide the lens with a light condensing effect.
  • the housing 42A is provided with a substrate 56 for fixing the linear light source 44A
  • the housing 42B is provided with a substrate 58B for fixing the linear light source 44B.
  • terminals 60 are provided on the first light source section 48 and the second light source section 50 .
  • the linear light source 44A is fixed to the substrate 56 by inserting the terminal 60 into the substrate 56 and soldering.
  • the linear light source 44B is fixed to the substrate 58B by inserting the terminal 60 into the substrate 58B and soldering.
  • a lens array 62 is provided inside the housing 42A.
  • the lens array 62 is an optical element that forms an image of the light reflected or transmitted by the object to be read T on a light receiving unit 64, which will be described later. can be done.
  • an ultraviolet light blocking filter (UV cut filter) 67 that blocks ultraviolet light from entering the light receiving section 64 is provided at an arbitrary position from the focal plane 40 to the light receiving section 64 .
  • a color filter 68 is provided between the light receiving section 64 and the ultraviolet light blocking filter 67 to pass visible light in a specific wavelength range.
  • the light receiving section 64 includes an image sensor, and is mounted on the substrate 58A within the housing 42A.
  • Each housing 42 is provided with protective glass 66 (66A, 66B) to protect the linear light source 44 from dust scattering and damage during use.
  • the light L1 emitted from the linear light source 44A becomes linear illumination light to illuminate the object T to be read, and the reflected light from the object T to be read is reflected by the light receiving unit 64. led to.
  • the light L2 emitted from the linear light source 44B becomes linear illumination light and passes through the object T to be read.
  • Food can be exemplified as the object to be read T, but it is not limited to this.
  • the reading target T in addition to food (solid matter, liquid matter, etc.), even if the density (bread, oil, etc.), color (ketchup, curry, etc.) or shape (noodles, etc.) is characteristic, There are no restrictions on the item.), resin moldings such as films or sheets, textile products such as fabrics, clothes, or non-woven fabrics, and paper products such as sheets, rolls, or structures (envelopes or boxes).
  • the object to be read T itself may be the object, or a portion of the object to be read T that is included in, mixed with, covered by, or otherwise read at the same time may be the object.
  • Such parts include metal pieces, resin pieces (plastic or rubber, etc.), mineral pieces (stones, sand, etc.), animal and plant pieces (branch, bones, insects, etc.), composites thereof, as well as inside the reading target T Structural defects (scratches, irregularities, air bubbles, adhered foreign matter, etc.) can be exemplified.
  • the image reading unit 28 can read the reading object T linearly along the main scanning direction (Y direction in FIG. 2) while scanning in the sub scanning direction (X direction in FIG. 2).
  • the configuration may be such that only the transmitted light from the object to be read T is received by the light receiving section 64 by omitting the linear light source 44A.
  • the image reading unit 28 is used to read the same object in lines along the main scanning direction extending in different directions, thereby generating a plurality of read image data. be done. For example, on the focal plane 40 of the fixed image reading unit 28, while conveying the object in a plurality of different directions intersecting with the Y direction, if the object during each conveyance is read along the main scanning direction, the same image can be obtained. , the object can be read along main scanning directions extending in different directions with respect to each object. Alternatively, the image reading unit 28 may be moved with respect to the object stopped on the focal plane 40 .
  • image data refers to images in data format. Therefore, in this specification, for example, read image data may be referred to as read image, and read image may be referred to as read image data. This is the same for spectra to be described later.
  • the spectrum data is a frequency spectrum in data format.
  • a frequency spectrum is a spectrum that has a corresponding component for each frequency.
  • the frequency spectrum is specifically the spatial frequency spectrum.
  • the frequency spectrum corresponds to any one of amplitude spectrum, phase spectrum and power spectrum.
  • the frequency spectrum corresponds to the amplitude spectrum
  • the component corresponding to frequency indicates the magnitude of amplitude.
  • An amplitude spectrum is preferably used as the frequency spectrum.
  • each spectral data is compared. Specifically, each component corresponding to the same frequency of a plurality of spectral data is compared.
  • one of the components corresponding to the same frequency of the plurality of spectral data is selected as a component for synthesis based on the comparison result.
  • the largest component among the components corresponding to the same frequency of a plurality of spectral data is selected as the component for synthesis.
  • Synthetic spectrum data is a synthetic spectrum in data format.
  • the composite spectrum is a spectrum in which the components corresponding to each frequency are used as components for synthesis.
  • the synthetic spectral data when synthetic spectral data is generated, the synthetic spectral data is subjected to inverse Fourier transform processing. Further, when the synthesized spectral data is subjected to inverse Fourier transform processing, output image data is generated. Note that in the inverse Fourier transform process, two-dimensional inverse Fourier transform is applied to the synthesized spectral data.
  • FIG. 4 is a diagram showing an example of a read image according to the first embodiment.
  • FIG. 5 is a diagram showing another example of a read image according to the first embodiment.
  • the read images shown in FIGS. 4 and 5 are images generated by reading the same reading object T in a line. Furthermore, when the read images shown in FIGS. 4 and 5 are generated, the main scanning direction and the sub-scanning direction of the image reading section 28 are perpendicular to each other. Furthermore, the sub-scanning direction of the image reading section 28 when generating the read image shown in FIG. 4 is orthogonal to the sub-scanning direction of the image reading section 28 when generating the read image shown in FIG. Specifically, in FIG. 4, the vertical direction is the main scanning direction, and the horizontal direction is the sub-scanning direction. In FIG. 5, the horizontal direction is the main scanning direction, and the vertical direction is the sub-scanning direction.
  • the environment in which the object is scanned specifically, the main scanning direction is different.
  • Blurred parts are different in the outline of a certain target area. That is, the contours of the target regions differ in sharp portions.
  • the target area is specifically a pixel area that constitutes the target object on the image. That is, the target area can also be said to be a target object on the image.
  • FIG. 6 is a diagram showing an example of an output image according to the first embodiment. Also, the output image in FIG. 6 is an image corresponding to the read image shown in FIGS. In the output image, the target area appears as a combination of the sharp parts of the target area in each read image. That is, the contour of the target area is clearly displayed in the output image.
  • FIG. 6 exemplifies an output image corresponding to two read images, but the larger the number of read images, the clearer the entire outline of the target area is displayed in the output image.
  • g ⁇ is represented by the following equation (1), for example, when g ⁇ is read image data in which the outline of the target area is unclear.
  • f is read image data in which the outline of the target area is clear.
  • h ⁇ is a PSF (Point Spread Function). That is, h ⁇ is a factor resulting from the blurring of the contour of the target region. Also, the PSF depends on the angle in the main scanning direction.
  • Equation (3) By applying Fourier transform processing to both sides of Equation (1) as shown in Equation (2), Equation (3) is derived. Therefore, in Equation (3), G ⁇ , F, and H ⁇ are g ⁇ , f, and h ⁇ subjected to Fourier transform processing. That is, G ⁇ is spectral data.
  • Equation (4) the left side is the synthesized spectral data, and the right side is each component for synthesis.
  • Equation (5) the left side is the output image data.
  • FIG. 7 is a diagram showing an example of the memory map 200 of the RAM 24 of the first embodiment.
  • the RAM 24 includes a program area 201 and a data area 202.
  • the program area 201 a control program read in advance from the storage section 26 is stored.
  • the control programs include an image reading program 201a, a spectrum generation program 201b, a comparison program 201c, a composite spectrum generation program 201d, an image generation program 201e, and the like.
  • the image reading program 201a is a program for controlling the image reading unit 28 to generate read image data 202a.
  • the spectrum generation program 201b is a program for generating spectrum data 202b by performing Fourier transform processing on the read image data 202a generated by the image reading program 201a.
  • the comparison program 201c is a program for comparing each component corresponding to the same frequency of the plurality of spectrum data 202b generated by the spectrum generation program 201b.
  • the synthesized spectrum generation program 201d is a program for selecting the largest component from among the components corresponding to the same frequency of the plurality of spectrum data 202b based on the comparison result by the comparison program 201c, and for generating the synthesized spectrum data 202c. be.
  • the image generation program 201e is a program for generating output image data 202d by performing inverse Fourier transform processing on the synthesized spectral data 202c.
  • the program area 201 also stores control programs other than the image reading program 201a and the spectrum generation program 201b.
  • Data read from the storage unit 26 in advance is stored in the data area 202 .
  • the data area 202 stores read image data 202a, spectral data 202b, synthesized spectral data 202c, output image data 202d, and the like.
  • the read image data 202a is data corresponding to the read image. Also, the data area 202 may store a plurality of read image data 202a.
  • the spectrum data 202b is data corresponding to the frequency spectrum. Also, the data area 202 may store a plurality of spectral data 202b.
  • the synthetic spectrum data 202c is data corresponding to the synthetic spectrum.
  • the output image data 202d is data corresponding to the output image.
  • the data area 202 stores, for example, execution data, and is provided with a timer (counter) and registers necessary for executing the control program.
  • FIG. 8 is a block diagram showing an example of the functional configuration of the image reading device 10 of the first embodiment.
  • the control unit 20 including the CPU 22, the CPU 22 executes the image reading program 201a so that the control unit 20 functions as an image reading processing unit 90 that controls the image reading unit 28 and generates read image data 202a.
  • the CPU 22 executes the spectrum generation program 201b so that the control section 20 functions as a spectrum generation processing section 92 that performs Fourier transform processing on the read image data 202a and generates spectrum data 202b.
  • the CPU 22 executes the comparison program 201c so that the control section 20 functions as a comparison processing section 94 that compares the components corresponding to the same frequency of the plurality of spectral data 202b.
  • the CPU 22 executes the synthesized spectrum generation program 201d, so that the control unit 20, based on the comparison result of the comparison processing unit 94, selects the largest component among the components corresponding to the same frequency of the plurality of spectrum data 202b. , and functions as a synthetic spectrum generation processing unit 96 that generates synthetic spectrum data.
  • the CPU 22 executes the image generation program 201e so that the control section 20 functions as an image generation processing section 98 that performs inverse Fourier transform processing on the synthesized spectrum data 202c and generates output image data 202d.
  • the spectrum generation processing unit 92 when the image reading processing unit 90 generates a plurality of read image data 202a, the spectrum generation processing unit 92 generates a plurality of spectrum data 202b. Further, when the plurality of spectral data 202b are generated, the comparison processing unit 94 compares the respective components corresponding to the same frequency of the plurality of spectral data 202b. Based on the comparison result, the largest component is selected from among the components corresponding to the same frequency of the plurality of spectral data 202b, and synthetic spectral data 202c is generated. Further, when the synthesized spectral data 202c is generated, the image generation processing section 98 generates output image data 202d.
  • FIG. 9 is a flowchart showing an example of image processing by the CPU 22 of the image reading device 10 of the first embodiment. For example, when a plurality of read image data 202a are generated, the CPU 22 starts image processing.
  • step S1 Fourier transform processing is performed on each read image data 202a to generate a plurality of spectral data 202b.
  • step S2 each component corresponding to the same frequency of the plurality of spectral data 202b is compared.
  • step S3 based on the comparison results of the components corresponding to the same frequency, the largest component is selected from among the components corresponding to the same frequency of the plurality of spectrum data 202b to generate synthetic spectrum data 202c.
  • step S4 an inverse Fourier transform process is performed on the synthesized spectrum data 202c to generate output image data 202d. Further, when the output image data 202d is generated in step S4, the image processing ends.
  • composition is not limited to the configuration in which the largest component among the components corresponding to the same frequency of the plurality of spectral data is selected as the component for synthesis, and the component for synthesis may be selected as appropriate.
  • the largest component is selected as the synthesis component, and for the components corresponding to the other frequencies, the synthesis component is appropriately selected. good too.
  • any component between the smallest component and the largest component may be used as the component for synthesis.
  • Second Embodiment In a second embodiment, a plurality of image reading units 28 are used to read an object. In addition, in the second embodiment, before Fourier transform processing is performed on a plurality of pieces of read image data, processing is performed to position the target area in substantially the same position in each piece of read image data. Note that a series of processes after the Fourier transform process are the same as those in the first embodiment, and duplicate descriptions will be omitted.
  • the main scanning directions of the plurality of image reading units 28 are arranged to intersect each other.
  • Each image reading section 28 is arranged such that the main scanning direction intersects the sub-scanning direction at a predetermined angle.
  • the sub-scanning direction of each image reading section 28 in the second embodiment (conveying direction of the object or moving direction of each image reading section 28) is the same direction, and all the image reading sections 28 is the direction that intersects with
  • FIG. 10 is a schematic diagram showing an example of the periphery of a plurality of image reading units 28 of the second embodiment. 10 is also a schematic diagram showing an example of the periphery of the plurality of image reading units 28 when viewed from a direction perpendicular to the focal plane 40. As shown in FIG.
  • the main scanning directions of the image reading section 28A and the image reading section 28B are orthogonal to each other, and the main scanning direction of the image reading section 28A and the image reading section 28B is 45° with respect to the sub-scanning direction. are arranged to intersect at an angle of That is, in the example shown in FIG. 10, the main scanning direction and the sub-scanning direction of each image reading section 28 are not orthogonal.
  • the sub-scanning direction of each image reading section 28 is the same direction, and the sub-scanning direction of each image reading section 28 intersects the image reading section 28A and the image reading section 28B.
  • the angle formed by the sub-scanning direction and the main scanning direction of the image reading section 28 is not limited to 45°. Further, the angle formed by the main scanning directions of the image reading section 28 is not limited to 90°.
  • the plurality of image reading units 28 are arranged such that the main scanning directions extend in directions different from the sub-scanning direction, and the main scanning directions intersect each other at one point. It can also be said.
  • the read image data is generated based on the image reading section 28 in which the main scanning direction and the sub-scanning direction are not orthogonal, so that the read image data in which the target area is tilted is generated.
  • correction processing is performed on a plurality of pieces of read image data before being subjected to Fourier transform processing so that target regions are positioned substantially identically on the images of each of the plurality of pieces of read image data.
  • the correction process includes a tilt correction process for correcting the tilt of the target area and a magnification correction process for correcting the magnification of the target area.
  • the tilt correction process and the magnification correction process are performed in this order.
  • the correction processing will be described with reference to FIG. 11 .
  • FIG. 11 is a diagram for explaining an example of correction processing according to the second embodiment. 11 shows a read image corresponding to the image reading section 28A and a read image corresponding to the image reading section 28B. Note that in the correction process, first, the read image is scanned to obtain the coordinates of the pixels.
  • the target area is tilted at an angle corresponding to the relative speed between the image reading unit 28 and the target when reading the target. The higher the relative speed, the greater the slope of the target area.
  • the tilt of the target area is calculated based on the pixel coordinates, and the pixels are shifted vertically according to the tilt.
  • the tilt of the target region is corrected along with the shift of the pixels. Note that a well-known method is used as a method for calculating the inclination of the target area from the coordinates of the pixels.
  • the target area is reduced in the horizontal direction.
  • the correction magnification which is the magnification for reducing the target area, is not particularly limited as long as the target area is positioned substantially at the same position on each read image.
  • the correction magnification may be set based on the tilt of the target area before tilt correction. Since pixel interpolation and the like involved in adjusting the magnification of the target area are well known, detailed description thereof will be omitted.
  • the target area may be enlarged instead of being reduced.
  • the magnification correction process does not have to be performed. That is, in the correction process, at least the tilt correction process may be performed out of the tilt correction process and the magnification correction process.
  • the tilt may be corrected using a well-known method.
  • a well-known method of correcting the tilt and magnification of an object such as characters, graphics, and illustrations on the image based on the positions (coordinates) of pixels forming the object may be used.
  • the case where the main scanning directions of the image reading section 28A and the image reading section 28B are arranged perpendicular to each other has been described as an example. extend in directions different from the sub-scanning direction, and the main scanning directions intersect each other at one point, the number of image reading units 28 may be three or more. Further, depending on the angle and the number of the image reading units 28, at least one read image data may be subjected to the correction process.
  • FIG. 12 is a diagram showing an example of the memory map 200 of the RAM 24 of the second embodiment.
  • the control program includes a correction program 201f.
  • the correction program 201f also includes a tilt correction program 201g and a magnification correction program 201h.
  • the correction program 201f is a program for performing correction processing on the read image data 202a generated by the image reading program 201a to generate corrected image data 202e.
  • the tilt correction program 202g is a program for performing tilt correction processing on the read image data 202a generated by the image reading program 201a.
  • the magnification correction program 201h is a program for applying magnification correction processing to the read image data 202a generated by the image reading program 201a.
  • the corrected image data 202 e is stored in the data area 202 .
  • the corrected image data 202e is data corresponding to a corrected image, which is a read image subjected to correction processing.
  • the corrected image data 202e may include a plurality of data corresponding to corrected images.
  • the spectrum generation program 201b performs Fourier transform processing on the read image data 202a generated by the image reading program 201a, specifically, the corrected image data 202e, thereby generating spectrum data. 202b.
  • FIG. 13 is a block diagram showing an example of the functional configuration of the image reading device 10 of the second embodiment.
  • the CPU 22 executes the correction program 201f to perform correction processing on the read image data 202a, and functions as the correction processing unit 100 that generates corrected image data 202e.
  • the CPU 22 executes the spectrum generation program 201b to perform Fourier transform processing on the corrected image data 202e to function as the spectrum generation processing unit 92 that generates the spectrum data 202b.
  • FIG. 14 is a flowchart showing an example of image processing by the CPU 22 of the image reading device 10 of the second embodiment.
  • a plurality of read image data 202a are generated, a plurality of corrected image data 202e are generated in step S1.
  • steps S2 to S5 correspond to steps S1 to S4 in the flow chart shown in FIG. 9, and redundant description will be omitted.
  • the number of image reading units 28 may be three or more. At least one is fine. Therefore, according to the second embodiment, even if the target area is tilted in at least a part of the read image data among the plurality of read image data, the target area can be positioned substantially in the same position in each read image data. . That is, the image reading section 28 can be arranged so that the main scanning direction and the sub-scanning direction are not perpendicular to each other.
  • the second embodiment it is possible to generate a plurality of read image data simply by moving the object or the plurality of image reading units 28 in the sub-scanning direction, that is, by performing only one operation. Therefore, according to the second embodiment, it is possible to efficiently obtain a plurality of pieces of read image data.
  • the main scanning direction and the sub-scanning direction of each image reading section 28 are not orthogonal.
  • the plurality of image reading units 28 may be arranged side by side in a direction intersecting the sub-scanning direction. Specifically, a group of image reading units 28 whose main scanning direction extends in directions different from the sub-scanning direction and which are arranged side by side in the sub-scanning direction are arranged in a direction orthogonal to the sub-scanning direction. may be placed.
  • 15 to 19 are schematic diagrams for explaining the arrangement of the image reading section 28 of the second embodiment.
  • the image reading section 28A and the image reading section 28B are arranged so that the main scanning directions are orthogonal to each other and the main scanning direction intersects the sub-scanning direction at an angle of 45°.
  • the image reading section 28A and the image reading section 28B are arranged symmetrically with respect to the direction perpendicular to the sub-scanning direction as an axis (axis of symmetry).
  • the image reading section 28C and the image reading section 28D are arranged in the same manner as the image reading section 28A and the image reading section 28B.
  • the axis of symmetry for the image reading units 28A and 28B is the same as the axis of symmetry for the image reading units 28C and 28D.
  • a group including the image reading section 28C and the image reading section 28D is arranged side by side with a group including the image reading section 28A and the image reading section 28B in the direction orthogonal to the sub-scanning direction.
  • the image reading section 28A and the image reading section 28C are arranged so that the main scanning directions are parallel to each other, and further, they are arranged so that they partially overlap each other in the sub-scanning direction. be. This also applies to the image reading section 28B and the image reading section 28D.
  • the image reading section 28A and the image reading section 28C are arranged so that the main scanning directions intersect with each other, and are symmetrical with respect to the sub-scanning direction. are arranged as follows. That is, the image reading section 28A and the image reading section 28C do not partially overlap each other in the sub-scanning direction. This also applies to the image reading section 28B and the image reading section 28D.
  • the axis of symmetry for the image reading units 28A and 28B is different from the axis of symmetry for the image reading units 28C and 28D. That is, the image reading section 28A and the image reading section 28C are asymmetrical, and the image reading section 28B and the image reading section 28D are also asymmetrical. The image reading section 28A and the image reading section 28C partially overlap each other in the sub-scanning direction. This also applies to the image reading section 28B and the image reading section 28D.
  • the image reading section 28A and the image reading section 28C are asymmetrical and partly overlap each other in the sub-scanning direction. This also applies to the image reading section 28B and the image reading section 28D.
  • an image reading section 28E is provided in which the main scanning direction is perpendicular to the sub-scanning direction.
  • the image reading section 28E is provided along the axis of symmetry between the image reading section 28A and the image reading section 28B.
  • the image reading section 28A and the image reading section 28B are arranged so that the main scanning direction intersects the sub-scanning direction at an angle of 60°.
  • the image reading section 28C and the image reading section 28D are arranged such that the main scanning direction intersects the sub-scanning direction at an angle of 60°.
  • the image reading section 28A, the image reading section 28C, and the image reading section 28E are arranged in an equilateral triangle with each other, and the image reading section 28B, the image reading section 28D, and the image reading section 28E are arranged in an equilateral triangle with each other. placed.
  • the object T to be read can be divided and read by a plurality of image reading units 28 arranged side by side in a direction intersecting the sub-scanning direction.
  • each image reading unit 28 has the same length, and it is only necessary to prepare a plurality of identical image reading units 28 and arrange them appropriately. realizable.
  • the corrected read image may be synthesized as necessary.
  • the imaging device 32 (32A, 32B) (see FIGS. 21 and 22), which will be described later, does not incline with respect to the longitudinal direction of the image reading section 28, and is , the main scanning direction of the image reading section 28 coincides with the longitudinal direction of the image reading section 28 .
  • FIG. 20 is a schematic diagram for explaining the arrangement of the image reading section 28 of the third embodiment.
  • FIG. 21 is a schematic diagram for explaining the configuration of the image reading section 28 of the third embodiment. 21 is also an enlarged view of the periphery of the image reading section 28 in FIG.
  • each of the image reading units 28 has an elongated shape along the longitudinal direction intersecting the sub-scanning direction.
  • each of the image reading units 28 is arranged so that its longitudinal direction is orthogonal to the sub-scanning direction.
  • one of the two image reading units 28, the image reading unit 28A has a first imaging device 32A that is inclined with respect to the longitudinal direction of the image reading unit 28A. arranged side by side.
  • a second image pickup element 32B inclined at an angle different from that of the first image pickup element 32A with respect to the longitudinal direction of the image reading section 28B is arranged side by side in the longitudinal direction of the image reading section 28B.
  • Each of the first imaging element 32A and the second imaging element 32B is a light receiving IC chip, and is configured by arranging a plurality of photoelectric conversion elements such as photodiodes in a straight line.
  • the first imaging element 32A is inclined at an angle of +45° with respect to the longitudinal direction of the image reading section 28A
  • the second imaging element 32B is inclined with respect to the longitudinal direction of the image reading section 28B. It is slanted at an angle of 45°.
  • the first imaging element 32A and the second imaging element 32B are arranged symmetrically with respect to the direction orthogonal to the sub-scanning direction. That is, in the example shown in FIG. 21, the main scanning directions of the image reading units 28 are perpendicular to each other.
  • the two image reading units 28 having different main scanning directions can perform two reading operations. Image data can be acquired. Therefore, compared to the configuration in which a plurality of first imaging elements 32A or a plurality of second imaging elements 32B are arranged side by side along the longitudinal direction of each image reading section 28 as shown in FIG. 28 can be narrowed to save space.
  • FIG. 22 is a schematic diagram for explaining a modification of the configuration of the image reading section 28 of the third embodiment.
  • this modification only one image reading section 28 is used.
  • a plurality of first imaging elements 32A are arranged in parallel in the longitudinal direction of the image reading unit 28, and a plurality of first image pickup elements 32A are arranged in the longitudinal direction of the image reading unit 28.
  • a second imaging element 32B is arranged in parallel.
  • the first imaging element 32A is inclined at an angle of +45° with respect to the longitudinal direction of the image reading section 28A
  • the second imaging element 32B is inclined with respect to the longitudinal direction of the image reading section 28B. It is slanted at an angle of 45°.
  • the first imaging element 32A and the second imaging element 32B are arranged symmetrically with respect to the direction orthogonal to the sub-scanning direction. Therefore, in the example shown in FIG. 22, there are two main scanning directions of the image reading section 28, and the main scanning directions are orthogonal to each other.
  • the arrangement area of the image reading section 28 in the sub-scanning direction can be narrowed to save space.
  • a fourth embodiment is the same as the first embodiment and the second embodiment except that the configuration of the image reading section 28 is changed. That is, in the fourth embodiment, the longitudinal direction of the image reading section 28 and the main scanning direction match.
  • FIG. 23 is a cross-sectional view showing part of an example of the image reading section 28 of the fourth embodiment.
  • FIG. 23 is a cross-sectional view of the image reading section 28 as seen from the main scanning direction.
  • the image reading section 28 of the fourth embodiment includes a slit member 34.
  • the slit member 34 extends in the main scanning direction and is arranged between the object to be read T and the linear light source 44B.
  • the slit member 34 is formed with an elongated slit along the main scanning direction, and only the light that has passed through the slit irradiates the object T to be read.
  • the width of the slit perpendicular to the main scanning direction may be about the same as or less than the width of each photoelectric conversion element of the imaging device.
  • the linear light source 44B may be composed of an LED, a halogen lamp, or the like, and may emit light in the near-infrared range of 750 nm to 2500 nm.
  • the slit member 34 is preferably formed of a material and shape that is resistant to high-power near-infrared light and has excellent dimensional stability. Also, the slit member 34 may be movable in the direction of approaching or separating from the object T to be read.
  • part of the light L2 irradiated to the object to be read T is blocked in the direction perpendicular to the main scanning direction. Blurring can be suppressed. As a result, output image data in which the outline of the area corresponding to the reading object T is sharper can be generated.
  • a general-purpose line laser light source may be used as the line light source 44B.
  • the line laser light source includes a laser light source and a lens, and irradiates laser light in a line.
  • the width of the linear laser light emitted from the line laser light source which is perpendicular to the main scanning direction, may be about the same as or less than the width of each photoelectric conversion element of the imaging device.
  • the line laser light source may emit near-infrared light of 750 nm to 2500 nm.
  • Line laser light sources include, for example, telecentric laser light sources.
  • the line light source 44B When a line laser light source is used as the line light source 44B, the radial spread of light in the direction perpendicular to the main scanning direction is suppressed, so that the same effect as when the slit member 34 is provided can be obtained. That is, by using a line laser light source as the line light source 44B, it is possible to suppress the blurring of the contour on the image due to the light that wraps around the object to be read T in the sub-scanning direction. Output image data in which the outline of the area corresponding to the reading target T is sharper can be generated.
  • the concrete structure of the conveying apparatus 36 is demonstrated.
  • the image reading device 10 also includes a conveying device, but a detailed description thereof is omitted.
  • the transport device 36 transports the object to be read T along a transport path extending in a direction intersecting the main scanning direction of the image reading section 28 .
  • the conveying device 36 conveys the object to be read T along a conveying path extending in a direction intersecting the longitudinal direction of the image reading section 28 .
  • FIG. 24 is a schematic diagram for explaining the configuration around the image reading unit 28 of the fifth embodiment.
  • a plurality of transport devices 36 are provided, and these transport devices 36 are arranged at intervals along the transport path. Also, the plurality of transport devices 36 transport the object to be read T along a transport path extending in a direction perpendicular to the main scanning direction.
  • a belt conveyor is used as the conveying device 36, but it is not limited to this.
  • the linear light source 44B of the image reading section 28 irradiates light onto the reading object T conveyed along the conveying path through the gap between the adjacent conveying devices 36 .
  • the light irradiated to the object to be read T transported along the transport path is projected between adjacent transport devices 36. You may inject into a clearance gap.
  • the linear light source 44B when the linear light source 44B is arranged between the conveying devices 36, it is possible to prevent the light emitted from the linear light source 44B from being absorbed or reflected before reaching the object T to be read. Therefore, it is possible to prevent the outline of the area corresponding to the reading target T from being blurred in the read image data.
  • two transport devices 36 are provided, but three or more transport devices 36 may be provided, and the linear light source 44B may be arranged between each of the adjacent transport devices 36. .
  • FIG. 25 is a schematic diagram for explaining a modification of the configuration around the image reading unit 28 of the fifth embodiment.
  • the light emitted from the linear light source 44B of the image reading section 28 passes through the conveying device 36 .
  • the light emitted from the linear light source 44B of the image reading section 28 passes through the belt of the conveying device 36 constituted by a belt conveyor. Therefore, since the belt is made of a transparent material, the conveying surface 36A forming the conveying path is light transmissive, and the light from the linear light source 44B is transmitted through the conveying surface 36A to read the target T. is irradiated to In this case, the image reading section 28 is arranged such that the conveying surface 36A is positioned between the linear light source 44B and the reading object T. As shown in FIG.
  • the transport surface 36A has optical transparency, it is possible to prevent the light emitted from the linear light source 44B from being absorbed or reflected before it reaches the object to be read T. It is possible to prevent the outline of the area corresponding to the reading target T from being blurred in the read image data.
  • the linear light source 44B is arranged outside the conveying device 36 (below the belt), but it is not limited to this, and may be arranged inside the conveying device 36 (inside the belt). .
  • the image reading device 10 may function as an image processing device by omitting the image reading section 28 .
  • an external image reading unit capable of functioning as the image reading unit 28 in each embodiment and the image processing apparatus are communicably connected.
  • the image reading apparatus 10 described in each embodiment may be implemented as an inspection apparatus for detecting foreign matter, scratches, defects, and the like.
  • ZNCC Zero-mean Normalized Cross-Correlation
  • image reading device 20 control unit 22 CPU 24 RAMs 26 storage unit 28 image reading unit 32 imaging element 34 slit member 36 conveying device 44 light source 90 image processing unit 92 spectrum generation processing unit 94 comparison processing unit 96 composite spectrum generation processing unit 98 image generation processing unit 100 correction processing unit 202a read image data 202b spectral data 202c synthesized spectral data 202d output image data 202e corrected image data

Abstract

A spectrum generation processing unit 92 subjects a plurality of items of image data individually to Fourier transform processing, thereby generating a plurality of items of spectrum data 202b. Then, a comparison processing unit 94 compares the individual components of the plurality of items of spectrum data 202b, corresponding to the same frequency, with each other. On the basis of the result of comparison by the comparison processing unit 94, a synthetic spectrum generation processing unit 96 selects one of the individual components of the plurality of items of spectrum data 202b, corresponding to the same frequency, thereby generating synthetic spectrum data 202c. An image generation processing unit 98 generates output image data 202d by subjecting the synthetic spectrum data 202c to inverse Fourier transform processing.

Description

画像処理装置、画像処理方法、画像処理プログラム及び画像読取装置Image processing device, image processing method, image processing program, and image reading device
 本発明は、ライン状に対象物を読み取ることにより得られる読取画像データを処理する、画像処理装置、画像処理方法、画像処理プログラム及び画像読取装置に関するものである。 The present invention relates to an image processing device, an image processing method, an image processing program, and an image reading device for processing read image data obtained by reading a line-shaped object.
 従来から主として可視域の光源を用いた表面検査装置に採用されているシステムは、ラインセンサ、密着型イメージセンサ、或いは、レーザビームによる走査型光学系と光電変換素子(フォトマル、アバランシェフォトダイオード、CCDセンサ、CMOSセンサなど)やライトガイドを含む受光光学系の組合せなどが代表的な検査システムであり、その多くは、検査対象(読取対象)におけるキズ、凹凸、欠陥、欠落、付着した異物などからの反射光や蛍光を受光する反射型である。なお、検査対象として、紙製品、食品の他、産業用として、フィルム又はシートのような樹脂成型物などが挙げられる。 Conventionally, the systems that have been adopted for surface inspection equipment using light sources in the visible range include line sensors, contact image sensors, or scanning optical systems using laser beams and photoelectric conversion elements (photomultipliers, avalanche photodiodes, CCD sensor, CMOS sensor, etc.) and a light receiving optical system including a light guide are typical inspection systems. It is a reflective type that receives reflected light and fluorescent light from. Examples of objects to be inspected include paper products, food products, and resin moldings such as films and sheets for industrial use.
 それに対し、受光系と照明系とを検査対象を挟んで対向配置した透過型において、検査対象は、透明で、薄く、透過率が高いものが多い。そして、厚みのある検査対象において、その検査対象に含まれる異物やキズ、欠陥等の対象物を検出するシステムは稀である。 On the other hand, in the transmissive type in which the light receiving system and the illumination system are arranged facing each other with the inspection object in between, the inspection object is often transparent, thin, and has high transmittance. It is rare for a system to detect objects such as foreign matter, scratches, and defects contained in a thick inspection object.
 また、非破壊検査装置として名高い透過性に優れたX線検査装置は、放射線であるX線を用いるため、放射線管理区域を設ける必要があり、放射線の人への被ばく量も管理しなければならない。即ち、設置場所を決定するうえでのハードルが高い。しかも、大型であり、重量もあるため、工場の既存の生産ラインへの追加導入は容易ではない。加えて、高額であるため、検査ポイントを多く設けることができない。 In addition, X-ray inspection equipment, which is well-known as a non-destructive inspection device and has excellent transparency, uses X-rays, which are radiation, so it is necessary to set up a radiation control area and control the amount of radiation exposure to people. . In other words, the hurdles to determine the installation location are high. Moreover, since it is large and heavy, it is not easy to add it to the existing production line of the factory. In addition, since it is expensive, many inspection points cannot be provided.
 さらに、X線検査装置は、X線自身の良好な透過性が仇となり、異物や欠陥、キズといったものまで透過してしまい、区別が出来ない場合も多い。 In addition, X-ray inspection equipment has the advantage of the good transparency of X-rays themselves, and often foreign objects, defects, and scratches cannot be distinguished.
国際公開第2015/186566号WO2015/186566
 特許文献1では、ライン光源及びラインセンサを備える従来の画像読取装置の一例が開示されている。従来の透過方式の画像読取装置では、X線検査装置のように検査対象に含まれる異物や欠陥、キズ等に起因してライン光源から出射される光が散乱する。 Patent Document 1 discloses an example of a conventional image reading device including a line light source and a line sensor. In conventional transmissive image reading apparatuses, light emitted from a line light source scatters due to foreign matter, defects, scratches, etc. contained in an inspection object like an X-ray inspection apparatus.
 特に、検査対象が分厚い場合などには、画像読取装置で対象物がライン状に読み取られ、その対象物を示すデータ形式の画像が生成される場合、画像上の対象物に対応する領域である対象領域の輪郭がぼけやすい。対象領域は、具体的には、画像上で対象物を構成する画素の領域である。 In particular, when the object to be inspected is thick, the image reading device reads the object in a line and generates an image of the data format showing the object. The outline of the target area is easily blurred. The target region is specifically a region of pixels forming a target object on the image.
 本願発明者らは、上記のような画像上の対象領域の輪郭のぼけに着目し、鋭意検討した結果、副走査方向においては、回り込みの光による画像上の対象領域の輪郭のぼけが除去されるが、主走査方向においては、ぼけが除去されないことが、鮮明な出力画像データを得られない原因になっていると考えるに至った。 The inventors of the present application focused on the blurring of the outline of the target area on the image as described above, and as a result of intensive studies, it was found that the blurring of the outline of the target area on the image due to the wraparound light can be eliminated in the sub-scanning direction. However, in the main scanning direction, the fact that blurring is not removed is the reason why clear output image data cannot be obtained.
 本発明は、上記実情に鑑みてなされたものであり、ライン状に対象物を読み取ることにより得られる、複数の読取画像データに画像処理を施すことで、対象領域の輪郭が鮮明な出力画像データを生成することができる画像処理装置、画像処理方法、画像処理プログラム及び画像読取装置を提供することを目的とする。 The present invention has been made in view of the above circumstances, and performs image processing on a plurality of read image data obtained by reading an object in a line to output image data with a clear contour of the target area. It is an object of the present invention to provide an image processing apparatus, an image processing method, an image processing program, and an image reading apparatus capable of generating a
 本発明に係る画像処理装置は、同一の対象物に対して、それぞれ異なる方向に延びる主走査方向に沿ってライン状に当該対象物を読み取ることにより得られる複数の読取画像データに画像処理を施す画像処理装置であって、当該画像処理装置は、スペクトル生成処理部と、比較処理部と、合成スペクトル生成処理部と、画像生成処理部とを備える。前記スペクトル生成処理部は、前記複数の読取画像データのそれぞれに対して、フーリエ変換処理を施すことで、複数のスペクトルデータを生成する。前記比較処理部は、前記複数のスペクトルデータの同一周波数に対応する各成分同士を比較する。前記合成スペクトル生成処理部は、前記比較処理部による比較結果に基づき、前記複数のスペクトルデータの同一周波数に対応する各成分の中からいずれかの成分を選択することにより、合成スペクトルデータを生成する。前記画像生成処理部は、前記合成スペクトルデータに対して、逆フーリエ変換処理を施すことで、出力画像データを生成する。 The image processing apparatus according to the present invention applies image processing to a plurality of read image data obtained by reading the same object linearly along the main scanning direction extending in different directions. An image processing apparatus comprising a spectrum generation processing section, a comparison processing section, a composite spectrum generation processing section, and an image generation processing section. The spectrum generation processing unit generates a plurality of spectrum data by performing a Fourier transform process on each of the plurality of read image data. The comparison processing unit compares each component corresponding to the same frequency of the plurality of spectral data. The synthetic spectrum generation processing unit generates synthetic spectrum data by selecting one of the components corresponding to the same frequency of the plurality of spectrum data based on the comparison result by the comparison processing unit. . The image generation processing unit generates output image data by performing inverse Fourier transform processing on the synthesized spectral data.
 このような構成によれば、対象物に対応する領域の輪郭が鮮明な出力画像データを生成することができる。 According to such a configuration, it is possible to generate output image data in which the contour of the area corresponding to the object is clear.
 前記合成スペクトル生成処理部は、前記複数のスペクトルデータの同一周波数に対応する各成分の中から最も大きい成分を選択してもよい。 The synthesized spectrum generation processing unit may select the largest component from among the components corresponding to the same frequency of the plurality of spectrum data.
 このような構成によれば、対象物に対応する領域の輪郭がより鮮明な出力画像データを生成することができる。また、このような構成によれば、読取画像データの数が多いほど、対象物に対応する領域の輪郭がより鮮明な出力画像データを生成することができると思われる。 According to such a configuration, it is possible to generate output image data in which the outline of the area corresponding to the object is sharper. Further, according to such a configuration, it is considered that output image data having a clearer outline of the area corresponding to the object can be generated as the number of read image data increases.
 前記画像処理装置は、前記複数の読取画像データのそれぞれにおける画像上で前記対象物に対応する領域が略同一に位置されるように、フーリエ変換処理が施される前の前記複数の読取画像データの少なくとも1つに対して補正処理を施す補正処理部をさらに備えていてもよい。この場合、前記スペクトル生成処理部は、前記補正処理部による補正処理が施された前記複数の読取画像データのそれぞれに対して、フーリエ変換処理を施してもよい。 The image processing device is configured to arrange the plurality of read image data before being subjected to Fourier transform processing so that the regions corresponding to the object on the image of each of the plurality of read image data are positioned substantially at the same position. may further include a correction processing unit that performs correction processing on at least one of In this case, the spectrum generation processing section may perform Fourier transform processing on each of the plurality of pieces of read image data that have been subjected to correction processing by the correction processing section.
 このような構成によれば、各読取画像データにおいて、対象物に対応する領域を略同一に位置させることができる。 According to such a configuration, in each read image data, the areas corresponding to the object can be positioned substantially at the same position.
 前記補正処理部は、フーリエ変換処理が施される前の前記複数の読取画像データの少なくとも1つに対して、当該読取画像データにおける画像上の前記対象物に対応する領域の傾き又は倍率を補正してもよい。 The correction processing unit corrects an inclination or magnification of an area corresponding to the object on the image in at least one of the plurality of read image data before being subjected to Fourier transform processing. You may
 このような構成によれば、読取画像データにおいて、対象物に対応する領域の傾き又は倍率を補正することで、各読取画像データにおいて、対象物に対応する領域を略同一に位置させることができる。 According to such a configuration, by correcting the inclination or the magnification of the area corresponding to the object in the read image data, the area corresponding to the object can be positioned substantially at the same position in each read image data. .
 本発明に係る画像読取装置は、複数のラインセンサと、前記画像処理装置とを備える。前記複数のラインセンサは、それぞれ主走査方向に沿ってライン状に画像を読み取り可能であり、当該主走査方向が副走査方向に対してそれぞれ異なる方向に延びる。前記画像処理装置は、前記複数のラインセンサで同一の対象物を読み取ることにより得られる複数の読取画像データに対して、画像処理を施す。 An image reading device according to the present invention includes a plurality of line sensors and the image processing device. Each of the plurality of line sensors can read an image in a line along the main scanning direction, and the main scanning direction extends in a direction different from the sub-scanning direction. The image processing device performs image processing on a plurality of read image data obtained by reading the same object with the plurality of line sensors.
 このような構成によれば、対象物又は複数のラインセンサが副走査方向に移動するだけで、つまり、一度の動作だけで複数の読取画像データを生成することができる。すなわち、このような構成によれば、効率よく複数の読取画像データを得ることできる。 With such a configuration, it is possible to generate a plurality of pieces of read image data simply by moving the object or the plurality of line sensors in the sub-scanning direction, that is, by a single operation. That is, according to such a configuration, it is possible to efficiently obtain a plurality of pieces of read image data.
 前記複数のラインセンサには、主走査方向が互いに直交するように配置された2つのラインセンサが含まれていてもよい。 The plurality of line sensors may include two line sensors arranged so that the main scanning directions are orthogonal to each other.
 このような構成によれば、2つのラインセンサを用いて、効率よく複数の読取画像データを得ることできる。 According to such a configuration, it is possible to efficiently obtain a plurality of read image data using two line sensors.
 前記2つのラインセンサは、それぞれの主走査方向が前記副走査方向に対して45°の角度で交差するように配置されていてもよい。 The two line sensors may be arranged such that their respective main scanning directions intersect with the sub-scanning direction at an angle of 45°.
 このような構成によれば、2つのラインセンサを用いて、さらに効率よく複数の読取画像データを得ることできる。 According to such a configuration, it is possible to more efficiently obtain a plurality of read image data using two line sensors.
 前記複数のラインセンサが、前記副走査方向に対して交差する方向に並べて配置されていてもよい。 The plurality of line sensors may be arranged side by side in a direction crossing the sub-scanning direction.
 このような構成によれば、副走査方向と交差する方向に並べて配置された複数のラインセンサで、対象物を分割して読み取ることができる。これにより、副走査方向におけるラインセンサの配置領域を狭めて、省スペース化することができる。 According to such a configuration, a plurality of line sensors arranged side by side in a direction intersecting the sub-scanning direction can divide and read an object. As a result, it is possible to narrow the arrangement area of the line sensors in the sub-scanning direction and save space.
 本発明の画像読取装置は、少なくとも1つのラインセンサと、前記画像処理装置とを備える。前記少なくとも1つのラインセンサは、副走査方向に対して交差する長手方向に沿って長尺形状を有する。前記画像処理装置は、前記少なくとも1つのラインセンサで同一の対象物を読み取ることにより得られる複数の読取画像データに対して、画像処理を施す。前記少なくとも1つのラインセンサには、前記長手方向に対して傾斜し、前記長手方向に並べて配置される複数の第1撮像素子と、前記長手方向に対して前記第1撮像素子とは異なる角度で傾斜し、前記長手方向に並べて配置される複数の第2撮像素子とが含まれる。 The image reading device of the present invention includes at least one line sensor and the image processing device. The at least one line sensor has an elongated shape along a longitudinal direction intersecting the sub-scanning direction. The image processing device performs image processing on a plurality of read image data obtained by reading the same object with the at least one line sensor. The at least one line sensor includes a plurality of first imaging elements that are inclined with respect to the longitudinal direction and arranged side by side in the longitudinal direction, and at an angle different from that of the first imaging elements with respect to the longitudinal direction. and a plurality of second imaging elements that are inclined and arranged side by side in the longitudinal direction.
 このような構成によれば、副走査方向における各ラインセンサの配置領域を狭めて、省スペース化することができる。 According to such a configuration, it is possible to narrow the arrangement area of each line sensor in the sub-scanning direction and save space.
 前記少なくとも1つのラインセンサには、2つのラインセンサが含まれ、前記2つのラインセンサのうちの一方には、前記長手方向に対して傾斜する前記複数の第1撮像素子が前記長手方向に並べて配置され、前記2つのラインセンサのうちの他方には、前記長手方向に対して前記第1撮像素子とは異なる角度で傾斜する前記複数の第2撮像素子が前記長手方向に並べて配置されていてもよい。 The at least one line sensor includes two line sensors, and one of the two line sensors includes the plurality of first imaging elements arranged in the longitudinal direction and inclined with respect to the longitudinal direction. and the other of the two line sensors is provided with a plurality of second imaging elements that are inclined at an angle different from that of the first imaging element with respect to the longitudinal direction and are arranged side by side in the longitudinal direction. good too.
 このような構成によれば、各ラインセンサの長手方向に沿って複数の第1撮像素子又は複数の第2撮像素子を並べて配置する構成と比べて、副走査方向における各ラインセンサの配置領域を狭めて、省スペース化することができる。 According to such a configuration, compared to a configuration in which a plurality of first imaging elements or a plurality of second imaging elements are arranged side by side along the longitudinal direction of each line sensor, the arrangement area of each line sensor in the sub-scanning direction is reduced to It can be narrowed to save space.
 前記少なくとも1つのラインセンサのうちの1つには、前記長手方向に対して傾斜する前記複数の第1撮像素子が前記長手方向に並べて配置されるとともに、前記長手方向に対して前記第1撮像素子とは異なる角度で傾斜する前記複数の第2撮像素子が前記長手方向に並べて配置されている。 In one of the at least one line sensor, the plurality of first imaging elements inclined with respect to the longitudinal direction are arranged side by side in the longitudinal direction, and the first imaging element is arranged with respect to the longitudinal direction. The plurality of second imaging elements inclined at an angle different from that of the elements are arranged side by side in the longitudinal direction.
 このような構成によれば、長手方向に沿って複数の第1撮像素子又は複数の第2撮像素子が並べて配置された2つのラインセンサを設ける構成と比べて、2つのラインセンサを一体化し、かつ、副走査方向におけるラインセンサの配置領域を狭めて、省スペース化することができる。 According to such a configuration, the two line sensors are integrated, In addition, the space can be saved by narrowing the arrangement area of the line sensor in the sub-scanning direction.
 本発明の画像読取装置は、少なくとも1つのラインセンサと、前記画像処理装置とを備える。前記少なくとも1つのラインセンサは、主走査方向に沿ってライン状に画像を読み取り可能である。前記画像処理装置は、前記少なくとも1つのラインセンサで同一の対象物を読み取ることにより得られる複数の読取画像データに対して、画像処理を施す。前記ラインセンサは、光源と、スリット部材を有する。前記光源は、前記主走査方向に延び、前記対象物に光を照射する。前記スリット部材は、前記主走査方向に延び、前記対象物と前記光源との間に配置される。 The image reading device of the present invention includes at least one line sensor and the image processing device. The at least one line sensor can read an image linearly along the main scanning direction. The image processing device performs image processing on a plurality of read image data obtained by reading the same object with the at least one line sensor. The line sensor has a light source and a slit member. The light source extends in the main scanning direction and irradiates the object with light. The slit member extends in the main scanning direction and is arranged between the object and the light source.
 このような構成によれば、対象物に対する副走査方向での回り込みの光による画像上の輪郭のぼけを抑制することができるため、対象物に対応する領域の輪郭がより鮮明な出力画像データを生成することができる。 With such a configuration, it is possible to suppress the blurring of the contour of the image due to the light that wraps around the object in the sub-scanning direction. can be generated.
 本発明の画像読取装置は、少なくとも1つのラインセンサと、前記画像処理装置とを備える。前記少なくとも1つのラインセンサは、主走査方向に沿ってライン状に画像を読み取り可能である。前記画像処理装置は、前記少なくとも1つのラインセンサで同一の対象物を読み取ることにより得られる複数の読取画像データに対して、画像処理を施す。前記ラインセンサは、ラインレーザ光源を有する。前記ラインレーザ光源は、前記対象物に光を照射する。 The image reading device of the present invention includes at least one line sensor and the image processing device. The at least one line sensor can read an image linearly along the main scanning direction. The image processing device performs image processing on a plurality of read image data obtained by reading the same object with the at least one line sensor. The line sensor has a line laser light source. The line laser light source irradiates the object with light.
 このような構成によれば、ラインセンサとしてラインレーザ光源を用いることにより、対象物に対する副走査方向での回り込みの光による画像上の輪郭のぼけを抑制することができるため、部材の数を増やすことなく、対象物に対応する領域の輪郭がより鮮明な出力画像データを生成することができる。 According to such a configuration, by using a line laser light source as the line sensor, it is possible to suppress the blurring of the contour on the image due to the light that wraps around the object in the sub-scanning direction, so the number of members is increased. Therefore, it is possible to generate output image data in which the outline of the area corresponding to the object is clearer.
 本発明の画像読取装置は、少なくとも1つのラインセンサと、前記画像処理装置と、複数の搬送装置とを備える。前記少なくとも1つのラインセンサは、主走査方向に沿ってライン状に画像を読み取り可能である。前記画像処理装置は、前記少なくとも1つのラインセンサで同一の対象物を読み取ることにより得られる複数の読取画像データに対して、画像処理を施す。前記複数の搬送装置は、前記主走査方向に対して交差する方向に延びる搬送路に沿って前記対象物を搬送する。前記ラインセンサは、光源を有する。前記光源は、前記対象物に光を照射する。前記複数の搬送装置は、前記搬送路に沿って互いに隙間を空けて配置されており、前記光源は、前記搬送路に沿って搬送される前記対象物に前記隙間を介して光を照射し、又は、前記搬送路に沿って搬送される対象物に照射した光を前記隙間に入射させる。 The image reading device of the present invention includes at least one line sensor, the image processing device, and a plurality of transport devices. The at least one line sensor can read an image linearly along the main scanning direction. The image processing device performs image processing on a plurality of read image data obtained by reading the same object with the at least one line sensor. The plurality of transport devices transport the target object along a transport path extending in a direction intersecting the main scanning direction. The line sensor has a light source. The light source irradiates the object with light. The plurality of conveying devices are arranged with a gap from each other along the conveying path, and the light source irradiates the object conveyed along the conveying path with light through the gap, Alternatively, the light irradiated to the object conveyed along the conveying path is caused to enter the gap.
 このような構成によれば、光源から照射された光が、対象物に到達するまでに吸収又は反射されるのを抑制することができるため、読取画像データにおいて、対象物に対応する領域の輪郭がぼやけるのを防止することができる。 With such a configuration, it is possible to suppress the light emitted from the light source from being absorbed or reflected before it reaches the object. can be prevented from blurring.
 本発明の画像読取装置は、少なくとも1つのラインセンサと、前記画像処理装置と、搬送装置とを備える。前記少なくとも1つのラインセンサは、主走査方向に沿ってライン状に画像を読み取り可能である。前記画像処理装置は、前記少なくとも1つのラインセンサで同一の対象物を読み取ることにより得られる複数の読取画像データに対して、画像処理を施す。前記搬送装置は、前記主走査方向に対して交差する方向に延びる搬送路に沿って前記対象物を搬送する。前記ラインセンサは、光源を有する。前記光源は、前記対象物に光を照射する。前記搬送装置は、前記搬送路を形成する搬送面を含み、当該搬送面が光透過性を有することにより、前記光源からの光が前記搬送面を透過して前記対象物に照射される。 The image reading device of the present invention includes at least one line sensor, the image processing device, and a conveying device. The at least one line sensor can read an image linearly along the main scanning direction. The image processing device performs image processing on a plurality of read image data obtained by reading the same object with the at least one line sensor. The transport device transports the object along a transport path extending in a direction intersecting the main scanning direction. The line sensor has a light source. The light source irradiates the object with light. The conveying device includes a conveying surface that forms the conveying path, and the conveying surface is light transmissive, so that the light from the light source is transmitted through the conveying surface to irradiate the object.
 このような構成によれば、光源から照射された光が、対象物に到達するまでに吸収又は反射されるのを抑制することができるため、読取画像データにおいて、対象物に対応する領域の輪郭がぼやけるのを防止することができる。 With such a configuration, it is possible to suppress the light emitted from the light source from being absorbed or reflected before it reaches the object. can be prevented from blurring.
 本発明に係る画像処理方法は、同一の対象物に対して、それぞれ異なる方向に延びる主走査方向に沿ってライン状に当該対象物を読み取ることにより得られる複数の読取画像データに対して、画像処理を施す画像処理方法であって、当該画像処理方法は、スペクトル生成ステップと、比較ステップと、合成スペクトル生成ステップと、画像生成ステップを含む。前記スペクトル生成ステップは、前記複数の読取画像データのそれぞれに対して、フーリエ変換処理を施すことで、複数のスペクトルデータを生成する。前記比較ステップは、前記複数のスペクトルデータの同一周波数に対応する各成分同士を比較する。合成スペクトル生成処理ステップは、前記比較ステップによる比較結果に基づき、前記複数のスペクトルデータの同一周波数に対応する各成分の中からいずれかの成分を選択することにより、合成スペクトルデータを生成する。前記画像生成ステップは、前記合成スペクトルデータに対して、逆フーリエ変換処理を施すことで、出力画像データを生成する。 The image processing method according to the present invention provides an image processing method for a plurality of read image data obtained by scanning the same object linearly along the main scanning direction extending in different directions. An image processing method for processing, the image processing method comprising a spectrum generation step, a comparison step, a synthetic spectrum generation step, and an image generation step. The spectrum generation step generates a plurality of spectrum data by performing a Fourier transform process on each of the plurality of read image data. The comparison step compares each component corresponding to the same frequency of the plurality of spectral data. The synthetic spectrum generating processing step generates synthetic spectral data by selecting one of the components corresponding to the same frequency of the plurality of spectral data based on the comparison result of the comparing step. The image generation step generates output image data by performing inverse Fourier transform processing on the synthesized spectral data.
 このような構成によれば、対象物に対応する領域の輪郭が鮮明な出力画像データを生成することが可能な画像処理方法を提供できる。 According to such a configuration, it is possible to provide an image processing method capable of generating output image data with a sharp outline of the area corresponding to the object.
 本発明に係る画像処理プログラムは、同一の対象物に対して、それぞれ異なる方向に延びる主走査方向に沿ってライン状に対象物を読み取ることにより得られる複数の読取画像データに対して、画像処理を施す画像処理プログラムであって、当該画像処理プログラムは、スペクトル生成ステップと、比較ステップと、合成スペクトル生成ステップと、画像生成ステップとをコンピュータに実行させる。前記スペクトル生成ステップは、前記複数の読取画像データのそれぞれに対して、フーリエ変換処理を施すことで、複数のスペクトルデータを生成する。前記比較ステップは、前記複数のスペクトルデータの同一周波数に対応する各成分同士を比較する。合成スペクトル生成処理ステップは、前記比較ステップによる比較結果に基づき、前記複数のスペクトルデータの同一周波数に対応する各成分の中からいずれかの成分を選択することにより、合成スペクトルデータを生成する。前記画像生成ステップは、前記合成スペクトルデータに対して、逆フーリエ変換処理を施すことで、出力画像データを生成する。 An image processing program according to the present invention performs image processing on a plurality of read image data obtained by scanning the same object linearly along main scanning directions extending in different directions. The image processing program causes a computer to execute a spectrum generation step, a comparison step, a synthesized spectrum generation step, and an image generation step. The spectrum generation step generates a plurality of spectrum data by performing a Fourier transform process on each of the plurality of read image data. The comparison step compares each component corresponding to the same frequency of the plurality of spectral data. The synthetic spectrum generating processing step generates synthetic spectral data by selecting one of the components corresponding to the same frequency of the plurality of spectral data based on the comparison result of the comparing step. The image generation step generates output image data by performing inverse Fourier transform processing on the synthesized spectral data.
 このような構成によれば、対象物に対応する領域の輪郭が鮮明な出力画像データを生成することが可能な画像処理プログラムを提供できる。 According to such a configuration, it is possible to provide an image processing program capable of generating output image data with a clear outline of the area corresponding to the object.
 本発明によれば、ライン状に対象物を読み取ることにより得られる、複数の読取画像データに画像処理を施すことで、対象物に対応する領域の輪郭が鮮明な出力画像データを生成することができる。 According to the present invention, by performing image processing on a plurality of read image data obtained by reading an object in lines, it is possible to generate output image data with a clear outline of an area corresponding to the object. can.
第1実施形態の画像読取装置の電気的構成の一例を示すブロック図である。2 is a block diagram showing an example of the electrical configuration of the image reading device according to the first embodiment; FIG. 第1実施形態の画像読取部の一例を示す断面図である。2 is a cross-sectional view showing an example of an image reading section of the first embodiment; FIG. 第1実施形態の画像読取部におけるライン状光源の外観の一例を概略的に示す分解斜視図である。4 is an exploded perspective view schematically showing an example of the appearance of a linear light source in the image reading section of the first embodiment; FIG. 第1実施形態の読取画像の一例を示す図である。4A and 4B are diagrams illustrating an example of a read image according to the first embodiment; FIG. 第1実施形態の読取画像の他の例を示す図である。FIG. 5 is a diagram showing another example of a read image according to the first embodiment; FIG. 第1実施形態の出力画像の一例を示す図である。4 is a diagram showing an example of an output image according to the first embodiment; FIG. 第1実施形態のRAMのメモリマップの一例を示す図である。It is a figure which shows an example of the memory map of RAM of 1st Embodiment. 第1実施形態の画像読取装置の機能的構成の一例を示すブロック図である。1 is a block diagram showing an example of a functional configuration of an image reading device according to a first embodiment; FIG. 第1実施形態の画像読取装置のCPUの画像処理の一例を示すフロー図である。4 is a flow chart showing an example of image processing by a CPU of the image reading apparatus according to the first embodiment; FIG. 第2実施形態の複数の画像読取部の周辺の一例を示す概略図である。FIG. 10 is a schematic diagram showing an example of the periphery of a plurality of image reading units according to the second embodiment; 第2実施形態の補正処理の一例を説明するための図である。It is a figure for explaining an example of correction processing of a 2nd embodiment. 第2実施形態のRAMのメモリマップの一例を示す図である。It is a figure which shows an example of the memory map of RAM of 2nd Embodiment. 第2実施形態の画像読取装置の機能的構成の一例を示すブロック図である。FIG. 11 is a block diagram showing an example of the functional configuration of an image reading apparatus according to a second embodiment; FIG. 第2実施形態の画像読取装置のCPUの画像処理の一例を示すフロー図である。FIG. 10 is a flow chart showing an example of image processing by a CPU of the image reading apparatus according to the second embodiment; 第2実施形態の画像読取部の配置を説明するための概略図である。FIG. 11 is a schematic diagram for explaining the arrangement of image reading units according to the second embodiment; 第2実施形態の画像読取部の配置を説明するための概略図である。FIG. 11 is a schematic diagram for explaining the arrangement of image reading units according to the second embodiment; 第2実施形態の画像読取部の配置を説明するための概略図である。FIG. 11 is a schematic diagram for explaining the arrangement of image reading units according to the second embodiment; 第2実施形態の画像読取部の配置を説明するための概略図である。FIG. 11 is a schematic diagram for explaining the arrangement of image reading units according to the second embodiment; 第2実施形態の画像読取部の配置を説明するための概略図である。FIG. 11 is a schematic diagram for explaining the arrangement of image reading units according to the second embodiment; 第3実施形態の画像読取部の配置を説明するための概略図である。FIG. 11 is a schematic diagram for explaining the arrangement of image reading units according to the third embodiment; 第3実施形態の画像読取部の構成を説明するための概略図である。FIG. 11 is a schematic diagram for explaining the configuration of an image reading unit according to a third embodiment; 第3実施形態の画像読取部の構成の変形例を説明するための概略図である。FIG. 11 is a schematic diagram for explaining a modification of the configuration of the image reading section of the third embodiment; 第4実施形態の画像読取部の一例の一部を示す断面図である。FIG. 11 is a cross-sectional view showing a part of an example of an image reading unit according to a fourth embodiment; 第5実施形態の画像読取部の周辺の構成を説明するための概略図である。FIG. 11 is a schematic diagram for explaining the peripheral configuration of an image reading unit according to a fifth embodiment; 第5実施形態の画像読取部の周辺の構成の変形例を説明するための概略図である。FIG. 14 is a schematic diagram for explaining a modification of the configuration around the image reading unit of the fifth embodiment;
1.第1実施形態
 図1は、第1実施形態の画像読取装置10の電気的構成の一例を示すブロック図である。図1に示すように、画像読取装置10は、制御部20及び画像読取部28を含み、これらの各々は、バス30を介して電気的に接続される。
1. First Embodiment FIG. 1 is a block diagram showing an example of the electrical configuration of an image reading apparatus 10 according to the first embodiment. As shown in FIG. 1, the image reading device 10 includes a control section 20 and an image reading section 28, each electrically connected via a bus 30. As shown in FIG.
 また、制御部20は、CPU(Central Processing Unit)22、RAM(Random Access Memory)24及び記憶部26を含む。 The control unit 20 also includes a CPU (Central Processing Unit) 22, a RAM (Random Access Memory) 24, and a storage unit 26.
 CPU22は、画像読取装置10の全体的な制御を司る。RAM24は、CPU22のワーク領域およびバッファ領域として用いられる。 The CPU 22 is in charge of overall control of the image reading device 10. The RAM 24 is used as a work area and a buffer area for the CPU 22 .
 記憶部26は、画像読取装置10の主記憶装置であって、HDD(Hard Disk Drive)およびEEPROM(Electrically Erasable Programmable Read Only Memory)のような不揮発性メモリが用いられる。また、記憶部26が、RAM24を含むように構成されてもよい。 The storage unit 26 is the main storage device of the image reading device 10, and non-volatile memory such as HDD (Hard Disk Drive) and EEPROM (Electrically Erasable Programmable Read Only Memory) is used. Moreover, the storage unit 26 may be configured to include the RAM 24 .
 記憶部26には、CPU22が画像読取装置10の各コンポーネントの動作を制御するための制御プログラムについてのデータ、各種画像についてのデータ及び制御プログラムの実行に必要な実行用データ等が記憶される。 The storage unit 26 stores data on control programs for the CPU 22 to control the operation of each component of the image reading apparatus 10, data on various images, execution data necessary for executing the control programs, and the like.
 画像読取部28は、少なくともイメージセンサを含み、副走査方向に走査しながら、主走査方向に沿ってライン状に対象物を読み取る。なお、イメージセンサは、光を光電変換により電気出力として画像を読取るためのセンサである。 The image reading unit 28 includes at least an image sensor, and reads an object linearly along the main scanning direction while scanning in the sub-scanning direction. Note that the image sensor is a sensor for reading an image by converting light into electrical output through photoelectric conversion.
 イメージセンサは、副走査方向に走査しながら、主走査方向に沿ってライン状に対象物を読み取ることが可能であれば、種類及び数は特に限定されない。 The type and number of image sensors are not particularly limited as long as they can read the target in a line along the main scanning direction while scanning in the sub-scanning direction.
 画像読取部28としては、たとえば、汎用のラインセンサ、汎用のカメラ又はCIS(Contact Image Sensor)などが利用可能である。第1実施形態では、画像読取部28として、CISを用いた場合を例に挙げて説明する。 As the image reading unit 28, for example, a general-purpose line sensor, general-purpose camera, CIS (Contact Image Sensor), or the like can be used. In the first embodiment, a case where a CIS is used as the image reading unit 28 will be described as an example.
 なお、エリアカメラなどの汎用のカメラ、すなわち、エリアセンサを備えるカメラを画像読取部28として用いる場合、エリアセンサの一列だけを用いる。カメラの視野角は、狭い方が好ましい。さらに、光源の出射幅及び出射角度についても、狭い方が好ましい。さらにまた、光源の出射角度は、0に近い方が好ましい。 Note that when a general-purpose camera such as an area camera, that is, a camera with an area sensor is used as the image reading unit 28, only one row of area sensors is used. The narrower the viewing angle of the camera, the better. Furthermore, the narrower the emission width and the emission angle of the light source, the better. Furthermore, it is preferable that the emission angle of the light source is close to zero.
 なお、第1実施形態の画像読取装置10の電気的構成は一例である。たとえば、画像読取装置10に、複数の画像読取部28が設けられてもよい。 The electrical configuration of the image reading device 10 of the first embodiment is an example. For example, the image reading device 10 may be provided with a plurality of image reading units 28 .
 図2は、第1実施形態の画像読取部28の一例を示す断面図である。図3は、第1実施形態の画像読取部28におけるライン状光源44(44A、44B)の外観の一例を概略的に示す分解斜視図である。図2に示すように、画像読取部28は、焦点面40を挟んで対向配置される2つの筐体42(42A、42B)及び筐体42に設けられる各種コンポーネントを含む。 FIG. 2 is a cross-sectional view showing an example of the image reading section 28 of the first embodiment. FIG. 3 is an exploded perspective view schematically showing an example of the appearance of the linear light source 44 (44A, 44B) in the image reading section 28 of the first embodiment. As shown in FIG. 2, the image reading unit 28 includes two housings 42 (42A, 42B) that face each other across a focal plane 40 and various components provided in the housings 42. As shown in FIG.
 各筐体42内には、焦点面40上にある物体を照明するためのライン状光源44(44A、44B)が設けられている。ライン状光源44は、焦点面40にある読取対象(検査対象)Tに向けて光を出射するコンポーネントである。図2において、ライン状光源44Aから焦点面40に向けて出射される光をL1で示し、ライン状光源44Bから焦点面40に向けて出射される光をL2で示す。 A linear light source 44 (44A, 44B) for illuminating an object on the focal plane 40 is provided in each housing 42. The linear light source 44 is a component that emits light toward a reading target (inspection target) T on the focal plane 40 . In FIG. 2, the light emitted from the linear light source 44A toward the focal plane 40 is denoted by L1, and the light emitted from the linear light source 44B toward the focal plane 40 is denoted by L2.
 また、図3に示すように、ライン状光源44は、長手方向Dに沿って延びる透明な導光体46と、長手方向Dの一方の端面に設けられた第1光源部48と、長手方向Dの他方の端面に設けられた第2光源部50と、導光体46の外側の各側面を保持するためのカバー部材51とを含む。なお、ライン状光源44は、導光体46で光をガイドするようなガイド式に限らない。また、図3では、第1光源部48及び第2光源部50は、導光体46から離間して図示される。 Further, as shown in FIG. 3, the linear light source 44 includes a transparent light guide 46 extending along the longitudinal direction D, a first light source section 48 provided on one end face in the longitudinal direction D, and A second light source section 50 provided on the other end surface of D, and a cover member 51 for holding each outer side surface of the light guide 46 are included. Note that the linear light source 44 is not limited to a guide type in which light is guided by the light guide 46 . Also, in FIG. 3 , the first light source section 48 and the second light source section 50 are shown separated from the light guide 46 .
 さらに、ライン状光源44は、光拡散パターンPを有している。光拡散パターンPは、導光体46の底面52において、長手方向Dに沿って延びるように形成される。第1光源部48及び第2光源部50から導光体46に光が入射されると、光拡散パターンPは、その導光体46の中を進む光を拡散・屈折させて、導光体46の光出射側面54から出射させる。 Furthermore, the linear light source 44 has a light diffusion pattern P. The light diffusion pattern P is formed to extend along the longitudinal direction D on the bottom surface 52 of the light guide 46 . When light is incident on the light guide 46 from the first light source unit 48 and the second light source unit 50, the light diffusion pattern P diffuses and refracts the light traveling through the light guide 46, The light is emitted from the light emitting side surface 54 of 46 .
 なお、光出射側面54は、レンズの集光効果を持たせるために外向きに滑らかな凸の曲線状に形成されている。 It should be noted that the light emitting side surface 54 is formed in a smooth convex curved shape outward in order to provide the lens with a light condensing effect.
 また、図2に示すように、筐体42Aには、ライン状光源44Aを固定するための基板56が設けられ、筐体42Bには、ライン状光源44Bを固定するための基板58Bが設けられる。 Further, as shown in FIG. 2, the housing 42A is provided with a substrate 56 for fixing the linear light source 44A, and the housing 42B is provided with a substrate 58B for fixing the linear light source 44B. .
 図3に示すように、第1光源部48及び第2光源部50には、端子60が設けられている。ライン状光源44Aは、端子60が基板56に差し込まれ、半田付けされることで、その基板56に固定される。また、ライン状光源44Bは、端子60が基板58Bに差し込まれ、半田付けされることで、その基板58Bに固定される。 As shown in FIG. 3 , terminals 60 are provided on the first light source section 48 and the second light source section 50 . The linear light source 44A is fixed to the substrate 56 by inserting the terminal 60 into the substrate 56 and soldering. The linear light source 44B is fixed to the substrate 58B by inserting the terminal 60 into the substrate 58B and soldering.
 また、図2に示すように、筐体42A内には、レンズアレイ62が設けられる。レンズアレイ62は、読取対象Tで反射又は透過された光を後述する受光部64に結像する光学素子であり、セルフォックレンズアレイ(登録商標:日本板硝子製)などのロッドレンズアレイを用いることができる。 Further, as shown in FIG. 2, a lens array 62 is provided inside the housing 42A. The lens array 62 is an optical element that forms an image of the light reflected or transmitted by the object to be read T on a light receiving unit 64, which will be described later. can be done.
 さらに、焦点面40から受光部64までの任意の位置には、受光部64に紫外光が入射するのを阻止する紫外光遮断フィルタ(UVカットフィルタ)67が設けられている。また、受光部64と紫外光遮断フィルタ67との間には、特定波長範囲の可視光を通過させるカラーフィルタ68が設けられている。 Furthermore, an ultraviolet light blocking filter (UV cut filter) 67 that blocks ultraviolet light from entering the light receiving section 64 is provided at an arbitrary position from the focal plane 40 to the light receiving section 64 . A color filter 68 is provided between the light receiving section 64 and the ultraviolet light blocking filter 67 to pass visible light in a specific wavelength range.
 受光部64は、イメージセンサを含み、筐体42A内において、基板58Aに実装されている。また、各筐体42には、使用中のごみの飛散や傷つきからライン状光源44を保護するために保護ガラス66(66A、66B)が設けられる。 The light receiving section 64 includes an image sensor, and is mounted on the substrate 58A within the housing 42A. Each housing 42 is provided with protective glass 66 (66A, 66B) to protect the linear light source 44 from dust scattering and damage during use.
 このように構成される画像読取部28では、ライン状光源44Aから出射された光L1は、ライン状の照明光となって読取対象Tを照明し、読取対象Tからの反射光が受光部64に導かれる。また、ライン状光源44Bから出射された光L2は、ライン状の照明光となって読取対象Tを透過し、読取対象Tを透過した透過光が受光部64に導かれる。読取対象Tとしては、食品を例示することができるが、これに限られるものではない。たとえば、読取対象Tとしては、食品(固形物、液状物などの他、密度(パンや油等)、色(ケチャップやカレー等)又は形状(麺類等)が特徴的なものであっても、品目は問わない。)、フィルム又はシートのような樹脂成形物、布生地、衣服又は不織布のような繊維製品、枚葉、ロール又は構造体(封筒又は箱)のような紙製品などが挙げられる。読取対象T自体が対象物であってもよいし、読取対象Tに内包される、混合される、あるいは覆われるなどの状態で同時に読み取られる部分が対象物であってもよい。当該部分としては、金属片、樹脂片(プラスチック又はゴムなど)、鉱物片(石又は砂など)、動植物片(枝葉、骨又は虫など)、それらの複合物の他、読取対象Tの内部における構造的欠陥(キズ、凹凸、気泡又は付着した異物など)を例示できる。 In the image reading unit 28 configured as described above, the light L1 emitted from the linear light source 44A becomes linear illumination light to illuminate the object T to be read, and the reflected light from the object T to be read is reflected by the light receiving unit 64. led to. The light L2 emitted from the linear light source 44B becomes linear illumination light and passes through the object T to be read. Food can be exemplified as the object to be read T, but it is not limited to this. For example, as the reading target T, in addition to food (solid matter, liquid matter, etc.), even if the density (bread, oil, etc.), color (ketchup, curry, etc.) or shape (noodles, etc.) is characteristic, There are no restrictions on the item.), resin moldings such as films or sheets, textile products such as fabrics, clothes, or non-woven fabrics, and paper products such as sheets, rolls, or structures (envelopes or boxes). . The object to be read T itself may be the object, or a portion of the object to be read T that is included in, mixed with, covered by, or otherwise read at the same time may be the object. Such parts include metal pieces, resin pieces (plastic or rubber, etc.), mineral pieces (stones, sand, etc.), animal and plant pieces (branch, bones, insects, etc.), composites thereof, as well as inside the reading target T Structural defects (scratches, irregularities, air bubbles, adhered foreign matter, etc.) can be exemplified.
 したがって、画像読取部28では、たとえば、副走査方向(図2におけるX方向)に走査しながら、主走査方向(図2におけるY方向)に沿ってライン状に読取対象Tを読み取ることができる。ただし、ライン状光源44Aが省略されることにより、読取対象Tからの透過光のみが受光部64で受光されるような構成であってもよい。 Therefore, the image reading unit 28 can read the reading object T linearly along the main scanning direction (Y direction in FIG. 2) while scanning in the sub scanning direction (X direction in FIG. 2). However, the configuration may be such that only the transmitted light from the object to be read T is received by the light receiving section 64 by omitting the linear light source 44A.
 また、図示は省略するが、基板58(58A、58B)上には、制御部20と電気的に接続するための構成、たとえば、コネクタ等が設けられる。 Also, although not shown in the drawings, on the board 58 (58A, 58B), a configuration for electrically connecting with the control unit 20, such as a connector, is provided.
 第1実施形態では、画像読取部28を用いて、同一の対象物に対して、それぞれ異なる方向に延びる主走査方向に沿ってライン状に対象物を読み取ることで、複数の読取画像データが生成される。例えば、固定された画像読取部28の焦点面40上で、Y方向と交差する異なる複数の方向に対象物を搬送しながら、各搬送時の対象物を主走査方向に沿って読み取れば、同一の対象物に対してそれぞれ異なる方向に延びる主走査方向に沿って対象物を読み取ることができる。また、焦点面40上で停止している対象物に対して、画像読取部28を移動させてもよい。 In the first embodiment, the image reading unit 28 is used to read the same object in lines along the main scanning direction extending in different directions, thereby generating a plurality of read image data. be done. For example, on the focal plane 40 of the fixed image reading unit 28, while conveying the object in a plurality of different directions intersecting with the Y direction, if the object during each conveyance is read along the main scanning direction, the same image can be obtained. , the object can be read along main scanning directions extending in different directions with respect to each object. Alternatively, the image reading unit 28 may be moved with respect to the object stopped on the focal plane 40 .
 なお、本明細書では、画像データは、データ形式である画像を指す。したがって、本明細書では、たとえば、読取画像データを読取画像と記述する場合もあり、読取画像を読取画像データを記述する場合もある。このことは、後述するスペクトルについても同様である。 In this specification, image data refers to images in data format. Therefore, in this specification, for example, read image data may be referred to as read image, and read image may be referred to as read image data. This is the same for spectra to be described later.
 第1実施形態では、複数の読取画像データが生成されると、複数の読取画像データのそれぞれに対して、フーリエ変換処理が施される。また、複数の読取画像データのそれぞれに対して、フーリエ変換処理が施されると、複数のスペクトルデータが生成される。 In the first embodiment, when a plurality of read image data are generated, Fourier transform processing is performed on each of the plurality of read image data. Further, when Fourier transform processing is performed on each of the plurality of read image data, a plurality of spectral data are generated.
 具体的に、フーリエ変換処理では、読取画像に対して2次元フーリエ変換が行われる。また、スペクトルデータは、データ形式の周波数スペクトルである。周波数スペクトルは、周波数ごとに対応する成分を有するスペクトルである。また、周波数スペクトルは、具体的に、空間周波数スペクトルである。 Specifically, in the Fourier transform process, a two-dimensional Fourier transform is performed on the read image. Also, the spectrum data is a frequency spectrum in data format. A frequency spectrum is a spectrum that has a corresponding component for each frequency. Also, the frequency spectrum is specifically the spatial frequency spectrum.
 第1実施形態では、周波数スペクトルは、振幅スペクトル、位相スペクトル及びパワースペクトルのいずれかに対応する。たとえば、周波数スペクトルが、振幅スペクトルに対応するのであれば、周波数に対応する成分は、振幅の大きさを示す。周波数スペクトルとしては、振幅スペクトルが用いられるのが好ましい。 In the first embodiment, the frequency spectrum corresponds to any one of amplitude spectrum, phase spectrum and power spectrum. For example, if the frequency spectrum corresponds to the amplitude spectrum, the component corresponding to frequency indicates the magnitude of amplitude. An amplitude spectrum is preferably used as the frequency spectrum.
 第1実施形態では、複数のスペクトルデータが生成されると、各スペクトルデータ同士が比較される。具体的には、複数のスペクトルデータの同一周波数に対応する各成分同士が比較される。 In the first embodiment, when a plurality of spectral data are generated, each spectral data is compared. Specifically, each component corresponding to the same frequency of a plurality of spectral data is compared.
 同一周波数に対応する各成分同士が比較されると、その比較結果に基づき、複数のスペクトルデータの同一周波数に対応する各成分の中からいずれかの成分が合成用成分として選択される。第1実施形態では、複数のスペクトルデータの同一周波数に対応する各成分の中から最も大きい成分が合成用成分として選択される。 When the components corresponding to the same frequency are compared with each other, one of the components corresponding to the same frequency of the plurality of spectral data is selected as a component for synthesis based on the comparison result. In the first embodiment, the largest component among the components corresponding to the same frequency of a plurality of spectral data is selected as the component for synthesis.
 また、各周波数に対応する成分として合成用成分が選択されると、それらの合成用成分に基づいて合成スペクトルデータが生成される。合成スペクトルデータは、データ形式の合成スペクトルである。また、合成スペクトルは、各周波数に対応する成分が合成用成分とされるスペクトルである。 Also, when synthesis components are selected as components corresponding to each frequency, synthesis spectrum data is generated based on these synthesis components. Synthetic spectrum data is a synthetic spectrum in data format. Also, the composite spectrum is a spectrum in which the components corresponding to each frequency are used as components for synthesis.
 第1実施形態では、合成スペクトルデータが生成されると、その合成スペクトルデータに対して、逆フーリエ変換処理が施される。また、合成スペクトルデータに逆フーリエ変換処理が施されると、出力画像データが生成される。なお、逆フーリエ変換処理では、合成スペクトルデータに対して2次元逆フーリエ変換が施される。 In the first embodiment, when synthetic spectral data is generated, the synthetic spectral data is subjected to inverse Fourier transform processing. Further, when the synthesized spectral data is subjected to inverse Fourier transform processing, output image data is generated. Note that in the inverse Fourier transform process, two-dimensional inverse Fourier transform is applied to the synthesized spectral data.
 図4は、第1実施形態の読取画像の一例を示す図である。図5は、第1実施形態の読取画像の他の例を示す図である。また、図4及び図5が示す読取画像は、同じ読取対象Tをライン状に読み取ることで生成される画像である。さらに、図4及び図5が示す読取画像が生成される際は、画像読取部28の主走査向及び副走査方向は直交する。さらにまた、図4が示す読取画像が生成される際の画像読取部28の副走査方向は、図5が示す読取画像が生成される際の画像読取部28の副走査方向と直交する。具体的に、図4では、上下方向が主走査方向、左右方向が副走査方向であり、図5では、左右方向が主走査方向、上下方向が副走査方向である。 FIG. 4 is a diagram showing an example of a read image according to the first embodiment. FIG. 5 is a diagram showing another example of a read image according to the first embodiment. The read images shown in FIGS. 4 and 5 are images generated by reading the same reading object T in a line. Furthermore, when the read images shown in FIGS. 4 and 5 are generated, the main scanning direction and the sub-scanning direction of the image reading section 28 are perpendicular to each other. Furthermore, the sub-scanning direction of the image reading section 28 when generating the read image shown in FIG. 4 is orthogonal to the sub-scanning direction of the image reading section 28 when generating the read image shown in FIG. Specifically, in FIG. 4, the vertical direction is the main scanning direction, and the horizontal direction is the sub-scanning direction. In FIG. 5, the horizontal direction is the main scanning direction, and the vertical direction is the sub-scanning direction.
 図4及び図5に示す読取画像が生成される際、対象物を走査する環境、具体的には、主走査方向が異なるため、これらの読取画像では、画像上の対象物に対応する領域である対象領域の輪郭において、ぼける部分が異なる。すなわち、対象領域の輪郭において、鮮明な部分が異なる。 When the read images shown in FIGS. 4 and 5 are generated, the environment in which the object is scanned, specifically, the main scanning direction is different. Blurred parts are different in the outline of a certain target area. That is, the contours of the target regions differ in sharp portions.
 なお、対象領域は、具体的に、画像上で対象物を構成する画素の領域である。すなわち、対象領域は、画像上の対象物ともいえる。 Note that the target area is specifically a pixel area that constitutes the target object on the image. That is, the target area can also be said to be a target object on the image.
 図6は、第1実施形態の出力画像の一例を示す図である。また、図6における出力画像は、図4及び図5が示す読取画像に対応する画像である。出力画像では、対象領域は、各読取画像における対象領域の鮮明な部分を組み合わせたように表示される。すなわち、出力画像では、対象領域の輪郭が鮮明に表示される。 FIG. 6 is a diagram showing an example of an output image according to the first embodiment. Also, the output image in FIG. 6 is an image corresponding to the read image shown in FIGS. In the output image, the target area appears as a combination of the sharp parts of the target area in each read image. That is, the contour of the target area is clearly displayed in the output image.
 なお、図6では、2枚の読取画像に対応する出力画像を例に挙げているが、読取画像の枚数が多いほど、出力画像において、対象領域の輪郭の全体がより鮮明に表示される。 Note that FIG. 6 exemplifies an output image corresponding to two read images, but the larger the number of read images, the clearer the entire outline of the target area is displayed in the output image.
 第1実施形態では、たとえば、gθを対象領域の輪郭が不鮮明な読取画像データとした場合、gθは下記式(1)で表される。fは、対象領域の輪郭が鮮明な読取画像データである。hθは、PSF(Point Spread Function)である。つまり、hθは、対象領域の輪郭のぼけに起因する要素である。また、PSFは、主走査方向の角度に依存する。
Figure JPOXMLDOC01-appb-M000001
In the first embodiment, g θ is represented by the following equation (1), for example, when g θ is read image data in which the outline of the target area is unclear. f is read image data in which the outline of the target area is clear. is a PSF (Point Spread Function). That is, is a factor resulting from the blurring of the contour of the target region. Also, the PSF depends on the angle in the main scanning direction.
Figure JPOXMLDOC01-appb-M000001
 式(1)に対して、式(2)に示すように、両辺にフーリエ変換処理を適用すると、式(3)が導き出される。したがって、式(3)において、Gθ、F、Hθは、フーリエ変換処理が施されたgθ、f、hθである。すなわち、Gθはスペクトルデータである。
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000003
By applying Fourier transform processing to both sides of Equation (1) as shown in Equation (2), Equation (3) is derived. Therefore, in Equation (3), G θ , F, and H θ are g θ , f, and h θ subjected to Fourier transform processing. That is, G θ is spectral data.
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000003
 また、上述したように、合成スペクトルデータが生成される際は、複数のスペクトルデータの同一周波数に対応する各成分の中から最も大きい成分が選択される。したがって、合成スペクトルデータが生成される際は、式(4)に従って生成される。つまり、式(4)において、左辺は合成スペクトルデータであり、右辺は、各合成用成分である。
Figure JPOXMLDOC01-appb-M000004
Moreover, as described above, when synthetic spectral data is generated, the largest component is selected from among the components corresponding to the same frequency of the plurality of spectral data. Therefore, when synthetic spectral data is generated, it is generated according to equation (4). That is, in Equation (4), the left side is the synthesized spectral data, and the right side is each component for synthesis.
Figure JPOXMLDOC01-appb-M000004
 合成スペクトルデータには、逆フーリエ変換処理が施されるため、式(4)に対して、式(5)に示すように逆フーリエ変換処理が適用される。すなわち、式(5)において、左辺は、出力画像データである。
Figure JPOXMLDOC01-appb-M000005
Since the synthesized spectrum data is subjected to inverse Fourier transform processing, the inverse Fourier transform processing is applied to equation (4) as shown in equation (5). That is, in Equation (5), the left side is the output image data.
Figure JPOXMLDOC01-appb-M000005
 図7は、第1実施形態のRAM24のメモリマップ200の一例を示す図である。図7に示すようにRAM24は、プログラム領域201およびデータ領域202を含み、プログラム領域201には、記憶部26から予め読み出された制御プログラムが記憶される。 FIG. 7 is a diagram showing an example of the memory map 200 of the RAM 24 of the first embodiment. As shown in FIG. 7, the RAM 24 includes a program area 201 and a data area 202. In the program area 201, a control program read in advance from the storage section 26 is stored.
 制御プログラムは、画像読取プログラム201a、スペクトル生成プログラム201b、比較プログラム201c、合成スペクトル生成プログラム201d及び画像生成プログラム201e等を含む。 The control programs include an image reading program 201a, a spectrum generation program 201b, a comparison program 201c, a composite spectrum generation program 201d, an image generation program 201e, and the like.
 画像読取プログラム201aは、画像読取部28を制御して読取画像データ202aを生成するためのプログラムである。 The image reading program 201a is a program for controlling the image reading unit 28 to generate read image data 202a.
 スペクトル生成プログラム201bは、画像読取プログラム201aによって生成される読取画像データ202aに対して、フーリエ変換処理を施すことで、スペクトルデータ202bを生成するためのプログラムである。 The spectrum generation program 201b is a program for generating spectrum data 202b by performing Fourier transform processing on the read image data 202a generated by the image reading program 201a.
 比較プログラム201cは、スペクトル生成プログラム201bによって生成される複数のスペクトルデータ202bの同一周波数に対応する各成分同士を比較するためのプログラムである。 The comparison program 201c is a program for comparing each component corresponding to the same frequency of the plurality of spectrum data 202b generated by the spectrum generation program 201b.
 合成スペクトル生成プログラム201dは、比較プログラム201cによる比較結果に基づき、複数のスペクトルデータ202bの同一周波数に対応する各成分の中から最も大きい成分を選択し、合成スペクトルデータ202cを生成するためのプログラムである。 The synthesized spectrum generation program 201d is a program for selecting the largest component from among the components corresponding to the same frequency of the plurality of spectrum data 202b based on the comparison result by the comparison program 201c, and for generating the synthesized spectrum data 202c. be.
 画像生成プログラム201eは、合成スペクトルデータ202cに対して、逆フーリエ変換処理を施すことで、出力画像データ202dを生成するためのプログラムである。 The image generation program 201e is a program for generating output image data 202d by performing inverse Fourier transform processing on the synthesized spectral data 202c.
 なお、図示は省略するが、プログラム領域201には、画像読取プログラム201a及びスペクトル生成プログラム201b等以外の制御プログラムも記憶される。 Although not shown, the program area 201 also stores control programs other than the image reading program 201a and the spectrum generation program 201b.
 データ領域202には、予め記憶部26から読み出されたデータが記憶される。図7に示す例では、データ領域202には、読取画像データ202a、スペクトルデータ202b、合成スペクトルデータ202c及び出力画像データ202d等が記憶される。 Data read from the storage unit 26 in advance is stored in the data area 202 . In the example shown in FIG. 7, the data area 202 stores read image data 202a, spectral data 202b, synthesized spectral data 202c, output image data 202d, and the like.
 読取画像データ202aは、読取画像に対応するデータである。また、データ領域202には、複数の読取画像データ202aが記憶されることがある。 The read image data 202a is data corresponding to the read image. Also, the data area 202 may store a plurality of read image data 202a.
 スペクトルデータ202bは、周波数スペクトルに対応するデータである。また、データ領域202には、複数のスペクトルデータ202bが記憶されることがある。 The spectrum data 202b is data corresponding to the frequency spectrum. Also, the data area 202 may store a plurality of spectral data 202b.
 合成スペクトルデータ202cは、合成スペクトルに対応するデータである。出力画像データ202dは、出力画像に対応するデータである。 The synthetic spectrum data 202c is data corresponding to the synthetic spectrum. The output image data 202d is data corresponding to the output image.
 また、データ領域202には、たとえば、実行用データが記憶されたり、制御プログラムの実行に必要なタイマ(カウンタ)およびレジスタが設けられたりする。 In addition, the data area 202 stores, for example, execution data, and is provided with a timer (counter) and registers necessary for executing the control program.
 図8は、第1実施形態の画像読取装置10の機能的構成の一例を示すブロック図である。CPU22を含む制御部20において、CPU22が画像読取プログラム201aを実行することで、制御部20が画像読取部28を制御して読取画像データ202aを生成する画像読取処理部90として機能する。 FIG. 8 is a block diagram showing an example of the functional configuration of the image reading device 10 of the first embodiment. In the control unit 20 including the CPU 22, the CPU 22 executes the image reading program 201a so that the control unit 20 functions as an image reading processing unit 90 that controls the image reading unit 28 and generates read image data 202a.
 また、CPU22がスペクトル生成プログラム201bを実行することで、制御部20が読取画像データ202aに対して、フーリエ変換処理を施し、スペクトルデータ202bを生成するスペクトル生成処理部92として機能する。 Also, the CPU 22 executes the spectrum generation program 201b so that the control section 20 functions as a spectrum generation processing section 92 that performs Fourier transform processing on the read image data 202a and generates spectrum data 202b.
 さらに、CPU22が比較プログラム201cを実行することで、制御部20が複数のスペクトルデータ202bの同一周波数に対応する各成分同士を比較する比較処理部94として機能する。 Further, the CPU 22 executes the comparison program 201c so that the control section 20 functions as a comparison processing section 94 that compares the components corresponding to the same frequency of the plurality of spectral data 202b.
 さらにまた、CPU22が合成スペクトル生成プログラム201dを実行することで、制御部20が、比較処理部94の比較結果に基づき、複数のスペクトルデータ202bの同一周波数に対応する各成分の中から最も大きい成分を選択し、合成スペクトルデータを生成する合成スペクトル生成処理部96として機能する。 Furthermore, the CPU 22 executes the synthesized spectrum generation program 201d, so that the control unit 20, based on the comparison result of the comparison processing unit 94, selects the largest component among the components corresponding to the same frequency of the plurality of spectrum data 202b. , and functions as a synthetic spectrum generation processing unit 96 that generates synthetic spectrum data.
 また、CPU22が画像生成プログラム201eを実行することで、制御部20が合成スペクトルデータ202cに対して、逆フーリエ変換処理を施し、出力画像データ202dを生成する画像生成処理部98として機能する。 Also, the CPU 22 executes the image generation program 201e so that the control section 20 functions as an image generation processing section 98 that performs inverse Fourier transform processing on the synthesized spectrum data 202c and generates output image data 202d.
 図8に示す例では、たとえば、画像読取処理部90によって複数の読取画像データ202aが生成されるとき、スペクトル生成処理部92によって、複数のスペクトルデータ202bが生成される。また、複数のスペクトルデータ202bが生成されると、比較処理部94によって、複数のスペクトルデータ202bの同一周波数に対応する各成分同士が比較され、合成スペクトル生成処理部96では、比較処理部94の比較結果に基づき、複数のスペクトルデータ202bの同一周波数に対応する各成分の中から最も大きい成分が選択され、合成スペクトルデータ202cが生成される。さらに、合成スペクトルデータ202cが生成されると、画像生成処理部98によって、出力画像データ202dが生成される。 In the example shown in FIG. 8, for example, when the image reading processing unit 90 generates a plurality of read image data 202a, the spectrum generation processing unit 92 generates a plurality of spectrum data 202b. Further, when the plurality of spectral data 202b are generated, the comparison processing unit 94 compares the respective components corresponding to the same frequency of the plurality of spectral data 202b. Based on the comparison result, the largest component is selected from among the components corresponding to the same frequency of the plurality of spectral data 202b, and synthetic spectral data 202c is generated. Further, when the synthesized spectral data 202c is generated, the image generation processing section 98 generates output image data 202d.
 図9は、第1実施形態の画像読取装置10のCPU22の画像処理の一例を示すフロー図である。CPU22は、たとえば、複数の読取画像データ202aが生成されると、画像処理を開始する。 FIG. 9 is a flowchart showing an example of image processing by the CPU 22 of the image reading device 10 of the first embodiment. For example, when a plurality of read image data 202a are generated, the CPU 22 starts image processing.
 ステップS1では、各読取画像データ202aに対して、フーリエ変換処理を行い、複数のスペクトルデータ202bを生成する。 In step S1, Fourier transform processing is performed on each read image data 202a to generate a plurality of spectral data 202b.
 ステップS2では、複数のスペクトルデータ202bの同一周波数に対応する各成分同士を比較する。 In step S2, each component corresponding to the same frequency of the plurality of spectral data 202b is compared.
 ステップS3では、同一周波数に対応する各成分の比較結果に基づき、複数のスペクトルデータ202bの同一周波数に対応する各成分の中から最も大きい成分を選択し、合成スペクトルデータ202cを生成する。 In step S3, based on the comparison results of the components corresponding to the same frequency, the largest component is selected from among the components corresponding to the same frequency of the plurality of spectrum data 202b to generate synthetic spectrum data 202c.
 ステップS4では、合成スペクトルデータ202cに対して、逆フーリエ変換処理を施し、出力画像データ202dを生成する。また、ステップS4で、出力画像データ202dが生成されると、画像処理が終了する。 In step S4, an inverse Fourier transform process is performed on the synthesized spectrum data 202c to generate output image data 202d. Further, when the output image data 202d is generated in step S4, the image processing ends.
 第1実施形態によれば、対象領域の輪郭が鮮明な出力画像データを生成することができる。 According to the first embodiment, it is possible to generate output image data with a clear outline of the target area.
 また、第1実施形態によれば、読取画像データの数が多いほど、対象領域の輪郭がより鮮明な出力画像データが生成されると思われる。 Also, according to the first embodiment, it is thought that output image data with a clearer outline of the target area is generated as the number of read image data increases.
 なお、複数のスペクトルデータの同一周波数に対応する各成分の中から最も大きい成分が合成用成分として選択されるような構成に限らず、合成用成分が適宜に選択されてもよい。 It should be noted that the composition is not limited to the configuration in which the largest component among the components corresponding to the same frequency of the plurality of spectral data is selected as the component for synthesis, and the component for synthesis may be selected as appropriate.
 たとえば、複数のスペクトルデータのうち、一部の周波数に対応する成分については、最も大きい成分が合成用成分として選択され、他の周波数に対応する成分については、適宜に合成用成分が選択されてもよい。 For example, among the plurality of spectral data, for the components corresponding to some frequencies, the largest component is selected as the synthesis component, and for the components corresponding to the other frequencies, the synthesis component is appropriately selected. good too.
 さらに、複数のスペクトルデータの同一周波数に対応する各成分の中から合成用成分が選択される際、最も小さい成分及び最も大きい成分の間の任意の成分を合成用成分としてもよい。 Furthermore, when a component for synthesis is selected from components corresponding to the same frequency of a plurality of spectral data, any component between the smallest component and the largest component may be used as the component for synthesis.
2.第2実施形態
 第2実施形態では、複数の画像読取部28を用いて対象物を読み取る。また、第2実施形態では、複数の読取画像データに対してフーリエ変換処理が施される前に、各読取画像データにおいて、対象領域を略同一に位置させるための処理を施す。なお、フーリエ変換処理以降の一連の処理は、第1実施形態と同様であるため、重複する説明は省略する。
2. Second Embodiment In a second embodiment, a plurality of image reading units 28 are used to read an object. In addition, in the second embodiment, before Fourier transform processing is performed on a plurality of pieces of read image data, processing is performed to position the target area in substantially the same position in each piece of read image data. Note that a series of processes after the Fourier transform process are the same as those in the first embodiment, and duplicate descriptions will be omitted.
 第2実施形態では、複数の画像読取部28の主走査方向が互いに交差するように配置される。また、各画像読取部28は、主走査方向が副走査方向に対して所定の角度で交差するように配置される。ただし、第2実施形態における各画像読取部28の副走査方向(対象物の搬送方向、又は、各画像読取部28の移動方向)は、同一の方向であり、かつ、全ての画像読取部28と交差する方向である。 In the second embodiment, the main scanning directions of the plurality of image reading units 28 are arranged to intersect each other. Each image reading section 28 is arranged such that the main scanning direction intersects the sub-scanning direction at a predetermined angle. However, the sub-scanning direction of each image reading section 28 in the second embodiment (conveying direction of the object or moving direction of each image reading section 28) is the same direction, and all the image reading sections 28 is the direction that intersects with
 図10は、第2実施形態の複数の画像読取部28の周辺の一例を示す概略図である。なお、図10は、焦点面40と直交する方向から見た場合の複数の画像読取部28の周辺の一例を示す概略図でもある。 FIG. 10 is a schematic diagram showing an example of the periphery of a plurality of image reading units 28 of the second embodiment. 10 is also a schematic diagram showing an example of the periphery of the plurality of image reading units 28 when viewed from a direction perpendicular to the focal plane 40. As shown in FIG.
 図10に示す例では、画像読取部28A及び画像読取部28Bの主走査方向が互いに直交し、かつ、画像読取部28A及び画像読取部28Bが、副走査方向に対して主走査方向が45°の角度で交差するように配置される。すなわち、図10に示す例では、各画像読取部28の主走査方向及び副走査方向は、直交しない。 In the example shown in FIG. 10, the main scanning directions of the image reading section 28A and the image reading section 28B are orthogonal to each other, and the main scanning direction of the image reading section 28A and the image reading section 28B is 45° with respect to the sub-scanning direction. are arranged to intersect at an angle of That is, in the example shown in FIG. 10, the main scanning direction and the sub-scanning direction of each image reading section 28 are not orthogonal.
 また、各画像読取部28の副走査方向は、同一の方向であり、かつ、各画像読取部28の副走査方向は、画像読取部28A及び画像読取部28Bと交差する。ただし、画像読取部28の副走査方向と主走査方向が成す角度は、45°に限らない。また、画像読取部28の主走査方向同士が成す角度は、90°に限らない。 Further, the sub-scanning direction of each image reading section 28 is the same direction, and the sub-scanning direction of each image reading section 28 intersects the image reading section 28A and the image reading section 28B. However, the angle formed by the sub-scanning direction and the main scanning direction of the image reading section 28 is not limited to 45°. Further, the angle formed by the main scanning directions of the image reading section 28 is not limited to 90°.
 また、第2実施形態では、複数の画像読取部28は、主走査方向が副走査方向に対してそれぞれ異なる方向に延び、かつ、互いの主走査方向が1点で交差するように配置されるともいえる。 Further, in the second embodiment, the plurality of image reading units 28 are arranged such that the main scanning directions extend in directions different from the sub-scanning direction, and the main scanning directions intersect each other at one point. It can also be said.
 第2実施形態では、主走査方向及び副走査方向が直交しない画像読取部28に基づいて、読取画像データが生成されるため、対象領域が傾く読取画像データが生成される。 In the second embodiment, the read image data is generated based on the image reading section 28 in which the main scanning direction and the sub-scanning direction are not orthogonal, so that the read image data in which the target area is tilted is generated.
 第2実施形態では、複数の読取画像データのそれぞれにおける画像上で対象領域が略同一に位置されるように、フーリエ変換処理が施される前の複数の読取画像データに対して、補正処理を施す。 In the second embodiment, correction processing is performed on a plurality of pieces of read image data before being subjected to Fourier transform processing so that target regions are positioned substantially identically on the images of each of the plurality of pieces of read image data. Apply.
 補正処理は、対象領域の傾きを補正する傾き補正処理及び対象領域の倍率を補正する倍率補正処理を含み、たとえば、傾き補正処理、倍率補正処理の順に行われる。補正処理については、図11を参照して説明する。図11は、第2実施形態の補正処理の一例を説明するための図である。また、図11は、画像読取部28Aに対応する読取画像、画像読取部28Bに対応する読取画像を示す。なお、補正処理では、初めに、読取画像が走査され、画素の座標が取得される。 The correction process includes a tilt correction process for correcting the tilt of the target area and a magnification correction process for correcting the magnification of the target area. For example, the tilt correction process and the magnification correction process are performed in this order. The correction processing will be described with reference to FIG. 11 . FIG. 11 is a diagram for explaining an example of correction processing according to the second embodiment. 11 shows a read image corresponding to the image reading section 28A and a read image corresponding to the image reading section 28B. Note that in the correction process, first, the read image is scanned to obtain the coordinates of the pixels.
 対象領域は、対象物を読み取る際の画像読取部28と対象物との相対的な速度に応じた角度で傾く。上記相対的な速度が速いほど、対象領域の傾きは大きくなる。 The target area is tilted at an angle corresponding to the relative speed between the image reading unit 28 and the target when reading the target. The higher the relative speed, the greater the slope of the target area.
 傾き補正処理では、画素の座標に基づいて対象領域の傾きを算出し、その傾きに従って、画素が上下方向にシフトされる。このように、傾き補正処理では、画素のシフトに伴って対象領域の傾きが補正される。なお、画素の座標から、対象領域の傾きを算出する方法としては、周知の方法を用いる。 In the tilt correction process, the tilt of the target area is calculated based on the pixel coordinates, and the pixels are shifted vertically according to the tilt. Thus, in the tilt correction process, the tilt of the target region is corrected along with the shift of the pixels. Note that a well-known method is used as a method for calculating the inclination of the target area from the coordinates of the pixels.
 倍率補正処理では、対象領域を左右方向に縮小する。また、対象領域を縮小する際の倍率である補正倍率は、各読取画像上において、対象領域が略同一に位置されるのであれば、特に限定されない。たとえば、補正倍率は、傾き補正前の対象領域の傾きに基づいて、設定されてもよい。なお、対象領域の倍率の調整に伴う画素の補間等は、周知であるため、詳細な説明は省略する。ただし、倍率補正処理では、対象領域を縮小するのではなく、拡大してもよい。 In the magnification correction process, the target area is reduced in the horizontal direction. Further, the correction magnification, which is the magnification for reducing the target area, is not particularly limited as long as the target area is positioned substantially at the same position on each read image. For example, the correction magnification may be set based on the tilt of the target area before tilt correction. Since pixel interpolation and the like involved in adjusting the magnification of the target area are well known, detailed description thereof will be omitted. However, in the magnification correction process, the target area may be enlarged instead of being reduced.
 また、傾き補正処理だけでも、各読取画像データにおいて、対象領域を略同一に位置させることができるため、倍率補正処理については、行われなくても良い。つまり、補正処理では、傾き補正処理及び倍率補正処理のうち、少なくとも傾き補正処理が行われてもよい。 In addition, since the target area can be positioned substantially at the same position in each read image data only by the tilt correction process, the magnification correction process does not have to be performed. That is, in the correction process, at least the tilt correction process may be performed out of the tilt correction process and the magnification correction process.
 さらに、補正処理では、周知の方法を用いて、対象領域の傾き及び倍率のうち、少なくとも傾きを補正してもよい。たとえば、補正処理では、画像上における文字、図形、イラスト等のオブジェクトを構成する画素の位置(座標)に基づいて、そのオブジェクトの傾き及び倍率を補正する周知の方法が用いられてもよい。 Furthermore, in the correction process, out of the tilt and magnification of the target area, at least the tilt may be corrected using a well-known method. For example, in the correction process, a well-known method of correcting the tilt and magnification of an object such as characters, graphics, and illustrations on the image based on the positions (coordinates) of pixels forming the object may be used.
 さらに、第2実施形態では、画像読取部28A及び画像読取部28Bの主走査方向が互いに直交して配置される場合を例に挙げて説明したが、複数の画像読取部28は、主走査方向が副走査方向に対してそれぞれ異なる方向に延び、かつ、互いの主走査方向が1点で交差するのであれば、画像読取部28の数は3つ以上でもよい。また、画像読取部28を配置する際の角度及び数によっては、補正処理が施される読取画像データは、少なくとも1つでよい。 Furthermore, in the second embodiment, the case where the main scanning directions of the image reading section 28A and the image reading section 28B are arranged perpendicular to each other has been described as an example. extend in directions different from the sub-scanning direction, and the main scanning directions intersect each other at one point, the number of image reading units 28 may be three or more. Further, depending on the angle and the number of the image reading units 28, at least one read image data may be subjected to the correction process.
 図12は、第2実施形態のRAM24のメモリマップ200の一例を示す図である。第2実施形態では、制御プログラムは、補正プログラム201fを含む。また、補正プログラム201fは、傾き補正プログラム201g及び倍率補正プログラム201hを含む。 FIG. 12 is a diagram showing an example of the memory map 200 of the RAM 24 of the second embodiment. In the second embodiment, the control program includes a correction program 201f. The correction program 201f also includes a tilt correction program 201g and a magnification correction program 201h.
 補正プログラム201fは、画像読取プログラム201aによって生成される読取画像データ202aに補正処理を施し、補正画像データ202eを生成するためのプログラムである。 The correction program 201f is a program for performing correction processing on the read image data 202a generated by the image reading program 201a to generate corrected image data 202e.
 傾き補正プログラム202gは、画像読取プログラム201aによって生成される読取画像データ202aに傾き補正処理を施すためのプログラムである。倍率補正プログラム201hは、画像読取プログラム201aによって生成される読取画像データ202aに倍率補正処理を施すためのプログラムである。 The tilt correction program 202g is a program for performing tilt correction processing on the read image data 202a generated by the image reading program 201a. The magnification correction program 201h is a program for applying magnification correction processing to the read image data 202a generated by the image reading program 201a.
 また、第2実施形態では、データ領域202に補正画像データ202eが記憶される。補正画像データ202eは、補正処理が施された読取画像である補正画像に対応するデータである。なお、補正画像データ202eは、補正画像に対応するデータを複数含むことがある。 Also, in the second embodiment, the corrected image data 202 e is stored in the data area 202 . The corrected image data 202e is data corresponding to a corrected image, which is a read image subjected to correction processing. Note that the corrected image data 202e may include a plurality of data corresponding to corrected images.
 さらに、第2実施形態では、スペクトル生成プログラム201bは、画像読取プログラム201aによって生成される読取画像データ202a、具体的には、補正画像データ202eに対して、フーリエ変換処理を施すことで、スペクトルデータ202bを生成するためのプログラムである。 Furthermore, in the second embodiment, the spectrum generation program 201b performs Fourier transform processing on the read image data 202a generated by the image reading program 201a, specifically, the corrected image data 202e, thereby generating spectrum data. 202b.
 図13は、第2実施形態の画像読取装置10の機能的構成の一例を示すブロック図である。第2実施形態では、CPU22が補正プログラム201fを実行することで、読取画像データ202aに補正処理を施し、補正画像データ202eを生成する補正処理部100として機能する。また、第2実施形態では、CPU22がスペクトル生成プログラム201bを実行することで、補正画像データ202eに対して、フーリエ変換処理を施し、スペクトルデータ202bを生成するスペクトル生成処理部92として機能する。 FIG. 13 is a block diagram showing an example of the functional configuration of the image reading device 10 of the second embodiment. In the second embodiment, the CPU 22 executes the correction program 201f to perform correction processing on the read image data 202a, and functions as the correction processing unit 100 that generates corrected image data 202e. In the second embodiment, the CPU 22 executes the spectrum generation program 201b to perform Fourier transform processing on the corrected image data 202e to function as the spectrum generation processing unit 92 that generates the spectrum data 202b.
 図14は、第2実施形態の画像読取装置10のCPU22の画像処理の一例を示すフロー図である。第2実施形態では、複数の読取画像データ202aが生成されると、ステップS1で、複数の補正画像データ202eを生成する。なお、ステップS2~ステップS5については、図9に示すフロー図のステップS1~ステップS4に相当するため、重複する説明は省略する。 FIG. 14 is a flowchart showing an example of image processing by the CPU 22 of the image reading device 10 of the second embodiment. In the second embodiment, when a plurality of read image data 202a are generated, a plurality of corrected image data 202e are generated in step S1. Note that steps S2 to S5 correspond to steps S1 to S4 in the flow chart shown in FIG. 9, and redundant description will be omitted.
 第2実施形態では、上述したように、画像読取部28の数は3つ以上でもよく、画像読取部28を配置する際の角度及び数によっては、補正処理が施される読取画像データは、少なくとも1つでよい。したがって、第2実施形態によれば、複数の読取画像データのうち、少なくとも一部の読取画像データにおいて対象領域が傾いていても、各読取画像データにおいて対象領域を略同一に位置させることができる。すなわち、画像読取部28を主走査方向及び副走査方向が直交しないように配置することができる。 In the second embodiment, as described above, the number of image reading units 28 may be three or more. At least one is fine. Therefore, according to the second embodiment, even if the target area is tilted in at least a part of the read image data among the plurality of read image data, the target area can be positioned substantially in the same position in each read image data. . That is, the image reading section 28 can be arranged so that the main scanning direction and the sub-scanning direction are not perpendicular to each other.
 また、第2実施形態によれば、対象物又は複数の画像読取部28が副走査方向に移動するだけで、つまり、一度の動作だけで複数の読取画像データを生成することができる。したがって、第2実施形態によれば、効率よく複数の読取画像データを得ることできる。ただし、この場合、各画像読取部28の主走査方向及び副走査方向は直交しない。 Further, according to the second embodiment, it is possible to generate a plurality of read image data simply by moving the object or the plurality of image reading units 28 in the sub-scanning direction, that is, by performing only one operation. Therefore, according to the second embodiment, it is possible to efficiently obtain a plurality of pieces of read image data. However, in this case, the main scanning direction and the sub-scanning direction of each image reading section 28 are not orthogonal.
 なお、第2実施形態では、図15~図19に示すように、複数の画像読取部28は、副走査方向に対して交差する方向に並べて配置されてもよい。具体的には、主走査方向が副走査方向に対してそれぞれ異なる方向に延び、かつ、副走査方向に並べて配置される画像読取部28の一群は、副走査方向に対して直交する方向に並べて配置されてもよい。 Note that in the second embodiment, as shown in FIGS. 15 to 19, the plurality of image reading units 28 may be arranged side by side in a direction intersecting the sub-scanning direction. Specifically, a group of image reading units 28 whose main scanning direction extends in directions different from the sub-scanning direction and which are arranged side by side in the sub-scanning direction are arranged in a direction orthogonal to the sub-scanning direction. may be placed.
 図15~図19は、第2実施形態の画像読取部28の配置を説明するための概略図である。図15に示す例では、画像読取部28A及び画像読取部28Bは、主走査方向が互いに直交し、かつ、副走査方向に対して主走査方向が45°の角度で交差するように配置される。また、画像読取部28A及び画像読取部28Bは、副走査方向と直交する方向を軸(対称軸)として、対称に配置されている。さらに、画像読取部28C及び画像読取部28Dについては、画像読取部28A及び画像読取部28Bと同様に配置されている。 15 to 19 are schematic diagrams for explaining the arrangement of the image reading section 28 of the second embodiment. In the example shown in FIG. 15, the image reading section 28A and the image reading section 28B are arranged so that the main scanning directions are orthogonal to each other and the main scanning direction intersects the sub-scanning direction at an angle of 45°. . Further, the image reading section 28A and the image reading section 28B are arranged symmetrically with respect to the direction perpendicular to the sub-scanning direction as an axis (axis of symmetry). Further, the image reading section 28C and the image reading section 28D are arranged in the same manner as the image reading section 28A and the image reading section 28B.
 また、画像読取部28A及び画像読取部28Bに係る対称軸は、画像読取部28C及び画像読取部28Dに係る対称軸と同一である。画像読取部28C及び画像読取部28Dを含む一群は、副走査方向と直交する方向において、画像読取部28A及び画像読取部28Bを含む一群と並べて配置される。 Also, the axis of symmetry for the image reading units 28A and 28B is the same as the axis of symmetry for the image reading units 28C and 28D. A group including the image reading section 28C and the image reading section 28D is arranged side by side with a group including the image reading section 28A and the image reading section 28B in the direction orthogonal to the sub-scanning direction.
 なお、図15に示す例では、画像読取部28A及び画像読取部28Cは、互いの主走査方向が平行となるように配置され、さらに、副走査方向において互いの一部が重なるように配置される。このことは、画像読取部28B及び画像読取部28Dについても同様である。 In the example shown in FIG. 15, the image reading section 28A and the image reading section 28C are arranged so that the main scanning directions are parallel to each other, and further, they are arranged so that they partially overlap each other in the sub-scanning direction. be. This also applies to the image reading section 28B and the image reading section 28D.
 図16に示す例では、図15に示す例と異なり、画像読取部28A及び画像読取部28Cは、互いの主走査方向が交差するように配置され、さらに、副走査方向を軸として対称となるように配置される。つまり、画像読取部28A及び画像読取部28Cは、副走査方向において互いの一部が重ならない。このことは、画像読取部28B及び画像読取部28Dについても同様である。 In the example shown in FIG. 16, unlike the example shown in FIG. 15, the image reading section 28A and the image reading section 28C are arranged so that the main scanning directions intersect with each other, and are symmetrical with respect to the sub-scanning direction. are arranged as follows. That is, the image reading section 28A and the image reading section 28C do not partially overlap each other in the sub-scanning direction. This also applies to the image reading section 28B and the image reading section 28D.
 図17に示す例では、図16に示す例と異なり、画像読取部28A及び画像読取部28Bに係る対称軸は、画像読取部28C及び画像読取部28Dに係る対称軸と異なる。つまり、画像読取部28A及び画像読取部28Cは、非対称とされ、画像読取部28B及び画像読取部28Dについても非対称とされる。また、画像読取部28A及び画像読取部28Cは、副走査方向において互いの一部が重なる。このことは、画像読取部28B及び画像読取部28Dについても同様である。 In the example shown in FIG. 17, unlike the example shown in FIG. 16, the axis of symmetry for the image reading units 28A and 28B is different from the axis of symmetry for the image reading units 28C and 28D. That is, the image reading section 28A and the image reading section 28C are asymmetrical, and the image reading section 28B and the image reading section 28D are also asymmetrical. The image reading section 28A and the image reading section 28C partially overlap each other in the sub-scanning direction. This also applies to the image reading section 28B and the image reading section 28D.
 図18に示す例では、図16に示す例と異なり、画像読取部28A及び画像読取部28Cは、非対称であり、副走査方向において互いの一部が重なる。このことは、画像読取部28B及び画像読取部28Dについても同様である。 In the example shown in FIG. 18, unlike the example shown in FIG. 16, the image reading section 28A and the image reading section 28C are asymmetrical and partly overlap each other in the sub-scanning direction. This also applies to the image reading section 28B and the image reading section 28D.
 図19に示す例では、図16に示す例と異なり、主走査方向が副走査方向と直交する画像読取部28Eが設けられる。画像読取部28Eは、画像読取部28A及び画像読取部28Bに係る対称軸に沿って設けられる。図19に示す例では、画像読取部28A及び画像読取部28Bは、副走査方向に対して主走査方向が60°の角度で交差するように配置される。画像読取部28C及び画像読取部28Dについても同様に、副走査方向に対して主走査方向が60°の角度で交差するように配置される。これにより、画像読取部28A、画像読取部28C及び画像読取部28Eは、互いに正三角形状に配置され、画像読取部28B、画像読取部28D及び画像読取部28Eも同様に、互いに正三角形状に配置される。 In the example shown in FIG. 19, unlike the example shown in FIG. 16, an image reading section 28E is provided in which the main scanning direction is perpendicular to the sub-scanning direction. The image reading section 28E is provided along the axis of symmetry between the image reading section 28A and the image reading section 28B. In the example shown in FIG. 19, the image reading section 28A and the image reading section 28B are arranged so that the main scanning direction intersects the sub-scanning direction at an angle of 60°. Similarly, the image reading section 28C and the image reading section 28D are arranged such that the main scanning direction intersects the sub-scanning direction at an angle of 60°. Accordingly, the image reading section 28A, the image reading section 28C, and the image reading section 28E are arranged in an equilateral triangle with each other, and the image reading section 28B, the image reading section 28D, and the image reading section 28E are arranged in an equilateral triangle with each other. placed.
 図15~図19の例のように、画像読取部28の一群が、副走査方向に対して交差する方向に並べて配置される場合、各画像読取部28の主走査方向の長さが短くても、副走査方向と交差する方向に並べて配置された複数の画像読取部28で、読取対象Tを分割して読み取ることができる。 15 to 19, when a group of image reading units 28 are arranged side by side in a direction intersecting the sub-scanning direction, the length of each image reading unit 28 in the main scanning direction is short. Also, the object T to be read can be divided and read by a plurality of image reading units 28 arranged side by side in a direction intersecting the sub-scanning direction.
 これにより、副走査方向における画像読取部28の配置領域を狭めて、省スペース化することができる。特に、図15~図19の例では、各画像読取部28が同じ長さであり、同一の画像読取部28を複数用意して適切に配置するだけでよいため、低コストで省スペース化を実現できる。なお、このように、画像読取部28の一群が、副走査方向に対して交差する方向に並べて配置される場合、必要に応じて、補正された後の読取画像が合成されてもよい。 As a result, the arrangement area of the image reading section 28 in the sub-scanning direction can be narrowed to save space. In particular, in the examples of FIGS. 15 to 19, each image reading unit 28 has the same length, and it is only necessary to prepare a plurality of identical image reading units 28 and arrange them appropriately. realizable. When a group of image reading units 28 are arranged side by side in a direction intersecting the sub-scanning direction in this way, the corrected read image may be synthesized as necessary.
 なお、第1実施形態、第2実施形態では、後述する撮像素子32(32A、32B)(図21、図22参照)が画像読取部28の長手方向に対して傾斜することなく、その長手方向に並べて配置されているため、画像読取部28の主走査方向は、その画像読取部28の長手方向と一致している。 In the first embodiment and the second embodiment, the imaging device 32 (32A, 32B) (see FIGS. 21 and 22), which will be described later, does not incline with respect to the longitudinal direction of the image reading section 28, and is , the main scanning direction of the image reading section 28 coincides with the longitudinal direction of the image reading section 28 .
3.第3実施形態
 第3実施形態は、画像読取部28の構成及び配置を変更したこと以外は、第1実施形態、第2実施形態と同様である。図20は、第3実施形態の画像読取部28の配置を説明するための概略図である。図21は、第3実施形態の画像読取部28の構成を説明するための概略図である。なお、図21は、図20における画像読取部28の周辺を拡大した図でもある。
3. Third Embodiment A third embodiment is the same as the first embodiment and the second embodiment except that the configuration and arrangement of the image reading section 28 are changed. FIG. 20 is a schematic diagram for explaining the arrangement of the image reading section 28 of the third embodiment. FIG. 21 is a schematic diagram for explaining the configuration of the image reading section 28 of the third embodiment. 21 is also an enlarged view of the periphery of the image reading section 28 in FIG.
 第3実施形態では、2つの画像読取部28が用いられる。また、画像読取部28のそれぞれは、副走査方向に対して交差する長手方向に沿って長尺形状を有する。図21に示す例では、画像読取部28のそれぞれは、長手方向が副走査方向と直交するよう配置されている。 Two image reading units 28 are used in the third embodiment. Further, each of the image reading units 28 has an elongated shape along the longitudinal direction intersecting the sub-scanning direction. In the example shown in FIG. 21, each of the image reading units 28 is arranged so that its longitudinal direction is orthogonal to the sub-scanning direction.
 また、第3実施形態では、2つの画像読取部28のうち一方の画像読取部28Aには、画像読取部28Aの長手方向に対して傾斜する第1撮像素子32Aが、画像読取部28Aの長手方向に並べて配置される。また、他方の画像読取部28Bには、画像読取部28Bの長手方向に対して第1撮像素子32Aと異なる角度で傾斜する第2撮像素子32Bが、画像読取部28Bの長手方向に並べて配置される。第1撮像素子32A及び第2撮像素子32Bは、それぞれ受光ICチップであり、フォトダイオードなどの光電変換素子が一直線上に複数並べて配置されることにより構成される。 Further, in the third embodiment, one of the two image reading units 28, the image reading unit 28A, has a first imaging device 32A that is inclined with respect to the longitudinal direction of the image reading unit 28A. arranged side by side. In the other image reading section 28B, a second image pickup element 32B inclined at an angle different from that of the first image pickup element 32A with respect to the longitudinal direction of the image reading section 28B is arranged side by side in the longitudinal direction of the image reading section 28B. be. Each of the first imaging element 32A and the second imaging element 32B is a light receiving IC chip, and is configured by arranging a plurality of photoelectric conversion elements such as photodiodes in a straight line.
 図21に示す例では、第1撮像素子32Aは、画像読取部28Aの長手方向に対して+45°の角度で傾斜し、第2撮像素子32Bは、画像読取部28Bの長手方向に対して-45°の角度で傾斜している。また、第1撮像素子32A及び第2撮像素子32Bは、副走査方向に直交する方向を軸として対称となるように配置されている。つまり、図21に示す例では、各画像読取部28の主走査方向は、直交している。 In the example shown in FIG. 21, the first imaging element 32A is inclined at an angle of +45° with respect to the longitudinal direction of the image reading section 28A, and the second imaging element 32B is inclined with respect to the longitudinal direction of the image reading section 28B. It is slanted at an angle of 45°. Also, the first imaging element 32A and the second imaging element 32B are arranged symmetrically with respect to the direction orthogonal to the sub-scanning direction. That is, in the example shown in FIG. 21, the main scanning directions of the image reading units 28 are perpendicular to each other.
 このように、第1撮像素子32A及び第2撮像素子32Bを配置した場合、2つの画像読取部28を平行に配置しても、主走査方向がそれぞれ異なる2つの画像読取部28で2つの読取画像データを取得することができる。そのため、図10のように各画像読取部28の長手方向に沿って複数の第1撮像素子32A又は複数の第2撮像素子32Bを並べて配置する構成と比べて、副走査方向における各画像読取部28の配置領域を狭めて、省スペース化することができる。 In this way, when the first imaging device 32A and the second imaging device 32B are arranged, even if the two image reading units 28 are arranged in parallel, the two image reading units 28 having different main scanning directions can perform two reading operations. Image data can be acquired. Therefore, compared to the configuration in which a plurality of first imaging elements 32A or a plurality of second imaging elements 32B are arranged side by side along the longitudinal direction of each image reading section 28 as shown in FIG. 28 can be narrowed to save space.
 図22は、第3実施形態の画像読取部28の構成の変形例を説明するための概略図である。この変形例では、画像読取部28が1つだけ用いられている。この場合、1つの画像読取部28における同一基板上に、画像読取部28の長手方向に並べて複数の第1撮像素子32Aが平行に配置されるとともに、画像読取部28の長手方向に並べて複数の第2撮像素子32Bが平行に配置される。 FIG. 22 is a schematic diagram for explaining a modification of the configuration of the image reading section 28 of the third embodiment. In this modification, only one image reading section 28 is used. In this case, on the same substrate of one image reading unit 28, a plurality of first imaging elements 32A are arranged in parallel in the longitudinal direction of the image reading unit 28, and a plurality of first image pickup elements 32A are arranged in the longitudinal direction of the image reading unit 28. A second imaging element 32B is arranged in parallel.
 図22に示す例では、第1撮像素子32Aは、画像読取部28Aの長手方向に対して+45°の角度で傾斜し、第2撮像素子32Bは、画像読取部28Bの長手方向に対して-45°の角度で傾斜している。また、第1撮像素子32A及び第2撮像素子32Bは、副走査方向に直交する方向を軸として対称となるように配置されている。したがって、図22に示す例では、画像読取部28の主走査方向は、2方向存在し、各主走査方向は、直交している。 In the example shown in FIG. 22, the first imaging element 32A is inclined at an angle of +45° with respect to the longitudinal direction of the image reading section 28A, and the second imaging element 32B is inclined with respect to the longitudinal direction of the image reading section 28B. It is slanted at an angle of 45°. Also, the first imaging element 32A and the second imaging element 32B are arranged symmetrically with respect to the direction orthogonal to the sub-scanning direction. Therefore, in the example shown in FIG. 22, there are two main scanning directions of the image reading section 28, and the main scanning directions are orthogonal to each other.
 このように、1つの画像読取部28に第1撮像素子32A及び第2撮像素子32Bを配置した場合でも、図10のように各画像読取部28の長手方向に沿って複数の第1撮像素子32A又は複数の第2撮像素子32Bを並べて配置する構成と比べて、副走査方向における画像読取部28の配置領域を狭めて、省スペース化することができる。 As described above, even when the first imaging device 32A and the second imaging device 32B are arranged in one image reading unit 28, a plurality of first imaging devices are arranged along the longitudinal direction of each image reading unit 28 as shown in FIG. Compared to a configuration in which 32A or a plurality of second imaging elements 32B are arranged side by side, the arrangement area of the image reading section 28 in the sub-scanning direction can be narrowed to save space.
4.第4実施形態
 第4実施形態は、画像読取部28の構成を変更したこと以外は、第1実施形態、第2実施形態と同様である。つまり、第4実施形態では、画像読取部28の長手方向と主走査方向は一致する。
4. Fourth Embodiment A fourth embodiment is the same as the first embodiment and the second embodiment except that the configuration of the image reading section 28 is changed. That is, in the fourth embodiment, the longitudinal direction of the image reading section 28 and the main scanning direction match.
 図23は、第4実施形態の画像読取部28の一例の一部を示す断面図である。また、図23は、画像読取部28を主走査方向から見た場合の断面図である。図23に示すように、第4実施形態の画像読取部28は、スリット部材34を備える。スリット部材34は、主走査方向に延び、読取対象Tとライン状光源44Bとの間に配置される。スリット部材34には、主走査方向に沿って細長いスリットが形成されており、当該スリットを通過した光のみが読取対象Tに照射される。スリットにおける主走査方向に直交する幅は、撮像素子の各光電変換素子の幅と同程度以下であってもよい。 FIG. 23 is a cross-sectional view showing part of an example of the image reading section 28 of the fourth embodiment. FIG. 23 is a cross-sectional view of the image reading section 28 as seen from the main scanning direction. As shown in FIG. 23, the image reading section 28 of the fourth embodiment includes a slit member 34. As shown in FIG. The slit member 34 extends in the main scanning direction and is arranged between the object to be read T and the linear light source 44B. The slit member 34 is formed with an elongated slit along the main scanning direction, and only the light that has passed through the slit irradiates the object T to be read. The width of the slit perpendicular to the main scanning direction may be about the same as or less than the width of each photoelectric conversion element of the imaging device.
 ライン状光源44Bは、LED又はハロゲンランプなどにより構成され、750nm~2500nmの近赤外域の光を出射するものであってもよい。この場合、スリット部材34は、高出力の近赤外光に耐性があり、寸法安定性に優れる材料及び形状で形成されることが好ましい。また、スリット部材34は、読取対象Tに対して、近接又は離間させる方向に移動可能であってもよい。 The linear light source 44B may be composed of an LED, a halogen lamp, or the like, and may emit light in the near-infrared range of 750 nm to 2500 nm. In this case, the slit member 34 is preferably formed of a material and shape that is resistant to high-power near-infrared light and has excellent dimensional stability. Also, the slit member 34 may be movable in the direction of approaching or separating from the object T to be read.
 スリット部材34によれば、主走査方向と直交する方向において、読取対象Tに照射される光L2の一部が遮られるため、読取対象Tに対する副走査方向での回り込みの光による画像上の輪郭のぼけを抑制することができる。これにより、読取対象Tに対応する領域の輪郭がより鮮明な出力画像データを生成することができる。 According to the slit member 34, part of the light L2 irradiated to the object to be read T is blocked in the direction perpendicular to the main scanning direction. Blurring can be suppressed. As a result, output image data in which the outline of the area corresponding to the reading object T is sharper can be generated.
 また、図示は省略するが、ライン状光源44Bとして、汎用のラインレーザ光源が用いられても良い。ラインレーザ光源は、レーザ光源及びレンズを含み、ライン状にレーザ光を照射する。ラインレーザ光源から照射されるライン状のレーザ光における主走査方向に直交する幅は、撮像素子の各光電変換素子の幅と同程度以下であってもよい。ラインレーザ光源は、750nm~2500nmの近赤外域の光を出射するものであってもよい。ラインレーザ光源としては、たとえば、テレセントリックレーザ光源が挙げられる。 Although not shown, a general-purpose line laser light source may be used as the line light source 44B. The line laser light source includes a laser light source and a lens, and irradiates laser light in a line. The width of the linear laser light emitted from the line laser light source, which is perpendicular to the main scanning direction, may be about the same as or less than the width of each photoelectric conversion element of the imaging device. The line laser light source may emit near-infrared light of 750 nm to 2500 nm. Line laser light sources include, for example, telecentric laser light sources.
 ライン状光源44Bとして、ラインレーザ光源を用いた場合、主走査方向と直交する方向において、光が放射状に広がるのが抑制されるため、スリット部材34を設けた場合と同様の効果が得られる。つまり、ライン状光源44Bとしてラインレーザ光源を用いることにより、読取対象Tに対する副走査方向での回り込みの光による画像上の輪郭のぼけを抑制することができるため、部材の数を増やすことなく、読取対象Tに対応する領域の輪郭がより鮮明な出力画像データを生成することができる。 When a line laser light source is used as the line light source 44B, the radial spread of light in the direction perpendicular to the main scanning direction is suppressed, so that the same effect as when the slit member 34 is provided can be obtained. That is, by using a line laser light source as the line light source 44B, it is possible to suppress the blurring of the contour on the image due to the light that wraps around the object to be read T in the sub-scanning direction. Output image data in which the outline of the area corresponding to the reading target T is sharper can be generated.
5.第5実施形態
 第5実施形態では、搬送装置36の具体的構成について説明する。なお、第1実施形態、第2実施形態、第3実施形態、第4実施形態においても、画像読取装置10は搬送装置を備えているが、具体的な説明は省略した。
5. 5th Embodiment In 5th Embodiment, the concrete structure of the conveying apparatus 36 is demonstrated. In the first, second, third, and fourth embodiments, the image reading device 10 also includes a conveying device, but a detailed description thereof is omitted.
 搬送装置36は、画像読取部28の主走査方向に対して交差する方向に延びる搬送路に沿って、読取対象Tを搬送する。第5実施形態を第4実施形態と組み合わせる場合は、搬送装置36は、画像読取部28の長手方向と交差する方向に延びる搬送路に沿って、読取対象Tを搬送する。 The transport device 36 transports the object to be read T along a transport path extending in a direction intersecting the main scanning direction of the image reading section 28 . When combining the fifth embodiment with the fourth embodiment, the conveying device 36 conveys the object to be read T along a conveying path extending in a direction intersecting the longitudinal direction of the image reading section 28 .
 図24は、第5実施形態の画像読取部28の周辺の構成を説明するための概略図である。図24に示す例では、複数の搬送装置36が設けられ、これらの搬送装置36は、搬送路に沿って互いに隙間を空けて配置される。また、複数の搬送装置36は、主走査方向に対して直交する方向に延びる搬送路に沿って、読取対象Tを搬送する。搬送装置36としては、ベルトコンベアを用いられるが、これに限定されない。 FIG. 24 is a schematic diagram for explaining the configuration around the image reading unit 28 of the fifth embodiment. In the example shown in FIG. 24, a plurality of transport devices 36 are provided, and these transport devices 36 are arranged at intervals along the transport path. Also, the plurality of transport devices 36 transport the object to be read T along a transport path extending in a direction perpendicular to the main scanning direction. A belt conveyor is used as the conveying device 36, but it is not limited to this.
 さらに、画像読取部28のライン状光源44Bは、搬送路に沿って搬送される読取対象Tに、隣接する搬送装置36間の隙間を介して光を照射する。ただし、読取対象Tが搬送装置36よりもライン状光源44B側で搬送される構成の場合には、搬送路に沿って搬送される読取対象Tに照射した光が、隣接する搬送装置36間の隙間に入射してもよい。 Furthermore, the linear light source 44B of the image reading section 28 irradiates light onto the reading object T conveyed along the conveying path through the gap between the adjacent conveying devices 36 . However, in the case where the object to be read T is transported closer to the linear light source 44B than the transport device 36, the light irradiated to the object to be read T transported along the transport path is projected between adjacent transport devices 36. You may inject into a clearance gap.
 このように、搬送装置36同士の間にライン状光源44Bを配置した場合、ライン状光源44Bから照射された光が、読取対象Tに到達するまでに吸収又は反射されるのを抑制することができるため、読取画像データにおいて、読取対象Tに対応する領域の輪郭がぼやけるのを防止することができる。図24の例では、搬送装置36が2つ設けられているが、3つ以上の搬送装置36を設けて、隣接する搬送装置36同士の間のそれぞれにライン状光源44Bを配置してもよい。 In this way, when the linear light source 44B is arranged between the conveying devices 36, it is possible to prevent the light emitted from the linear light source 44B from being absorbed or reflected before reaching the object T to be read. Therefore, it is possible to prevent the outline of the area corresponding to the reading target T from being blurred in the read image data. In the example of FIG. 24, two transport devices 36 are provided, but three or more transport devices 36 may be provided, and the linear light source 44B may be arranged between each of the adjacent transport devices 36. .
 図25は、第5実施形態の画像読取部28の周辺の構成の変形例を説明するための概略図である。この例では、画像読取部28のライン状光源44Bから照射される光が、搬送装置36を透過する。具体的には、画像読取部28のライン状光源44Bから照射される光が、ベルトコンベアにより構成される搬送装置36のベルトを透過する。そのため、ベルトが透明な材料で形成されることにより、搬送路を形成する搬送面36Aが光透過性を有しており、ライン状光源44Bからの光が搬送面36Aを透過して読取対象Tに照射される。この場合、搬送面36Aがライン状光源44B及び読取対象Tの間に位置するように画像読取部28が配置される。 FIG. 25 is a schematic diagram for explaining a modification of the configuration around the image reading unit 28 of the fifth embodiment. In this example, the light emitted from the linear light source 44B of the image reading section 28 passes through the conveying device 36 . Specifically, the light emitted from the linear light source 44B of the image reading section 28 passes through the belt of the conveying device 36 constituted by a belt conveyor. Therefore, since the belt is made of a transparent material, the conveying surface 36A forming the conveying path is light transmissive, and the light from the linear light source 44B is transmitted through the conveying surface 36A to read the target T. is irradiated to In this case, the image reading section 28 is arranged such that the conveying surface 36A is positioned between the linear light source 44B and the reading object T. As shown in FIG.
 このように、搬送面36Aが光透過性を有するのであれば、ライン状光源44Bから照射された光が、読取対象Tに到達するまでに吸収又は反射されるのを抑制することができるため、読取画像データにおいて、読取対象Tに対応する領域の輪郭がぼやけるのを防止することができる。なお、図25では、ライン状光源44Bが搬送装置36の外側(ベルトの下側)に配置されているが、これに限らず、搬送装置36の内側(ベルトの内側)に配置されてもよい。 In this way, if the transport surface 36A has optical transparency, it is possible to prevent the light emitted from the linear light source 44B from being absorbed or reflected before it reaches the object to be read T. It is possible to prevent the outline of the area corresponding to the reading target T from being blurred in the read image data. In FIG. 25, the linear light source 44B is arranged outside the conveying device 36 (below the belt), but it is not limited to this, and may be arranged inside the conveying device 36 (inside the belt). .
 さらに、上述の実施形態で挙げた具体的な構成等は一例であり、実際の製品に応じて適宜変更することが可能である。たとえば、画像読取装置10における画像読取部28を省略することにより、画像処理装置として機能してもよい。ただし、この場合、各実施形態における画像読取部28として機能することが可能な外部の画像読取部と画像処理装置が通信可能に接続される。また、たとえば、各実施形態で説明した画像読取装置10は、異物やキズ、欠陥等を検出するための検査装置として実施されてもよい。 Furthermore, the specific configurations, etc. mentioned in the above-described embodiment are examples, and can be changed as appropriate according to the actual product. For example, the image reading device 10 may function as an image processing device by omitting the image reading section 28 . However, in this case, an external image reading unit capable of functioning as the image reading unit 28 in each embodiment and the image processing apparatus are communicably connected. Further, for example, the image reading apparatus 10 described in each embodiment may be implemented as an inspection apparatus for detecting foreign matter, scratches, defects, and the like.
 さらにまた、上述の実施形態で示したフロー図の各ステップは、同じ結果が得られるのであれば、処理される順番は適宜変更することが可能である。 Furthermore, the order in which each step of the flow chart shown in the above embodiment is processed can be appropriately changed as long as the same result can be obtained.
 なお、対象領域の輪郭が鮮明な読取画像データに対する出力画像データの類似度の評価指標として、ZNCC(Zero-mean Normalized Cross-Correlation:正規化相互相関)を用いることが可能である。 It should be noted that ZNCC (Zero-mean Normalized Cross-Correlation) can be used as an evaluation index of the degree of similarity of output image data to read image data with a clear outline of the target area.
10   画像読取装置
20   制御部
22   CPU
24   RAM
26   記憶部
28   画像読取部
32   撮像素子
34   スリット部材
36   搬送装置
44   光源
90   画像処理部
92   スペクトル生成処理部
94   比較処理部
96   合成スペクトル生成処理部
98   画像生成処理部
100  補正処理部
202a 読取画像データ
202b スペクトルデータ
202c 合成スペクトルデータ
202d 出力画像データ
202e 補正画像データ
10 image reading device 20 control unit 22 CPU
24 RAMs
26 storage unit 28 image reading unit 32 imaging element 34 slit member 36 conveying device 44 light source 90 image processing unit 92 spectrum generation processing unit 94 comparison processing unit 96 composite spectrum generation processing unit 98 image generation processing unit 100 correction processing unit 202a read image data 202b spectral data 202c synthesized spectral data 202d output image data 202e corrected image data

Claims (17)

  1.  同一の対象物に対して、それぞれ異なる方向に延びる主走査方向に沿ってライン状に当該対象物を読み取ることにより得られる複数の読取画像データに、画像処理を施す画像処理装置であって、
     前記複数の読取画像データのそれぞれに対して、フーリエ変換処理を施すことで、複数のスペクトルデータを生成するスペクトル生成処理部と、
     前記複数のスペクトルデータの同一周波数に対応する各成分同士を比較する比較処理部と、
     前記比較処理部による比較結果に基づき、前記複数のスペクトルデータの同一周波数に対応する各成分の中からいずれかの成分を選択することにより、合成スペクトルデータを生成する合成スペクトル生成処理部と、
     前記合成スペクトルデータに対して、逆フーリエ変換処理を施すことで、出力画像データを生成する画像生成処理部とを備える、画像処理装置。
    An image processing device for performing image processing on a plurality of read image data obtained by scanning the same object in a line along main scanning directions extending in different directions,
    a spectrum generation processing unit that generates a plurality of spectrum data by performing a Fourier transform process on each of the plurality of read image data;
    a comparison processing unit that compares each component corresponding to the same frequency of the plurality of spectral data;
    a synthetic spectrum generation processing unit that generates synthetic spectrum data by selecting one of the components corresponding to the same frequency of the plurality of spectrum data based on the comparison result by the comparison processing unit;
    An image processing device, comprising: an image generation processing unit that generates output image data by performing inverse Fourier transform processing on the synthesized spectral data.
  2.  前記合成スペクトル生成処理部は、前記複数のスペクトルデータの同一周波数に対応する各成分の中から最も大きい成分を選択する、請求項1に記載の画像処理装置。 The image processing apparatus according to claim 1, wherein the synthesized spectrum generation processing unit selects the largest component from among the components corresponding to the same frequency of the plurality of spectral data.
  3.  前記複数の読取画像データのそれぞれにおける画像上で前記対象物に対応する領域が略同一に位置されるように、フーリエ変換処理が施される前の前記複数の読取画像データの少なくとも1つに対して補正処理を施す補正処理部をさらに備え、
     前記スペクトル生成処理部は、前記補正処理部による補正処理が施された前記複数の読取画像データのそれぞれに対して、フーリエ変換処理を施す、請求項1又は2に記載の画像処理装置。
    for at least one of the plurality of read image data before being subjected to Fourier transform processing so that regions corresponding to the object on the image in each of the plurality of read image data are positioned substantially identically; further comprising a correction processing unit that performs correction processing by
    3. The image processing apparatus according to claim 1, wherein said spectrum generation processing section performs Fourier transform processing on each of said plurality of pieces of read image data subjected to correction processing by said correction processing section.
  4.  前記補正処理部は、フーリエ変換処理が施される前の前記複数の読取画像データの少なくとも1つに対して、当該読取画像データにおける画像上の前記対象物に対応する領域の傾き又は倍率を補正する、請求項3に記載の画像処理装置。 The correction processing unit corrects an inclination or magnification of an area corresponding to the object on the image in at least one of the plurality of read image data before being subjected to Fourier transform processing. 4. The image processing apparatus according to claim 3, wherein
  5.  それぞれ主走査方向に沿ってライン状に画像を読み取り可能であり、当該主走査方向が副走査方向に対してそれぞれ異なる方向に延びる複数のラインセンサと、
     前記複数のラインセンサで同一の対象物を読み取ることにより得られる複数の読取画像データに対して、画像処理を施す請求項1~4のいずれか一項に記載の画像処理装置とを備える、画像読取装置。
    a plurality of line sensors each capable of reading an image in a line along the main scanning direction, the main scanning direction extending in directions different from the sub-scanning direction;
    The image processing device according to any one of claims 1 to 4, wherein image processing is performed on a plurality of read image data obtained by reading the same object with the plurality of line sensors. reader.
  6.  前記複数のラインセンサには、主走査方向が互いに直交するように配置された2つのラインセンサが含まれる、請求項5に記載の画像読取装置。 6. The image reading apparatus according to claim 5, wherein the plurality of line sensors include two line sensors arranged such that the main scanning directions are orthogonal to each other.
  7.  前記2つのラインセンサは、それぞれの主走査方向が前記副走査方向に対して45°の角度で交差するように配置されている、請求項6に記載の画像読取装置。 7. The image reading device according to claim 6, wherein the two line sensors are arranged such that their respective main scanning directions intersect with the sub-scanning direction at an angle of 45°.
  8.  前記複数のラインセンサが、前記副走査方向に対して交差する方向に並べて配置されている、請求項5~7のいずれか一項に記載の画像読取装置。 The image reading device according to any one of claims 5 to 7, wherein the plurality of line sensors are arranged side by side in a direction crossing the sub-scanning direction.
  9.  副走査方向に対して交差する長手方向に沿って長尺形状を有する少なくとも1つのラインセンサと、
     前記少なくとも1つのラインセンサで同一の対象物を読み取ることにより得られる複数の読取画像データに対して、画像処理を施す請求項1~4のいずれか一項に記載の画像処理装置とを備え、
     前記少なくとも1つのラインセンサには、前記長手方向に対して傾斜し、前記長手方向に並べて配置される複数の第1撮像素子と、前記長手方向に対して前記第1撮像素子とは異なる角度で傾斜し、前記長手方向に並べて配置される複数の第2撮像素子とが含まれる、画像読取装置。
    at least one line sensor having an elongated shape along a longitudinal direction intersecting the sub-scanning direction;
    The image processing device according to any one of claims 1 to 4, wherein image processing is performed on a plurality of read image data obtained by reading the same object with the at least one line sensor,
    The at least one line sensor includes a plurality of first imaging elements that are inclined with respect to the longitudinal direction and arranged side by side in the longitudinal direction, and at an angle different from that of the first imaging elements with respect to the longitudinal direction. and a plurality of second imaging elements that are inclined and arranged side by side in the longitudinal direction.
  10.  前記少なくとも1つのラインセンサには、2つのラインセンサが含まれ、
     前記2つのラインセンサのうちの一方には、前記長手方向に対して傾斜する前記複数の第1撮像素子が前記長手方向に並べて配置され、
     前記2つのラインセンサのうちの他方には、前記長手方向に対して前記第1撮像素子とは異なる角度で傾斜する前記複数の第2撮像素子が前記長手方向に並べて配置されている、請求項9に記載の画像読取装置。
    the at least one line sensor includes two line sensors;
    In one of the two line sensors, the plurality of first imaging elements inclined with respect to the longitudinal direction are arranged side by side in the longitudinal direction,
    3. The plurality of second imaging elements inclined at an angle different from that of the first imaging element with respect to the longitudinal direction are arranged side by side in the longitudinal direction on the other of the two line sensors. 9. The image reading device according to 9.
  11.  前記少なくとも1つのラインセンサのうちの1つには、前記長手方向に対して傾斜する前記複数の第1撮像素子が前記長手方向に並べて配置されるとともに、前記長手方向に対して前記第1撮像素子とは異なる角度で傾斜する前記複数の第2撮像素子が前記長手方向に並べて配置されている、請求項9に記載の画像読取装置。 In one of the at least one line sensor, the plurality of first imaging elements inclined with respect to the longitudinal direction are arranged side by side in the longitudinal direction, and the first imaging element is arranged with respect to the longitudinal direction. 10. The image reading device according to claim 9, wherein the plurality of second imaging elements inclined at an angle different from that of the elements are arranged side by side in the longitudinal direction.
  12.  主走査方向に沿ってライン状に画像を読み取り可能な少なくとも1つのラインセンサと、
     前記少なくとも1つのラインセンサで同一の対象物を読み取ることにより得られる複数の読取画像データに対して、画像処理を施す請求項1~4のいずれか一項に記載の画像処理装置とを備え、
     前記ラインセンサは、
     前記主走査方向に延び、前記対象物に光を照射する光源と、
     前記主走査方向に延び、前記対象物と前記光源との間に配置されるスリット部材とを有する、画像読取装置。
    at least one line sensor capable of reading an image linearly along the main scanning direction;
    The image processing device according to any one of claims 1 to 4, wherein image processing is performed on a plurality of read image data obtained by reading the same object with the at least one line sensor,
    The line sensor is
    a light source that extends in the main scanning direction and irradiates the object with light;
    and a slit member extending in the main scanning direction and arranged between the object and the light source.
  13.  主走査方向に沿ってライン状に画像を読み取り可能な少なくとも1つのラインセンサと、
     前記少なくとも1つのラインセンサで同一の対象物を読み取ることにより得られる複数の読取画像データに対して、画像処理を施す請求項1~4のいずれか一項に記載の画像処理装置とを備え、
     前記ラインセンサは、前記対象物に光を照射するラインレーザ光源を有する、画像読取装置。
    at least one line sensor capable of reading an image linearly along the main scanning direction;
    The image processing device according to any one of claims 1 to 4, wherein image processing is performed on a plurality of read image data obtained by reading the same object with the at least one line sensor,
    The image reading device, wherein the line sensor has a line laser light source that irradiates the object with light.
  14.  主走査方向に沿ってライン状に画像を読み取り可能な少なくとも1つのラインセンサと、
     前記少なくとも1つのラインセンサで同一の対象物を読み取ることにより得られる複数の読取画像データに対して、画像処理を施す請求項1~4のいずれか一項に記載の画像処理装置と、
     前記主走査方向に対して交差する方向に延びる搬送路に沿って前記対象物を搬送する複数の搬送装置とを備え、
     前記ラインセンサは、
     前記対象物に光を照射する光源を有し、
     前記複数の搬送装置は、前記搬送路に沿って互いに隙間を空けて配置されており、
     前記光源は、前記搬送路に沿って搬送される前記対象物に前記隙間を介して光を照射し、又は、前記搬送路に沿って搬送される対象物に照射した光を前記隙間に入射させる、画像読取装置。
    at least one line sensor capable of reading an image linearly along the main scanning direction;
    The image processing device according to any one of claims 1 to 4, wherein image processing is performed on a plurality of read image data obtained by reading the same object with the at least one line sensor;
    a plurality of transport devices for transporting the object along a transport path extending in a direction intersecting the main scanning direction;
    The line sensor is
    Having a light source for irradiating the object with light,
    The plurality of transport devices are arranged with a gap from each other along the transport path,
    The light source irradiates the object transported along the transport path with light through the gap, or causes the light irradiated on the object transported along the transport path to enter the gap. , image reader.
  15.  主走査方向に沿ってライン状に画像を読み取り可能な少なくとも1つのラインセンサと、
     前記少なくとも1つのラインセンサで同一の対象物を読み取ることにより得られる複数の読取画像データに対して、画像処理を施す請求項1~4のいずれか一項に記載の画像処理装置と、
     前記主走査方向に対して交差する方向に延びる搬送路に沿って前記対象物を搬送する搬送装置とを備え、
     前記ラインセンサは、
     前記対象物に光を照射する光源を有し、
     前記搬送装置は、前記搬送路を形成する搬送面を含み、当該搬送面が光透過性を有することにより、前記光源からの光が前記搬送面を透過して前記対象物に照射される、画像読取装置。
    at least one line sensor capable of reading an image linearly along the main scanning direction;
    The image processing device according to any one of claims 1 to 4, wherein image processing is performed on a plurality of read image data obtained by reading the same object with the at least one line sensor;
    a conveying device for conveying the object along a conveying path extending in a direction intersecting the main scanning direction;
    The line sensor is
    Having a light source for irradiating the object with light,
    The conveying device includes a conveying surface that forms the conveying path, and the conveying surface has optical transparency so that the light from the light source is transmitted through the conveying surface to irradiate the object. reader.
  16.  同一の対象物に対して、それぞれ異なる方向に延びる主走査方向に沿ってライン状に当該対象物を読み取ることにより得られる複数の読取画像データに、画像処理を施す画像処理方法であって、
     前記複数の読取画像データのそれぞれに対して、フーリエ変換処理を施すことで、複数のスペクトルデータを生成するスペクトル生成ステップと、
     前記複数のスペクトルデータの同一周波数に対応する各成分同士を比較する比較ステップと、
     前記比較ステップによる比較結果に基づき、前記複数のスペクトルデータの同一周波数に対応する各成分の中からいずれかの成分を選択することにより、合成スペクトルデータを生成する合成スペクトル生成ステップと、
     前記合成スペクトルデータに対して、逆フーリエ変換処理を施すことで、出力画像データを生成する画像生成ステップとを備える、画像処理方法。
    1. An image processing method for performing image processing on a plurality of read image data obtained by scanning the same object linearly along main scanning directions extending in different directions, the method comprising:
    a spectrum generating step of generating a plurality of spectral data by performing a Fourier transform process on each of the plurality of read image data;
    a comparison step of comparing each component corresponding to the same frequency of the plurality of spectral data;
    a synthetic spectrum generating step of generating synthetic spectral data by selecting one of the components corresponding to the same frequency of the plurality of spectral data based on the comparison result of the comparing step;
    and an image generating step of generating output image data by performing an inverse Fourier transform process on the synthesized spectrum data.
  17.  同一の対象物に対して、それぞれ異なる方向に延びる主走査方向に沿ってライン状に対象物を読み取ることにより得られる複数の読取画像データに、画像処理を施す画像処理プログラムであって、
     前記複数の読取画像データのそれぞれに対して、フーリエ変換処理を行うことで、複数のスペクトルデータを生成するスペクトル生成ステップと、
     前記複数のスペクトルデータの同一周波数に対応する各成分同士を比較する比較ステップと、
     前記比較ステップによる比較結果に基づき、前記複数のスペクトルデータの同一周波数に対応する各成分の中からいずれかの成分を選択することにより、合成スペクトルデータを生成する合成スペクトル生成ステップと、
     前記合成スペクトルデータに対して、逆フーリエ変換処理を施すことで、出力画像データを生成する画像生成ステップとをコンピュータに実行させる、画像処理プログラム。
    1. An image processing program for performing image processing on a plurality of read image data obtained by scanning the same object linearly along main scanning directions extending in different directions,
    a spectrum generation step of generating a plurality of spectral data by performing a Fourier transform process on each of the plurality of read image data;
    a comparison step of comparing each component corresponding to the same frequency of the plurality of spectral data;
    a synthetic spectrum generating step of generating synthetic spectral data by selecting one of the components corresponding to the same frequency of the plurality of spectral data based on the comparison result of the comparing step;
    An image processing program for causing a computer to execute an image generation step of generating output image data by performing inverse Fourier transform processing on the synthesized spectral data.
PCT/JP2022/012653 2021-03-23 2022-03-18 Image processing device, image processing method, image processing program, and image reading device WO2022202671A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023509127A JPWO2022202671A1 (en) 2021-03-23 2022-03-18

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021048810 2021-03-23
JP2021-048810 2021-03-23

Publications (1)

Publication Number Publication Date
WO2022202671A1 true WO2022202671A1 (en) 2022-09-29

Family

ID=83397320

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/012653 WO2022202671A1 (en) 2021-03-23 2022-03-18 Image processing device, image processing method, image processing program, and image reading device

Country Status (2)

Country Link
JP (1) JPWO2022202671A1 (en)
WO (1) WO2022202671A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006221365A (en) * 2005-02-09 2006-08-24 Bitstrong:Kk Line scan type image processing unit
JP2007281956A (en) * 2006-04-07 2007-10-25 Fuji Xerox Co Ltd Resolution evaluation device, resolution evaluation method, and program
JP2008176645A (en) * 2007-01-19 2008-07-31 Konica Minolta Holdings Inc Three-dimensional shape processing apparatus, control method of three-dimensional shape processing apparatus, and control program of three-dimensional shape processing apparatus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006221365A (en) * 2005-02-09 2006-08-24 Bitstrong:Kk Line scan type image processing unit
JP2007281956A (en) * 2006-04-07 2007-10-25 Fuji Xerox Co Ltd Resolution evaluation device, resolution evaluation method, and program
JP2008176645A (en) * 2007-01-19 2008-07-31 Konica Minolta Holdings Inc Three-dimensional shape processing apparatus, control method of three-dimensional shape processing apparatus, and control program of three-dimensional shape processing apparatus

Also Published As

Publication number Publication date
JPWO2022202671A1 (en) 2022-09-29

Similar Documents

Publication Publication Date Title
KR100996335B1 (en) Apparatus and methods for inspecting a composite structure for inconsistencies
KR101915498B1 (en) Appearance inspection apparatus
TW201100788A (en) Polycrystalline wafer inspection method
US9727961B2 (en) Method of operating a radiographic inspection system with a modular conveyor chain
US10887500B2 (en) Optical inspection system
US20180238812A1 (en) Optical inspecting apparatus with an optical screening device
US20190178812A1 (en) Surface inspection system and inspection method
JP5687748B2 (en) Inspection device
JP2015203586A (en) inspection method
WO2022202671A1 (en) Image processing device, image processing method, image processing program, and image reading device
JP6679942B2 (en) Sheet defect inspection device
CN112703393B (en) Illumination for defect inspection of sheet, defect inspection device for sheet, and defect inspection method for sheet
CN112791983B (en) Finished goods scanning assembly
JPWO2018147454A1 (en) Scanning optical system and laser radar device
KR20220165784A (en) Foreign material/defect inspection apparatus, image generating apparatus in foreign material/defect inspection, and foreign material/defect inspection method
KR20230021745A (en) Foreign material/defect inspection apparatus, image generating apparatus in foreign material/defect inspection, and foreign material/defect inspection method
AU716024B2 (en) Surface topography enhancement
JP7307618B2 (en) Inspection system and light irradiation device
JP2010008468A (en) Radiographic image reading apparatus
JP6086277B2 (en) Pattern inspection apparatus and illumination optical system used therefor
JP2023163365A (en) Imaging apparatus, and inspection apparatus using imaging apparatus
JP7370023B1 (en) Inspection equipment and inspection method
JP7199945B2 (en) optical measuring device
JP2023092933A (en) Inspection device and inspection method
JP2018017675A (en) Optical inspection system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22775459

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023509127

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22775459

Country of ref document: EP

Kind code of ref document: A1