WO2014119772A1 - 画像生成装置、欠陥検査装置および欠陥検査方法 - Google Patents

画像生成装置、欠陥検査装置および欠陥検査方法 Download PDF

Info

Publication number
WO2014119772A1
WO2014119772A1 PCT/JP2014/052371 JP2014052371W WO2014119772A1 WO 2014119772 A1 WO2014119772 A1 WO 2014119772A1 JP 2014052371 W JP2014052371 W JP 2014052371W WO 2014119772 A1 WO2014119772 A1 WO 2014119772A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
pixel
defect map
defect
map image
Prior art date
Application number
PCT/JP2014/052371
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
麻耶 尾崎
Original Assignee
住友化学株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 住友化学株式会社 filed Critical 住友化学株式会社
Priority to CN201480006156.9A priority Critical patent/CN104956210B/zh
Priority to KR1020157017615A priority patent/KR102168143B1/ko
Priority to JP2014559794A priority patent/JP6191627B2/ja
Publication of WO2014119772A1 publication Critical patent/WO2014119772A1/ja

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/89Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles
    • G01N21/892Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles characterised by the flaw, defect or object feature examined
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8854Grading and classifying of flaws
    • G01N2021/8861Determining coordinates of flaws
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30124Fabrics; Textile; Paper

Definitions

  • the present invention relates to an image generation apparatus that generates image data for inspecting defects in a sheet-like molded body such as a polarizing film and a retardation film, a defect inspection apparatus including the image generation apparatus, and a defect inspection method.
  • FIGS. 12A and 12B are diagrams for explaining the operation when the defect map image L is generated using the one-dimensional images K1 to K19 obtained by the line sensor in the first conventional defect inspection apparatus.
  • a defect inspection apparatus illuminates a sheet-shaped molded body with a linear light source such as a fluorescent tube, and the surface of the sheet-shaped molded body extends from one end to the other in the longitudinal direction along the longitudinal direction of the sheet-shaped molded body.
  • a plurality of one-dimensional images (still images) K1 to K19 as shown in FIG. 12A are acquired while scanning with the line sensor.
  • the one-dimensional images K1 to K19 shown in FIG. 12A have undergone processing (for example, image processing such as binarization) for emphasizing a defective portion on the image captured by the line sensor.
  • the black portion represents a portion having no defect
  • the white portion represents a portion having a defect.
  • the first conventional defect inspection apparatus generates a defect map image L, which is a two-dimensional image, by spreading a plurality of one-dimensional images K1 to K19 in order of acquisition time. Based on the map image L, a defect of the sheet-like molded body is inspected.
  • the black portion represents a portion without a defect
  • the white portion represents a portion with a defect.
  • a defect map image L is created by spreading the one-dimensional images K1 to K19 before the defect enhancement processing (raw image acquired by the line sensor) in the order of acquisition time, and the defect portion is emphasized with respect to the defect map image L. There are also cases where processing is performed.
  • the region observed by the line sensor usually includes a linear light source image.
  • the linear light source and the line sensor are arranged on one side of the sheet-like molded body, the linear light source image is emitted from the linear light source and regularly reflected by the sheet-shaped molded body.
  • the sheet-shaped molded body is arranged between the linear light source and the line sensor, the linear light source image is emitted from the linear light source and transmitted through the sheet-shaped molded body. Then, it is an image of light reaching the line sensor.
  • a plurality of line sensors are arranged in the width direction so that the entire area in the width direction of the sheet-like molded body can be inspected.
  • the first conventional defect inspection apparatus inspects defects in a sheet-like molded body based on a defect map image L which is a two-dimensional image generated by spreading a plurality of one-dimensional images K1 to K19. Therefore, the positional relationship between the inspection target pixel and the linear light source image in each of the one-dimensional images K1 to K19 constituting the defect map image L is one determined positional relationship.
  • the defect may appear on the one-dimensional images K1 to K19 only when the positional relationship between the inspection target pixel and the linear light source image is a specific positional relationship. For example, bubbles that are one type of defect often appear on the one-dimensional images K1 to K19 only when they are at or near the periphery of the linear light source image.
  • the defect may not be detected depending on the position. Therefore, the defect of the first conventional technique in which the defect of the sheet-like molded body is inspected using the defect map image L which is a two-dimensional image composed of a plurality of one-dimensional images K1 to K19 acquired by the line sensor.
  • the inspection apparatus has a limited defect detection capability.
  • the defect inspection apparatus uses an area sensor to illuminate a sheet-shaped molded body with a linear light source such as a fluorescent tube and continuously convey the sheet-shaped molded body in a predetermined conveyance direction. A two-dimensional image (moving image) is acquired, and a defect of the sheet-like molded body is inspected based on the two-dimensional image.
  • a linear light source such as a fluorescent tube
  • the defect inspection apparatus of the second prior art since it is possible to determine whether or not there is a defect based on a plurality of two-dimensional images having different positional relationships between the inspection target pixel and the linear light source image, The defect can be detected more reliably than the first conventional defect inspection apparatus using the line sensor. Therefore, the defect inspection apparatus according to the second prior art using the area sensor has improved defect detection capability than the defect inspection apparatus according to the first prior art using the line sensor.
  • FIGS. 13A and 13B are diagrams for explaining the operation when the defect map image N is generated using the two-dimensional images M1 to M6 acquired by the area sensor in the second conventional defect inspection apparatus.
  • the area sensor performs an imaging operation at a predetermined time interval with respect to the continuously-formed sheet-like molded body, and performs each imaging operation as shown in FIG. 13A.
  • a plurality of two-dimensional images M1 to M6 at least partially overlapped are acquired.
  • the two-dimensional images M1 to M6 shown in FIG. 13A have undergone processing (for example, image processing such as binarization) for emphasizing a defective portion on the image captured by the area sensor.
  • the black portion represents a portion having no defect
  • the white portion represents a portion having a defect.
  • the two-dimensional images M1 to M6 acquired by the area sensor are between the two-dimensional image M1 and the two-dimensional image M2, and between the two-dimensional image M2 and the two-dimensional image M3.
  • An overlapping portion partially overlapping between the two-dimensional image M3 and the two-dimensional image M4, between the two-dimensional image M4 and the two-dimensional image M5, and between the two-dimensional image M5 and the two-dimensional image M6.
  • the defect map image N is generated by sequentially spreading the two-dimensional images M1 to M6 in the order of acquisition time in the second conventional defect inspection apparatus, as shown in FIG. 13B, one defect map image N There are a plurality of defective pixels (for example, defective pixel N1 in FIG. 13B) showing the same defect.
  • An object of the present invention is to provide an image generation apparatus that generates an image for inspecting a defect of a sheet-shaped molded body, and can accurately inspect the position of the defect in the sheet-shaped molded body with high detection capability.
  • An object is to provide an image generation device, a defect inspection device, and a defect inspection method capable of preventing duplicate detection of the same defect.
  • the present invention In an image generation apparatus that generates an image for inspecting a defect of a sheet-like molded body, A transport unit for transporting the sheet-shaped molded body in the longitudinal direction at a predetermined transport speed; A light irradiation unit for irradiating light to the sheet-like molded body to be conveyed; An imaging unit that is arranged to face the surface of the sheet-like molded body to be conveyed and images a part of the surface of the sheet-like molded body at a predetermined time interval to generate a plurality of two-dimensional images, An imaging unit in which the time interval is set so that imaging regions captured by two consecutive imaging operations partially overlap; A feature amount calculation unit that calculates a feature amount of each pixel constituting each of the two-dimensional images based on a luminance value of each pixel by a predetermined algorithm process; Each pixel constituting each two-dimensional image is distinguished into a defective pixel whose feature value is equal to or greater than a predetermined threshold value and a residual pixel whose feature value
  • a processed image data generation unit that generates a processed image to which a gradation value corresponding to each of the two-dimensional images is provided, and a processed gradation image that is assigned a gradation value of zero for the remaining pixels;
  • a defect map image generation unit that generates a defect map image representing a distribution of defects in a sheet-like molded body by combining a plurality of processed images generated by the processed image data generation unit,
  • a defect map image coordinate value calculation unit for calculating the coordinate value of each pixel for constituting the defect map image based on the coordinate value of each pixel constituting each processed image, the transport speed, and the time interval; Integration unit that performs either (1) or (2) below or both (1) and (2) below: (1) For each pixel of the defect map image, count the number of defective pixels among the corresponding pixels in the processed image; (2) For each pixel of the defect map image, calculate the sum of the gradation values assigned to the corresponding pixel in the processed image; When, As the luminance value of each pixel of the defect map image,
  • the time interval is preferably such that the length of the partially overlapping imaging regions in the longitudinal direction is 1/2 or more times the length of the two-dimensional images in the longitudinal direction. Is set to be
  • the present invention also provides The image generating device; A display unit for displaying the defect map image generated by the defect map image generation unit of the image generation device; A defect inspection apparatus is provided.
  • the present invention is a defect inspection method for inspecting a defect of a sheet-like molded body, A transporting step of transporting the sheet-shaped molded body in the longitudinal direction at a transporting speed determined in advance by the transporting unit; A light irradiation step of irradiating the sheet-like molded body to be conveyed with light; An imaging step of generating a plurality of two-dimensional images by imaging a part of the surface of the sheet-shaped molded body at a predetermined time interval by an imaging unit arranged to face the surface of the sheet-shaped molded body being conveyed An imaging step in which the time interval is set so that imaging areas captured by two consecutive imaging operations partially overlap; A feature amount calculating step of calculating a feature amount of each pixel constituting each two-dimensional image based on a luminance value of each pixel by a predetermined algorithm process; Each pixel constituting each two-dimensional image is distinguished into a defective pixel whose feature value is equal to or greater than a predetermined threshold value and a residual
  • the image generation apparatus is an apparatus that generates an image for inspecting a defect in a sheet-like molded body, and includes a conveyance unit, a light irradiation unit, an imaging unit, a feature amount calculation unit, and a processed image data generation unit. And a defect map image generator.
  • the imaging unit generates a plurality of two-dimensional images by imaging the surface of the sheet-like molded body conveyed by the conveyance unit while being irradiated with light by the light irradiation unit at predetermined time intervals.
  • the time interval is set so that the imaging areas captured by two consecutive imaging operations partially overlap.
  • a plurality of two-dimensional images generated in this way are part of each other in a direction parallel to the longitudinal direction of the sheet-like molded body when viewed in two two-dimensional images generated in two consecutive imaging operations. The images will overlap.
  • the feature amount calculation unit calculates a feature amount based on a luminance value of each pixel constituting each two-dimensional image by processing each two-dimensional image with a predetermined algorithm.
  • the processed image data generation unit distinguishes each pixel constituting each of the two-dimensional images into a defective pixel whose feature value is greater than or equal to a predetermined threshold value and a remaining pixel whose feature value is less than the threshold value, A processed image to which a gradation value corresponding to the feature amount is assigned to the defective pixel and a zero gradation value is assigned to the remaining pixel is generated corresponding to each two-dimensional image.
  • the defect map image generation unit is a part that generates a defect map image by combining a plurality of processed images generated by the processed image data generation unit, and includes a defect map image coordinate value calculation unit, an integration unit, and a luminance value setting Part.
  • the defect map image coordinate value calculation unit configures the defect map image based on the coordinate value of each pixel constituting each processing image, the conveyance speed of the sheet-like molded body, and the time interval set in the imaging unit. The coordinate value of each pixel is calculated.
  • the integrating unit performs either of the following (1) or (2), or both (1) and (2) below. (1) For each pixel of the defect map image, count the number of defective pixels among the corresponding pixels in the processed image.
  • the luminance value setting unit obtains the number of defective pixels obtained in (1) and / or the gradation value obtained in (2) in the integrating unit as the luminance value of each pixel of the defect map image.
  • a defect map image is generated by setting a value calculated based on the total.
  • a defect map image which is an image for inspecting a defect of the sheet-like molded body based on the two-dimensional image of the sheet-like molded body generated by the imaging unit. Therefore, it is possible to maintain a high defect detection capability as compared to a case where an image for inspecting defects is generated based on a plurality of one-dimensional images by a line sensor, for example.
  • a defect map image is configured based on the coordinate value of each pixel constituting each processed image, the conveyance speed of the sheet-shaped molded body, and the time interval set in the imaging unit.
  • the coordinate value of each pixel is calculated.
  • the luminance value of each pixel corresponding to the calculated coordinate value is the number of defective pixels in the pixel in the processed image and the same coordinate value is calculated, and the gradation value of the defective pixel. Since the defect map image is generated by setting based on the sum, the defect map image is used to inspect the defects of the sheet-shaped molded body, and the defect position in the sheet-shaped molded body is highly detected. Can be accurately inspected. Since the same defect appears in one place in the defect map, it is possible to prevent duplicate detection of the same defect.
  • a defect inspection apparatus includes the image generation apparatus according to the present invention and a display unit.
  • the display unit displays the defect map image generated by the defect map image generation unit of the image generation device.
  • the position of the defect in the sheet-like molded body can be confirmed.
  • the defect inspection method is a method for inspecting a defect of the sheet-like molded body, and includes a conveyance process, a light irradiation process, an imaging process, a feature amount calculation process, a processed image data generation process, and a defect map.
  • An image generation process and a display process are included.
  • the imaging step a plurality of two-dimensional images are generated by capturing an image of the surface of the sheet-like molded body conveyed while being irradiated with light at a predetermined time interval.
  • the time interval is set so that the imaging areas captured by two consecutive imaging operations partially overlap.
  • a plurality of two-dimensional images generated in this way are part of each other in a direction parallel to the longitudinal direction of the sheet-like molded body when viewed in two two-dimensional images generated in two consecutive imaging operations. The images will overlap.
  • the feature amount based on the luminance value of each pixel constituting each two-dimensional image is calculated by processing each two-dimensional image with a predetermined algorithm.
  • each pixel constituting each of the two-dimensional images is distinguished into a defective pixel whose feature value is equal to or greater than a predetermined threshold value and a residual pixel whose feature value is less than the threshold value, A processed image to which a gradation value corresponding to the feature amount is assigned to the defective pixel and a zero gradation value is assigned to the remaining pixel is generated corresponding to each two-dimensional image.
  • a defect map image is generated by combining a plurality of processed images generated in the processed image generation step.
  • the defect map image generation step includes a defect map image coordinate value calculation step, a calculation number integration step, and a luminance value setting step.
  • a defect map image is configured based on the coordinate value of each pixel constituting each processed image, the conveyance speed of the sheet-like molded body, and the time interval set in the imaging unit. The coordinate value of each pixel is calculated.
  • the integrating step either the following (1) or (2) below, or both (1) and (2) below are performed. (1) For each pixel of the defect map image, count the number of defective pixels among the corresponding pixels in the processed image.
  • the sum of the gradation values assigned to the corresponding but ancestors in the processed image is calculated.
  • the luminance value setting step as the luminance value of each pixel of the defect map image, in the integrating step, the number of defective pixels obtained in (1) and / or the total of gradation values obtained in (2), By setting the value calculated based on the above, a defect map image is generated.
  • the display step the defect map image generated in the defect map image generation step is displayed.
  • a defect map image which is an image for inspecting a defect of a sheet-like molded body based on a two-dimensional image of the sheet-like molded body generated in the imaging step. Therefore, it is possible to maintain a high defect detection capability as compared to a case where an image for inspecting defects is generated based on a plurality of one-dimensional images by a line sensor, for example.
  • the defect map image generation step based on the coordinate value of each pixel constituting each processed image, the conveyance speed of the sheet-shaped molded body, and the time interval set in the imaging unit, The coordinate value of each pixel for constituting the defect map image is calculated. Then, the luminance value of each pixel corresponding to the calculated coordinate value is the number of defective pixels in the pixel in the processed image and the same coordinate value is calculated, and the gradation value of the defective pixel.
  • a defect map image is generated, so by inspecting the defects of the sheet-shaped molded body by looking at the defect map image displayed in the display process, the defects in the sheet-shaped molded body Can be accurately inspected with high detection capability. Since the same defect appears in one place in the defect map, it is possible to prevent duplicate detection of the same defect.
  • FIG. 1A is a process diagram showing a process of a defect inspection method according to an embodiment of the present invention.
  • FIG. 1B is a process diagram showing a defect map image generation process according to an embodiment of the present invention.
  • FIG. 2 is a schematic diagram showing the configuration of the defect inspection apparatus 100 according to an embodiment of the present invention.
  • FIG. 3 is a block diagram illustrating a configuration of the defect inspection apparatus 100.
  • FIG. 4A is a diagram for explaining an edge profile method which is an example of a defect detection algorithm, and is a diagram illustrating an example of a two-dimensional image A corresponding to the two-dimensional image data generated by the imaging device 5.
  • FIG. 4B is a diagram illustrating an example of the edge profile P1 created by the processed image generation unit 61.
  • FIG. 4C is a diagram illustrating an example of the differential profile P2 created by the processed image generation unit 61.
  • FIG. 5A is a diagram for explaining a peak method which is another example of the defect detection algorithm, and is a diagram illustrating an example of a two-dimensional image B corresponding to the two-dimensional image data generated by the imaging device 5.
  • FIG. 5B is a diagram illustrating an example of the luminance profile P3 created by the processed image generation unit 61.
  • FIG. 5C is a diagram for explaining an assumed procedure of a mass point moving from one end of a data point toward the other end, which is executed by the processed image generation unit 61.
  • FIG. 5D is a diagram illustrating an example of a brightness value difference profile P4 created by the processed image generation unit 61.
  • FIG. 6A is a diagram for explaining a smoothing method that is another example of the defect detection algorithm, and is a diagram illustrating an example of a two-dimensional image C corresponding to the two-dimensional image data generated by the imaging device 5.
  • FIG. 6B is a diagram illustrating an example of the smoothing profile P5 generated by the processed image generation unit 61.
  • FIG. 7A is a diagram for explaining a second edge profile method which is another example of the defect detection algorithm, and shows an example of a two-dimensional image D corresponding to the two-dimensional image data generated by the imaging device 5.
  • FIG. FIG. 7B is a diagram illustrating an example of the edge profile P6 created by the processed image generation unit 61.
  • FIG. 7C is a diagram illustrating an example of the edge profile P7 created by the processed image generation unit 61.
  • 8A and 8B are diagrams illustrating examples of processed images E1 to E6 generated by the image processing device 6.
  • FIG. 9 is a diagram illustrating an example of the defect map image F generated by the image analysis device 7.
  • FIG. 10A is a diagram illustrating an example of processed images G1 to G13, which are other examples of processed images generated by the image processing device 6.
  • FIG. 10B is a diagram illustrating an example of a defect map image H that is another example of the defect map image generated by the image analysis device 7.
  • FIG. 11A is a diagram showing an example of processed images G1 to G13 made of one-dimensional images generated by the image processing device 6.
  • FIG. 11B is a diagram illustrating an example of a defect map image J generated by sequentially spreading the processed images G1 to G13.
  • FIG. 12A is a diagram illustrating an example of the one-dimensional images K1 to K19 acquired by the line sensor in the first conventional defect inspection apparatus.
  • FIG. 12B is a diagram illustrating an example of the defect map image L generated by spreading the one-dimensional images K1 to K19 in order of acquisition time.
  • FIG. 13A is a diagram illustrating an example of two-dimensional images M1 to M6 acquired by an area sensor in the defect inspection apparatus according to the second conventional technique.
  • FIG. 13B is a diagram illustrating an example of the defect map image N generated by spreading the two-dimensional images M1 to M6 in order of acquisition time.
  • FIG. 1A and 1B are process diagrams showing processes of a defect inspection method according to an embodiment of the present invention.
  • the defect inspection method of the present embodiment includes a transport step s1, a light irradiation step s2, an imaging step s3, a feature amount calculation step s4, a processed image data generation step s5, and a defect map image generation step s6 shown in FIG. 1A. And a display step s7.
  • the defect map image generation step s6 includes a defect map image coordinate value calculation step s6-1, an integration step s6-2, and a luminance value setting step s6-3 shown in FIG. 1B.
  • FIG. 2 is a schematic diagram showing the configuration of the defect inspection apparatus 100 according to an embodiment of the present invention.
  • FIG. 3 is a block diagram illustrating a configuration of the defect inspection apparatus 100.
  • the defect inspection apparatus 100 according to the present embodiment is an apparatus that detects defects in the sheet-like molded body 2 such as a thermoplastic resin, and includes the image generation apparatus 1 according to the present invention and a display unit 21.
  • the image generation apparatus 1 of the defect inspection apparatus 100 includes a transport device 3, an illumination device 4, an imaging device 5, an image processing device 6, and an image analysis device 7.
  • the defect inspection apparatus 100 implements the defect inspection method according to the present invention.
  • the conveyance device 3 executes the conveyance step s1, the illumination device 4 executes the light irradiation step s2, the imaging device 5 executes the imaging step s3, the image processing device 6 performs the feature amount calculation step s4, and processing image data generation Step s5 is executed, the image analysis device 7 executes the defect map image generation step s6, and the display unit 21 executes the display step s7.
  • the defect inspection apparatus 100 conveys the sheet-like molded body 2 continuous in the longitudinal direction with a constant width by the conveyor device 3 in a predetermined direction (the same direction as the longitudinal direction perpendicular to the width direction of the sheet-like molded body 2).
  • the sheet surface is conveyed at a speed, and the sheet surface illuminated by the illuminating device 4 in this conveying process is imaged at a predetermined time interval by the imaging device 5 to generate a two-dimensional image.
  • a corresponding processed image is generated, the image analysis device 7 generates a defect map image by combining a plurality of processed images output from the image processing device 6, and the display unit 21 displays the defect map image to form a sheet
  • the defect detection in the molded body 2 is performed.
  • the sheet-like molded body 2 that is the object to be inspected is subjected to a treatment such as passing the thermoplastic resin extruded from the extruder through the gap between the rolls to smooth the surface or imparting an uneven shape, It is formed by being drawn while being cooled on the transport roll.
  • a treatment such as passing the thermoplastic resin extruded from the extruder through the gap between the rolls to smooth the surface or imparting an uneven shape, It is formed by being drawn while being cooled on the transport roll.
  • the thermoplastic resin applicable to the sheet-like molded body 2 of the present embodiment include polyolefins such as methacrylic resin, methyl methacrylate-styrene copolymer (MS resin), polyethylene (PE), and polypropylene (PP), and polycarbonate. (PC), polyvinyl chloride (PVC), polystyrene (PS), polyvinyl alcohol (PVA), triacetyl cellulose resin (TAC), and the like.
  • Examples of defects that occur in the sheet-like molded body 2 include so-called nicks caused by point-like defects (point defects) such as bubbles, fish eyes, foreign matter, tire marks, dent marks, and scratches generated during molding, and crease marks. (Knick), and linear defects (line defects) such as so-called original streaks caused by differences in thickness.
  • point defects point-like defects
  • line defects linear defects
  • original streaks caused by differences in thickness.
  • the conveying device 3 has a function as a conveying unit, and conveys the sheet-like molded body 2 in a certain direction (conveying direction Z) at a predetermined conveying speed.
  • the transport device 3 includes, for example, a sending roller and a receiving roller that transport the sheet-like molded body 2 in the transport direction Z, and measures a transport distance by a rotary encoder or the like.
  • the transport speed is set to about 2 to 30 m / min in the transport direction Z.
  • the illuminating device 4 has a function as a light irradiation unit, and illuminates the width direction of the sheet-like molded body 2 orthogonal to the conveyance direction Z linearly.
  • the illumination device 4 is arranged so that a linear reflection image is included in the image captured by the imaging device 5.
  • the illumination device 4 faces the surface of the sheet-like molded body 2 on the same side as the imaging device 5 with the sheet-like molded body 2 as a reference, and an illumination area on the surface of the sheet-like molded body 2, that is, It arrange
  • photographs may be set to 200 mm, for example.
  • the illuminating device 4 As a light source of the illuminating device 4, in particular, as long as it irradiates light that does not affect the composition and properties of the sheet-like molded body 2 such as an LED (Light Emitting Diode), a metal halide lamp, a halogen transmission light, and a fluorescent lamp. It is not limited.
  • the illuminating device 4 may be installed on the opposite side to the imaging device 5 with the sheet-like molded body 2 interposed therebetween. In this case, the image captured by the imaging device 5 includes a transmission image that passes through the sheet-like molded body 2. Moreover, in FIG.
  • the illuminating device 4 provided with the light source extended linearly in the width direction of the sheet-like molded object 2 was illustrated, it is not limited to such a structure.
  • the illumination device 4 may have various configurations according to the type of defect detection algorithm processing performed by the processing image generation unit 61 described later. For example, it is good also as a structure of the illuminating device 4 provided with the slit member arrange
  • the defect inspection apparatus 100 includes a plurality of imaging devices 5 having a function as an imaging unit, and the imaging devices 5 are arranged at equal intervals in a direction orthogonal to the conveyance direction Z (width direction of the sheet-like molded body 2).
  • the imaging device 5 is arranged such that the direction from the imaging device 5 toward the center of the imaging region of the sheet-like molded body 2 and the conveying direction Z form an acute angle.
  • the imaging device 5 is configured to generate a two-dimensional image including a reflection image or a transmission image (hereinafter collectively referred to as “illumination image”) of the sheet-like molded body 2 by the illumination device 4 at a predetermined time interval (imaging interval). Imaging is performed to generate a plurality of two-dimensional images.
  • the time interval is set so that imaging areas captured by two consecutive imaging operations partially overlap.
  • the two two-dimensional images generated in the two consecutive imaging operations are images that partially overlap each other in the direction parallel to the longitudinal direction of the sheet-like molded body 2.
  • the imaging device 5 includes a CCD (Charge Coupled Device) or CMOS (Complementary Metal-Oxide Semiconductor) area sensor that captures a two-dimensional image. As shown in FIG. 2, the imaging device 5 is arranged so as to capture the entire region in the width direction orthogonal to the conveyance direction Z of the sheet-like molded body 2. In this way, by imaging the entire area in the width direction of the sheet-shaped molded body 2 and conveying the sheet-shaped molded body 2 continuous in the conveying direction Z, defects in the entire area of the sheet-shaped molded body 2 can be efficiently removed. Can be inspected.
  • CCD Charge Coupled Device
  • CMOS Complementary Metal-Oxide Semiconductor
  • the imaging interval of the imaging device 5 may be fixed, or may be changeable by the user operating the imaging device 5 itself.
  • the imaging interval of the imaging device 5 may be a fraction of a second, which is a time interval for continuous shooting by a digital still camera.
  • a short time interval for example, It is preferable that it is 1/30 second, which is the imaging interval of the moving image data.
  • the imaging interval of the imaging device 5 is such that the length of the partially overlapping imaging region in the transport direction Z is 1 ⁇ 2 times the length of the two-dimensional image in the transport direction Z, that is, the time defined by the imaging interval. It is preferable that the distance at which the sheet-like molded body 2 is conveyed is set to be 1 ⁇ 2 times or less the length of the two-dimensional image in the conveying direction Z. In other words, the length in the transport direction Z of the two-dimensional image captured by the imaging device 5 is such that the sheet-like molded body 2 is within the time from when the imaging device 5 captures the next two-dimensional image until the next two-dimensional image is captured. It is preferable that it is 2 times or more of the conveyance distance conveyed.
  • the length of the two-dimensional image in the conveyance direction Z is set to be longer than the conveyance distance of the sheet-like molded body 2 in the time from when the imaging device 5 captures the two-dimensional image until the next two-dimensional image is captured.
  • the image processing apparatus 6 includes a processed image generation unit 61 having functions as a feature amount calculation unit and a processed image data generation unit.
  • the image processing device 6 is provided corresponding to each of the plurality of imaging devices 5.
  • the processed image generation unit 61 can be realized by an internal hardware of the image processing board or the imaging device 5 such as an FPGA (Field-programmable gate array) or a GPGPU (General-purpose computing on graphics processing units).
  • the processed image generation unit 61 processes each two-dimensional image output from the imaging device 5 with a predetermined algorithm (hereinafter, referred to as “defect detection algorithm”), so that each pixel constituting the two-dimensional image is processed. Then, a feature amount based on the luminance value is calculated. Further, the processed image generation unit 61 recognizes, in each of the two-dimensional images, a pixel whose feature amount is equal to or greater than a predetermined threshold value as a defective pixel, and the gradation value corresponding to the feature amount is assigned to the defective pixel. For the remaining pixels other than the defective pixels (pixels whose feature amount is less than the threshold value), a processed image to which a zero gradation value is assigned is generated corresponding to each two-dimensional image, and each generated processed image Is output.
  • a predetermined algorithm hereinafter, referred to as “defect detection algorithm”
  • the defect detection algorithm used in the processed image generation unit 61 will be described with reference to FIGS. 4A to 4C, FIGS. 5A to 5D, FIGS. 6A and 6B, and FIGS. 7A to 7C.
  • FIG. 4A to 4C are diagrams for explaining an edge profile method which is an example of a defect detection algorithm.
  • FIG. 4A shows an example of a two-dimensional image A corresponding to the two-dimensional image data generated by the imaging device 5, and the upper side of the image is the downstream side in the transport direction Z, and the lower side of the image is the upstream side in the transport direction Z. .
  • a direction parallel to the width direction of the sheet-like molded body 2 is defined as an X direction
  • a direction parallel to the longitudinal direction (direction parallel to the transport direction Z) of the sheet-shaped molded body 2 is defined as a Y direction.
  • a strip-shaped bright area located in the center in the Y direction of the two-dimensional image A and extending in the X direction is the illumination image A1
  • a dark area existing inside the illumination image A1 is the first defective pixel group A21.
  • a bright region in the vicinity of the illumination image A1 is the second defective pixel group A22.
  • the processed image generation unit 61 first divides the two-dimensional image A into data of pixel columns one by one along the Y direction. Next, the processed image generation unit 61 shifts the edge of the data of each pixel column from one end in the Y direction (the upper end of the two-dimensional image A in FIG. 4A) to the other end (the lower end of the two-dimensional image A in FIG. 4A). Perform edge determination processing to search.
  • the processed image generation unit 61 sets the second pixel from one end in the Y direction as the target pixel for the data of each pixel column, and determines the brightness value of the adjacent pixel adjacent to the one end side with respect to the target pixel. Also, it is determined whether or not the luminance value of the target pixel is greater than a predetermined threshold value. If it is determined that the luminance value of the target pixel is greater than the luminance value of the adjacent pixel by a predetermined threshold or more, the processed image generation unit 61 determines that the adjacent pixel is the upper limit edge A3.
  • the processed image generation unit 61 determines that the luminance value of the target pixel is larger than the luminance value of the adjacent pixel by a predetermined threshold or more while shifting the target pixel one pixel at a time toward the other end in the Y direction. The edge determination process is repeated until it is done.
  • the processed image generation unit 61 shifts the target pixel by one pixel toward the other end in the Y direction, and determines whether the luminance value of the target pixel is smaller than the luminance value of the adjacent pixel by a predetermined threshold value or more. Determine whether or not.
  • the processed image generation unit 61 determines that the adjacent pixel is the lower limit edge A4.
  • the processed image generation unit 61 determines that the luminance value of the target pixel is smaller than the luminance value of the adjacent pixel by a predetermined threshold or more while shifting the target pixel one pixel toward the other end in the Y direction. The edge determination process is repeated until it is done.
  • an example of the upper limit edge A3 detected by the edge determination process by the processed image generation unit 61 is indicated by “ ⁇ ”, and an example of the lower limit edge A4 is indicated by “ ⁇ ”.
  • the coordinate values (Y coordinates) of the upper limit edge A3 and the lower limit edge A4 in the Y direction. Value) is extremely smaller than the difference in the Y coordinate values of the remaining pixels other than the defective pixel.
  • the Y coordinate value of the upper limit edge A3 is clearly different from the Y coordinate value of the remaining pixels other than the defective pixel.
  • the processed image generation unit 61 creates an edge profile P1 shown in FIG. 4B.
  • a peak P11 corresponding to the Y coordinate value of the upper limit edge A3 appears corresponding to the second defective pixel group A22 in the two-dimensional image A.
  • the processed image generation unit 61 may be configured to create an edge profile based on a difference in Y coordinate values between the upper limit edge A3 and the lower limit edge A4.
  • the upper limit edge A3 and the lower limit edge A4 correspond to the first defective pixel group A21 and the second defective pixel group A22 in the two-dimensional image A.
  • a peak with a small difference in Y coordinate values will appear.
  • the processed image generation unit 61 performs a differentiation process on the edge profile P1 to create a differentiation profile P2 shown in FIG. 4C.
  • the differential profile P2 shown in FIG. 4C features corresponding to the peak P11 in the edge profile P1, that is, corresponding to the second defective pixel group A22 in the two-dimensional image A, are equal to or greater than a predetermined threshold (the differential value is large).
  • a peak P21 having a quantity P22 appears.
  • the processed image generation unit 61 extracts, as a defective pixel, a pixel in the two-dimensional image A corresponding to the peak P21 having a feature amount P22 that is equal to or greater than a predetermined threshold based on the differential profile P2.
  • the processed image generation unit 61 extracts the second defective pixel group A22 as defective pixels.
  • FIG. 5A to 5D are diagrams for explaining a peak method which is another example of the defect detection algorithm.
  • FIG. 5A shows an example of a two-dimensional image B corresponding to the two-dimensional image data generated by the imaging device 5, wherein the upper side of the image is the downstream side in the transport direction Z and the lower side of the image is the upstream side in the transport direction Z. .
  • a direction parallel to the width direction of the sheet-shaped molded body 2 is defined as an X direction
  • a direction parallel to the longitudinal direction (direction parallel to the transport direction Z) of the sheet-shaped molded body 2 is defined as a Y direction.
  • a strip-shaped bright region located in the center in the Y direction of the two-dimensional image B and extending in the X direction is the illumination image B1
  • a dark region existing inside the illumination image B1 is the first defective pixel.
  • the bright region that is in the group B21 and is in the vicinity of the illumination image B1 is the second defective pixel group B22.
  • the processed image generation unit 61 first divides the two-dimensional image B into data of pixel columns one by one along the Y direction. Next, the processed image generation unit 61 continuously draws the data of each pixel column using the luminance value data at positions along a straight line L parallel to the Y direction of the two-dimensional image B as points, The connected curve is created as a luminance profile P3 shown in FIG. 5B.
  • the luminance profile P3 shows a unimodal profile in which no valley portion appears, but when there is a defective pixel, a valley portion P31 appears as shown in FIG. 5B. It shows the profile of Soho.
  • the processed image generation unit 61 has one end in the X direction of the luminance profile P3 so that the movement time between adjacent data points is constant regardless of the distance between the data points.
  • a mass that moves from one to the other.
  • the mass point moves from the data point c to the adjacent data point b, from the data point b to the adjacent data point a, and from the data point a to the adjacent data point d.
  • the data point d is a data point corresponding to the target pixel.
  • the processed image generation unit 61 obtains the velocity vector and acceleration vector of the mass point at the data points a, b, and c where the mass point passes immediately before the data point d. That is, the processed image generation unit 61 determines the interval from the data point b to the data point a based on the coordinates of the two data points a and b where the mass point has passed immediately before the data point d and the movement time. Find the velocity vector of the mass point at. Further, the processed image generation unit 61 determines the interval from the data point c to the data point b based on the coordinates of the two data points b and c that the mass point has passed immediately before the data point a and the movement time. Find the velocity vector of the mass point at.
  • the processed image generation unit 61 starts from the data point c based on the speed vector of the mass point in the section from the data point b to the data point a and the speed vector of the mass point in the section from the data point c to the data point b.
  • the acceleration vector of the mass point in the section up to the data point a is obtained.
  • the processed image generation unit 61 calculates the coordinates of the data point d from the velocity vector of the mass point in the section from the data point b to the data point a and the acceleration vector of the mass point in the section from the data point c to the data point a. Predict (predicted data point f).
  • the processed image generation unit 61 obtains a difference between the luminance value of the predicted data point f of the data point d predicted as described above and the actual (actually measured) luminance value of the data point d, and the luminance shown in FIG. 5D.
  • a value difference profile P4 is created.
  • the luminance value difference profile P4 shown in FIG. 5D it corresponds to the valley portion P31 in the luminance profile P3 shown in FIG. 5B, that is, corresponds to the first defective pixel group B21 in the two-dimensional image B, and is equal to or higher than a predetermined threshold value.
  • a peak P41 having a feature amount P42 appears.
  • the processed image generation unit 61 extracts, as a defective pixel, a pixel in the two-dimensional image B corresponding to the peak P41 having a feature amount P42 equal to or greater than a predetermined threshold based on the luminance value difference profile P4.
  • the processed image generation unit 61 extracts the first defective pixel group B21 as defective pixels.
  • FIG. 6A and 6B are diagrams for explaining a smoothing method which is another example of the defect detection algorithm.
  • FIG. 6A shows an example of a two-dimensional image C corresponding to the two-dimensional image data generated by the imaging device 5, and the upper side of the image is the downstream side in the transport direction Z, and the lower side of the image is the upstream side in the transport direction Z. .
  • the direction parallel to the width direction of the sheet-like molded body 2 is defined as the X direction
  • the direction parallel to the longitudinal direction (direction parallel to the transport direction Z) of the sheet-shaped molded body 2 is defined as the Y direction.
  • a strip-shaped bright region located in the center in the Y direction of the two-dimensional image C and extending in the X direction is the illumination image C1
  • a dark region existing inside the illumination image C1 is the first defective pixel group C21.
  • a bright area in the vicinity of the illumination image C1 is the second defective pixel group C22.
  • the processed image generation unit 61 first divides the two-dimensional image C into data of pixel columns one by one along the Y direction. Next, the processed image generation unit 61 creates a kernel C31 of several pixels in the X direction and the Y direction (for example, 5 pixels in the X direction and 1 pixel in the Y direction).
  • the processed image generation unit 61 sets the luminance value of the central pixel in the kernel C31 at the position along the straight line L parallel to the Y direction of the two-dimensional image C and all the data in the kernel C31.
  • Data of the difference from the average value of the luminance values of the pixels is continuously drawn as points, and a curve connecting them is created as a smoothing profile P5 shown in FIG. 6B.
  • a peak P51 having a feature quantity P52 that is equal to or greater than a predetermined threshold (a luminance value difference is large) appears corresponding to the first defective pixel group C21 in the two-dimensional image C. .
  • the processed image generation unit 61 extracts, as a defective pixel, a pixel in the two-dimensional image C that corresponds to the peak P51 having a feature amount P52 that is equal to or greater than a predetermined threshold based on the smoothing profile P5.
  • the processed image generation unit 61 extracts the first defective pixel group C21 as defective pixels.
  • FIG. 7A to 7C are diagrams for explaining a second edge profile method which is another example of the defect detection algorithm.
  • FIG. 7A shows an example of a two-dimensional image D corresponding to the two-dimensional image data generated by the imaging device 5, wherein the upper side of the image is the downstream side in the transport direction Z and the lower side of the image is the upstream side in the transport direction Z. .
  • the direction parallel to the width direction of the sheet-like molded body 2 is defined as the X direction
  • the direction parallel to the longitudinal direction (direction parallel to the transport direction Z) of the sheet-shaped molded body 2 is defined as the Y direction.
  • a band-like bright region located in the center in the Y direction of the two-dimensional image D and extending in the X direction is the illumination image D1
  • a dark region existing inside the illumination image D1 is the first defective pixel group D21.
  • the bright region existing in the vicinity of the illumination image D1 is the second defective pixel group D22.
  • the processed image generation unit 61 first divides the two-dimensional image D into data of pixel columns one by one along the Y direction. Next, the processed image generation unit 61 shifts the edge of the data of each pixel column from one end in the Y direction (the upper end of the two-dimensional image D in FIG. 7A) to the other end (the lower end of the two-dimensional image D in FIG. 7A). Perform edge determination processing to search.
  • the processed image generation unit 61 sets the second pixel from one end in the Y direction as the target pixel for the data of each pixel column, and determines the brightness value of the adjacent pixel adjacent to the one end side with respect to the target pixel. Also, it is determined whether or not the luminance value of the target pixel is greater than a predetermined threshold value. When it is determined that the luminance value of the target pixel is greater than the luminance value of the adjacent pixel by a predetermined threshold or more, the processed image generation unit 61 determines that the adjacent pixel is the edge D3.
  • the processed image generation unit 61 determines that the luminance value of the target pixel is larger than the luminance value of the adjacent pixel by a predetermined threshold or more while shifting the target pixel one pixel at a time toward the other end in the Y direction. The edge determination process is repeated until it is done.
  • FIG. 7A an example of the edge D3 detected by the edge determination process by the processed image generation unit 61 is indicated by “ ⁇ ”.
  • the coordinate value (Y coordinate value) in the Y direction of the edge D3 is It changes extremely.
  • the processed image generation unit 61 creates an edge profile P6 corresponding to the edge D3 in the two-dimensional image D.
  • the edge profile P6 corresponding to the edge D3 in the vicinity of the second defective pixel group D22 of the two-dimensional image D is shown enlarged.
  • the Y coordinate value changes extremely corresponding to the second defective pixel group D22 in the two-dimensional image D.
  • the processed image generation unit 61 selects points P61 and P62 which are arbitrary two points on the created edge profile P6, and is surrounded by a straight line connecting the points P61 and P62 and the curve of the edge profile P6.
  • the area of the region P63 is calculated as a feature amount.
  • the processed image generation unit 61 extracts a pixel in the two-dimensional image D corresponding to a profile portion having a feature amount (area of the region P63) equal to or greater than a predetermined threshold as a defective pixel.
  • the processed image generation unit 61 creates an edge profile P7 corresponding to the edge D3 in the two-dimensional image D.
  • the edge profile P7 corresponding to the edge D3 in the vicinity of the second defective pixel group D22 of the two-dimensional image D is shown enlarged.
  • the Y coordinate value changes extremely corresponding to the second defective pixel group D22 in the two-dimensional image D.
  • the processed image generation unit 61 selects a point P71 and a point P72 that are arbitrary two points on the created edge profile P7, a tangent line P711 of the edge profile P7 at the point P71, and a tangent line P721 of the edge profile P7 at the point P72. Create Next, the processed image generation unit 61 calculates an angle ⁇ 1 formed between the virtual straight line P73 parallel to the X axis and the tangent line P711 and an angle ⁇ 2 formed between the virtual straight line P73 and the tangent line P721, and the calculated angle ⁇ 1. An angle ⁇ 3 that is a difference from the angle ⁇ 2 is obtained.
  • the processed image generation unit 61 uses the length of the arc P74 between the point P71 and the point P72 in the edge profile P7 and the angle ⁇ 3 to use the arc P74 between the point P71 and the point P72 in the edge profile P7. Is calculated as a feature amount. Based on the edge profile P7, the processed image generation unit 61 extracts a pixel in the two-dimensional image D corresponding to a profile portion having a feature amount (curvature radius R) within a predetermined threshold range as a defective pixel.
  • the defects generated in the sheet-like molded body 2 include so-called nicks caused by bubbles, fish eyes, foreign matters, tire marks, dents, scratches, and so-called nicks, and differences in thickness. And line defects such as so-called raw streaks caused by the above.
  • the type of defect that can be extracted differs depending on the type of defect detection algorithm used when the processed image is generated by the processed image generation unit 61.
  • the edge profile method which is an example of a defect detection algorithm, can extract defects such as foreign matter, tire marks, and scratches with high extraction ability.
  • the peak method can extract defects such as foreign matters, dents, and scratches with high extraction ability.
  • the smoothing method can extract defects such as bubbles, fish eyes, and dents with high extraction ability.
  • defects such as raw fabric lines and nicks can be extracted with high extraction ability.
  • the processing image generation unit 61 calculates a feature amount by processing using a plurality of defect detection algorithms, using the difference in defect extraction ability depending on the type of defect detection algorithm. Then, by extracting the defective pixels in the two-dimensional image using the calculated feature amount, it becomes possible to distinguish the defect type of the defect area in the two-dimensional image generated by the imaging device 5.
  • FIGS. 8A and 8B are diagrams illustrating examples of the processed images E1 to E6 generated by the image processing device 6.
  • the processed image generation unit 61 of the image processing device 6 processes each two-dimensional image output from the imaging device 5 with the above-described defect detection algorithm to extract defective pixels, and then FIGS. 8A and 8B.
  • Processed images E1 to E6 as shown in FIG. 6 are generated corresponding to each two-dimensional image.
  • the black portion represents a residual pixel that is a portion having no defect
  • the white portion represents a defective pixel that is a portion having a defect.
  • the direction parallel to the width direction of the sheet-like molded body 2 is defined as the X direction, and the longitudinal direction of the sheet-shaped molded body 2 (the direction parallel to the conveying direction Z).
  • the parallel direction is defined as the Y direction.
  • the processed images E1 to E6 shown in FIGS. 8A and 8B are 0, 1, 0 from one end in the X direction (the left end of each processed image in FIGS. 8A and 8B) to the other end (the right end of each processed image in FIGS. 8A and 8B). 2,..., M ⁇ 2, m ⁇ 1, which are arranged in the X direction, one end in the Y direction (the upper end of each processed image in FIGS. 8A and 8B) and the other end (each in FIGS. 8A and 8B).
  • This is an image composed of n pixels arranged in the order of 0, 1, 2,..., N ⁇ 2, n ⁇ 1 toward the lower end of the processed image.
  • the processed image generation unit 61 corresponds to each two-dimensional image generated by being captured by the imaging device 5 at a predetermined time interval, and the processed image E1 and the processed image E2 according to the imaging order.
  • the processed images are sequentially generated in the order of the processed image E3, the processed image E4, the processed image E5, and the processed image E6.
  • the size and shape of the processed images E1 to E6 generated by the processed image generating unit 61 are the same as the size and shape of each two-dimensional image, and the processed image E1 of each pixel constituting the processed images E1 to E6.
  • the processed image position coordinate representing the position at ⁇ E6 matches the coordinate value representing the position in the two-dimensional image of each pixel constituting the corresponding two-dimensional image.
  • the processed images E1 to E6 generated by the processed image generation unit 61 are between two processed images in which the generation order is continuous, specifically, between the processed image E1 and the processed image E2, and between the processed image E2 and the processed image.
  • An overlapping portion at least partially overlaps between the image E3, between the processed image E3 and the processed image E4, between the processed image E4 and the processed image E5, and between the processed image E5 and the processed image E6. Have.
  • the processed images E1 to E6 generated by the processed image generation unit 61 are input to the image analysis device 7.
  • the image analysis apparatus 7 provided in the defect inspection apparatus 100 of the present embodiment combines the plurality of processed images E1 to E6 generated by the processed image generation unit 61 to express the distribution of defects in the sheet-like molded body 2.
  • a defect map image F as shown in FIG. 9 is generated.
  • FIG. 9 is a diagram illustrating an example of the defect map image F generated by the image analysis device 7.
  • the black portion represents the remaining pixels that are portions without defects
  • the white portion represents the defective pixels that are portions having defects.
  • the direction parallel to the width direction of the sheet-like molded body 2 is defined as the X direction, and the direction parallel to the longitudinal direction of the sheet-like molded body 2 (direction parallel to the transport direction Z). Is the Y direction.
  • the defect map image F shown in FIG. 9 is 0, 1, 2,..., T from one end in the X direction (the left end of the defect map image F in FIG. 9) to the other end (the right end of the defect map image F in FIG. 9). T pixels arranged in the order of ⁇ 2, t ⁇ 1, from one end in the Y direction (upper end of the defect map image F in FIG. 9) to the other end (lower end of the defect map image F in FIG. 9). .., U ⁇ 2, u ⁇ 1, which are positioned in the order of 0, 1, 2,..., U ⁇ 2, u ⁇ 1.
  • the image analysis device 7 includes a processed image input unit 71, a defect map image generation unit 72, and a control unit 73.
  • the processed image input unit 71 inputs the processed images E1 to E6 output from the processed image generation unit 61 of the image processing device 6.
  • the defect map image generation unit 72 is a part that generates the defect map image F, and includes a coordinate value calculation unit 721 that is a defect map image coordinate value calculation unit, an integration unit 722, and a luminance value setting unit 723.
  • the coordinate value calculation unit 721 is based on the coordinate values (hereinafter referred to as “processed image position coordinates”) of the pixels constituting the processed images E1 to E6 sequentially generated by the processed image generating unit 61.
  • the coordinate value (hereinafter referred to as “defect map image position coordinate”) of each pixel that constitutes is calculated.
  • each defect map image F is configured.
  • the coordinate value calculation unit 721 has a defect map image position coordinate corresponding to the pixel whose processing image position coordinate is (X N , Y N ) ( X t , Y u ) is calculated according to the following formula (3).
  • N indicates the generation order of processed images
  • LS indicates the conveyance speed (mm / second) of the sheet-like molded body 2 by the conveyance device 3
  • FR indicates the imaging operation by the imaging device 5.
  • RS indicates the resolution (mm / pixel) of the imaging device 5.
  • the accumulating unit 722 performs either (1) or (2) below or both (1) and (2) below.
  • (1) For each pixel of the defect map image F the number of defective pixels among the corresponding pixels in the processed image is counted.
  • (2) For each pixel of the defect map image F the total of gradation values given to the corresponding pixels in the processed image is calculated.
  • the processed images E1 to E6 generated by the processed image generation unit 61 have an overlapping portion at least partially overlapping between two processed images in which the generation order is continuous. Therefore, pixels having the same defect map image position coordinates may be calculated from a plurality of processed images by the processing in the coordinate value calculation unit 721.
  • the same defect map image position coordinates are calculated from two or more processed images for all the pixels of the defect map image F. That is, there are one or more pixels corresponding to each pixel of the defect map image F that are pixels of the processed image.
  • the process (1) the number of defective pixels among the pixels of one or two or more processed images that exist is counted.
  • the processed image has a defective pixel and a residual pixel as described above.
  • the defective pixel is assigned a gradation value corresponding to the feature amount, and the remaining pixels are assigned a zero gradation value.
  • the sum of the gradation values assigned to each of the pixels of the processed image existing in one or two or more is calculated.
  • the luminance value setting unit 723 is obtained in (1) in the integrating unit 722 as the luminance value of each pixel of the defect map image F represented by the defect map image position coordinates calculated by the coordinate value calculating unit 721.
  • a value calculated based on the number of defective pixels and / or the sum of the gradation values obtained in (2) is set.
  • the luminance value setting unit 723 multiplies the average value of the luminance values of the pixels in the processed images E1 to E6 corresponding to the pixels of the defect map image F by the number of defective pixels obtained in (1).
  • the luminance value setting unit 723 may set the sum of the gradation values obtained in (2) as the luminance value of each pixel of the defect map image F, or obtained in (1).
  • the number of defective pixels may be set as the luminance value of each pixel of the defect map image F.
  • the luminance value setting unit 723 sets the luminance value of each pixel constituting the defect map image F as described above, the difference in luminance value between the defective pixel and the remaining pixel becomes large, and as a result, the defect map image F In, defective pixels become clearer. Further, in the defect map image F, the luminance value of the defective pixel corresponding to the defect map image position coordinate having a larger total number of defective pixels obtained in the above (1) and the gradation value obtained in the above (2). Since it becomes large, the degree of sharpness can be made different between defective pixels.
  • the defect map image F generated by the defect map image generation unit 72 is input to the control unit 73.
  • the control unit 73 outputs the input defect map image F to the display unit 21.
  • the display unit 21 is, for example, a liquid crystal display, an EL (Electroluminescence) display, a plasma display, or the like.
  • the display unit 21 displays the defect map image F generated by the defect map image generation unit 72 on the display screen.
  • the defect inspection apparatus 100 in order to inspect defects in the sheet-like molded body 2 based on the two-dimensional image of the sheet-like molded body 2 generated by the imaging device 5. Since the defect map image F, which is an image of the above, is generated, a high defect detection capability can be maintained as compared with the case where, for example, an image for inspecting defects is generated based on a plurality of one-dimensional images by a line sensor. .
  • the defect map image position coordinates representing the position of each pixel in the defect map image F are based on the coordinate value of each pixel of each of the processed images E1 to E6 according to the above equation (3). Based on the calculated number of defective pixels among the plurality of processed images E1 to E6 and the same defect map image position coordinates calculated, and the sum of the gradation values of the defective pixels, Since the brightness value of each pixel of the map image F is set, the defect map image 2 is inspected for defects in the sheet-like molded body 2 by using the defect map image F, and the position of the defect in the sheet-like molded body 2 can be detected with high capability. It can be accurately inspected.
  • the display part 21 displays the defect map image F produced
  • a user displays the defect map image F displayed by the display part 21. By looking, the position of the defect in the sheet-like molded body 2 can be confirmed.
  • FIGS. 10A and 10B show processed images G1 to G13, which are other examples of processed images generated by the image processing device 6, and a defect map image H, which is another example of the defect map images generated by the image analyzing device 7.
  • FIGS. 11A and 11B are diagrams illustrating the operation of the image analysis device 7 when the defect map image J is generated by sequentially laying the processed images G1 to G13 formed of the one-dimensional image generated by the image processing device 6.
  • FIG. 10A and 11A the defect map image H shown in FIG. 10B, and the defect map image J shown in FIG. 11B
  • the black part represents the remaining pixels that are parts without defects
  • the white part Represents a defective pixel which is a defective part.
  • the processed image generation unit 61 generates the processed images E1 to E6 having the same size and shape as the two-dimensional image generated by the imaging device 5 corresponding to each two-dimensional image.
  • the present invention is not limited to this configuration.
  • the processed image generation unit 61 extracts a boundary region portion between a bright part and a dark part of the illumination image in the two-dimensional image generated by the imaging device 5, and a one-dimensional image as illustrated in FIG. 10A. Processed images G1 to G13 are generated.
  • the defective pixels may be extracted from the two-dimensional image generated by the imaging device 5 using the defect detection algorithm to generate the processed images G1 to G13 including the one-dimensional image.
  • Each pixel constituting the processed images G1 to G13 generated by the processed image generation unit 61 is a coordinate in which information on the processed image position coordinates is stored in a luminance value information storage bit string in which luminance value information representing a luminance value is stored. It is composed of a bit string to which an information storage bit string is added.
  • the coordinate information storage bit string of each pixel constituting the processed images G1 to G13 information on coordinate values corresponding to the coordinates of each pixel constituting the two-dimensional image generated by the imaging device 5 is the processed image position coordinates. It is stored as information.
  • the defect map image generation unit 72 when the defect map image generation unit 72 generates a defect map image by sequentially laying out the plurality of processing images G1 to G13 generated by the processing image generation unit 61 according to the generation order, as shown in FIG. 11B.
  • a defect map image J is generated, and there are a plurality of defective pixels indicating the same defect in one defect map image J.
  • Such a defect map image J is used to inspect a defect of the sheet-like molded body 2, it is difficult to accurately grasp the position of the defect in the sheet-like molded body 2. Moreover, the same defect will be detected repeatedly.
  • the defect map image generation unit 72 combines the plurality of processed images G1 to G13 generated by the processed image generation unit 61 to thereby generate a defect map image H as shown in FIG. 10B. Is generated.
  • the coordinate value calculation unit 721 of the defect map image generation unit 72 uses the defect map image H based on the processing image position coordinate information stored in the coordinate information storage bit string of each pixel constituting each processing image G1 to G13.
  • the defect map image position coordinates representing the position in the defect map image H of each of the pixels constituting the are calculated according to the above equation (3).
  • the accumulating unit 722 includes the number of defective pixels and / or the number of defective pixels among the pixels in which the coordinate value calculating unit 721 calculates the same defect map image position coordinates in the plurality of processed images G1 to G13. Find the sum of key values. Then, the luminance value setting unit 723 uses the luminance value of each pixel of the defect map image H represented by the defect map image position coordinates calculated by the coordinate value calculation unit 721 as the number of defective pixels obtained by the integrating unit 722. And / or calculated and set based on the sum of gradation values.
  • the defect map image position coordinates indicating the position in the defect map image H are information on the processed image position coordinates stored in the coordinate information storage bit string of each pixel of the processed images G1 to G13.
  • the defect map image 2 is inspected for defects using the defect map image H, so that the position of the defect in the sheet-like molded body 2 is highly detected. Can be accurately inspected. Since the same defect appears in one place in the defect map, it is possible to prevent duplicate detection of the same defect.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Textile Engineering (AREA)
  • Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
PCT/JP2014/052371 2013-01-30 2014-01-28 画像生成装置、欠陥検査装置および欠陥検査方法 WO2014119772A1 (ja)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201480006156.9A CN104956210B (zh) 2013-01-30 2014-01-28 图像生成装置、缺陷检查装置以及缺陷检查方法
KR1020157017615A KR102168143B1 (ko) 2013-01-30 2014-01-28 화상 생성 장치, 결함 검사 장치 및 결함 검사 방법
JP2014559794A JP6191627B2 (ja) 2013-01-30 2014-01-28 画像生成装置、欠陥検査装置および欠陥検査方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013015641 2013-01-30
JP2013-015641 2013-01-30

Publications (1)

Publication Number Publication Date
WO2014119772A1 true WO2014119772A1 (ja) 2014-08-07

Family

ID=51262469

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/052371 WO2014119772A1 (ja) 2013-01-30 2014-01-28 画像生成装置、欠陥検査装置および欠陥検査方法

Country Status (5)

Country Link
JP (1) JP6191627B2 (ko)
KR (1) KR102168143B1 (ko)
CN (1) CN104956210B (ko)
TW (1) TWI608230B (ko)
WO (1) WO2014119772A1 (ko)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106503724A (zh) * 2015-09-04 2017-03-15 佳能株式会社 分类器生成装置、有缺陷/无缺陷确定装置和方法
WO2017043270A1 (ja) * 2015-09-09 2017-03-16 住友電装株式会社 端子付電線の検査方法及び端子付電線検査装置
JP2017215277A (ja) * 2016-06-02 2017-12-07 住友化学株式会社 欠陥検査システム、フィルム製造装置及び欠陥検査方法
JP2019074354A (ja) * 2017-10-13 2019-05-16 王子ホールディングス株式会社 衛生用紙の製造方法及び欠陥検査装置
JP7141772B1 (ja) 2021-12-02 2022-09-26 株式会社岩崎電機製作所 画像検査装置、画像検査方法、および、画像検査プログラム

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6358351B1 (ja) * 2017-03-21 2018-07-18 Jfeスチール株式会社 表面欠陥検査方法及び表面欠陥検査装置
JP6676573B2 (ja) * 2017-03-29 2020-04-08 Ckd株式会社 検査装置及び巻回装置
JP6970549B2 (ja) * 2017-07-24 2021-11-24 住友化学株式会社 欠陥検査システム及び欠陥検査方法
JP6970550B2 (ja) * 2017-07-24 2021-11-24 住友化学株式会社 欠陥検査システム及び欠陥検査方法
JP7067321B2 (ja) * 2018-06-29 2022-05-16 オムロン株式会社 検査結果提示装置、検査結果提示方法及び検査結果提示プログラム
CN109030499B (zh) * 2018-07-27 2021-08-24 江苏理工学院 一种适用于目标缺陷连续在线检测防止缺陷数目重复计数的装置及方法
JP7007324B2 (ja) * 2019-04-25 2022-01-24 ファナック株式会社 画像処理装置、画像処理方法、及びロボットシステム
CN111047561A (zh) * 2019-11-22 2020-04-21 国网江西省电力有限公司电力科学研究院 一种复合绝缘子伞裙龟裂纹的识别方法
CN111044522B (zh) * 2019-12-14 2022-03-11 中国科学院深圳先进技术研究院 缺陷检测方法、装置及终端设备
CN111692998B (zh) * 2020-06-11 2022-02-11 西格迈股份有限公司 一种活塞杆表面粗糙度检测系统

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH095255A (ja) * 1995-06-26 1997-01-10 Hitachi Ltd 欠陥検査方法およびその装置並びに薄膜磁気ヘッド用の素子の製造方法
JP2003098106A (ja) * 2001-09-27 2003-04-03 Dainippon Printing Co Ltd 印刷欠陥表示方法および装置
JP2006337167A (ja) * 2005-06-01 2006-12-14 Fast:Kk 周期性ノイズ下での低コントラスト欠陥検査方法、繰返しパターン下での低コントラスト欠陥検査方法
JP2007218629A (ja) * 2006-02-14 2007-08-30 Sumitomo Chemical Co Ltd 欠陥検査装置及び欠陥検査方法
WO2010058557A1 (ja) * 2008-11-21 2010-05-27 住友化学株式会社 成形シートの欠陥検査装置
WO2012035852A1 (ja) * 2010-09-15 2012-03-22 株式会社日立ハイテクノロジーズ 欠陥検査方法及びその装置

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09159622A (ja) * 1995-12-05 1997-06-20 Kawasaki Steel Corp 表面欠陥検査装置
JP5415709B2 (ja) * 2008-03-31 2014-02-12 住友化学株式会社 偏光フィルムの仕分けシステム
JP4726983B2 (ja) * 2009-10-30 2011-07-20 住友化学株式会社 欠陥検査システム、並びに、それに用いる、欠陥検査用撮影装置、欠陥検査用画像処理装置、欠陥検査用画像処理プログラム、記録媒体、および欠陥検査用画像処理方法
JP2011163852A (ja) * 2010-02-08 2011-08-25 Kobe Steel Ltd 外観検査装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH095255A (ja) * 1995-06-26 1997-01-10 Hitachi Ltd 欠陥検査方法およびその装置並びに薄膜磁気ヘッド用の素子の製造方法
JP2003098106A (ja) * 2001-09-27 2003-04-03 Dainippon Printing Co Ltd 印刷欠陥表示方法および装置
JP2006337167A (ja) * 2005-06-01 2006-12-14 Fast:Kk 周期性ノイズ下での低コントラスト欠陥検査方法、繰返しパターン下での低コントラスト欠陥検査方法
JP2007218629A (ja) * 2006-02-14 2007-08-30 Sumitomo Chemical Co Ltd 欠陥検査装置及び欠陥検査方法
WO2010058557A1 (ja) * 2008-11-21 2010-05-27 住友化学株式会社 成形シートの欠陥検査装置
WO2012035852A1 (ja) * 2010-09-15 2012-03-22 株式会社日立ハイテクノロジーズ 欠陥検査方法及びその装置

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106503724A (zh) * 2015-09-04 2017-03-15 佳能株式会社 分类器生成装置、有缺陷/无缺陷确定装置和方法
WO2017043270A1 (ja) * 2015-09-09 2017-03-16 住友電装株式会社 端子付電線の検査方法及び端子付電線検査装置
JP2017215277A (ja) * 2016-06-02 2017-12-07 住友化学株式会社 欠陥検査システム、フィルム製造装置及び欠陥検査方法
JP2019074354A (ja) * 2017-10-13 2019-05-16 王子ホールディングス株式会社 衛生用紙の製造方法及び欠陥検査装置
JP7141772B1 (ja) 2021-12-02 2022-09-26 株式会社岩崎電機製作所 画像検査装置、画像検査方法、および、画像検査プログラム
JP2023082341A (ja) * 2021-12-02 2023-06-14 株式会社岩崎電機製作所 画像検査装置、画像検査方法、および、画像検査プログラム

Also Published As

Publication number Publication date
KR20150114464A (ko) 2015-10-12
CN104956210A (zh) 2015-09-30
JP6191627B2 (ja) 2017-09-06
CN104956210B (zh) 2017-04-19
JPWO2014119772A1 (ja) 2017-01-26
KR102168143B1 (ko) 2020-10-20
TWI608230B (zh) 2017-12-11
TW201435334A (zh) 2014-09-16

Similar Documents

Publication Publication Date Title
JP6191627B2 (ja) 画像生成装置、欠陥検査装置および欠陥検査方法
JP6191622B2 (ja) 画像生成装置、欠陥検査装置および欠陥検査方法
JP5619348B2 (ja) 成形シートの欠陥検査装置
JP5643918B2 (ja) 欠陥検査装置および欠陥検査方法
JP5006551B2 (ja) 欠陥検査装置及び欠陥検査方法
KR101593251B1 (ko) 편광 필름 분류 시스템 및 분류 방법
JP2013140050A (ja) 欠陥検査装置および欠陥検査方法
JP4796860B2 (ja) オブジェクト検出装置及びオブジェクト検出方法
JP6191623B2 (ja) 画像生成装置、欠陥検査装置および欠陥検査方法
JP5796430B2 (ja) 板ガラス検査装置、板ガラス検査方法、板ガラス製造装置、及び板ガラス製造方法
JP4707055B2 (ja) 枚葉フィルム断裁配置評価装置及び枚葉フィルム断裁配置決定方法
JP4748572B2 (ja) 欠陥マーキング装置及び欠陥マーキング方法、並びに欠陥マーキング評価装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14745445

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2014559794

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 20157017615

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14745445

Country of ref document: EP

Kind code of ref document: A1