WO2011052332A1 - 欠陥検査用画像処理装置および欠陥検査用画像処理方法 - Google Patents
欠陥検査用画像処理装置および欠陥検査用画像処理方法 Download PDFInfo
- Publication number
- WO2011052332A1 WO2011052332A1 PCT/JP2010/066934 JP2010066934W WO2011052332A1 WO 2011052332 A1 WO2011052332 A1 WO 2011052332A1 JP 2010066934 W JP2010066934 W JP 2010066934W WO 2011052332 A1 WO2011052332 A1 WO 2011052332A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image data
- line
- data
- image
- inspection object
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/89—Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles
- G01N21/892—Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles characterised by the flaw, defect or object feature examined
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/89—Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles
- G01N21/8901—Optical details; Scanning details
- G01N21/8903—Optical details; Scanning details using a multiple detector array
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8806—Specially adapted optical and illumination features
- G01N2021/8822—Dark field detection
- G01N2021/8825—Separate detection of dark field and bright field
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
Definitions
- the present invention relates to a defect inspection system for inspecting a defect of an inspection object such as a sheet, and a defect inspection imaging device, a defect inspection image processing device, a defect inspection image processing program, and a defect inspection image processing used therein.
- the present invention relates to a computer-readable recording medium on which a program is recorded, and an image processing method for defect inspection.
- the arrangement of the optical system shown in FIG. 15 (a) is called a regular transmission method, and the arrangement of the optical system shown in FIG. 15 (b) is called a transmission scattering method.
- a method of measuring transmitted light such as a regular transmission method and a transmission scattering method, is used when inspecting an inspection object 502 having a high light transmittance.
- the arrangement of the optical system shown in FIG. 15C is called a regular reflection method, and the arrangement of the optical system shown in FIG. 15D is called a reflection / scattering method.
- a method of measuring reflected light such as a regular reflection method and a reflection / scattering method, is used when inspecting an inspection object 502 having a low light transmittance.
- a line sensor 501 is disposed on the optical axis of the light emitted from the light source 503, and non-scattered light (regularly transmitted light or regular reflected light) from the inspection object 502. Is measured by the line sensor 501 is also called a bright field method.
- the line sensor 501 is arranged shifted from the optical axis of the light emitted from the light source 503, and the non-scattered light from the inspection object 502 is directly incident on the line sensor 501.
- a light shield (knife edge) 504 is disposed between the light source 503 and the inspection object 502 so that the line sensor 501 is focused on the end of the light shielding body 504, and scattered light (scattering) from the inspection object 502 is detected.
- a method of measuring transmitted light or scattered reflected light) with the line sensor 501 is also referred to as a dark field method or an optical axis shifting method.
- the light shield 504 may be omitted, and the line sensor 501 may be arranged so that non-scattered light from the inspection object 502 does not directly enter the line sensor 501.
- the amount of non-scattered light received by the line sensor 501 is reduced by the light from the light source 503 being scattered by the defect of the inspection object 502.
- the presence or absence of a defect in the inspection object 502 is determined from the amount of decrease (change) in the light received by the line sensor 501.
- the bright field method is a method suitable for detecting a relatively large defect having a large reduction amount because of low detection sensitivity.
- the arrangement of the optical system can be easily performed as compared with the dark field method, the operation is stable and the practical application is simple.
- the line sensor 501 receives light scattered by the defect of the inspection object 502, and determines the presence or absence of the defect of the inspection object 502 from the amount of light received.
- the dark field method has a higher defect detection sensitivity than the bright field method, and can detect minute irregularities (defects).
- the optical system line sensor 501, light source 503, and light shield 504 with high accuracy, practical application is limited.
- the dark field method is often used for the defect inspection apparatus.
- the dark field method has a problem in that it is difficult to inspect defects of an inspection object with high accuracy because it is difficult to arrange an optical system practically as described above.
- Patent Document 1 A technique for solving this problem is disclosed in Patent Document 1.
- the size of a light shielding body appropriate for the arrangement of the optical system is defined.
- JP 2007-333563 A released on December 27, 2007
- JP 2008-292171 A released on December 4, 2008
- Patent Document 1 also describes that in order to detect a defect with a small optical distortion, it is necessary to bring the line-shaped transmission illumination device and the light receiving means closer to each other (paragraph [0009]). Therefore, the conventional technology as described above has a problem that it is difficult to inspect various types of defects having different ray path changes caused by the defects with sufficient accuracy at a time.
- the defect inspection apparatus using the dark field method it is necessary to arrange the optical system with high accuracy as described above. Therefore, it is practical to change the arrangement of the optical system and the size of the light shielding body according to the type of defect. ,Have difficulty. Therefore, in the conventional defect inspection apparatus using the dark field method, the arrangement of the optical system and the size of the light shielding body that can detect a specific type of defect that is relatively large are selected and used. There is a problem in that there are cases in which defects of a kind cannot be detected with sufficient accuracy.
- the present invention has been made in view of the above-described problems, and its purpose is to provide a defect inspection system capable of detecting various types of defects having different ray path changes caused by defects at a sufficient accuracy, and A defect inspection imaging apparatus, a defect inspection image processing apparatus, a defect inspection image processing program, a computer-readable recording medium on which a defect inspection image processing program is recorded, and a defect inspection image processing method are realized. There is.
- an image processing apparatus for defect inspection is continuously imaged in time by an imaging unit in a state where an object to be inspected and an imaging unit are relatively moved.
- a defect inspection image processing apparatus that processes image data of a two-dimensional image of the inspection object and thereby generates defect inspection image data for inspecting a defect of the inspection object.
- the same line extracting means for extracting one line data having the same position on the image data from the image data, and the line data extracted by the same line extracting means are arranged in time series to synthesize a plurality of lines.
- Line synthesizing means for generating image data, and the same line extracting means includes the line data for a plurality of different positions on the image data.
- the line composition means generates a plurality of different line composition image data by arranging the line data extracted by the same line extraction means in time series for each position on the image data. Furthermore, an operator calculation means for performing calculation using an operator for emphasizing a luminance change for each of the plurality of line composite image data and generating a plurality of emphasized image data of one line or a plurality of lines, Integrating means for integrating the luminance values of the plurality of emphasized image data indicating the same part of the inspection object for each pixel to generate image data for defect inspection.
- the defect inspection image processing method continuously in time by the imaging unit while the object to be inspected and the imaging unit are relatively moved.
- the same line extracting step for extracting one line data having the same position on the image data from the different image data, and the line data extracted in the same line extracting step are arranged in time series to form a plurality of lines.
- Each line data is extracted, and in the line synthesis step, the line data extracted in the same line extraction step is arranged in time series for each position on the image data, and a plurality of different line synthesized image data Further, an operator calculation is performed for each of the plurality of line composite image data by performing an operation using an operator that emphasizes a luminance change and generating a plurality of emphasized image data of one line or a plurality of lines. And a step of integrating the luminance values of the plurality of emphasized image data indicating the same portion of the inspection object for each pixel to generate image data for defect inspection.
- one line having the same position on the image data from among a plurality of different image data in the two-dimensional image of the inspection object imaged continuously in time by the imaging unit.
- Data is extracted, and this extraction process is similarly performed for a plurality of different positions on the image data.
- the extracted line data is arranged in time series for each position on the image data, and a plurality of different line combined image data composed of a plurality of lines is generated. Since the object to be inspected and the photographing unit are relatively moved, the plurality of different line composite image data corresponds to image data photographed at different photographing angles with respect to the object to be inspected. It is.
- the line composite image data by generating the line composite image data, it is possible to obtain a plurality of image data captured at different imaging angles with respect to the inspection object without changing the imaging angle of the imaging unit with respect to the inspection object. it can. Therefore, it is possible to obtain line composite image data photographed at a plurality of optimum photographing angles for inspecting various types of defects having different ray path changes caused by the defects. Therefore, by referring to the plurality of line composite image data, it is possible to detect various types of defects on the inspection object having different ray path changes caused by the defects at a sufficient accuracy at a time. . In addition, even if the optical system placement accuracy is not high, any of the obtained multiple line composite image data is equivalent to the image data obtained when the optical system is placed with high accuracy, so defects can be detected with high accuracy. can do.
- the operator calculation means performs calculation using an operator that emphasizes the luminance change for each of the plurality of line composite image data, thereby enhancing one line or a plurality of lines of the emphasized image data. Are generated respectively. Therefore, the luminance change in each pixel of the plurality of line composite image data is emphasized, so that it becomes easy to detect a minute defect, a thin defect, a light defect, or the like.
- the defect inspection image data is generated by integrating the luminance values of the enhanced image data of the plurality of enhanced image data indicating the same portion of the inspection object for each pixel. By integrating, noise can be reduced.
- the method for obtaining the plurality of emphasized image data indicating the same portion of the inspection object is not particularly limited. For example, (1) before extracting the same line, The line data indicating the same location is identified from each of the data, and an identifier indicating the same location is added to each line data.
- the plurality of emphasized images based on the identifier A method of extracting the plurality of emphasized image data indicating the same portion of the inspection object from the data; (2) line data indicating the same portion from a plurality of different image data before extracting the same line; Identify each and add an identifier indicating the same location to each line data, and after the line synthesis, before the operator calculation, The plurality of line composite image data indicating the same part of the inspection object is extracted from the in-composition image data, and the plurality of line composite image data indicating the same part of the extracted inspection object is extracted A method of generating the plurality of emphasized image data indicating the same portion of the inspection object by performing an operator calculation; (3) after the operator calculation and before the integration, from a plurality of different emphasized image data.
- a method for identifying and extracting enhanced image data indicating the same location (4) After the line synthesis, before the operator calculation, enhanced image data indicating the same location from a plurality of different line synthesized image data.
- the operator operation is performed on the plurality of line composite image data that is specified and extracted and indicates the same portion of the extracted inspection object. And the like; a method of producing a plurality of weighted image data representing the same portion of the object to be inspected by the.
- the operator calculating means performs a calculation using the differential operator on the plurality of line combined image data, thereby obtaining a center of the plurality of line combined image data.
- the brightness value gradient along the direction orthogonal to the center line at each pixel in the line is calculated, and the brightness value of each pixel in the center line of the plurality of line composite image data is calculated as the brightness value gradient at each pixel. It is preferable to generate new one-line emphasized image data by replacing with absolute values.
- the said operator calculating means performs the calculation using a differential operator with respect to these line synthetic
- combination image data The gradient of the luminance value along the direction orthogonal to is calculated, and the luminance value of each pixel in the center line of the plurality of line composite image data is replaced with the absolute value of the gradient of the luminance value in each pixel, thereby creating one new line
- the enhanced image data is generated.
- the luminance value as an absolute value, the data can be processed without distinction as data indicating a defect regardless of whether the gradient of the luminance value is positive or negative.
- the defect appearing on the bright side and the defect appearing on the dark side can be handled in the same manner, the defect can be detected with high accuracy even if the arrangement accuracy of the optical system is not high. Also, by adding the luminance values of the plurality of emphasized image data for each pixel to generate defect inspection image data, the data indicating the defect can be added without canceling. For this reason, it is possible to detect even a defect whose appearance changes on the bright side or on the dark side depending on the arrangement of the optical system (experience is known that many such defects actually exist). ing).
- the integration unit may include the plurality of emphasized image data respectively indicating the plurality of portions of the inspection object for each portion of the inspection object. Are integrated for each pixel to generate a plurality of defect inspection image data respectively indicating a plurality of portions of the inspection object, and a plurality of each indicating the plurality of portions of the inspection object. It is preferable to include image generation means for arranging the defect inspection image data corresponding to the location of the inspection object and synthesizing new defect inspection image data.
- the image generating means arranges the image data corresponding to the location of the inspection object and synthesizes new image data for defect inspection. Since the position of the image data for defect inspection synthesized by the image generation means corresponds to the location of the inspection object, it is possible to easily detect which position of the entire inspection object has a defect.
- the integration unit may inspect the inspection object for each location of the inspection object in order from the top location of the inspection object every time the imaging unit captures an image. It is preferable that the luminance values of the plurality of emphasized image data indicating the same part of the object are integrated for each pixel to generate a plurality of defect inspection image data respectively indicating the plurality of parts of the inspection object.
- the integration means indicates the same location of the inspected object for each location of the inspected object in order from the first location of the inspected object every time the imaging unit captures an image.
- the luminance values of the plurality of emphasized image data are integrated for each pixel to generate a plurality of defect inspection image data respectively indicating a plurality of locations of the inspection object. Therefore, it is possible to generate defect inspection image data from the emphasized image data every time the imaging unit captures an image. Therefore, since an image for identifying the presence or absence of a defect can be output for each frame, a defect inspection can be performed in real time.
- the defect inspection imaging apparatus provides a two-dimensional image of the inspection object in a state where the defect inspection image processing apparatus, the inspection object and the imaging unit are relatively moved. And a photographing unit that continuously photographs in time.
- the image processing apparatus for defect inspection since the image processing apparatus for defect inspection is included, it is possible to provide an imaging apparatus for defect inspection that can detect various types of defects having different ray path changes caused by defects at a sufficient accuracy. it can.
- the defect inspection system is a defect inspection system for inspecting a defect of an object to be inspected, and moves to relatively move the defect inspection imaging device, the object to be inspected, and the imaging unit. Means.
- the imaging apparatus for defect inspection since the imaging apparatus for defect inspection is included, it is possible to provide a defect inspection system that can detect various types of defects having different ray path changes caused by defects at a sufficient accuracy.
- the defect inspection system includes a light source that irradiates light to the inspection object, and a light shielding body that partially transmits or reflects the inspection object from the light source and blocks light incident on the imaging unit. And inspecting the inspection object for defects using the dark field method.
- the image processing apparatus for defect inspection may be realized by a computer.
- the image processing apparatus for defect inspection is operated by operating the computer as each unit of the image processing apparatus for defect inspection.
- a control program realized by a computer and a computer-readable recording medium on which the control program is recorded also fall within the scope of the present invention.
- the present invention it is possible to obtain a plurality of data captured at different imaging angles with respect to the same portion of the inspection object. Therefore, by referring to the plurality of data, it is possible to detect various types of defects on the inspection object.
- FIG. 1 It is a figure which shows the process of a line synthesis
- (a) is a conceptual diagram which shows that the 480 images which the area camera image
- (b) is this 480 image data ( (1) to (# 480) are sequentially arranged side by side from the left
- (c) is a diagram illustrating a state in which the Nth line is extracted from 480 pieces of image data and arranged.
- (A) is a figure which shows the image which the area camera image
- (b) is a figure which shows the image which extracted the line near knife edge from 480 pieces of image data which the area camera image
- FIG. 4A is a diagram illustrating an example of an image taken by an area camera
- FIG. 4B is an example of an RT-LCI image obtained by a defect inspection system according to an embodiment of the present invention.
- FIG. 4B is an example of an RT-LCI image obtained by a defect inspection system according to an embodiment of the present invention.
- FIG. 4 is a diagram illustrating an example, (a) is a diagram illustrating an example of an image captured by an area camera, (b) is a diagram illustrating an example of a line composite image, and (c) is an embodiment of the present invention. It is a figure which shows an example of the RT-LCI image obtained by the defect inspection system which concerns on an example.
- FIG. 4 is a diagram illustrating an example, (a) is a diagram illustrating an example of an image captured by an area camera, (b) is a diagram illustrating an example of a line composite image, and (c) is an embodiment of the present invention. It is a figure which shows an example of the RT-LCI image obtained by the defect inspection system which concerns on an example.
- FIG. 4 is a diagram illustrating an example, (a) is a diagram illustrating an example of an image captured by an area camera, (b) is a diagram illustrating an example of a line composite image, and (c) is an embodiment of the present invention. It is a figure which
- FIG. 4 is a diagram illustrating an example, (a) is a diagram illustrating an example of an image captured by an area camera, (b) is a diagram illustrating an example of a line composite image, and (c) is an embodiment of the present invention. It is a figure which shows an example of the RT-LCI image obtained by the defect inspection system which concerns on an example.
- FIG. 2 shows an arrangement of an optical system of a defect inspection apparatus as a conventional technique, (a) shows an arrangement of an optical system of a regular transmission method, (b) shows an arrangement of an optical system of a transmission scattering method, and (c ) Is the arrangement of the specular reflection optical system, and (d) is the arrangement of the reflection scattering optical system.
- the defect inspection system detects a defect of a molded sheet.
- the defect inspection system according to this embodiment is suitable for inspection of a light-transmitting molded sheet, particularly a molded sheet made of a resin such as a thermoplastic resin.
- a resin such as a thermoplastic resin.
- the thermoplastic resin extruded from the extruder is passed through the gaps between the rolls to be treated to impart smoothness and gloss to the surface. What was shape
- thermoplastic resin examples include methacrylic resin, methyl methacrylate-styrene copolymer, polyolefin such as polyethylene and polypropylene, polycarbonate, polyvinyl chloride, polystyrene, polyvinyl alcohol, and triacetyl cellulose resin. Etc.
- the molded sheet may consist of only one of these thermoplastic resins, or may be a laminate (laminated sheet) obtained by laminating a plurality of types of these thermoplastic resins.
- the defect inspection system according to the present embodiment is suitable for inspection of optical films such as polarizing films and retardation films, in particular, long optical films that are wound and stored and transported in a web shape.
- the molded sheet may have any thickness, and even if it has a relatively thin thickness generally called a film, it has a relatively thick thickness generally called a plate. It may be.
- defects in molded sheets include bubbles (such as those that occur during molding), fish eyes, foreign matter, tire marks, dents, scratches, and other point defects; nicks, streaks (such as those that occur due to differences in thickness), and the like. Can be mentioned.
- the line sensor may be moved within a fluctuation tolerance (generally about several tens to several hundreds of ⁇ m) of the imaging line of the line sensor. Conceivable.
- a fluctuation tolerance generally about several tens to several hundreds of ⁇ m
- the optical system since it is necessary to arrange the optical system with high accuracy, it is difficult to detect defects under the same conditions while gradually changing the imaging line (imaging angle) of the line sensor. is there.
- a method of photographing a plurality of imaging lines at the same time by arranging a plurality of line sensors can be considered, but the arrangement of a plurality of line sensors complicates the apparatus system, and the optical system needs to be arranged with higher accuracy. .
- FIG. 2 is a diagram showing the positional relationship of the defect inspection optical system including the area camera 5, the linear light source 4 and the knife edge 7.
- the area camera 5 is arranged above the linear light source 4 so that the center of the linear light source 4 coincides with the center of the photographing range, and the linear light source 4 is viewed from the area camera 5.
- the knife edge 7 is arranged so that half of it is hidden.
- FIG. 2B is a view as seen from the X-axis direction of FIG.
- the area camera 5 includes a CCD (Charge Coupled Device) 51 and a lens 52. If the half angle ⁇ of the angle taken by the CCD 51 via the lens 52 is about ⁇ 0.1 degrees, the difference in shooting distance (1-cos ⁇ ) is negligible in the range taken by the CCD 51.
- CCD Charge Coupled Device
- the focal length f of the lens 52 is 35 mm
- the distance between the lens 52 and the object to be inspected is about 300 mm
- the area camera 5 having a resolution of 70 ⁇ m / pixel is used with the X axis as the center.
- FIG. 3 is a schematic diagram showing an outline of the defect inspection system 1.
- the defect inspection system 1 includes a conveyor (moving means) 3, a linear light source 4, an area camera (imaging unit) 5, an image analysis device (defect inspection image processing device) 6, a display unit 30, A knife edge 7 and an illumination diffuser 8 are included.
- a molded sheet 2 that is an object to be inspected is disposed on a conveyor 3.
- the area camera 5 continuously photographs the molding sheet 2 irradiated with light by the linear light source 4 while conveying the rectangular molding sheet 2 in a certain direction by the conveyor 3. Based on the two-dimensional image data of the molding sheet 2 photographed by the camera 5, the image analysis device 6 detects defects in the molding sheet 2.
- the conveyor 3 conveys the rectangular molded sheet 2 in a direction orthogonal to its thickness direction, particularly in its longitudinal direction so that the position of the molded sheet 2 irradiated to the linear light source 4 changes.
- the conveyor 3 includes, for example, a sending roller and a receiving roller that convey the molded sheet 2 in a fixed direction, and measures the conveying speed using a rotary encoder or the like.
- the conveyance speed is set to about 2 m to 12 m / min, for example.
- the conveyance speed in the conveyor 3 is set and controlled by an information processing device (not shown) or the like.
- the linear light source 4 is arranged so that its longitudinal direction intersects with the conveyance direction of the molded sheet 2 (for example, a direction orthogonal to the conveyance direction of the molded sheet 2), and is irradiated from the linear light source 4. It is arranged at a position facing the area camera 5 with the molding sheet 2 sandwiched so that the transmitted light passes through the molding sheet 2 and enters the area camera 5.
- the linear light source 4 is not particularly limited as long as it emits light that does not affect the composition and properties of the molded sheet 2.
- a fluorescent light especially a high frequency fluorescent light
- a metal halide lamp a halogen light
- Such as a transmission light is used.
- the linear light source 4 is arranged on the same side as the area camera 5 toward the molding sheet 2, and the linear light source 4 is reflected so as to be incident on the area camera 5 after being reflected by the molding sheet 2.
- the light source 4 may be arranged (see the arrangement of the reflection / scattering optical system shown in FIG. 15D).
- the configuration in which the light reflected by the molding sheet 2 is incident on the area camera 5 as described above can be applied not only to the molding sheet 2 but also to inspection of defects in various shapes and materials of inspection objects.
- the area camera 5 receives light transmitted through the molding sheet 2 and continuously captures a two-dimensional image of the molding sheet 2 in terms of time.
- the area camera 5 outputs data of the photographed two-dimensional image of the molded sheet 2 to the image analysis device 6.
- the area camera 5 is composed of an area sensor composed of an image sensor such as a CCD or CMOS (Complementary Metal-Oxide Semiconductor) that captures a two-dimensional image.
- the area camera 5 is not particularly limited as long as it outputs multi-gradation image data, but in the present embodiment, it can output 8-bit gray scale (256 gradations) image data.
- the resolution of the area camera 5 may be selected according to the size of the defect to be detected.
- the three-dimensional shape (ratio of width to height) of the defect detected by the defect inspection system 1 basically does not depend on the resolution of the area camera 5, so it is necessary to select the camera resolution depending on the type of defect to be detected. There is no.
- the area camera 5 is preferably arranged so that the entire region in the width direction of the molded sheet 2 (direction orthogonal to the conveying direction of the molded sheet 2 and orthogonal to the thickness direction of the molded sheet 2) can be photographed. By photographing the entire area in the width direction of the molded sheet 2 with the area camera 5, it is possible to inspect defects in the entire area of the molded sheet 2.
- the shooting interval (frame rate) of the area camera 5 may be fixed, may be changed by the user operating the area camera 5 itself, and an information processing device (connected to the area camera 5 ( It may be changed by the user operating (not shown; can be omitted).
- the shooting interval of the area camera 5 may be a fraction of a second, which is the time interval of continuous shooting by the digital still camera.
- it is usually used for an industrial CCD camera.
- the time interval can be shortened by minimizing the number of lines in one frame.
- the number of effective pixels is 512 x 480 pixels in the horizontal direction and 30 pixels per second (hereinafter FPS) when reading all the pixels
- FPS pixels per second
- the width is something.
- FPS pixels per second
- the number of effective pixels is about 1600 ⁇ horizontal 1200 pixels and all pixels are read out, about 150 FPS.
- the number of effective pixels of the camera and the driving method can be appropriately selected depending on the conveyance speed of the inspection object, the size of the defect to be detected, and the like.
- the image analysis device 6 receives the image data output from the area camera 5 and performs image processing on the image data, thereby generating defect inspection image data for inspecting the defect of the inspection object.
- the defect inspection image data is output to the display unit 30.
- the image analysis device 6 includes a storage unit 20 that stores image data, and an image processing unit 10 that performs image processing on the image data.
- the area camera 5 and the image analysis device 6 constitute a defect inspection imaging device.
- the image analysis device 6 is not particularly limited as long as it can perform image processing of two-dimensional image data.
- a PC personal computer
- Examples thereof include an image capture board equipped with an FPGA and a camera (called an intelligent camera) having a processor in which an image processing program is described. Details of the image processing performed by the image analysis device 6 will be described later.
- the display unit 30 displays image data for defect inspection.
- the display unit 30 only needs to display an image or a video.
- an LC (Liquid Crystal) display panel, a plasma display panel, an EL (Electro Luminescence) display panel, or the like can be applied as the display unit 30.
- the image analysis device 6 or the defect inspection imaging device may be configured to include the display unit 30. Further, the display unit 30 may be separated from the defect inspection system 1 as an external display device, and the display unit 30 may be replaced with another output device, for example, a printing device.
- the knife edge 7 is a knife-shaped light blocking body that blocks light emitted from the linear light source 4.
- the illumination diffusing plate 8 diffuses light in order to make the amount of light emitted from the linear light source 4 uniform.
- the algorithm used in the present embodiment has been developed in view of the problem of the defect detection method using simple line synthesis described below.
- a line synthesis method for generating an image equivalent to an image captured by a plurality of line sensors arranged in parallel from a plurality of images captured by the area camera 5 will be described with reference to FIG.
- an area camera 5 having a frame rate of 60 FPS (Frame Per Second) takes 8 seconds and obtains 480 images.
- FPS Full Per Second
- a plurality of divisions are equally divided by at least one dividing line along the width direction of the molded sheet 2 (the direction orthogonal to the conveying direction of the molded sheet 2 and orthogonal to the thickness direction of the molded sheet 2).
- Each of the partial images is referred to as a line.
- the height of the entire image (size along the longitudinal direction of the molded sheet 2) is H pixels (H is a natural number), and the width of the entire image (size along the width direction of the molded sheet 2) is W pixels (W is a natural number).
- the size of the line is a height H / L pixel (L is an integer of 2 or more) and a width W pixel.
- the line is typically a partial image of 1 pixel ⁇ W pixels arranged on a straight line along the width direction of the molded sheet 2.
- FIG. 4 (a) is a conceptual diagram showing that the 480 images are arranged in time series.
- FIG. 4B is a diagram showing a state in which the 480 pieces of image data (# 1 to # 480) are arranged side by side sequentially from the left.
- the lower dark part is a part where light is blocked by the knife edge 7, and the bright part near the center is a part where light from the linear light source 4 is transmitted.
- the dark portion at the top is a location where the light from the linear light source 4 does not reach and is out of the inspection target. Further, the inspection object is conveyed from the bottom to the top in FIG.
- one line (the red line shown in FIG. 4B: the Nth line) is extracted from the same position on the image data.
- the width of one line extracted at this time is the distance that the inspection object moves per frame (1/60 second).
- the extracted lines are arranged in order from the top so as to be a line extracted from the image data of # 1, a line extracted from the image data of # 2,..., A line extracted from the image data of # 480.
- FIG. 4C shows a state in which the Nth line is extracted from 480 pieces of image data and arranged.
- an image equivalent to the image captured by the line sensor that captures the Nth line can be generated by arranging the extracted lines and combining them into one piece of image data.
- line synthesis extracting a line at the same position from a plurality of image data captured by the area camera 5 and generating an image equivalent to the image captured by the line sensor that captures the certain line.
- a plurality of images can be obtained from the image data captured by the area camera 5.
- Image data captured at an imaging angle (imaging position) can be generated at a time.
- image data can be generated.
- FIG. 5 shows an image photographed by the area camera 5 and an image obtained by line synthesis from 480 pieces of image data photographed by the area camera 5.
- FIG. 5A is an image taken by the area camera 5.
- FIG. 5A is an image similar to FIG. 4B, the dark part at the bottom is a part where light is blocked by the knife edge 7, and the bright part near the center is from the linear light source 4. It is a portion through which light is transmitted, and a dark portion at the top is a portion where the light from the linear light source 4 does not reach and is excluded from the inspection target.
- the dark part protruding from the lower end to the upper part in FIG. 5A is the shadow of the object arranged for the mark.
- FIG. 5B shows an image obtained by extracting lines near the knife edge 7 from 480 pieces of image data taken by the area camera 5 and synthesizing the lines. Specifically, it is an image obtained by extracting and synthesizing a line at a position 210 ⁇ m away from the upper end of the knife edge 7 toward the illumination side.
- the dark portion protruding from the lower end to the upper portion in FIG. 5B is the shadow of the object arranged for the mark.
- FIG. 5B is seen, a thing like a washboard-like stripe (bank mark) can be visually recognized.
- a defect that cannot be viewed with the image captured by the area camera 5 as it is can be visually recognized by performing line synthesis on the image data captured by the area camera 5.
- FIG. 6A shows an original image obtained by line synthesis, which is the same as the image shown in FIG.
- FIG. 6B is an image obtained by performing 7 ⁇ 7 vertical differential filter processing on the image shown in FIG.
- FIG. 6C is an image obtained by binarizing the image shown in FIG. 6B according to a fixed threshold using the Laplacian histogram method. In this way, by performing image processing on the line-combined image, it is possible to identify defects more prominently.
- an image in which the defect is best seen is selected from a plurality of images obtained by line synthesis of image data captured by the area camera 5. Therefore, the defect inspection can be easily performed and the efficiency of the defect inspection is improved. Further, image processing is performed on each line composite image corresponding to a plurality of generated shooting angles, and an image in which a defect appears remarkably is selected from each line composite image subjected to image processing and referenced. As a result, various defects of the inspection object can be clearly recognized.
- RT-LCI Real-Time-Line Composition-and-Integration
- each part of the image analysis apparatus 6 that performs RT-LCI will be described with reference to FIG.
- the number of types of photographing angles used when performing RT-LCI image processing is k (k is an integer of 2 or more).
- the number of lines of the differential operator described later is m (m is a natural number). Note that the number k of shooting angles and the number m of differential operator rows can be set arbitrarily and are determined in advance.
- the distance that the inspection object 2 moves from the time when the area camera 5 captures a certain image to the time when the next image is captured is defined as the movement width, and the data described later.
- the actual distance (distance on the surface of the inspection object 2) indicated by the width of the line data (one line partial image data) extracted by the extraction unit is the same as the movement width.
- the number of differential operator columns may be two or more, but is assumed to be one here.
- FIG. 1 is a functional block diagram showing the configuration of the main part of the image analysis device 6.
- the image analysis device 6 includes the image processing unit 10 and the storage unit 20.
- the image processing unit 10 includes a data extraction unit (same line extraction unit) 11, a first segment determination unit 12, a data storage unit (line synthesis unit) 13, an all segment determination unit 14, and a change amount calculation unit (operator calculation unit) 15. ,
- the same location determination extraction unit 16 an integration unit (integration unit) 17, and an image generation unit (image generation unit) 18.
- the storage unit 20 includes a first storage unit 21, a second storage unit 22, a third storage unit 23, and a fourth storage unit 24.
- the second storage unit 22 includes a first area 221, a second area 222, and a kth area 22k.
- the first region 221 to the kth region 22k are each divided into m sections.
- the data extraction unit 11 includes a first extraction unit 111, a second extraction unit 112,..., A kth extraction unit 11k.
- the first extraction unit 111 extracts line data at a predetermined position on the image data (for example, line data at the bottom) from the image data stored in the first storage unit 21.
- the line data at a predetermined position extracted by the first extraction unit 111 is set as the first line data.
- the second extraction unit 112 extracts line data (second line data) adjacent to the moving direction side of the inspection object 2 from the line data at the predetermined position on the image data.
- the k-th extracting unit extracts k-th line data from the line data at the predetermined position on the image data in the moving direction of the inspection object 2.
- the data extraction unit 11 extracts one line of line data having the same position on the image data from a plurality of different image data. Each line data is extracted.
- the 1st classification determination part 12 is provided with the 1st determination part 121, the 2nd determination part 122, ... kth determination part 12k.
- the first determination unit 121 determines whether or not line data is already stored in the first section of the first area 221 in the second storage unit 22.
- the second determination unit 122 determines whether line data has already been stored in the first section of the second area 222 in the second storage unit 22.
- the kth determination unit 12k determines whether or not line data is already stored in the first section of the kth region 22k in the second storage unit 22.
- the data storage unit 13 includes a first storage unit 131, a second storage unit 132,..., A kth storage unit 13k.
- the first storage unit 131 extracts the first storage unit 131.
- the line data thus stored is stored in the first section of the first area 221.
- the first storage unit 131 stores the first region 221.
- the storage location of the data stored in each section is moved up by one section and moved.
- the line data stored in the first section is moved to the second section, and the line data stored in the (m ⁇ 1) th section is moved to the mth section.
- the line data is discarded or moved to a backup location (not shown).
- the first storage unit 131 moves the storage location of the line data stored in each section, and then stores the line data extracted by the first extraction unit 111 in the first section of the first region 221. To do.
- the first extraction unit 111 extracts a plurality of line data at a predetermined position on the image data
- the first storage unit 131 continues the extracted plurality of line data in the first region 221. By storing the data in the sections, the plurality of extracted line data are combined into one line composite image data.
- the second storage unit 132 receives the line data extracted by the second extraction unit 112 based on the determination of the second determination unit 122 of the first classification determination unit 12. 2 is stored in the first section of the second area 222.
- the second storage unit 132 continues the second line 222 with the extracted line data. By storing the data in the sections, the plurality of extracted line data are combined into one line composite image data.
- the kth storage unit 13 k receives the line data extracted by the kth extraction unit 11 k based on the determination of the kth determination unit 12 k of the first classification determination unit 12. Stored in the first section of the k region 22k.
- the kth storage unit 13k continues the extracted line data in the kth region 22k. By storing the data in the sections, the extracted plurality of line data are combined into one line composite image data.
- the data storage unit 13 generates line composite image data of a plurality of lines by arranging the line data extracted by the data extraction unit 11 in time series, and the line data extracted by the data extraction unit 11 For each position on the image data, a plurality of different line combined image data are generated in time series.
- the all category determination unit 14 includes a first determination unit 141, a second determination unit 142,..., A kth determination unit 14k.
- the first determination unit 141 determines whether or not line data is stored in all the sections (first to mth sections) of the first area 221.
- the second determination unit 142 determines whether or not line data is stored in all sections (first to mth sections) of the second area 222.
- the kth determination unit 14k determines whether or not line data is stored in all the sections (first to mth sections) of the kth area 22k.
- the change amount calculation unit 15 includes a first calculation unit 151, a second calculation unit 152,..., A kth calculation unit 15k.
- the first calculation unit 151 includes a plurality of lines stored in the first region 221 when the first determination unit 141 of the all-segment determination unit 14 determines that line data is stored in all the segments.
- a differential operator operation is performed on the line composite image data composed of the data, and the enhanced image data (image data of one line or a plurality of lines) obtained as a result is stored in the third storage unit 23.
- the second calculation unit 152 has a plurality of lines stored in the second region 222.
- a differential operator operation is performed on the line composite image data composed of the data, and the enhanced image data (image data of one line or a plurality of lines) obtained as a result is stored in the third storage unit 23.
- the k-th calculating unit 15k when the k-th determining unit 14k of the all-segment determining unit 14 determines that the line data is stored in all the segments, the plurality of lines stored in the k-th region 22k.
- a differential operator operation is performed on the line composite image data composed of the data, and the enhanced image data (image data of one line or a plurality of lines) obtained as a result is stored in the third storage unit 23.
- the details of the calculation processing performed by the change amount calculation unit 15 will be described later.
- the change amount calculation unit 15 generates one line or a plurality of lines of emphasized image data by performing an operation using an operator that emphasizes the luminance change for each of the plurality of line composite image data. It is.
- the same location determination extraction unit 16 determines whether or not the enhanced image data of all imaging angles (k types) indicating the same location of the inspection object 2 is stored in the third storage unit 23. When it is determined that the enhanced image data of all photographing angles indicating the same location is stored, the same location determination extraction unit 16 extracts the k types of enhanced image data.
- the accumulating unit 17 accumulates the luminance values of k kinds of emphasized image data indicating the same location of the inspection object 2 extracted by the same location determining and extracting unit 16 for each pixel, and the defect inspection image of one line or plural lines. Data (RT-LCI data) is generated.
- the accumulating unit 17 stores the position of the inspection object 2 indicated by the accumulated k types of emphasized image data in the fourth storage unit 24 in association with the defect inspection image data obtained as a result of the accumulation.
- the image generation unit 18 is stored in the fourth storage unit 24 based on the position of the inspection object 2 associated with each defect inspection image data stored in the fourth storage unit 24.
- the defect inspection image data are arranged in the same manner as the positional relationship of the inspection object 2 to synthesize new defect inspection image data (RT-LCI data). Display as.
- FIG. 7 is a diagram illustrating an operation flow of each unit of the image analysis device 6 in the RT-LCI process.
- the first extraction unit 111 extracts line data at a predetermined position (for example, the first line data from the bottom) from the image data stored in the first storage unit 21 (S41).
- the first extraction unit 111 associates the position of the inspection object 2 indicated by the extracted line data with the line data. For example, when the movement width and the width of the actual distance indicated by the line data are the same, the first extraction unit 111 adds “pi” as a symbol indicating the position of the inspection object 2 to the extracted line data (where i is Frame number).
- the line data at a predetermined position is arbitrarily set in advance, and it is determined from which line the data is extracted.
- the k-th extraction unit 11k is the k-th line from the line data at the predetermined position extracted by the first extraction unit 111 toward the moving direction of the inspection object 2.
- Data is extracted (S4k).
- the kth extraction unit 11k associates the position of the inspection object 2 indicated by the extracted line data with the line data. For example, when the movement width and the width of the actual distance indicated by the line data are the same, the kth extraction unit 11k uses “p (i ⁇ k + 1)” as a symbol indicating the position of the inspection object 2 in the extracted line data. Append. If i ⁇ k (NO) in S3k, the process proceeds to S140.
- the first determination unit 121 of the first division determination unit 12 determines whether or not line data is already stored in the first division of the first region 221 of the second storage unit 22 ( S51).
- the first storage unit 131 stores the data in each section of the first region 221.
- the storage location of the line data is moved up by one section and moved (S61).
- the first storage unit 131 moves the storage location of the line data stored in each section, and then stores the line data extracted by the first extraction unit 111 in the first section of the first region 221. (S71).
- the first extraction unit 111 extracts the first storage unit 131.
- the line data thus stored is stored in the first section of the first area 221 (S71).
- the second determination unit 122 of the first classification determination unit 12 determines whether or not line data is already stored in the first classification of the second region 222 of the second storage unit 22 (S52). ).
- the second determination unit 122 determines that there is line data in the first section of the second region 222 (YES in S52)
- the second storage unit 132 stores the data in each section of the second region 222.
- the storage location of the line data is moved up by one section and moved (S62).
- the second storage unit 132 moves the storage location of the line data stored in each section, and then stores the line data extracted by the second extraction unit 112 in the first section of the second region 222. (S72).
- the second extraction unit 112 extracts the second storage unit 132.
- the line data thus stored is stored in the first section of the second area 222 (S72).
- the kth determination unit 12k of the first classification determination unit 12 determines whether data is already stored in the first classification of the kth region 22k of the second storage unit 22 (S5k). .
- the kth storage unit 13k stores the data in each section of the kth area 22k.
- the storage location of the line data is moved up by one section and moved (S6k).
- the kth storage unit 13k moves the storage location of the line data stored in each section, and then stores the line data extracted by the kth extraction unit 11k in the first section of the kth region 22k. (S7k).
- the kth determination unit 12k determines that there is no line data in the first section of the kth region 22k (NO in S5k)
- the kth storage unit 13k is extracted by the kth extraction unit 11k.
- the line data thus stored is stored in the first section of the kth area 22k (S7k).
- the first determination unit 141 of the all section determination unit 14 determines whether or not line data is stored in all sections of the first region 221 (S81).
- the first determination unit 141 of the all category determination unit 14 determines that line data is stored in all categories (YES in S81)
- the first calculation unit 151 is stored in the first area 221.
- a differential operator operation is performed on the combined line image data composed of a plurality of line data, and the resultant enhanced image data is stored in the third storage unit 23 (S91).
- a symbol indicating the position of the inspection object 2 associated with the line data stored in the mth section of the first region 221 is added to the emphasized image data that is the result of the differential operator calculation. To do.
- the first determination unit 141 determines that the line data is not stored in all the sections (NO in S81)
- the process proceeds to S140.
- the second determination unit 142 of the all category determination unit 14 determines whether or not line data is stored in all categories of the second region 221 (S82).
- the second determination unit 142 of the all category determination unit 14 determines that the line data is stored in all categories (YES in S82)
- the second calculation unit 152 is stored in the second area 222.
- a differential operator operation is performed on the combined line image data composed of a plurality of line data, and the resultant enhanced image data is stored in the third storage unit 23 (S92).
- a symbol indicating the position of the inspection object 2 associated with the line data stored in the mth section of the second region 222 is added to the emphasized image data that is the result of the differential operator calculation. To do.
- the second determination unit 142 determines that the line data is not stored in all the sections (NO in S82)
- the process proceeds to S140.
- the k-th determination unit 14k of the all-section determination unit 14 determines whether or not line data is stored in all the sections of the k-th region 22k (S8k).
- the k-th calculation unit 15k is stored in the k-th area 22k.
- a differential operator operation is performed on the combined line image data composed of a plurality of line data, and the resultant enhanced image data is stored in the third storage unit 23 (S9k).
- the same location determination extraction unit 16 refers to the symbol indicating the position of the inspection object 2 associated with the emphasized image data stored in the third storage unit 23, and the same of the inspection object 2 is detected. It is determined whether or not enhanced image data of all shooting angles (k types) indicating the location is stored (S100). When it is determined that the same location determination extraction unit 16 does not store the enhanced image data of all imaging angles indicating the same location of the inspection object 2 (NO in S100), the process proceeds to S140. On the other hand, if the same location determination extraction unit 16 determines that enhanced image data of all imaging angles indicating the same location of the inspection object 2 is stored (YES in S100), k types that are all imaging angles are stored. Extract enhanced image data.
- the integrating unit 17 integrates the luminance values of the k types of emphasized image data extracted by the same location determination extracting unit 16 for each pixel (S110).
- the accumulating unit 17 associates the symbol indicating the position of the inspected object 2 associated with the accumulated k types of emphasized image data with the defect inspection image data (RT-LCI data) as a result of accumulation.
- the image generation unit 18 is stored in the fourth storage unit based on a symbol indicating the position of the inspection object 2 associated with each RT-LCI data stored in the fourth storage unit.
- Each RT-LCI data is arranged in the same manner as the positional relationship of the inspection object 2, and new defect inspection image data (RT-LCI data) is synthesized (S120).
- RT-LCI real-time line synthesis integration
- FIG. 8A is a state transition diagram showing the image data stored in each storage unit and the state of the image displayed on the display unit 30 for each frame.
- the number of pixels of the area camera 5 is n pixels wide (size in a direction orthogonal to the moving direction of the inspection object 2; n is an integer of 2 or more) ⁇ height (size along the moving direction of the inspection object 2).
- n pixels are an integer of 2 or more
- ⁇ height size along the moving direction of the inspection object 2.
- the width of one line is one pixel. That is, the number of pixels in one line is n pixels ⁇ 1 pixel.
- the distance (movement width) by which the inspection object 2 moves by the conveyor 3 per frame and the actual distance indicated by the width of one line are the same. That is, it is assumed that the actual distance (resolution of one pixel) indicated by one pixel is the same as the movement width.
- Reference numeral 400 shown in FIG. 8b denotes the inspection object 2, and symbols (p1, p2,%) Described in 400 indicate the parts of the inspection object 2, and each part (p1, p2,). ) Is divided for each movement width.
- the inspection object 2 is conveyed by the conveyor 3 from the lower side to the upper side of the imaging range 401 of the area camera 5.
- Image data 410 shown in FIG. 8 a indicates image data stored in the first storage unit 21. That is, the image data as it is taken by the area camera 5 is used.
- the image data 301 to 310 have the number of pixels of n pixels ⁇ 9 pixels and include nine line data.
- the image data 301 to 310 are divided into nine line data, which are referred to as a first line, a second line,..., A ninth line in order from the bottom of the image data 301 to 310.
- line data is extracted from the first line, the second line, and the third line.
- the line data at the predetermined position described above is set as the first line data (bottom line data).
- the first line is set as the shooting angle A
- the second line is set as the shooting angle B
- the third line is set as the shooting angle C.
- each image data 301 to 310 is attached with a part (p1, p2,...)
- “p1-A” is the imaging angle A, that is, line data corresponding to the first line and line data corresponding to the part p1 of the inspection object 2. Show.
- the 420 shown in FIG. 8a indicates line data of 1 line or more and m lines (5 lines in this example) stored in the second storage unit 22. Further, 421 is stored in the first area 221, 422 is stored in the second area 222, and 423 is stored in the third area 223.
- the line data is stored in the range of 1 line to m lines (5 lines in this example). Indicates. That is, the line composite image data 311 to 320 are line data of 1 to m lines (5 lines in this example) stored in the first area 221.
- the line composite image data 321 to 330 are Line data stored in the second area 222 is not less than 1 line and not more than m lines (in this example, 5 lines), and the line composite image data 331 to 340 includes one line or more stored in the third area 223.
- the line data is less than m lines (5 lines in this example).
- Each of the line composite image data 311 to 340 is composed of one or more and five or less line data, and in order from the bottom of each line composite image data 311 to 340, the first section and the second of each region ... Line data stored in the fifth section.
- each line composite image data 311 to 340 includes a part (p1, p2,...) And an imaging angle (A to C) of the inspection object 2 corresponding to each line data. It is appended.
- Reference numeral 430 illustrated in FIG. 8 a indicates the emphasized image data stored in the third storage unit 23.
- the corresponding parts (p3, p4,%) Of the inspection object 2 and the photographing angles (A to C) are appended to each of the emphasized image data 430 for convenience of explanation.
- reference numeral 440 shown in FIG. 8 a indicates defect inspection image data (RT-LCI data) stored in the fourth storage unit 24.
- 450 shown in FIG. 8a shows the image currently displayed on the display part 30.
- FIG. The RT-LCI data 361 to 370 and the images 381 to 384 are also appended with corresponding parts (p3, p4,...) Of the inspection object 2 for each line for convenience of explanation.
- FIG. 8c is a diagram showing a process for generating the RT-LCI data generated first.
- the vertical axis of FIG. 8c represents time (time in frame units).
- the same data as the data shown in FIG. A 1 , a 2 ,..., A m shown in FIG. 8c indicate elements of the first, second,.
- RT-LCI RT-LCI processing
- no data is stored in each storage unit at the start of RT-LCI processing.
- the RT-LCI processing is started when the upper end portion p1 of the inspection object 2 enters the imaging range 401 of the area camera 5.
- the first extraction unit 111 extracts the line data p1-A from the first line (predetermined line) of the image data 301, and the first storage unit 131 is extracted by the first extraction unit 111.
- the line data p1-A is stored in the first section of the first area 221.
- the image data taken by the area camera 5 at this time is image data 302.
- the first extraction unit 111 extracts line data p2-A from the first line of the image data 302.
- the first storage unit 131 stores the first region
- the line data p1-A (line data of the first line associated with the part p1) in the first section 221 is moved to the second section of the first region 221 and the first extraction unit
- the line data p2-A extracted by 111 (line data associated with the part p2) is stored in the first section of the first region 221.
- the second extraction unit 112 extracts the line data p1-B from the second line of the image data 302, and the second storage unit 132 sets the second extraction unit 112.
- the line data p1-B extracted by the above is stored in the first section of the second area 222. Also here, since there is no line data in all the sections of the first area 221 and the second area 222, it waits to shift to the next frame.
- Image data taken by the area camera 5 at this time is image data 303.
- the first extraction unit 111 and the second extraction unit 112 extract line data p3-A and p2-B from the first line and the second line of the image data 303, respectively.
- the first storage unit 131 and the second storage unit 132 move the line data stored in the first region 221 and the second region 222 upward by one section, respectively, and the first region 221 and the second storage unit Line data p3-A and p2-B are stored in the first section of area 222.
- the third extraction unit 113 extracts the line data p1-C from the third line of the image data 303, and the third storage unit is the third extraction unit 113.
- the line data p1-C extracted by the above is stored in the first section of the third area 223. Also here, since there is no line data in all sections of the first area 221, the second area 222, and the third area 223, it waits to shift to the next frame.
- the first to third extraction units 111 to 113 perform the line data p4-A, p3-B, from the first to third lines of the image data 304 and 305, respectively.
- p2-C and line data p5-A, p4-B, and p3-C are extracted, and the first to third storage units 131 to 133 are stored in the first to third areas 221 to 223, respectively.
- the line data is moved upward by one section, and line data p4-A, p3-B, p2-C and line data p5-A, p4-B are added to the first section of the first to third regions 221 to 223. , P3-C are stored.
- the line composite image data 315 stored in the first area 221 is combined with the line composite image data obtained by combining the first line (shooting angle A) in the image data 301 to 305 of the first to fifth frames.
- the first determination section 141 of the all section determination section 14 It is determined that there is line data in all sections of one region 221, and the first calculation unit 151 performs a differential operator calculation (arithmetic processing that applies a differential filter) on the line composite image data 315, thereby generating a line composite image.
- the center line data of the data 315 that is, the emphasized image data 341 representing the absolute value of the luminance gradient in the line data p3-A stored in the third section of the first area is generated and stored in the third storage unit 23.
- the first calculation unit 151 uses the symbol p3 indicating the part of the inspection object 2 associated with the line data p3-A stored in the third section of the first area and the line composition.
- a symbol A indicating the shooting angle corresponding to the image data 315 is added to the emphasized image data of the differential operator calculation result.
- the same location determination extraction unit 16 determines whether or not the enhanced image data of all imaging angles (3 types) indicating the same location (part) of the inspection object 2 is stored in the third storage unit 23.
- the third storage unit 23 has only the emphasized image data 341, it waits for a transition to the next frame.
- the first to third extraction units 111 to 113 extract line data p6-A, p5-B, and p4-C from the first to third lines of the image data 306, respectively.
- the first to third storage units 131 to 133 move the line data already stored in the first to third areas 221 to 223 upward one by one, and the first to third areas 221 to 223 Line data p6-A, p5-B, and p4-C are stored in the first section.
- the line composite image data 316 and the line composite image data 326 stored in the first region 221 and the second region 222 are stored in the first line (image data 302 to 306 of the second to sixth frames).
- the first to third extraction units 111 to 113 extract line data p7-A, p6-B, and p5-C from the first to third lines of the image data 307, respectively.
- the first to third storage units 131 to 133 move the line data already stored in the first to third areas 221 to 223 upward one by one, and the first to third areas 221 to 223 Line data p7-A, p6-B, and p5-C are stored in the first section. Accordingly, the line composite image data 317, 327, and 337 stored in the first to third areas 221 to 223 are the first lines (shooting angle A) in the image data 303 to 307 of the third to seventh frames.
- Line synthesized image data obtained by synthesizing the second line (shooting angle B) in the third to seventh frame image data 303 to 307, and third to seventh frame image data 303.
- the combined line image data is obtained by combining the third lines (shooting angle C) in .about.307. Since the line data is stored in all sections of the first area 221, the second area 222, and the third area 223, the first calculation unit 151 performs a differential operator operation on the line composite image data 317, and The enhanced image data 350 representing the absolute value of the luminance gradient in the line data p5-A is generated, and the second calculation unit 152 performs a differential operator operation on the line composite image data 327, and the luminance gradient in the line data p4-B is calculated.
- Emphasized image data 349 representing the absolute value is generated, and the third calculation unit 153 performs a differential operator operation on the line composite image data 337 to obtain the enhanced image data 347 representing the absolute value of the luminance gradient in the line data p3-C.
- the generated enhanced image data 350, 349, and 347 are stored in the third storage unit 23.
- enhanced image data (enhanced image data 345 to 347) of all imaging angles (three types of A, B, and C) indicating the same part p3 of the inspection object 2 is stored in the third storage unit 23.
- the integrating unit 17 integrates the luminance values of the emphasized image data 345, 346, and 347 to generate RT-LCI data 361 of the part p3 and stores it in the fourth storage unit 24. Then, the image generation unit 18 outputs the RT-LCI data 361 of the part p3 as new defect inspection image data to the display unit 30, and the display unit 30 displays the defect inspection image 381.
- the generation processing of the RT-LCI data 361 will be described again with reference to FIG.
- the frame number i 5 (m)
- 5 (m) rows and one column are added to the line synthesized image data 315 obtained by synthesizing the same lines of the image data 301 to 305 of 5 (m) frames.
- a differential operator calculation is performed to generate emphasized image data 341 representing the absolute value of the luminance gradient in the line data p3-A (become emphasized image data 342, 345, etc. in subsequent frames).
- a 5 ⁇ 1 differential operator operation is performed on the line composite image data 337 obtained by synthesizing the same lines of the image data 303 to 307 of 5 frames.
- Emphasized image data 347 representing the absolute value of the luminance gradient in the data p3-C is generated.
- the RT-LCI data 361 of the part p3 is generated by integrating the luminance values of the enhanced image data 345 to 347 at the same observation position in the generated 3 (k) frame.
- the first to third extraction units 111 to 113 extract line data p8-A, p7-B, and p6-C from the first to third lines of the image data 308, respectively.
- the first to third storage units 131 to 133 move the line data already stored in the first to third areas 221 to 223 upward one by one, and the first to third areas 221 to 223 Line data p8-A, p7-B, and p6-C are stored in the first section.
- the line composite image data 318, 328, 338 stored in the first to third areas 221 to 223 are converted into the first line (shooting angle A) in the image data 304 to 308 of the fourth to eighth frames.
- Emphasized image data 355 representing an absolute value is generated, and the third calculation unit 153 performs a differential operator operation on the line composite image data 338 to obtain enhanced image data 353 representing the absolute value of the luminance gradient in the line data p4-C.
- the generated enhanced image data 356, 355, and 353 are stored in the third storage unit 23.
- enhanced image data enhanced image data (enhanced image data 351 to 353) of all imaging angles (three types of A, B, and C) indicating the same part p4 of the inspection object 2 is stored in the third storage unit 23. Therefore, the integrating unit 17 integrates the luminance values of the emphasized image data 351 to 353 to generate the RT-LCI data 363 of the part p4 and stores it in the fourth storage unit 24.
- the image generation unit 18 arranges the RT-LCI data 362 and 363 from the top so that the RT-LCI data 362 and 363 correspond to the positional relationship of the inspection object 2, and arranges the RT-LCI data 362 and 363 from the top for new defect inspection.
- the image data is synthesized and output to the display unit 30.
- the display unit 30 displays the defect inspection image 382 of the parts p3 and p4.
- defect inspection image data obtained by arranging and synthesizing the RT-LCI data of the parts p3, p4, and p5 is output to the display unit 30, and the parts p3, p4, The defect inspection image data of p5 is displayed.
- the defect inspection image data obtained by arranging and synthesizing the RT-LCI data of the parts p3, p4, p5, and p6 is output to the display unit 30 and displayed.
- the defect inspection images of the parts p3, p4, p5, and p6 are displayed.
- the defect inspection image data obtained by arranging and synthesizing the RT-LCI data of the parts p3 to p476 in the frame number i 480. Is output to the display unit 30, and the defect inspection images of the parts p3 to p478 are displayed on the display unit 30. Therefore, it is possible to obtain defect inspection image data for most of the inspection object 2 (474/478). If only the parts p3 to p476 of the inspection object 2 are the inspection objects, defect inspection image data for the entire inspection object can be obtained.
- FIG. 9 shows line composite image data composed of a plurality of line data stored in the second storage unit 22, a differential operator used by the change amount calculation unit 15, and values of the emphasized image data calculated by the change amount calculation unit 15. It is a figure which shows an example.
- a matrix 461 shown in FIG. 9 corresponds to line composite image data (line composite image data 315, 326, 337 shown in FIGS. 8a and 8c, etc.) composed of five line data stored in the second storage unit 22. ) In 5 rows by 4 columns with luminance values of all the pixels as elements.
- a matrix 462 illustrated in FIG. 9 is a 5 ⁇ 1 differential operator (differential filter) used by the change amount calculation unit 15.
- a matrix 463 illustrated in FIG. 9 is a 5-by-4 matrix having the vertical luminance gradient calculated by the variation calculation unit 15 as an element.
- a matrix 464 shown in FIG. 9 is the absolute value of the matrix 463, and indicates the luminance values of all the pixels of the enhanced image data 341 to 356 shown in FIG. 8a.
- the change amount calculation unit 15 calculates the vertical luminance gradient of the center line data of the line synthesized image data by applying the differential operator to the line synthesized image data including the five line data, and the luminance data Is generated. By calculating the gradient of the luminance value of the line composite image data, it becomes easy to detect a minute defect, a thin defect, and a light defect on the inspection object 2.
- the brightness gradient calculated by the change amount calculation unit 15 is not 0, it indicates that there is a defect in that part of the inspection object 2 (however, if the brightness gradient is close to 0, it is a defect). And may be noise).
- the inspection object 2 has a defect, in the inspection object 2, the defect becomes dark in a region where the light from the linear light source 4 is incident without being blocked by the knife edge 7, and the light from the linear light source 4 is converted into a knife. In the area that is shielded by the edge 7 and is not directly incident, the defect is photographed as bright. Therefore, the sign of the value of the brightness gradient does not affect the presence or absence of a defect, and the magnitude of the brightness gradient is important when determining the presence or absence of a defect.
- the change amount calculation unit 15 generates the matrix 464 by taking the absolute value of the luminance gradient matrix 463 generated by the change amount calculation unit 15 as in the matrix 464 shown in FIG. As described above, the change amount calculation unit 15 replaces the luminance value of each pixel in the center line of the line composite image data with the absolute value of the vertical luminance gradient in each pixel, and adds new one-line emphasized image data. Generate. In this way, by using the absolute value of the luminance gradient, the defect appearing on the bright side and the defect appearing on the dark side contribute to plus in the same way, and it becomes easier to identify the presence or absence of the defect.
- the photographing range of the area camera 5 so as to straddle the knife edge 7 when positioning the optical system. Furthermore, when positioning the knife edge 7, it is possible to perform an inspection with high accuracy without strictly matching the horizontal direction of the area camera 5. That is, it is not necessary to arrange the optical system in the dark field method with high accuracy compared to the conventional case, which is simplified. For this reason, the inspection capability of the defect inspection system and the maintainability of the apparatus system of the defect inspection system, particularly the optical system, are improved.
- the inspection object 2 when the inspection object 2 is relatively thick or when the inspection object 2 is warped, it is irradiated from the linear light source 4. When light passes through the object 2 to be inspected, it may be refracted and incident off the optical axis of the area camera 5. Further, in the conventional reflection / scattering method, as shown in FIG. 10C, when the inspection object 2 is relatively thick or the inspection object 2 is warped, the light irradiated from the linear light source 4 is irradiated.
- the inspection object 2 When the inspection object 2 is reflected, it may be reflected at an angle different from the reflection angle intended in the design of the optical system, and may be incident with a deviation from the optical axis of the area camera 5. As described above, the light from the linear light source 4 unintentionally shifts and enters the area camera 5 under the influence of the thickness and warpage of the object 2 to be inspected. There is a risk of misidentifying that there is a defect on the inspection object 2 because it cannot be distinguished from the deviation of the optical axis due to the defect.
- the inspection object 2 is relatively thin, the curvature of the inspection object 2 is relatively small, or the incident angle (the Z axis and the area camera 5 shown in FIG. 2).
- the angle of the incident light to the optical axis is relatively small, the optical axis shift is slight, so the influence on the image captured by the area camera 5 is relatively small.
- FIG. 10B when the inspection object 2 is relatively thick, the curvature of the inspection object 2 is relatively large, or the incident angle is relatively large, the deviation of the optical axis becomes large, The influence on the image captured by the area camera 5 cannot be ignored.
- the reflection / scattering method even when the reflection angles are slightly different, the deviation of the optical axis increases in proportion to the distance between the inspection object 2 and the area camera 5.
- the optical axis may be shifted due to the thickness or warpage of the inspection object 2 as in the conventional case.
- the influence of the deviation of the optical axis due to the thickness or warpage of the inspection object 2 is obtained. Can be suppressed.
- the distance that the inspection object 2 moves from the time when the area camera 5 captures a certain image until the next image is captured one frame period
- the data extraction unit 11 Although the actual distance indicated by the width of the extracted line data is the same, it is not limited to this.
- the width of the line data extracted by the data extraction unit 11 is set to 5 pixels. RT-LCI processing can be performed.
- the data extraction unit 11 performs line data (n ⁇ 1 pixel) every 5 frames.
- the same RT-LCI process can be performed.
- the actual distance indicated by the width of the line data extracted by the data extraction unit 11 and the moving width cannot be strictly matched (for example, the ratio of the resolution per pixel of the area camera 5 to the moving width is 1: 1).
- the line data position is corrected using a pattern matching technique, a similar RT-LCI process can be performed. Note that the above pattern matching technique is easy to implement in hardware and various methods are known, so that a technique suitable for RT-LCI processing may be used.
- the image analysis apparatus 6 has the first to k-th areas 221 to 22k, the first as the means for storing the line data of the first to k-th lines extracted by the data extraction unit 11.
- the 1st to kth determination units 121 to 12k and the 1st to kth storage units 131 to 13k are provided, but instead of these, the first to kth lines of the 1st to kth lines extracted by the data extraction unit 11 are provided.
- each FIFO memory includes first to fifth sections for storing one line of line data, and each time new line data is received, the received line data is stored in the first section.
- the data stored in the second to fourth sections is moved to the third to fifth sections, and the data stored in the fifth section is discarded.
- the area camera 5 is fixed and the inspection object 2 is moved using the conveyor 3.
- the present invention is not limited to this, and the area camera 5 and the inspection object 2 move relatively. If you do.
- data (image data, line data, etc.) not related to or used for RT-LCI processing may be discarded each time, or the same storage unit or another storage unit for backup. You may save it in For example, the image data 301 to 310 shown in FIG. 8a are not used in the RT-LCI processing after the line data is extracted, but may be stored in the first storage unit 21 or the like.
- smoothing processing may be performed before and after the change amount calculation unit 15 calculates the gradient of the luminance value of the line composite image data.
- the line composite image data including a plurality of line data stored in each of the first to kth regions 221 to 22k.
- a smoothing processing unit that generates smoothed image data by performing a smoothing process using a smoothing operator of m rows and 1 column may be provided. By performing the smoothing process, it is possible to detect small (light, thin) defects that are buried in noise.
- a smoothing operator for example, a 7 ⁇ 1 matrix (1, 1, 1, 1, 1, 1, 1) T can be used.
- an operation using another operator such as a sharpening operator
- An operator calculation processing unit may be provided.
- the defect inspection image data synthesized by the image generation unit 18 may be output after binarizing the luminance value with an appropriate threshold value. Thereby, noise can be removed, and image data for defect inspection that shows defects more clearly and accurately can be output and displayed.
- image data for defect inspection which shows a defect more correctly can be output and displayed.
- defect inspection may be performed based on whether or not the luminance value of each pixel in the binarized image data is a luminance representing a defect (bright luminance in the present embodiment). Good.
- a two-dimensional image of the inspection object is taken using the knife edge 7, and the dark field in the image data for defect inspection (light from the linear light source 4 is blocked by the knife edge 7 to be linear.
- the defect of the inspection object is inspected by using a dark field method of detecting the defect as a bright part (which is caused by light scattered by the defect of the inspection object 2) in the non-incident part).
- the present invention is not limited to this.
- a two-dimensional image of the object to be inspected is taken without the knife edge 7, and a dark part in a bright field (portion illuminated by the linear light source 4) in the image data for defect inspection.
- the defect of the inspection object may be inspected by using a bright field method of detecting the defect (due to light scattering by the defect of the inspection object 2).
- the RT-LCI process is performed every frame period.
- the present invention is not limited to this.
- the RT-LCI process may be performed every predetermined frame period, or an image is taken. Later, the captured image data may be collected and line synthesis integration image processing similar to RT-LCI processing may be performed.
- line synthesis integration image processing similar to RT-LCI processing
- image processing for line synthesis integration similar to the RT-LCI processing while storing the stored image data as image data in a storage device or the like and reading the stored image data in time series. In this way, the line synthesis integration process that is not performed in real time is referred to as software LCI processing.
- the image analysis device 6 is a PC on which image processing software is installed.
- the area camera 5 may include the image analysis device 6.
- a capture board PC expansion card that captures image data from the area camera 5 may incorporate the image analysis device 6.
- the change amount calculation unit 15 performs the differential operator calculation on the partial image data stored in each area of the second storage unit 22 using the same differential operator.
- the first calculation unit of the change amount calculation unit 15 performs a differential operator calculation using a certain differential operator A on the line composite image data (315, etc.) stored in the first region.
- the second calculation unit performs a differential operator operation using the differential operator B different from the differential operator A on the line composite image data (326 and the like) stored in the second area
- the calculation unit may perform a differential operator calculation on the line composite image data (eg, 337) stored in the third region using a differential operator C different from the differential operators A and B.
- a calculation using another operator that emphasizes a luminance change such as a smoothing operator, may be performed.
- the line composite image data extracted from each photographed image data may be substantially the same.
- the emphasized image data may be generated by using different differential operator operations for a plurality of substantially the same line combined image data.
- Various defects can be detected by integrating a plurality of emphasized image data generated using the different differential operator calculations.
- the same location determination extraction unit 16 extracts enhanced image data indicating the same location of the inspection object 2.
- the change amount calculation unit 15 applies to the line composite image data extracted by the same location determination extraction unit 16.
- the operator image calculation may be performed to generate the emphasized image data.
- the change amount calculation unit 15 performs an operation using an operator that emphasizes the luminance change on each of the plurality of line composite image data combined by the data storage unit 13, thereby obtaining one line or a plurality of lines.
- a plurality of emphasized image data are respectively generated, and the same location determination extraction unit 16 extracts a plurality of emphasized image data indicating the same location of the inspection object 2 from the plurality of emphasized image data generated by the change amount calculation unit 15.
- the integrating unit 17 may generate defect inspection image data by integrating the luminance values of the plurality of emphasized image data extracted by the same location determination extracting unit 16 for each pixel.
- the same location determination extraction unit 16 extracts a plurality of line composite image data indicating the same location of the inspection object 2 from the plurality of line composite image data synthesized by the data storage unit 13, and a change amount calculation unit 15 performs a calculation using an operator for emphasizing a change in brightness on each of the plurality of line composite image data extracted by the same location determination extraction unit 16, thereby obtaining a plurality of emphasized image data of one line or a plurality of lines.
- the integration unit 17 may generate the defect inspection image data by integrating the luminance values of the plurality of emphasized image data generated by the change amount calculation unit 15 for each pixel.
- a symbol (identifying) that specifies (identifies) a position on the inspection object 2 with respect to the extracted line data. Identifier) is added, but the present invention is not limited to this.
- the image data (captured image data, line data, line synthesis) is processed before the same location determination extraction unit 16 performs processing for extracting line composite image data (or enhanced image data) indicating the same location of the inspection object 2.
- a symbol for specifying the position on the inspection object 2 may be added to the image data, the emphasized image data, and the like.
- the same location determination extraction unit 16 generates line composite image data (or emphasized image data) indicating the same location of the inspection object 2. It may be identified and extracted.
- the area camera 5 has a frame rate of 60 FPS and was shot in a normal TV format. Photographed for 8 seconds with an area camera, RT-LCI processing was performed on 480 images.
- the conveyance speed of the conveyor 3 was set to 4.2 mm / second. That is, the inspection object 2 is set to move by 70 ⁇ m per frame.
- a 22 kHz high frequency fluorescent lamp was used as the linear light source 4.
- As the illumination diffuser plate 8 a milky white PMMA (polymethyl methacrylate) sheet having a thickness of 3 mm was used.
- An NT cutter was used as the knife edge 7.
- As the inspection object 2 a transparent PMMA sheet having defects such as bank marks and avatars was used.
- the number of pixels of the line data extracted by the data extraction unit 11 is set to 512 ⁇ 1 pixels, and the moving width and the actual distance indicated by the width of the line data are set to coincide with each other.
- the image shown in FIG. 11A is an original image, and is the image of the last frame (480th frame) of the moving image captured by the area camera 5.
- This original image is the same type as the image shown in FIG.
- the dark part at the bottom is a part where light is blocked by the knife edge 7, and the bright part near the center is from the linear light source 4.
- the portion where the light of the linear light source 4 is transmitted, and the dark portion at the top is the portion where the light from the linear light source 4 does not reach and is out of the inspection target.
- FIG. 11A is a shadow of an object arranged for a mark.
- the image shown in FIG. 11B is an RT-LCI image obtained by performing RT-LCI processing on image data near the knife edge 7 of the original image and arranging the RT-LCI data according to the original image. . It shows that there is a defect in the bright part of the RT-LCI image. From the RT-LCI image, it can be seen that there are defects such as streak-like bank marks and dot-like avatars on the inspection object 2. Furthermore, since the position of the inspection object 2 corresponds to the position of the image, it is easy to identify what kind of defect exists at which position on the inspection object 2 by looking at the RT-LCI image. be able to.
- FIG. 11A shows an image (480th frame) taken by the area camera 5.
- m 7, and RT using a 7 ⁇ 1 matrix ( ⁇ 3, ⁇ 2, ⁇ 1, 0, 1, 2, 3) T as a differential operator for the original image.
- FIG. 6 is a diagram illustrating an example of an RT-LCI image obtained by performing an LCI process.
- the RT-LCI image shown in FIG. 11B is processed so as to enhance the luminance gradient in the vertical direction of the image data of the original image.
- This differential operator is a calculation process suitable for detecting defects that are very light irregularities.
- FIG. 11A shows an image (480th frame) taken by the area camera 5.
- m 7, and RT using a 7 ⁇ 1 matrix ( ⁇ 3, ⁇ 2, ⁇ 1, 0, 1, 2, 3) T as a differential operator for the original image.
- FIG. 6 is a diagram illustrating an example of an RT-LCI image obtained by performing an LCI process.
- the RT-LCI image shown in FIG. 11B is
- FIG. 6 is a diagram illustrating an example of an RT-LCI image obtained by performing RT-LCI processing using T.
- This differential operator is an arithmetic process suitable for highlighting only moderate point defects.
- FIG. 12 shows an example using another differential operator.
- FIG. 12A shows an original image taken by the area camera 5.
- FIG. 12B is a line composite image at a certain shooting angle near the edge of the original image.
- FIG. 13 shows an example in which the image is taken with the knife edge inclined with respect to the horizontal direction of the area camera.
- FIG. 13A is an original image taken by the area camera 5.
- FIG. 13B is a line composite image at a certain shooting angle near the edge of the original image, which is equivalent to an image obtained by a defect inspection system using a conventional line sensor.
- the left side of the image is bright and the right side is dark.
- the defect inspection system using the conventional line sensor there is a great influence on the image captured by the arrangement relationship between the line sensor and the knife edge. That is, conventionally, an accurate inspection result cannot be obtained unless the arrangement of the optical system including the line sensor and the knife edge is positioned with high accuracy.
- FIG. 13A is an original image taken by the area camera 5.
- FIG. 13B is a line composite image at a certain shooting angle near the edge of the original image, which is equivalent to an image obtained by a defect inspection system using a conventional line sensor.
- the left side of the image is bright and the right side is
- RT-LCI processing is performed on the original image using a matrix ( ⁇ 2, ⁇ 1,0,1,2) T of 5 rows and 1 column as a differential operator. It is the RT-LCI image obtained by this. From FIG. 13C, it can be seen that unlike the case of FIG. 13B, the RT-LCI image is not affected by the inclination of the knife edge 7. That is, the defect inspection system 1 of the present invention can detect defects with high accuracy even when the knife edge 7 is inclined with respect to the horizontal direction of the area camera 5.
- FIG. 14A shows an original image taken by the area camera.
- FIG. 14B is a line composite image at a certain shooting angle in the vicinity of the edge of the original image.
- m 7, and a 7 ⁇ 1 matrix (1,1,1,1,1,1) T as a smoothing operator instead of the differential operator for the original image.
- each block of the image analysis device 6, particularly the data extraction unit 11, the first category determination unit 12, the data storage unit 13, the all category determination unit 14, the change amount calculation unit 15, the same location determination extraction unit 16, and the integration unit 17 and the image generation unit 18 may be configured by hardware logic such as a FPGA (Field Programmable Gate Gate Array) circuit, or may be realized by software using a CPU as follows.
- FPGA Field Programmable Gate Gate Array
- the image analysis device 6 adds, for example, an arithmetic unit for line extraction, an area FIFO memory, a comparator for binarization, etc. to an FPGA circuit that realizes a differential operator calculation of m rows and one column. It can be realized by doing.
- An FPGA circuit that realizes m-row, 1-column differential operator operation includes m line FIFO memories for storing image data for each row, and m flip-flops with enable terminals for storing differential operator coefficients (filter coefficients). (DFFE), m multipliers that multiply the image data of each row and the coefficient of the differential operator, an adder circuit that adds the multiplication results, and the like.
- DFFE differential operator coefficients
- the change amount calculation unit 15 may set the number of rows and columns of the differential operator to be used as appropriate in accordance with the number of lines of the line composite image data stored in the second storage unit 22. Further, the emphasized image data generated as a result of the calculation by the change amount calculation unit 15 is not limited to one composed of one line as described above, and may be composed of a plurality of lines as a result of the calculation. Good.
- the change amount calculation unit 15 calculates the luminance gradient in the vertical direction.
- the present invention is not limited to this, and the luminance gradient in the horizontal direction may be calculated.
- the defect inspection image data is generated in accordance with the size (number of pixels) of the image data captured by the area camera 5, but the present invention is not limited to this.
- the above RT-LCI data may be generated as image data for defect inspection, and defects on the inspection object may be detected based on the generated data.
- the image analysis apparatus 6 includes a CPU (central processing unit) that executes instructions of a control program for realizing each function, a ROM (read only memory) that stores the program, and a RAM (random access memory) that develops the program. And a storage device (recording medium) such as a memory for storing the program and various data.
- An object of the present invention is a recording medium on which a program code (execution format program, intermediate code program, source program) of a control program of the image analysis apparatus 6 which is software for realizing the functions described above is recorded in a computer-readable manner. This can also be achieved by supplying the image analysis apparatus 6 and reading and executing the program code recorded on the recording medium by the computer (or CPU or MPU).
- Examples of the recording medium include tapes such as magnetic tapes and cassette tapes, magnetic disks such as floppy (registered trademark) disks / hard disks, and disks including optical disks such as CD-ROM / MO / MD / DVD / CD-R.
- Card system such as IC card, IC card (including memory card) / optical card, or semiconductor memory system such as mask ROM / EPROM / EEPROM / flash ROM.
- the image analysis device 6 may be configured to be connectable to a communication network, and the program code may be supplied via the communication network.
- the communication network is not particularly limited.
- the Internet intranet, extranet, LAN, ISDN, VAN, CATV communication network, virtual private network, telephone line network, mobile communication network, satellite communication. A net or the like is available.
- the transmission medium constituting the communication network is not particularly limited.
- wired such as IEEE 1394, USB, power line carrier, cable TV line, telephone line, ADSL line, etc.
- infrared rays such as IrDA and remote control, Bluetooth (Registered trademark), 802.11 wireless, HDR, mobile phone network, satellite line, terrestrial digital network, and the like can also be used.
- the present invention can also be realized in the form of a computer data signal embedded in a carrier wave in which the program code is embodied by electronic transmission.
- the present invention relates to a defect inspection system for inspecting a defect of an inspection object such as a sheet, and a defect inspection imaging device, a defect inspection image processing device, a defect inspection image processing program, and a defect inspection image processing used therein.
- the present invention can be used for a computer-readable recording medium on which a program is recorded and an image processing method for defect inspection.
Abstract
Description
しかしながら、暗視野法には、上述のように実用的には光学系の配置が困難であるため、被検査物の欠陥を精度良く検査することが難しいという問題がある。
θ=arc tan(70[μm/pixel]×10^3×7[pixel]/300[mm])≒0.09[degree]
となる。よって、撮影距離の差は10^-6のオーダーのため、無視できる程度である。従って、この場合、各々の撮影範囲が70μmである15本のラインセンサ(X軸上の1本のラインセンサと、その両側にそれぞれ配置された7本ずつのラインセンサ)が並列して撮影する場合と同等の光学条件で撮影できる。
ラインは、典型的には、成形シート2の幅方向に沿って1直線上に並ぶ1画素×W画素の部分画像である。
さらに、生成した複数の撮影角度に相当する各ライン合成画像に対して、画像処理を行い、画像処理を施した各ライン合成画像の中から欠陥が顕著に表れている画像を選択して参照することによって、被検査物の様々な欠陥を明瞭に認識することができる。
なお、撮影角度の種類数kおよび微分オペレータの行数mは任意に設定でき、予め定めておくものである。また、説明を簡単にするため、エリアカメラ5が或る画像を撮影してから次の画像を撮影するまで(1フレーム期間)に被検査物2が移動する距離を移動幅とし、後述のデータ抽出部が抽出するラインデータ(1ラインの部分画像データ)の幅が示す実距離(被検査物2表面での距離)は、上記移動幅と同一であるとする。また、微分オペレータの列数は、2以上であってもよいが、ここでは、1であるものとする。
第1の判定部121が第1の領域221の第1の区分にラインデータがあると判定した場合(S51でYES)、第1の格納部131は、第1の領域221の各区分に格納されているラインデータの格納場所をそれぞれ1区分ずつ繰り上げて移動させる(S61)。第1の格納部131は、各区分に格納されているラインデータの格納場所を移動させた後、第1の抽出部111が抽出したラインデータを第1の領域221の第1の区分に格納する(S71)。一方、第1の判定部121が第1の領域221の第1の区分にラインデータがないと判定した場合(S51でNO)、第1の格納部131は、第1の抽出部111が抽出したラインデータを第1の領域221の第1の区分に格納する(S71)。
第2の判定部122が第2の領域222の第1の区分にラインデータがあると判定した場合(S52でYES)、第2の格納部132は、第2の領域222の各区分に格納されているラインデータの格納場所をそれぞれ1区分ずつ繰り上げて移動させる(S62)。第2の格納部132は、各区分に格納されているラインデータの格納場所を移動させた後、第2の抽出部112が抽出したラインデータを第2の領域222の第1の区分に格納する(S72)。一方、第2の判定部122が第2の領域222の第1の区分にラインデータがないと判定した場合(S52でNO)、第2の格納部132は、第2の抽出部112が抽出したラインデータを第2の領域222の第1の区分に格納する(S72)。
また、図8aに示す440は、第4の記憶部24に格納されている欠陥検査用画像データ(RT-LCIデータ)を示す。そして、図8aに示す450は、表示部30に表示されている画像を示す。RT-LCIデータ361~370および画像381~384にも、説明の便宜のために、ライン毎に、対応する被検査物2の部位(p3、p4、・・・)が付記されている。
これ以外は、図8a~図8cの説明で用いた条件をそのまま適用する。図9は、第2の記憶部22に格納されている複数のラインデータからなるライン合成画像データ、変化量算出部15が用いる微分オペレータ、変化量算出部15が算出した強調画像データの値の一例を示す図である。
この場合、各FIFOメモリは、1ラインのラインデータをそれぞれ格納する第1~第5の区分を備え、新たなラインデータを受け取る毎に、受け取ったラインデータを第1の区分に格納し、第2~第4の区分に格納されたデータを第3~第5の区分に移動させ、第5の区分に格納されたデータを破棄する。
以下に、図3に示す欠陥検査システム1において、エリアカメラ5で撮影した画像データに対してRT-LCI処理を施した実施例について説明する。エリアカメラ5の本体として、倍速プログレッシブスキャン白黒カメラモジュール(ソニー株式会社製のXC-HR50)を使用した。また、エリアカメラ5のレンズには、株式会社タムロン製レンズ(焦点距離f=35mm)に5mm接写リングを取り付けたものを使用した。エリアカメラ5の画素数は、512×480画素であり、1画素あたりの解像度は70μm/pixelである。エリアカメラ5の焦点は被検査物の表面上に合わせた。エリアカメラ5のフレームレートは60FPSであり、通常のTVフォーマットで撮影した。エリアカメラで8秒間撮影し、480枚の画像に対してRT-LCI処理を行った。コンベア3の搬送速度は4.2mm/秒に設定した。つまり、1フレームあたりに被検査物2が70μm移動するように設定した。線状光源4として22kHzの高周波蛍光灯を使用した。照明拡散板8として、厚さ3mmで乳白色のPMMA(ポリメチルメタクリレート)シートを使用した。ナイフエッジ7としてNTカッターを使用した。被検査物2には、バンクマークやアバタなどの欠陥を有する透明のPMMAシートを用いた。
データ抽出部11が抽出するラインデータの画素数を512×1画素とし、移動幅とラインデータの幅が示す実距離とが一致するように設定した。
図14(a)は、エリアカメラが撮影した原画像である。図14(b)は、原画像のエッジ近傍における或る撮影角度のライン合成画像である。図14(c)は、m=7とし、原画像に対して、微分オペレータに代えて平滑化オペレータとしての7行1列の行列(1,1,1,1,1,1,1)T を用いたRT-LCI処理を行うことによって得られたRT-LCI画像である。
〔変形例〕
変化量算出部15は、第2の記憶部22に格納されているライン合成画像データのライン数に合わせて、使用する微分オペレータの行数および列数を適宜設定すればよい。また、変化量算出部15が算出した結果生成される強調画像データは、上述のように、1ラインで構成されるものだけに限らず、算出した結果、複数のラインからなるものであってもよい。
2 被検査物
3 コンベア(移動手段)
4 線状光源
5 エリアカメラ(撮影部、欠陥検査用撮影装置の一部)
6 画像解析装置(欠陥検査用画像処理装置)
7 ナイフエッジ
8 照明拡散板
10 画像処理部(欠陥検査用画像処理装置の一部、欠陥検査用撮影装置の一部)
11 データ抽出部(同一ライン抽出手段)
12 第1区分判定部
13 データ格納部(ライン合成手段)
14 全区分判定部
15 変化量算出部(オペレータ演算手段)
16 同一箇所判定抽出部
17 積算部(積算手段)
18 画像生成部(画像生成手段)
20 記憶部(欠陥検査用画像処理装置の一部、欠陥検査用撮影装置の一部)
21 第1の記憶部
22 第2の記憶部
23 第3の記憶部
24 第4の記憶部
30 表示部
Claims (10)
- 被検査物と撮影部とが相対的に移動させられている状態で、撮影部によって時間的に連続して撮影された上記被検査物の2次元画像の画像データを処理し、それによって上記被検査物の欠陥を検査するための欠陥検査用画像データを生成する欠陥検査用画像処理装置であって、
複数の異なる画像データの中から画像データ上における位置が同一である1ラインのラインデータをそれぞれ抽出する同一ライン抽出手段と、
上記同一ライン抽出手段によって抽出されたラインデータを時系列に並べて複数ラインのライン合成画像データを生成するライン合成手段とを備え、
上記同一ライン抽出手段は、上記画像データ上における異なる複数の位置について上記ラインデータをそれぞれ抽出するものであり、
上記ライン合成手段は、上記同一ライン抽出手段によって抽出されたラインデータを、上記画像データ上における位置ごとに時系列に並べて、異なる複数のライン合成画像データを生成するものであり、
さらに、上記複数のライン合成画像データに対してそれぞれ、輝度変化を強調するオペレータを用いた演算を行い、1ラインまたは複数ラインの複数の強調画像データを生成するオペレータ演算手段と、
上記被検査物の同一箇所を示す上記複数の強調画像データの輝度値を画素ごとに積算して欠陥検査用画像データを生成する積算手段と、を備える装置。 - 上記オペレータ演算手段は、上記複数のライン合成画像データに対して微分オペレータを用いた演算を行うことにより、上記複数のライン合成画像データの中心ラインにおける各画素での、中心ラインに直交する方向に沿った輝度値の勾配を算出し、上記複数のライン合成画像データの中心ラインにおける各画素の輝度値を各画素での輝度値の勾配の絶対値に置き換えて新たな1ラインの強調画像データを生成するものである請求項1に記載の装置。
- 上記積算手段は、上記被検査物の複数の箇所をそれぞれ示す上記複数の強調画像データを上記被検査物の箇所ごとに、当該強調画像データの輝度値を画素ごとに積算して、上記被検査物の複数の箇所をそれぞれ示す複数の欠陥検査用画像データを生成するものであり、
さらに、上記被検査物の複数の箇所をそれぞれ示す複数の欠陥検査用画像データを、上記被検査物の箇所に対応させて並べて新たな欠陥検査用画像データを合成する画像生成手段を含む請求項1または2に記載の装置。 - 上記積算手段は、上記撮影部が撮影する毎に、上記被検査物の先頭箇所から順番に上記被検査物の箇所ごとに、上記被検査物の同一箇所を示す上記複数の強調画像データの輝度値を画素ごとに積算して、上記被検査物の複数の箇所をそれぞれ示す複数の欠陥検査用画像データを生成する請求項1ないし3の何れか1項に記載の装置。
- 請求項1ないし4の何れか1項に記載の装置と、
被検査物と撮影部とが相対的に移動させられている状態で、上記被検査物の2次元画像を時間的に連続して撮影する撮影部とを含む欠陥検査用撮影装置。 - 被検査物の欠陥を検査する欠陥検査システムであって、
請求項5に記載の装置と、
上記被検査物と上記撮影部とを相対的に移動させる移動手段とを含むシステム。 - 上記被検査物に光を照射する光源と、
上記光源から上記被検査物を透過または反射して上記撮影部に入射する光を一部遮る遮光体とを含み、
暗視野法を用いて被検査物の欠陥を検査する請求項6に記載のシステム。 - 請求項1から4の何れか1項に記載の装置を動作させるための欠陥検査用画像処理プログラムであって、コンピュータを上記手段の全てとして機能させるためのプログラム。
- 請求項8に記載のプログラムを記録したコンピュータ読み取り可能な記録媒体。
- 被検査物と撮影部とが相対的に移動させられている状態で、撮影部によって時間的に連続して撮影された上記被検査物の2次元画像の画像データを処理し、それによって上記被検査物の欠陥を検査するための欠陥検査用画像データを生成する欠陥検査用画像処理方法であって、
複数の異なる画像データの中から画像データ上における位置が同一である1ラインのラインデータをそれぞれ抽出する同一ライン抽出ステップと、
上記同一ライン抽出ステップにおいて抽出されたラインデータを時系列に並べて複数ラインの合成画像データを生成するライン合成ステップとを含み、
上記同一ライン抽出ステップは、上記画像データ上における異なる複数の位置について上記ラインデータをそれぞれ抽出するステップであり、
上記ライン合成ステップは、上記同一ライン抽出ステップにおいて抽出されたラインデータを、上記画像データ上における位置ごとに時系列に並べて、異なる複数のライン合成画像データを生成するステップであり、
さらに、上記複数のライン合成画像データに対してそれぞれ、輝度変化を強調するオペレータを用いた演算を行い、1ラインまたは複数ラインの複数の強調画像データを生成するオペレータ演算ステップと、
上記被検査物の同一箇所を示す上記複数の強調画像データの輝度値を画素ごとに積算して欠陥検査用画像データを生成する積算ステップと、を含む方法。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020127013652A KR101682744B1 (ko) | 2009-10-30 | 2010-09-29 | 결함 검사용 화상 처리 장치 및 결함 검사용 화상 처리 방법 |
CA2778128A CA2778128A1 (en) | 2009-10-30 | 2010-09-29 | Image processing device for defect inspection and image processing method for defect inspection |
CN201080048759.7A CN102630299B (zh) | 2009-10-30 | 2010-09-29 | 缺陷检查用图像处理装置和缺陷检查用图像处理方法 |
US13/504,791 US20130128026A1 (en) | 2009-10-30 | 2010-09-29 | Image processing device for defect inspection and image processing method for defect inspection |
EP10826466A EP2495552A1 (en) | 2009-10-30 | 2010-09-29 | Image processing device for defect inspection and image processing method for defect inspection |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009251107A JP4726983B2 (ja) | 2009-10-30 | 2009-10-30 | 欠陥検査システム、並びに、それに用いる、欠陥検査用撮影装置、欠陥検査用画像処理装置、欠陥検査用画像処理プログラム、記録媒体、および欠陥検査用画像処理方法 |
JP2009-251107 | 2009-10-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011052332A1 true WO2011052332A1 (ja) | 2011-05-05 |
Family
ID=43921760
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/066934 WO2011052332A1 (ja) | 2009-10-30 | 2010-09-29 | 欠陥検査用画像処理装置および欠陥検査用画像処理方法 |
Country Status (8)
Country | Link |
---|---|
US (1) | US20130128026A1 (ja) |
EP (1) | EP2495552A1 (ja) |
JP (1) | JP4726983B2 (ja) |
KR (1) | KR101682744B1 (ja) |
CN (1) | CN102630299B (ja) |
CA (1) | CA2778128A1 (ja) |
TW (1) | TWI484161B (ja) |
WO (1) | WO2011052332A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104583761A (zh) * | 2012-08-28 | 2015-04-29 | 住友化学株式会社 | 缺陷检查装置以及缺陷检查方法 |
JPWO2016121878A1 (ja) * | 2015-01-29 | 2018-01-11 | 株式会社 デクシス | 光学式外観検査装置、及びこれを用いた光学式外観検査システム |
Families Citing this family (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103210296B (zh) | 2010-06-01 | 2016-08-10 | 阿克莱机械公司 | 检查系统 |
JP5808015B2 (ja) * | 2012-03-23 | 2015-11-10 | 富士フイルム株式会社 | 欠陥検査方法 |
JP2014035326A (ja) * | 2012-08-10 | 2014-02-24 | Toshiba Corp | 欠陥検査装置 |
JP5946751B2 (ja) * | 2012-11-08 | 2016-07-06 | 株式会社日立ハイテクノロジーズ | 欠陥検出方法及びその装置並びに欠陥観察方法及びその装置 |
JP5957378B2 (ja) * | 2012-12-28 | 2016-07-27 | 株式会社日立ハイテクノロジーズ | 欠陥観察方法および欠陥観察装置 |
CN104919305B (zh) * | 2013-01-16 | 2017-05-10 | 住友化学株式会社 | 图像生成装置、缺陷检查装置以及缺陷检查方法 |
WO2014119772A1 (ja) * | 2013-01-30 | 2014-08-07 | 住友化学株式会社 | 画像生成装置、欠陥検査装置および欠陥検査方法 |
US9575008B2 (en) * | 2014-02-12 | 2017-02-21 | ASA Corporation | Apparatus and method for photographing glass in multiple layers |
JP6213662B2 (ja) | 2014-03-07 | 2017-10-18 | 新日鐵住金株式会社 | 表面性状指標化装置、表面性状指標化方法及びプログラム |
JP2015225041A (ja) * | 2014-05-29 | 2015-12-14 | 住友化学株式会社 | 積層偏光フィルムの欠陥検査方法 |
US9816939B2 (en) | 2014-07-22 | 2017-11-14 | Kla-Tencor Corp. | Virtual inspection systems with multiple modes |
US9443300B2 (en) * | 2014-09-15 | 2016-09-13 | The Boeing Company | Systems and methods for analyzing a bondline |
WO2016055866A1 (en) * | 2014-10-07 | 2016-04-14 | Shamir Optical Industry Ltd. | Methods and apparatus for cleaning blocked ophthalmic lenses |
JP2017049974A (ja) * | 2015-09-04 | 2017-03-09 | キヤノン株式会社 | 識別器生成装置、良否判定方法、およびプログラム |
JP2017215277A (ja) * | 2016-06-02 | 2017-12-07 | 住友化学株式会社 | 欠陥検査システム、フィルム製造装置及び欠陥検査方法 |
JP2017219343A (ja) * | 2016-06-03 | 2017-12-14 | 住友化学株式会社 | 欠陥検査装置、欠陥検査方法、フィルム製造装置及びフィルム製造方法 |
JP6859627B2 (ja) * | 2016-08-09 | 2021-04-14 | 株式会社ジェイテクト | 外観検査装置 |
CN108885180B (zh) * | 2017-04-12 | 2021-03-23 | 意力(广州)电子科技有限公司 | 一种显示屏的检测方法及装置 |
JP2018205099A (ja) * | 2017-06-02 | 2018-12-27 | コニカミノルタ株式会社 | 筒状物の検査装置 |
JP6568664B2 (ja) | 2017-06-29 | 2019-08-28 | 住友化学株式会社 | 特異部検知システム及び特異部検知方法 |
JP6970550B2 (ja) * | 2017-07-24 | 2021-11-24 | 住友化学株式会社 | 欠陥検査システム及び欠陥検査方法 |
JP6970549B2 (ja) * | 2017-07-24 | 2021-11-24 | 住友化学株式会社 | 欠陥検査システム及び欠陥検査方法 |
JP2019023587A (ja) * | 2017-07-24 | 2019-02-14 | 住友化学株式会社 | 欠陥検査システム及び欠陥検査方法 |
KR102045818B1 (ko) * | 2017-11-06 | 2019-12-02 | 동우 화인켐 주식회사 | 투과 광학계 검사 장치 및 이를 이용한 결함 검사 방법 |
JP2019135460A (ja) * | 2018-02-05 | 2019-08-15 | 株式会社Screenホールディングス | 画像取得装置、画像取得方法および検査装置 |
CN109030512B (zh) * | 2018-08-23 | 2021-09-07 | 红塔烟草(集团)有限责任公司 | 烟条单相机重复视觉检测装置及方法 |
US10481097B1 (en) * | 2018-10-01 | 2019-11-19 | Guardian Glass, LLC | Method and system for detecting inclusions in float glass based on spectral reflectance analysis |
KR102372714B1 (ko) * | 2020-01-31 | 2022-03-10 | 한국생산기술연구원 | 딥러닝 기반 자동 결함 검사 장치 및 방법 |
WO2021166058A1 (ja) * | 2020-02-18 | 2021-08-26 | 日本電気株式会社 | 画像認識装置、画像認識方法、及び、記録媒体 |
JP2021196242A (ja) * | 2020-06-12 | 2021-12-27 | パナソニックIpマネジメント株式会社 | 検出方法および検出装置 |
CN112069974B (zh) * | 2020-09-02 | 2023-04-18 | 安徽铜峰电子股份有限公司 | 一种识别元器件缺损的图像识别方法及其系统 |
CN116507907A (zh) * | 2020-11-30 | 2023-07-28 | 柯尼卡美能达株式会社 | 分析装置、检查系统及学习装置 |
JP2022131075A (ja) * | 2021-02-26 | 2022-09-07 | 株式会社日立パワーソリューションズ | 超音波検査装置 |
US11394851B1 (en) * | 2021-03-05 | 2022-07-19 | Toshiba Tec Kabushiki Kaisha | Information processing apparatus and display method |
CN115032148B (zh) * | 2022-06-06 | 2023-04-25 | 苏州天准科技股份有限公司 | 一种片材棱面检测方法及规整检测暂存站 |
CN115115625B (zh) * | 2022-08-26 | 2022-11-04 | 聊城市正晟电缆有限公司 | 基于图像处理的电缆生产异常检测方法 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003250078A (ja) * | 2002-02-22 | 2003-09-05 | Nippon Telegr & Teleph Corp <Ntt> | 平行投影画像生成装置 |
JP2005181014A (ja) * | 2003-12-17 | 2005-07-07 | Hitachi Software Eng Co Ltd | 画像読取装置及び画像読取方法 |
JP2008256402A (ja) * | 2007-04-02 | 2008-10-23 | Purex:Kk | シート状物品の検査方法および装置 |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1156828A (ja) * | 1997-08-27 | 1999-03-02 | Fuji Photo Film Co Ltd | 異常陰影候補検出方法および装置 |
US7796801B2 (en) * | 1999-08-26 | 2010-09-14 | Nanogeometry Research Inc. | Pattern inspection apparatus and method |
TW500920B (en) * | 2000-03-24 | 2002-09-01 | Olympus Optical Co | Defect detecting apparatus |
JP4201478B2 (ja) * | 2000-11-08 | 2008-12-24 | 住友化学株式会社 | シート状製品の欠陥マーキング装置 |
JP4230880B2 (ja) * | 2003-10-17 | 2009-02-25 | 株式会社東芝 | 欠陥検査方法 |
JP2005201783A (ja) * | 2004-01-16 | 2005-07-28 | Sumitomo Chemical Co Ltd | 枚葉フィルムの欠陥検査装置 |
CN100489508C (zh) * | 2004-07-21 | 2009-05-20 | 欧姆龙株式会社 | 基板检查方法及装置 |
JP4755888B2 (ja) * | 2005-11-21 | 2011-08-24 | 住友化学株式会社 | 枚葉フィルム検査装置及び枚葉フィルム検査方法 |
JP5006551B2 (ja) * | 2006-02-14 | 2012-08-22 | 住友化学株式会社 | 欠陥検査装置及び欠陥検査方法 |
JP5260320B2 (ja) * | 2006-02-15 | 2013-08-14 | ドウジン セミケム カンパニー リミテッド | 平板表示装置の検査システム及び検査方法 |
JP4796860B2 (ja) * | 2006-02-16 | 2011-10-19 | 住友化学株式会社 | オブジェクト検出装置及びオブジェクト検出方法 |
US7567344B2 (en) * | 2006-05-12 | 2009-07-28 | Corning Incorporated | Apparatus and method for characterizing defects in a transparent substrate |
JP2007333563A (ja) | 2006-06-15 | 2007-12-27 | Toray Ind Inc | 光透過性シートの検査装置および検査方法 |
JP2008216590A (ja) * | 2007-03-02 | 2008-09-18 | Hoya Corp | グレートーンマスクの欠陥検査方法及び欠陥検査装置、フォトマスクの欠陥検査方法、グレートーンマスクの製造方法、並びにパターン転写方法 |
JP2008292171A (ja) | 2007-05-22 | 2008-12-04 | Toray Ind Inc | 表面検査装置、表面検査方法および高分子フィルム表面検査方法 |
JP5367292B2 (ja) * | 2008-03-31 | 2013-12-11 | 古河電気工業株式会社 | 表面検査装置および表面検査方法 |
-
2009
- 2009-10-30 JP JP2009251107A patent/JP4726983B2/ja active Active
-
2010
- 2010-09-29 CN CN201080048759.7A patent/CN102630299B/zh active Active
- 2010-09-29 WO PCT/JP2010/066934 patent/WO2011052332A1/ja active Application Filing
- 2010-09-29 KR KR1020127013652A patent/KR101682744B1/ko active IP Right Grant
- 2010-09-29 EP EP10826466A patent/EP2495552A1/en not_active Withdrawn
- 2010-09-29 CA CA2778128A patent/CA2778128A1/en not_active Abandoned
- 2010-09-29 US US13/504,791 patent/US20130128026A1/en not_active Abandoned
- 2010-10-25 TW TW099136292A patent/TWI484161B/zh active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003250078A (ja) * | 2002-02-22 | 2003-09-05 | Nippon Telegr & Teleph Corp <Ntt> | 平行投影画像生成装置 |
JP2005181014A (ja) * | 2003-12-17 | 2005-07-07 | Hitachi Software Eng Co Ltd | 画像読取装置及び画像読取方法 |
JP2008256402A (ja) * | 2007-04-02 | 2008-10-23 | Purex:Kk | シート状物品の検査方法および装置 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104583761A (zh) * | 2012-08-28 | 2015-04-29 | 住友化学株式会社 | 缺陷检查装置以及缺陷检查方法 |
CN104583761B (zh) * | 2012-08-28 | 2016-12-21 | 住友化学株式会社 | 缺陷检查装置以及缺陷检查方法 |
JPWO2016121878A1 (ja) * | 2015-01-29 | 2018-01-11 | 株式会社 デクシス | 光学式外観検査装置、及びこれを用いた光学式外観検査システム |
JP7026309B2 (ja) | 2015-01-29 | 2022-02-28 | 株式会社 デクシス | 光学式外観検査装置、及びこれを用いた光学式外観検査システム |
Also Published As
Publication number | Publication date |
---|---|
KR20120091249A (ko) | 2012-08-17 |
CN102630299B (zh) | 2014-07-30 |
TWI484161B (zh) | 2015-05-11 |
EP2495552A1 (en) | 2012-09-05 |
KR101682744B1 (ko) | 2016-12-05 |
US20130128026A1 (en) | 2013-05-23 |
CA2778128A1 (en) | 2011-05-05 |
JP2011095171A (ja) | 2011-05-12 |
CN102630299A (zh) | 2012-08-08 |
JP4726983B2 (ja) | 2011-07-20 |
TW201140040A (en) | 2011-11-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4726983B2 (ja) | 欠陥検査システム、並びに、それに用いる、欠陥検査用撮影装置、欠陥検査用画像処理装置、欠陥検査用画像処理プログラム、記録媒体、および欠陥検査用画像処理方法 | |
CA2615117C (en) | Apparatus and methods for inspecting a composite structure for inconsistencies | |
JP2017215277A (ja) | 欠陥検査システム、フィルム製造装置及び欠陥検査方法 | |
KR20190011198A (ko) | 결함 검사 시스템 및 결함 검사 방법 | |
KR20120109548A (ko) | 외관 검사 장치 | |
JP2016156647A (ja) | 電磁波を使用した検査装置 | |
KR101828536B1 (ko) | 패널 검사 방법 및 장치 | |
JP2015075483A (ja) | 光透過性フィルムの欠陥検出方法 | |
TWI607212B (zh) | Image generation device, defect inspection device, and defect inspection method | |
CN111103309A (zh) | 用于检测透明材质物体瑕疵的方法 | |
JP2011145305A (ja) | 欠陥検査システム、並びに、それに用いる、欠陥検査用撮影装置、欠陥検査用画像処理装置、欠陥検査用画像処理プログラム、記録媒体、および欠陥検査用画像処理方法 | |
JP2008249413A (ja) | 欠陥検出方法および装置 | |
KR20120109547A (ko) | 외관 검사 장치 | |
JP2019023589A (ja) | 欠陥検査システム及び欠陥検査方法 | |
JP2021092439A (ja) | 照明最適化方法、制御装置、及びプログラム | |
KR20190011199A (ko) | 결함 검사 시스템 및 결함 검사 방법 | |
JP7293907B2 (ja) | 外観検査管理システム、外観検査管理装置、外観検査管理方法及びプログラム | |
JP2005083906A (ja) | 欠陥検出装置 | |
JP5231779B2 (ja) | 外観検査装置 | |
JP7298333B2 (ja) | 外観検査管理システム、外観検査管理装置、外観検査管理方法及びプログラム | |
JP2014106141A (ja) | 欠陥検査装置、及び欠陥検査方法 | |
JP2010175283A (ja) | 面画像生成装置 | |
JP2007101239A (ja) | 外観検査装置及び外観検査方法 | |
Perng et al. | A novel vision system for CRT panel auto-inspection | |
JP6996363B2 (ja) | シート状物の欠陥検査装置及び製造方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080048759.7 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10826466 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2778128 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010826466 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20127013652 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13504791 Country of ref document: US |