CN115684189A - Thin film inspection device, thin film inspection method, and recording medium - Google Patents
Thin film inspection device, thin film inspection method, and recording medium Download PDFInfo
- Publication number
- CN115684189A CN115684189A CN202210870412.5A CN202210870412A CN115684189A CN 115684189 A CN115684189 A CN 115684189A CN 202210870412 A CN202210870412 A CN 202210870412A CN 115684189 A CN115684189 A CN 115684189A
- Authority
- CN
- China
- Prior art keywords
- thin film
- defect
- film
- processing
- light
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/89—Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles
- G01N21/8901—Optical details; Scanning details
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/89—Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles
- G01N21/892—Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles characterised by the flaw, defect or object feature examined
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
- G01N2021/8854—Grading and classifying of flaws
- G01N2021/8858—Flaw counting
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
- G01N2021/8883—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges involving the calculation of gauges, generating models
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
- G01N2021/8887—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Textile Engineering (AREA)
- Software Systems (AREA)
- Quality & Reliability (AREA)
- Artificial Intelligence (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Signal Processing (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
Abstract
The invention relates to a thin film inspection apparatus, a thin film inspection method, and a recording medium. The detection of defects generated in the thin film is easily performed without manual monitoring. A film inspection apparatus optically inspects defects generated in a film in a stacked state, wherein light is irradiated to an inspection area of the film by a light source (step S1), the inspection area of the film is photographed by a camera (optical sensor) and diffused light among light reflected in the inspection area is detected (step S2), and a control section performs data processing on an output signal of the camera to detect defects (step S5).
Description
Technical Field
The invention relates to a thin film inspection apparatus, a thin film inspection method and a recording medium.
Background
When a long resin film is wound in a roll shape, various defects occur depending on the state of the film (film thickness variation, residual stress, temperature, humidity), and winding conditions (tension, speed, accompanying air). As defects in the film, for example, there are defects such as a dark streak defect (dark band) in the circumferential direction of the roll due to adhesion of the films to each other, a streak deformation (longitudinal wrinkle) in the circumferential direction, and a deformation in an oblique direction (oblique wrinkle).
Conventionally, winding of a film roll has been visually monitored because it is necessary to detect a roll having a defect as a defective product. Specifically, the operator monitors the film roll during winding for several minutes, and records the state of the roll at intervals of several tens of mm. In this way, a human is required to detect a defect on the film, and the human determines that there is a variation.
In contrast, an attempt has been made to automatically detect a specific defect such as a wrinkle or a flaw generated when the roll film is wound, by using an image processing technique for an image obtained by reading the film. For example, a technique has been proposed in which light is irradiated to a roll film, the roll is photographed by a camera, and a wrinkle is detected from the reflected light from the roll (see patent document 1).
Documents of the prior art
Patent literature
Patent document 1: japanese patent laid-open publication No. 2002-365225
Disclosure of Invention
However, the technique described in patent document 1 uses regular reflection light, and therefore, a streak-like abnormal reflection region due to a wrinkle (unevenness of the film surface) can be detected from image data obtained by a camera, but there is a problem that a defect such as a dark band in which the film surface is substantially flat cannot be detected.
Further, although the strength (level) of the wrinkle depends not only on the shape of the wrinkle but also on the degree of unevenness, in the technique described in patent document 1, the intensity of light received by the camera is likely to vary due to factors other than the unevenness of the wrinkle, and the degree of unevenness cannot be accurately detected from the image data, so that it is difficult to quantify the defect.
Therefore, when a defect other than a wrinkle occurs or when it is desired to quantitatively evaluate the defect, the operator is required to monitor the defect as a result. In particular, when a condition search is performed for the occurrence of no defect, it is necessary to quantitatively evaluate the defect under the current condition.
The present invention has been made in view of the above-described problems of the prior art, and an object of the present invention is to easily detect a defect occurring in a thin film without manual monitoring.
In order to solve the above problem, the invention described in item 1 is a thin film inspection apparatus for optically inspecting a defect generated in a thin film in a stacked state, the thin film inspection apparatus including: a light source that irradiates light onto an inspection area of the film; an optical sensor that detects diffused light among light reflected in an inspection area of the film; and a control section that detects the defect by performing data processing on an output signal of the optical sensor.
The invention described in item 2 is the thin film inspection device described in item 1, wherein the data processing includes image processing on image data obtained from an output signal of the optical sensor and defect determination processing for determining the defect based on the image processed data.
The invention described in item 3 is the thin film inspection apparatus described in item 2, wherein the control unit uses a machine learning model constructed using the data after the image processing as an explanatory variable and a determination result of the defect by a human as a target variable in the defect determination processing.
The invention described in item 4 is the thin film inspection apparatus described in item 2 or 3, wherein the data processing further includes quantitative evaluation processing for quantitatively evaluating the defect based on the data after the image processing.
The invention described in item 5 is the thin film inspection device described in item 4, wherein the control unit uses a machine learning model that is constructed using the data after the image processing as an explanatory variable and using a result of quantitative evaluation of the defect by a human as a target variable in the quantitative evaluation processing.
The invention described in item 6 is the film inspection device described in any one of items 1 to 5, wherein the film is wound in a roll shape, and the film inspection device inspects the defect generated in the process of winding the film.
The invention described in item 7 is the film inspection apparatus described in item 6, wherein the light source uniformly irradiates light in a width direction of the roll film.
The invention described in item 8 is the thin film inspection device described in any one of items 1 to 7, wherein a contrast (contrast) between a signal value corresponding to an irradiated portion of the thin film irradiated with light by the light source and a signal value corresponding to a non-irradiated portion of the thin film not irradiated with light by the light source is a predetermined value or more in the output signal of the optical sensor.
The invention described in item 9 is the thin film inspection device described in any one of items 1 to 8, wherein a reflection prevention plate is provided so that a regular reflection light of the light reflected in the inspection region of the thin film does not enter the optical sensor.
The invention described in item 10 is the thin film inspection device described in any one of items 1 to 9, wherein the optical sensor is a black-and-white camera.
The invention described in item 11 is a thin film inspection method for optically inspecting a defect generated in a thin film in a stacked state, the thin film inspection method including: irradiating the inspection area of the thin film with light from a light source; detecting diffused light among light reflected in an inspection area of the film by an optical sensor; and a step of detecting the defect by a control unit by performing data processing on an output signal of the optical sensor.
The invention described in item 12 provides the thin film inspection method described in item 11, wherein the data processing includes image processing on image data obtained from an output signal of the optical sensor and defect determination processing for determining the defect based on the image processed data.
The invention described in item 13 is the thin film inspection method described in item 12, wherein the control unit uses a machine learning model constructed using the data after the image processing as an explanatory variable and a determination result of the defect by a human as a target variable in the defect determination processing.
The invention described in item 14 is the thin film inspection method described in item 12 or 13, wherein the data processing further includes quantitative evaluation processing for quantitatively evaluating the defect based on the data after the image processing.
The invention described in item 15 provides the thin film inspection method described in item 14, wherein the control unit uses a machine learning model constructed by using the data after the image processing as an explanatory variable and a result of quantitative evaluation based on the defect of the person as a target variable in the quantitative evaluation processing.
The invention described in item 16 is the film inspection method described in any one of items 11 to 15, wherein the film is wound in a roll shape, and the defect generated during the winding of the film is an inspection target.
The invention described in item 17 is the thin film inspection method described in item 16, wherein the light source uniformly irradiates light in a width direction of the roll-shaped thin film.
The invention described in item 18 is the thin film inspection method described in any one of items 11 to 17, wherein a contrast between a signal value corresponding to an irradiated portion of the thin film irradiated with light from the light source and a signal value corresponding to a non-irradiated portion of the thin film not irradiated with light from the light source is a predetermined value or more in the output signal of the optical sensor.
The invention described in item 19 is the thin film inspection method described in any one of items 11 to 18, wherein a reflection preventing plate is used so that, of the light reflected in the inspection region of the thin film, regular reflection light does not enter the optical sensor.
The invention described in item 20 is the thin film inspection method described in any one of items 11 to 19, wherein the optical sensor is a black-and-white camera.
The invention described in item 21 is a computer-readable recording medium storing a program for causing a computer of a thin film inspection apparatus for optically inspecting defects generated in thin films stacked on each other to function as a control unit, the thin film inspection apparatus including: a light source that irradiates light onto an inspection area of the film; and an optical sensor that detects diffused light among light reflected in an inspection area of the film, the control section detecting the defect by performing data processing on an output signal of the optical sensor.
The invention described in item 22 is the recording medium described in item 21, wherein the data processing includes image processing on image data obtained from an output signal of the optical sensor and defect determination processing for determining the defect based on the image-processed data.
The invention described in item 23 is the recording medium described in item 22, wherein the control unit uses a machine learning model constructed using the image-processed data as an explanatory variable and a determination result based on the defect of the person as a target variable in the defect determination process.
The invention described in item 24 is the recording medium described in item 22 or 23, wherein the data processing further includes quantitative evaluation processing for quantitatively evaluating the defect based on the data after the image processing.
An invention described in item 25 is the recording medium described in item 24, wherein the control unit uses a machine learning model constructed using the image-processed data as an explanatory variable and a result of quantitative evaluation based on the defect of the person as a target variable in the quantitative evaluation process.
The invention according to item 26 is the recording medium according to any one of items 21 to 25, wherein the film is wound in a roll shape, and the film inspection apparatus inspects the defect generated in the process of winding the film.
The invention described in item 27 is the recording medium described in item 26, wherein the light source irradiates light uniformly in a width direction of the roll film.
The invention described in item 28 is the recording medium described in any one of items 21 to 27, wherein a contrast between a signal value corresponding to an irradiated portion of the thin film irradiated with light by the light source and a signal value corresponding to a non-irradiated portion of the thin film not irradiated with light by the light source is a predetermined value or more in the output signal of the optical sensor.
The invention described in item 29 is the recording medium according to any one of items 21 to 28, wherein a reflection preventing plate is provided in the thin film inspection device so that a regular reflection light of the light reflected in the inspection region of the thin film does not enter the optical sensor.
The invention described in item 30 is the recording medium described in any one of items 21 to 29, wherein the optical sensor is a black-and-white camera.
According to the present invention, it is possible to easily detect a defect generated in a thin film without manual monitoring.
Drawings
Fig. 1 is a diagram showing a functional configuration of a thin film inspection apparatus according to embodiment 1 of the present invention.
Fig. 2 is a diagram for explaining the configuration of the light source and the camera.
Fig. 3 shows an example of arrangement of a light source and a camera in an actual film inspection apparatus.
Fig. 4 is a perspective view showing a positional relationship of the light source and the camera with respect to the rolled film.
Fig. 5 is a diagram for explaining the arrangement of the reflection preventing plate.
Fig. 6 is a flowchart showing a film defect detection process performed by the film inspection apparatus.
Fig. 7 is a flowchart showing data processing.
Fig. 8 is a flowchart showing a thin film defect learning process performed by the thin film inspection apparatus of embodiment 2.
Fig. 9 is a flowchart showing the feature quantity creation process of explanatory variables.
Fig. 10 is a flowchart showing the labeling process of the destination variable.
Fig. 11 is an example of a learning data set.
Fig. 12 is an example of a visual evaluation screen.
Fig. 13 is a flowchart showing the 2 nd data processing.
Fig. 14 shows an example of the evaluation results obtained by the thin film inspection apparatus according to embodiment 2.
Fig. 15A is an example of feature amounts calculated from image data obtained by imaging a film in a film inspection apparatus.
Fig. 15B shows the result of human visual evaluation (sensory evaluation) of the same image data as in fig. 15A.
Fig. 16 shows an example of the contrast between an irradiated portion and a non-irradiated portion on a film.
Description of the symbols
10: a film inspection device; 11: a light source; 12: a camera; 20: a data processing device; 21: a control unit; 25: a display unit; 26: an operation section; 27: a storage unit; 28: a timing section; 48: a reflection prevention plate; 251: visual evaluation of the picture; f: film(s)
Detailed Description
Hereinafter, an embodiment of the thin film inspection apparatus according to the present invention will be described with reference to the drawings. However, the scope of the present invention is not limited to the illustrated examples.
[ 1 st embodiment ]
Fig. 1 shows a functional configuration of a thin film inspection apparatus 10 according to embodiment 1. The film inspection apparatus 10 includes a light source 11, a camera 12 as an optical sensor, and a data processing device 20. The film inspection apparatus 10 optically inspects defects generated in the film F wound in a roll and stacked. The film inspection apparatus 10 inspects a defect generated in the process of winding the long film F.
The material of the film F is not particularly limited, but generally includes polycarbonate resin, polysulfone resin, acrylic resin, polyolefin resin, cyclic olefin resin, polyether resin, polyester resin, polyamide resin, polysulfide resin, unsaturated polyester resin, epoxy resin, melamine resin, phenol resin, diallyl phthalate resin, polyimide resin, polyurethane resin, polyvinyl acetate resin, polyvinyl alcohol resin, styrene resin, cellulose acetate resin, vinyl chloride resin, and the like.
For example, the film F has a thickness of 1 to 1000 μm and a width of 0.1 to 5 m.
Examples of defects in the film F include dark stripes, longitudinal wrinkles, and diagonal wrinkles.
The dark band is a defect that the films F are adhered to each other to appear darker than the surroundings, and is a defect of dark streaks in the circumferential direction of the roll. The dark bands are also called black bands and raised bands.
A longitudinal fold is a circumferential rib-like deformation of the web. The longitudinal folds are also referred to as lantern buckling.
The oblique wrinkles are deformations in oblique directions with respect to the circumferential direction and the width direction of the roll, and have pyramid-shaped or chain-shaped irregularities. Diagonal folds are also known as diamond buckling.
The light source 11 irradiates light to the inspection area of the film F. The light source 11 uniformly irradiates light in the width direction of the rolled film F (the direction perpendicular to the longitudinal direction of the film F, the direction parallel to the film surface). Here, the uniformity means that the illuminance on the film F is substantially the same in the width direction of the film F (the difference between the maximum value and the minimum value is equal to or less than a predetermined value).
The camera 12 is an optical sensor that optically reads the inspection area of the film F. The camera 12 includes image sensors such as a CCD (Charge Coupled Device) and a CMOS (Complementary Metal Oxide Semiconductor), and lenses, and is an area sensor (area sensor) that generates 2-dimensional image data from output signals of the image sensors. The camera 12 detects diffused light among light irradiated by the light source 11 and reflected in the inspection area of the film F. Here, as the camera 12, a monochrome camera (monochrome camera) is used.
The camera 12 has an imaging range over the entire width direction of the film F, and simultaneously reads the entire width direction of the film F by 1 imaging. The camera 12 may be a camera that detects light in a visible light region, or may be a camera that detects light in an infrared region.
In addition, it is desirable that, in the output signal of the camera 12, the contrast between a signal value corresponding to an irradiated portion of the film F irradiated with light by the light source 11 and a signal value corresponding to a non-irradiated portion of the film F not irradiated with light by the light source 11 is a predetermined value or more. That is, it is desirable that only the portion (irradiation portion) of the film F irradiated with light from the light source 11 look bright.
The contrast is represented by a ratio of a difference between two values of a processing target (here, a signal value corresponding to an irradiation portion and a signal value corresponding to a non-irradiation portion), or the like, and the contrast increases as the difference between the two values increases. In order to increase the contrast between the irradiation portion and the non-irradiation portion, it is desirable to use the light source 11 having a strong force and a high linearity.
Here, "high power" means that when the illuminance at an irradiation distance of 50mm is defined as E50, the illuminance E50 is 50000lx or more.
The phrase "high straightness" means that (E50-E100)/E50 <0.5 is satisfied where E50 is the illuminance at an irradiation distance of 50mm and E100 is the illuminance at an irradiation distance of 100 mm.
The data processing device 20 includes a control unit 21, i/F (interfaces) 22 and 23, a communication unit 24, a display unit 25, an operation unit 26, a storage unit 27, a timer unit 28, and the like, and the respective units are connected by a bus. The data processing device 20 is constituted by a PC (Personal Computer) or the like.
The control Unit 21 is constituted by a CPU (Central Processing Unit), a RAM (Random Access Memory), and the like, and generally controls the Processing operations of each part of the data Processing apparatus 20 (thin film inspection apparatus 10). Specifically, the CPU reads out various processing programs stored in the storage unit 27, expands the programs into the RAM, and performs various processes in cooperation with the programs.
The I/F22 is an interface for connecting to the camera 12, and transmits a control signal to the camera 12 and receives image data obtained by imaging the film F from the camera 12.
The I/F23 is an interface for connecting with the light source 11, and transmits a control signal for controlling the irradiation of light of the light source 11 to the light source 11.
The communication unit 24 is configured by a Network interface or the like, and transmits and receives data to and from an external device connected via a communication Network such as a LAN (Local Area Network).
The Display unit 25 is configured to include a monitor such as an LCD (Liquid Crystal Display), and displays various screens in accordance with instructions of Display signals input from the control unit 21.
The operation unit 26 includes a keyboard including a cursor key, a character/number input key, various function keys, and the like, and a pointing device such as a mouse, and outputs an operation signal input by a key operation or a mouse operation with respect to the keyboard to the control unit 21.
The storage unit 27 is configured by an HDD (Hard Disk Drive), an SSD (Solid State Drive), or the like, and stores various processing programs, data necessary for executing the programs, and the like.
The timer unit 28 has a timer circuit (RTC: real Time Clock) and counts the current date and Time by the timer circuit and outputs the current date and Time to the control unit 21.
The control unit 21 detects a defect (type, position, and intensity) of the film F by performing data processing on an output signal of the camera 12 (optical sensor).
The data processing includes image processing for image data obtained from the output signal of the camera 12, defect determination processing for determining a defect from the image-processed data, and quantitative evaluation processing for quantitatively evaluating a defect from the image-processed data.
Next, the arrangement of the light source 11 and the camera 12 will be described with reference to fig. 2.
In the case of observing the regular reflection light which is irradiated from the light source 11 and reflected by the object to be inspected (here, the film F), the luminance component of the light source 11 is observed. As a result of the study by the applicant, it was found that defects of the film F were not easily detected by regular reflection light in the case of human observation.
The same applies to the machine vision such as the camera 12. When the camera 12 is disposed at the position P1 that receives the regular reflection light (luminance component) from the light source 11, the camera 12 basically captures an image of the inspection object (the inspection object that overlaps the posture of the light source 11) in a state of being reflected by the light source 11, and thus it is found that the camera is not suitable for defect detection.
The position of the camera 12 is set to avoid a position P1 that receives the regular reflection light of the light emitted from the light source 11. Further, the position P2 near the position P1 is also not suitable for defect detection. Here, the vicinity of the position P1 refers to a range that may be affected by the regular reflection light of the light from the light source 11.
The camera 12 may be disposed at a position (positions P3 to P6, etc.) other than the position P1 and its vicinity (position P2, etc.), that is, at a position that receives diffused light of light reflected by the object to be inspected.
Further, since a problem of aberration is likely to occur when the surface of the inspection target is imaged from a low angle (positions P5 and P6, etc.), it is desirable to provide the camera 12 at a position where the angle formed by the imaging direction of the camera 12 and the inspection target surface is equal to or greater than a predetermined value.
Next, an example of the arrangement of the light source 11 and the camera 12 in the actual thin film inspection apparatus 10 will be described with reference to fig. 3.
When the film inspection apparatus 10 winds the film F into a roll, the film F is imaged by the camera 12. Therefore, the winding radius of the reel at the start of winding is different from that at the end of winding. Specifically, as shown in fig. 3, the winding radius gradually increases from the roll 31 immediately after the start of winding to the roll 32 at the end of winding. It is preferable that the camera 12 stably receives the diffused light from the start of winding of the film F to the end of winding.
Therefore, the inspection area of the roll 31 immediately after the start of winding is positioned on the extension line of each of the irradiation direction of the light source 11 and the imaging direction of the camera 12, and the regular reflection light from the light source 11 is not entered into the camera 12.
Specifically, the camera 12 is not disposed in the NG area 37 at least from the position where the regular reflection light 34 reflected by the reel 31 immediately after the start of winding of the irradiation light 33 from the light source 11 is received to the position where the regular reflection light 36 reflected by the reel 32 at the end of winding of the irradiation light 35 from the light source 11 is received.
On the other hand, the camera 12 can be disposed in the OK areas 38 and 39 other than the NG area 37.
The optimum arrangement is not limited to that shown in fig. 3, but by arranging the light source 11 and the camera 12 closer, the change in the observation position (the position where the camera 12 captures an image) at the start and end of winding of the roll becomes smaller. In addition, when the light source 11 and the camera 12 are arranged closer, there is an advantage that the apparatus itself of the thin film inspection apparatus 10 becomes compact.
Fig. 4 is a perspective view showing a positional relationship between the light source 11 and the camera 12 with respect to the roll film F. The light source 11 and the camera 12 are disposed close to each other, and the roll surface of the film F during winding (the film F in a superposed state) is set as an inspection target.
In addition, a reflection preventing plate may be provided in the film inspection apparatus 10 so that the regular reflection light of the light reflected in the inspection area of the film F does not enter the camera 12.
When the background is reflected on the film surface in the observation range of the camera 12, the accuracy of defect determination is lowered.
As shown in fig. 5, since the roll 41 immediately after the start of winding has the smallest winding radius until it is wound to the roll 42 at the end of winding, the region to be reflected by regular reflection (the region where the light regularly reflected on the film surface enters the camera 12) is maximized.
The color of the reflection prevention plate 48 is preferably black. As a material of the antireflection plate 48, general plastics (polypropylene, acrylic resin, polyethylene terephthalate, polycarbonate, and the like), rubber, felt, synthetic fiber, paper, and the like are used.
Next, the operation of the thin film inspection apparatus 10 will be described.
Fig. 6 is a flowchart illustrating a thin film defect detection process performed by the thin film inspection apparatus 10. The film defect detection processing is processing for detecting a defect from the film F in a state of being overlapped during winding, and is realized by software processing based on cooperation of the control unit 21 and a program stored in the storage unit 27. The film F is wound up so that the transport speed (linear speed) of the film F in the longitudinal direction thereof is constant.
First, the control unit 21 controls the light source 11 via the I/F23 to irradiate the inspection area of the film F with light (step S1).
Next, the control unit 21 controls the camera 12 via the I/F22 to photograph the inspection area of the film F (step S2). The camera 12 detects diffused light among light reflected in the inspection area of the film F.
Next, the control unit 21 acquires 2-dimensional image data generated by the camera 12 via the I/F22 (step S3), and stores the acquired image data in the storage unit 27 (step S4). Specifically, the control unit 21 acquires the current date and time from the time counting unit 28 as the examination date and time, and stores the image data in the storage unit 27 in association with the examination date and time. Since the image data obtained by 1-time imaging is a portion of the area in the circumferential direction of the wound film F, the inspection date and time corresponding to the image data corresponds to the position in the longitudinal direction of the film F.
Next, the control unit 21 performs data processing on the image data acquired from the camera 12 (step S5).
Here, data processing will be described with reference to fig. 7.
The control section 21 divides the image data into a plurality of regions in the width direction of the film F (step S11). Specifically, the control unit 21 divides the image data into n regions a1 to an.
Next, the control unit 21 acquires the image data of the area a1 (step S12), and performs mathematical processing on the image data of the area a1 (step S13). Appropriate mathematical processing is prepared according to the type of defect (dark band, longitudinal wrinkle, oblique wrinkle, etc.) to be detected.
The mathematical processing includes preprocessing, emphasis processing, signal processing, image feature extraction, and the like.
Examples of the preprocessing include image clipping, low-pass filtering, high-pass filtering, gaussian filtering, median filtering, bilateral filtering, morphological transformation, color transformation (L α λ b, sRGB, HSV, and HSL), contrast adjustment, noise removal, blur/blur image restoration, mask processing, hough transformation, and projective transformation.
Examples of the emphasis process include Sobel filtering, scharr filtering, laplacian filtering, gabor filtering, canny algorithm, and the like.
Examples of the signal processing include processing for obtaining basic statistics (maximum value, minimum value, mean value, median, standard deviation, variance, quartile point), square and square root, difference, sum, product, ratio, distance matrix, differential integration, threshold processing (binarization, adaptive binarization, etc.), fourier transform, wavelet transform, peak detection (peak value, peak count, half-value width, etc.), and the like.
The image feature extraction includes template matching, SIFT feature extraction, and the like.
The mathematical processing of step S13 corresponds to "image processing" for the image data.
Specifically, in the detection of the dark band, values such as low-pass filtering, gaussian filtering, median filtering, morphological transformation, and mask processing are used as preprocessing, and basic statistics and threshold processing are used in signal processing.
For the longitudinal wrinkles and oblique wrinkles, high-pass filtering is used as preprocessing, sobel filtering and Gabor filtering are used as emphasis processing, and square roots, fourier transform, basic statistics, and the like are used as signal processing.
Next, the control unit 21 performs threshold processing on the value (feature value) obtained by mathematical processing on the image data of the region a1 (step S14). The threshold processing is as follows: whether or not the defect is a detection target is determined based on one or more threshold values predetermined for each type of defect of the detection target, and the level (intensity) of the defect is determined.
In the threshold processing of step S14, the determination of the presence of a defect and the type of defect correspond to "defect determination processing". In the threshold processing of step S14, classifying defects into a plurality of levels according to the threshold corresponds to "quantitative evaluation processing".
For example, regarding a parameter (feature quantity) having a value of 1 to 100, defects are classified into a plurality of classes such that the parameter has a class 1 when the parameter has a value of 1 to 10, a class 2 when the parameter has a value of 11 to 30, a class 3 when the parameter has a value of 31 to 60, and a class 4 when the parameter has a value of 61 to 100.
The same processing is performed for the regions other than the region a 1. For example, the processing of steps S15 to S17 performed on the image data of the area an is the same as the processing of steps S12 to S14.
After the processing for each of the areas a1 to an, the control unit 21 summarizes the results for each of the areas a1 to an (step S18), and the data processing ends. Specifically, the control unit 21 generates data in which the type and the grade of the detected defect are associated for each region (each position in the width direction of the film F).
In the data processing shown in fig. 7, in addition to the mathematical processing (steps S13 and S16) performed on the image data of the respective regions a1 to an divided in the width direction of the film F, the mathematical processing may be performed on the entire image data before the region division before the processing of step S11.
After the data processing, returning to fig. 6, the control unit 21 stores the processing result of the data processing in the storage unit 27 and displays the processing result on the display unit 25 (step S6). The control unit 21 stores the processing results (the types and levels of defects detected in the respective regions divided in the width direction) obtained by the data processing for the image data stored in step S4 in the storage unit 27 in association with the inspection date and time corresponding to the image data.
Thus, the film defect detection processing is ended.
The thin film defect detection processing shown in fig. 6 is processing showing inspection of a range corresponding to 1 shot. The inspection area in the longitudinal direction of the film F is changed along with the movement of the wound film F, and the film defect detection process is repeated, thereby obtaining the inspection results at each position in the longitudinal direction of the film F.
In fig. 6, the data processing is performed every time 1 image capture is performed (see fig. 7), but the data processing may be performed collectively after storing image data obtained by a plurality of image captures.
As described above, according to embodiment 1, since the output signal obtained by detecting the diffused light reflected in the inspection area of the film F is subjected to data processing, it is possible to easily detect a defect generated in the film F without manual monitoring. By using the diffused light, it is possible to detect a difference in brightness between the surfaces of films having no substantial unevenness, such as dark stripes, and detect defects of a type which have been difficult to detect.
Specifically, the defect can be detected by image processing for image data obtained from an output signal of the camera 12 (optical sensor) and defect determination processing for determining a defect from the data after the image processing.
In addition, by quantitatively evaluating the defects based on the data after the image processing, the defects can be classified into a plurality of levels.
For example, by using mathematical processing (image processing) according to the type of the defect to be detected, the defect can be detected for each type of dark band, vertical wrinkle, oblique wrinkle, and the like. Among defects such as wrinkles, longitudinal wrinkles and diagonal wrinkles can be detected separately.
Further, by performing threshold processing on the value (feature value) obtained by the mathematical processing, it is possible to determine whether or not the defect is a detection target, and detect the level (intensity) of the defect.
In the shape detection of defects using regular reflection light as in the conventional art, the shape of an abnormal reflection region can be captured, but since light having high directivity is detected, the intensity of the detected light changes due to the shape other than the irregularities of wrinkles, the movement during winding, and the like.
On the other hand, in the film inspection apparatus 10 using the diffused light, since light having low directivity is detected, it is less likely to be affected by shapes other than irregularities of wrinkles or movement during winding, and an image close to a state when a person evaluates it can be obtained. By combining image processing (mathematical processing) on image data obtained by detecting diffused light, it is possible to extract a numerical value (feature quantity) having a high correlation with an evaluation index of a person, and as a result, it is possible to quantitatively evaluate a defect.
In addition, under actual conditions, the type and strength of defects generated during the winding of the film F are known, and conditions under which defects are not generated can be searched for.
Further, since the light source 11 uniformly irradiates light in the width direction of the rolled film F, it is possible to suppress variations in evaluation in the width direction and improve the defect detection accuracy.
In addition, in the output signal of the camera 12 (optical sensor), when the contrast between the signal value corresponding to the irradiation portion on the film F and the signal value corresponding to the non-irradiation portion becomes large, the difference between the data after image processing obtained in the normal portion and the abnormal portion (defect) in the irradiation portion becomes large, and the accuracy of defect determination improves. In addition, when the contrast between the irradiated portion and the non-irradiated portion is increased, the effect of reducing the reflection in the surroundings is also obtained, which contributes to an improvement in the accuracy of defect determination.
By using the strong and highly linear light source 11, the contrast between the irradiated portion and the non-irradiated portion of the film F is set to a predetermined value or more, and using image data obtained by detecting the diffused light of the irradiated portion, defects that cannot be observed well under ambient light or the like can be detected.
In addition, the thin film inspection apparatus 10 is provided with the reflection preventing plate 48 so that the regular reflection light of the light reflected in the inspection region of the thin film F does not enter the camera 12, thereby preventing the reduction of the detection accuracy of the defect due to the reflection to the inspection region.
Further, by using a monochrome camera as the camera 12, the data amount of the image data can be reduced while ensuring the defect detection accuracy.
[ 2 nd embodiment ]
Next, embodiment 2 to which the present invention is applied will be explained.
The thin film inspection apparatus according to embodiment 2 has the same configuration as the thin film inspection apparatus 10 according to embodiment 1, and therefore, the configuration thereof will be omitted from illustration and description with reference to fig. 1. Hereinafter, a characteristic configuration and processing of embodiment 2 will be described.
In embodiment 2, the control unit 21 of the thin-film inspection apparatus 10 uses the learning result by machine learning when detecting a defect from the image data of the inspection area captured by the camera 12.
The control section 21 detects a defect of the film F by performing data processing on the output signal of the camera 12.
The data processing includes image processing for image data obtained from the output signal of the camera 12, defect determination processing for determining a defect from the image-processed data, and quantitative evaluation processing for quantitatively evaluating a defect from the image-processed data.
The control unit 21 uses a machine learning model constructed using the data after the image processing as explanatory variables and the determination result based on the human defect as a target variable in the defect determination processing.
The control unit 21 uses a machine learning model constructed using the data after the image processing as explanatory variables and using the result of the quantitative evaluation based on the human defect as a target variable in the quantitative evaluation processing. The control unit 21 quantitatively evaluates the defects from the data after the image processing using the machine learning result, thereby classifying the defects into a plurality of classes.
In addition, in practice, it is not necessary to execute the defect determination process and the quantitative evaluation process separately, and the defect determination process and the quantitative evaluation process can be collectively performed using the result of machine learning using data after image processing as input data and a combination of the type and the level of a defect evaluated by a human as output data.
As an algorithm of the machine learning model, a machine learning model classified by category is preferably used because a result of determination based on human defects is a nominal scale. Since the result of quantitative evaluation of human-based defects is a sequential scale, it is similarly preferable to use a machine learning model for classification. Specifically, machine learning models such as generalized linear models (logistic, probit, poisson), discriminant analysis, canonical discriminant analysis, k-nearest neighbor method, gaussian process, naive bayes, decision trees, ensemble learning (Adaboost, GBM, random forest), SVM, neural networks, and the like can be used.
Further, by converting the evaluation results of the nominal scale and the sequential scale into One-hot vectors, it is also possible to use an algorithm of a regression machine learning model. Specifically, machine learning models such as linear models (multiple regression, lasso, ridge, elastic-Net, and the like), generalized linear models (logistic, proband, poisson), PLS, k-neighbor method, gaussian process, decision tree, ensemble learning (Adaboost, GBM, random forest), SVR, and neural network can be used.
Before learning by the machine learning model, preprocessing such as dimensional compression (principal component analysis, factor analysis, multidimensional scaling, canonical correlation analysis), variable transformation (kernel method, interaction term, or the like), feature quantity selection (regularization, AIC, BIC, random forest, or the like) and the like can be used as appropriate.
Next, the operation of the thin film inspection apparatus 10 according to embodiment 2 will be described.
Fig. 8 is a flowchart showing the thin film defect learning process performed by the thin film inspection apparatus 10. The film defect learning process is a process of machine learning a task of detecting a defect from image data of the film F, and is realized by software processing based on cooperation of the control unit 21 and a program stored in the storage unit 27.
The processing of steps S21 to S24 is the same as the processing of steps S1 to S4 of the thin film defect detection processing (see fig. 6), and therefore, the description thereof is omitted. In step S24, the control unit 21 stores the image data in the storage unit 27 in association with the date and time of the examination.
After the process of step S24, the control unit 21 performs a feature quantity creation process of an explanatory variable (step S25) and a labeling process of a target variable (step S26) on the image data acquired from the camera 12.
Here, the feature quantity creation process of the explanatory variable will be described with reference to fig. 9. This processing is processing for creating a feature amount used as an explanatory variable (input data) in machine learning.
The control section 21 divides the image data into a plurality of regions in the width direction of the film F (step S31). Specifically, the control unit 21 divides the image data into n regions a1 to an.
The processing in steps S32 and S33 is the same as the processing in steps S12 and S13 of the data processing (see fig. 7), and therefore, the description thereof is omitted. In the mathematical processing of step S33, the value obtained from the image data is a feature amount. In step S33, appropriate mathematical processing is performed for each type of defect to be detected.
The same processing is performed for the regions other than the region a 1. For example, the processing of step S34 and step S35 performed on the image data of the area an is the same as the processing of step S32 and step S33.
After the processing for each of the areas a1 to an, the control unit 21 summarizes the results for each of the areas a1 to an (step S36), and the feature quantity creation processing for the explanatory variables is ended. Specifically, the control unit 21 stores the respective feature amounts calculated for the respective regions (the respective positions in the width direction of the film F) in the storage unit 27 in association with the inspection dates and times corresponding to the image data of the processing target.
In the feature quantity creation process of explanatory variables shown in fig. 9, mathematical processing (steps S33 and S35) may be performed on the image data of each of the regions a1 to an divided in the width direction of the film F, and mathematical processing may be performed on the entire image data before the region division before the processing of step S31.
Next, the labeling process of the objective variable is explained with reference to fig. 10. This process is a process of giving a label that is used as a target variable (output data) in machine learning.
The control section 21 divides the image data into a plurality of regions in the width direction of the film F (step S41). Specifically, the control unit 21 divides the image data into n regions a1 to an.
Next, the control unit 21 acquires image data of the area a1 (step S42).
Next, the control unit 21 causes the display unit 25 to display the image data of the area a1, and accepts tagging of the image data of the area a1 by an operation from the operation unit 26 (step S43). The user inputs the type and level of the defect in the area a1 by operating the operation unit 26.
The same process is performed for the regions other than the region a 1. For example, the processing of step S44 and step S45 performed on the image data of the area an is the same as the processing of step S42 and step S43.
After the processing for each of the areas a1 to an, the control unit 21 summarizes the results for each of the areas a1 to an (step S46), and the labeling processing of the target variable ends. Specifically, the control unit 21 stores the type and level of the defect input by the user for each region (each position in the width direction of the film F) in the storage unit 27 in association with the inspection date and time corresponding to the image data to be processed.
After the feature quantity generation processing of the explanatory variables (step S25) and the labeling processing of the target variables (step S26), the process returns to fig. 8, and the control unit 21 determines whether or not the collection of the learning data is completed (step S27). For example, the control unit 21 may give a user a notice as to whether or not the collection of the input learning data is completed, or may automatically determine whether or not a predetermined amount or more of the learning data is accumulated.
If the collection of the data for learning is not completed (step S27; no), the process returns to step S21, and the process is repeated with a new inspection area and a new reel as targets.
When the collection of the learning data is completed in step S27 (step S27; yes), the control unit 21 creates a machine learning model using the stored data (step S28). Specifically, the control unit 21 performs machine learning using the feature amounts and labeling results associated with the respective positions (inspection date and time, regions in the width direction) of the film F. That is, the control unit 21 performs machine learning by using the feature amount (data after image processing) obtained by mathematical processing as input data and using the determination result (type of defect) based on human defects and the quantitative evaluation result (level of defect) as output data.
Next, the control unit 21 stores the created machine learning model in the storage unit 27 (step S29).
The film defect learning process is ended as described above.
Fig. 11 shows an example of the data set for learning used in step S28. For each inspection date and time t1, t2, \ 8230; (corresponding to the position in the longitudinal direction of the film F), the feature amounts 1 to k obtained by the mathematical processing for the regions a1 to an are associated with the label (defect determination result). As the defect determination result, labels such as the types of defects such as "dark band", "longitudinal wrinkle", "diagonal wrinkle", and "no defect" are recorded.
In fig. 11, only the defect determination result (type of defect) is used as a label, but a quantitative evaluation result (grade of defect) may be included.
In the film defect learning process, for the sake of simplicity of explanation, labeling is performed for each region divided in the width direction every time 1-time photographing is performed (steps S43 and S45), but labeling need not be performed every time and may be completed before step S28 of creating the machine learning model. In fact, it is realistic to perform labeling based on a person to some extent on image data of a plurality of shots or a plurality of regions, and efficiency is also good.
Fig. 12 shows an example of the visual evaluation screen 251 displayed on the display unit 25 when labeling is performed collectively. The visual evaluation screen 251 is a screen for recording visual evaluation of the film F for the number of revolutions of the roll 1. The visual evaluation screen 251 includes a widthwise position field 51, a label input field 52, and a composite image field 53.
In the width direction position column 51, the position in the width direction of the film F is displayed. Here, the position in the width direction of the film F is divided every 5 cm.
In the label input field 52, labels (defect determination result and quantitative evaluation result) for the composite images displayed in the composite image field 53 are input. For example, the label input field 52 inputs "dark band", "longitudinal wrinkle", "diagonal wrinkle", "no defect", and the like, and also inputs the level of each defect.
In the synthesized image column 53, a synthesized image corresponding to a position in the width direction (a divided region in the width direction) is displayed. The composite image is obtained by dividing image data obtained by a plurality of times (16 times in this case) of imaging into a plurality of regions in the width direction, generating divided image data, and combining the divided image data for 1 circumference (16 pieces) of the roll at the same position in the width direction along the passage of time (date and time of inspection, winding direction of the film F). In the example shown in fig. 12, the composite image obtained by 16 shots is labeled for each width direction position.
In the film inspection apparatus 10 according to embodiment 2, a learned machine learning model (learned model) created by a film defect learning process (see fig. 8) is used when detecting a defect from the film F during winding.
The thin film defect detection processing of embodiment 2 is substantially the same as the thin film defect detection processing of embodiment 1 shown in fig. 6, but the data processing of step S5 is replaced with the data processing of embodiment 2 shown in fig. 13.
The processing of steps S51 to S53 is the same as the processing of steps S11 to S13 in the data processing (see fig. 7), and therefore, the description thereof is omitted.
Next, the control unit 21 detects a defect using the learned model based on the value (feature value) obtained by the mathematical processing on the image data of the region a1 (step S54). Specifically, the control unit 21 inputs the feature value obtained from the image data of the region a1 to the machine learning model, and acquires the type and level of the defect output from the machine learning model.
The processing in step S54 corresponds to "defect determination processing" and "quantitative evaluation processing".
The same processing is performed for the regions other than the region a 1. For example, the processing of steps S55 to S57 performed on the image data of the area an is the same as the processing of steps S52 to S54.
After the processing for each of the areas a1 to an, the control unit 21 summarizes the results for each of the areas a1 to an (step S58), and the 2 nd data processing is ended. Specifically, the control unit 21 generates data in which the type and the grade of the detected defect are associated for each region (each position in the width direction of the film F).
In the 2 nd data processing shown in fig. 13, in addition to the mathematical processing (steps S53 and S56) performed on the image data of the respective regions a1 to an divided in the width direction of the film F, the mathematical processing may be performed on the entire image data before the region division before the processing of step S51.
In the film defect detection processing of embodiment 2, the inspection area in the longitudinal direction of the film F is changed in accordance with the movement of the wound film F, and the film defect detection processing is repeated to acquire the inspection results at each position in the longitudinal direction of the film F.
Further, the 2 nd data processing may be performed collectively after storing the image data obtained by the plurality of times of imaging.
Fig. 14 shows an example of evaluation results obtained by the thin film inspection apparatus 10 according to embodiment 2. For example, the control unit 21 causes the display unit 25 to display the evaluation result.
In the evaluation results of fig. 14, the horizontal axis represents the position of the film F in the width direction, and the vertical axis represents the date and time of inspection. Since the film inspection apparatus 10 inspects the wound film F for defects, the inspection date and time corresponds to the position of the film F in the longitudinal direction.
As a result of the evaluation, for each defect detected, characters (b 1 to b4, c1 to c4, and the like) corresponding to the type and grade of the defect are arranged at positions corresponding to the position in the width direction of the film F where the defect is detected and the inspection date and time. Here, b1 to b4 represent dark bands, and the intensity is stronger as the number is smaller. C1 to c4 represent oblique wrinkles, and the smaller the number, the stronger the strength.
As described above, according to embodiment 2, as in embodiment 1, since data processing is performed on an output signal obtained by detecting diffused light reflected in the inspection area of the film F, it is possible to easily detect a defect occurring in the film F without manual monitoring.
In addition, by constructing a machine learning model using the data after the image processing as explanatory variables and the determination result based on the human defect as a target variable, the defect can be determined using the machine learning result (learned model) in the defect determination processing.
In addition, by constructing a machine learning model using the data after the image processing as explanatory variables and using the result of quantitative evaluation based on human defects as a target variable, it is possible to quantitatively evaluate defects using the result of machine learning (learned model) in the quantitative evaluation process.
In addition, even when an unknown defect occurs in the film F, it is possible to automatically detect a defect that has not been recognized until now by performing image processing on image data including the unknown defect, using the data as an explanatory variable, and learning again with the unknown defect as a target variable based on the result of defect determination by a person.
In steps S43 and S45, the image data displayed on the display unit 25 is visually evaluated (labeled), but the film F itself may be observed during winding and labeled.
< comparison of characteristic amount calculation result with sensory evaluation >
Fig. 15A is an example of feature amounts calculated from image data obtained by imaging the film F in the film inspection apparatus 10 according to embodiment 1 or 2. Specifically, in fig. 15A, when the film F is wound, the image data obtained by imaging a certain region in the width direction of the film F with the camera 12 is subjected to Fast Fourier Transform (FFT), and the average of the amplitude is plotted for the position in the longitudinal direction of the film F (roll length) as the intensity at a specific frequency.
Fig. 15B shows the result of human visual evaluation (sensory evaluation) of the same image data as in fig. 15A. In fig. 15B, the visual scale of the oblique wrinkles is plotted with respect to the position (roll length) of the film F in the longitudinal direction. According to fig. 15B, the diagonal wrinkles (c 4) of level 4 occur in the range of 100m to 275m in the roll length of the film F, and the diagonal wrinkles (c 3) of level 3 occur in the range of 300m to 400m in the roll length of the film F.
In contrast, in fig. 15A, the intensity at a specific frequency gradually increased in the range of the roll length of the film F from 50m to 400m, corresponding to the sensory evaluation result (change from c4 to c 3) in fig. 15B. That is, it is found that the intensity at a specific frequency obtained by FFT is suitable as a feature amount when diagonal wrinkles are detected or a feature amount when the intensity (level) of defects is detected.
As can be seen from fig. 15A and 15B, the tendency of the evaluation using the feature amount calculated from the image data matches the tendency of the evaluation by the person.
< contrast between irradiated part and non-irradiated part >
Next, an example of a method of calculating the contrast between the irradiated portion and the non-irradiated portion on the film F will be described.
In the image data obtained by imaging the film F with the camera 12, the signal value (brightness) of the irradiated portion irradiated with light from the light source 11 is L1, and the signal value of the non-irradiated portion not irradiated with light from the light source 11 is L2.
Fig. 16 shows an average value L1_ mean, a minimum value L1_ min, and a maximum value L1_ max of the signal value L1 in the irradiation portion, an average value L2_ mean, a minimum value L2_ min, a maximum value L2_ max, and a contrast L1_ mean/L2_ mean of the signal value L2 in the non-irradiation portion, which are measured by changing the condition of the light source 11.
The linear light sources 1 and 2 are light sources having high linearity, and are suitable for the purpose of irradiating only a portion to be observed with light. The linear light source 1 has a higher linearity than the linear light source 2.
In data numbers 1 to 3, the same linear light source 1 was used, but the number and intensity of light sources were changed to adjust the contrast.
In data numbers 4 and 5, the same linear light source 2 was used, but the number and intensity of light sources were changed to adjust the contrast.
The diffuse light source irradiates light to a part other than the part to be observed, but even in the diffuse light source, the irradiation mode of light can be made different between the irradiation part and the non-irradiation part as long as the light source is close to the film F to some extent.
In the case of ambient light (conventional), as a result, the average value L2_ mean of the signal values L2 in the non-irradiation section is larger than the average value L1_ mean of the signal values L1 in the irradiation section, and thus light is not irradiated only to a portion to be observed.
As shown in fig. 16, when the ratio "L1_ mean/L2_ mean" of the average value of the signal values L1 in the irradiation portion to the average value of the signal values L2 in the non-irradiation portion is used as the contrast, the contrast is preferably 1.1 or more. Further, it is more preferably 1.8 or more, and still more preferably 2.5 or more.
The above description of the respective embodiments is an example of the thin film inspection apparatus according to the present invention, and is not limited thereto. The detailed configuration and detailed operation of each part constituting the apparatus can be appropriately changed without departing from the scope of the present invention.
For example, in each of the above embodiments, the defect generated when the film F is wound in a roll shape is detected, but the defect generated when the films F cut to a predetermined size are overlapped may be detected.
In the above embodiments, the case where the monochrome camera is used as the camera 12 has been described, but the camera 12 may be a color camera.
In the above embodiments, the case where the area sensor (camera 12) is used as the optical sensor has been described, but a line sensor may be used.
In the data processing shown in fig. 7 or the 2 nd data processing shown in fig. 13, the quantitative evaluation processing may not be performed. That is, when image processing is performed on image data obtained from the output signal of the camera 12 and a defect is determined from the image-processed data, the intensity of the defect may not be obtained.
In the above embodiments, the entire region of the film F in the longitudinal direction may be sequentially used as the inspection region, or the inspection regions may be provided at predetermined intervals in the longitudinal direction of the film F to detect defects.
In addition to the defect that occurs in the film F during winding, the defect may not occur during winding, but may occur during transfer to the next step (during movement or storage of the film F) or the like, and is recognized as a defect when the film F is wound off the roll.
The machine learning is performed using image-processed data obtained by image processing image data at the time of winding as an explanatory variable and using the result of determination of a human-based defect at the time of unwinding in the next step as a target variable, thereby making it possible to detect a sign of a defect occurring during transition to the next step. Further, by setting the target variable at the time of machine learning as the result of quantitative evaluation of human-based defects at the time of unwinding in the next step, it is also possible to predict the level of defects generated in the process of shifting to the next step.
The computer-readable medium storing the program for executing each process is not limited to the above example, and a portable recording medium can be applied. In addition, as a medium for supplying data of the program via a communication line, a carrier wave (carrier frequency) may also be applied.
Claims (30)
1. A thin film inspection apparatus for optically inspecting a defect generated in a thin film in a stacked state, the thin film inspection apparatus comprising:
a light source that irradiates light onto an inspection area of the film;
an optical sensor that detects diffused light among light reflected in an inspection area of the film; and
and a control unit for detecting the defect by performing data processing on the output signal of the optical sensor.
2. The thin film inspection apparatus according to claim 1,
the data processing includes image processing for image data obtained from an output signal of the optical sensor and defect determination processing for determining the defect from the image-processed data.
3. The thin film inspection apparatus according to claim 2,
the control unit uses a machine learning model constructed using the image-processed data as an explanatory variable and using a result of determination of the defect by a person as a target variable in the defect determination process.
4. The thin film inspection apparatus according to claim 2 or 3,
the data processing further comprises quantitative evaluation processing for quantitatively evaluating the defects according to the data after the image processing.
5. The thin film inspection apparatus according to claim 4,
the control unit uses a machine learning model constructed using the data after the image processing as an explanatory variable and a result of quantitative evaluation based on the defect of the person as a target variable in the quantitative evaluation processing.
6. The thin film inspection apparatus according to any one of claims 1 to 5,
the film is wound into a roll shape,
the film inspection apparatus inspects the defect generated in the winding process of the film.
7. The thin film inspection apparatus according to claim 6,
the light source uniformly irradiates light in the width direction of the roll-shaped film.
8. The thin film inspection apparatus according to any one of claims 1 to 7,
in the output signal of the optical sensor, a contrast between a signal value corresponding to an irradiated portion of the thin film irradiated with light by the light source and a signal value corresponding to a non-irradiated portion of the thin film not irradiated with light by the light source is a predetermined value or more.
9. The thin film inspection apparatus according to any one of claims 1 to 8,
a reflection preventing plate is provided so that regular reflection light among light reflected in the inspection area of the film does not enter the optical sensor.
10. The thin film inspection apparatus according to any one of claims 1 to 9,
the optical sensor is a black and white camera.
11. A thin film inspection method of optically inspecting a defect generated in a thin film in a stacked state, the thin film inspection method comprising:
irradiating the inspection area of the thin film with light from a light source;
detecting diffused light among light reflected in an inspection area of the film by an optical sensor; and
and a step of detecting the defect by a control unit by performing data processing on the output signal of the optical sensor.
12. The thin film inspection method according to claim 11,
the data processing includes image processing for image data obtained from an output signal of the optical sensor and defect determination processing for determining the defect from the image-processed data.
13. The thin film inspection method according to claim 12,
the control unit uses, in the defect determination processing, a machine learning model that is constructed using the image-processed data as an explanatory variable and using a determination result based on the defect of a person as a target variable.
14. The thin film inspection method according to claim 12 or 13,
the data processing further comprises quantitative evaluation processing for quantitatively evaluating the defects according to the data after the image processing.
15. The thin film inspection method according to claim 14,
the control unit uses a machine learning model constructed using the data after the image processing as an explanatory variable and a result of quantitative evaluation based on the defect of the person as a target variable in the quantitative evaluation processing.
16. The thin film inspection method according to any one of claims 11 to 15,
the film is wound into a roll form,
the film inspection method takes the defect generated in the winding process of the film as an inspection object.
17. The thin film inspection method according to claim 16,
the light source uniformly irradiates light in the width direction of the roll-shaped film.
18. The thin film inspection method according to any one of claims 11 to 17,
in the output signal of the optical sensor, a contrast between a signal value corresponding to an irradiated portion of the thin film irradiated with light by the light source and a signal value corresponding to a non-irradiated portion of the thin film not irradiated with light by the light source is a predetermined value or more.
19. The thin film inspection method according to any one of claims 11 to 18,
a reflection preventing plate is used so that regular reflection light among light reflected in the inspection area of the film does not enter the optical sensor.
20. The thin film inspection method according to any one of claims 11 to 19,
the optical sensor is a black and white camera.
21. A computer-readable recording medium storing a program for causing a computer of a thin film inspection apparatus for optically inspecting a defect generated in a thin film in a stacked state to function as a control unit, the thin film inspection apparatus comprising: a light source that irradiates light onto an inspection area of the film; and an optical sensor that detects diffused light among light reflected in an inspection area of the film,
the control section detects the defect by performing data processing on an output signal of the optical sensor.
22. The recording medium according to claim 21, wherein,
the data processing includes image processing for image data obtained from an output signal of the optical sensor and defect determination processing for determining the defect from the image-processed data.
23. The recording medium according to claim 22,
the control unit uses, in the defect determination processing, a machine learning model that is constructed using the image-processed data as an explanatory variable and using a determination result based on the defect of a person as a target variable.
24. The recording medium according to claim 22 or 23,
the data processing further comprises quantitative evaluation processing for quantitatively evaluating the defects according to the data after the image processing.
25. The recording medium according to claim 24,
the control unit uses a machine learning model constructed using the data after the image processing as an explanatory variable and a result of quantitative evaluation based on the defect of the person as a target variable in the quantitative evaluation processing.
26. The recording medium according to any one of claims 21 to 25,
the film is wound into a roll shape,
the film inspection apparatus inspects the defect generated in the winding process of the film.
27. The recording medium according to claim 26,
the light source uniformly irradiates light in the width direction of the roll-shaped film.
28. The recording medium according to any one of claims 21 to 27,
in the output signal of the optical sensor, a contrast between a signal value corresponding to an irradiated portion of the thin film irradiated with light by the light source and a signal value corresponding to a non-irradiated portion of the thin film not irradiated with light by the light source is a predetermined value or more.
29. The recording medium according to any one of claims 21 to 28,
a reflection prevention plate is provided in the thin film inspection apparatus so that regular reflection light among light reflected in an inspection area of the thin film does not enter the optical sensor.
30. The recording medium according to any one of claims 21 to 29,
the optical sensor is a black and white camera.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-121221 | 2021-07-26 | ||
JP2021121221A JP2023017169A (en) | 2021-07-26 | 2021-07-26 | Film inspection apparatus, film inspection method and program |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115684189A true CN115684189A (en) | 2023-02-03 |
Family
ID=85060926
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210870412.5A Pending CN115684189A (en) | 2021-07-26 | 2022-07-22 | Thin film inspection device, thin film inspection method, and recording medium |
Country Status (4)
Country | Link |
---|---|
JP (1) | JP2023017169A (en) |
KR (1) | KR20230016571A (en) |
CN (1) | CN115684189A (en) |
TW (1) | TWI829143B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117333492A (en) * | 2023-12-01 | 2024-01-02 | 深圳菲尔泰光电有限公司 | Optical film quality detection method and related device based on image processing |
CN117799203A (en) * | 2023-12-29 | 2024-04-02 | 扬州博恒新能源材料科技有限公司 | Preparation process and preparation equipment of ultrathin composite current collector base film |
WO2024216526A1 (en) * | 2023-04-19 | 2024-10-24 | 南京原觉信息科技有限公司 | Detection system and detection method for surface defect of metal plate |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002365225A (en) | 2001-03-19 | 2002-12-18 | Toray Ind Inc | Method and apparatus for inspecting sheet material |
JP4213616B2 (en) * | 2004-03-31 | 2009-01-21 | 大日本印刷株式会社 | Base film for liquid crystal panel, functional film for liquid crystal panel, method for producing functional film, and apparatus for producing functional film |
JP6394825B1 (en) * | 2018-02-08 | 2018-09-26 | 横河電機株式会社 | Measuring apparatus and measuring method |
-
2021
- 2021-07-26 JP JP2021121221A patent/JP2023017169A/en active Pending
-
2022
- 2022-04-18 TW TW111114662A patent/TWI829143B/en active
- 2022-05-13 KR KR1020220058999A patent/KR20230016571A/en not_active Application Discontinuation
- 2022-07-22 CN CN202210870412.5A patent/CN115684189A/en active Pending
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024216526A1 (en) * | 2023-04-19 | 2024-10-24 | 南京原觉信息科技有限公司 | Detection system and detection method for surface defect of metal plate |
CN117333492A (en) * | 2023-12-01 | 2024-01-02 | 深圳菲尔泰光电有限公司 | Optical film quality detection method and related device based on image processing |
CN117333492B (en) * | 2023-12-01 | 2024-03-15 | 深圳菲尔泰光电有限公司 | Optical film quality detection method and related device based on image processing |
CN117799203A (en) * | 2023-12-29 | 2024-04-02 | 扬州博恒新能源材料科技有限公司 | Preparation process and preparation equipment of ultrathin composite current collector base film |
CN117799203B (en) * | 2023-12-29 | 2024-05-31 | 扬州博恒新能源材料科技有限公司 | Preparation process and preparation equipment of ultrathin composite current collector base film |
Also Published As
Publication number | Publication date |
---|---|
TWI829143B (en) | 2024-01-11 |
JP2023017169A (en) | 2023-02-07 |
KR20230016571A (en) | 2023-02-02 |
TW202305320A (en) | 2023-02-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN115684189A (en) | Thin film inspection device, thin film inspection method, and recording medium | |
WO2023077404A1 (en) | Defect detection method, apparatus and system | |
US9582872B2 (en) | Optical film defect detection method and system thereof | |
JP5546317B2 (en) | Visual inspection device, visual inspection discriminator generation device, visual inspection discriminator generation method, and visual inspection discriminator generation computer program | |
CN113592845A (en) | Defect detection method and device for battery coating and storage medium | |
CN116758491B (en) | Printing monitoring image analysis method and system applied to 3D printing | |
EP3361444A1 (en) | Real-time, full web image processing method and system for web manufacturing supervision | |
Gao et al. | A novel VBM framework of fiber recognition based on image segmentation and DCNN | |
Fekri-Ershad et al. | A robust approach for surface defect detection based on one dimensional local binary patterns | |
CN118583802B (en) | Intelligent cloth inspection detection method, system, equipment and medium | |
CN117576014A (en) | Ceramic substrate quality detection method, system, electronic equipment and storage medium | |
KR102372714B1 (en) | Automatic defect inspection system based on deep learning | |
CN117612100A (en) | Sewage treatment equipment monitoring system and method based on image recognition | |
CN117314826A (en) | Performance detection method of display screen | |
TW202034421A (en) | Color filter inspection device, inspection device, color filter inspection method, and inspection method | |
Revathy et al. | Fabric defect detection and classification via deep learning-based improved Mask RCNN | |
Gyimah et al. | A discriminative deeplab model (ddlm) for surface anomaly detection and localization | |
Kuo et al. | Automated inspection of micro-defect recognition system for color filter | |
US20210049396A1 (en) | Optical quality control | |
CN116258703A (en) | Defect detection method, defect detection device, electronic equipment and computer readable storage medium | |
Meeradevi et al. | Automatic fabric defect detection in textile images using a labview based multiclass classification approach | |
JP7469740B2 (en) | Belt inspection system and belt inspection program | |
JP2022139796A (en) | Support device and method | |
Stahl et al. | Comprehensive Quantitative Quality Assessment of Thermal Cut Sheet Edges using Convolutional Neural Networks. | |
JP2021006854A (en) | Belt inspection system and belt inspection program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |