CN112326685B - Online detection device and detection method for laser-induced damage of optical element - Google Patents
Online detection device and detection method for laser-induced damage of optical element Download PDFInfo
- Publication number
- CN112326685B CN112326685B CN202011156466.2A CN202011156466A CN112326685B CN 112326685 B CN112326685 B CN 112326685B CN 202011156466 A CN202011156466 A CN 202011156466A CN 112326685 B CN112326685 B CN 112326685B
- Authority
- CN
- China
- Prior art keywords
- image
- optical element
- damage
- light source
- detection
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 83
- 230000003287 optical effect Effects 0.000 title claims abstract description 73
- 238000003384 imaging method Methods 0.000 claims abstract description 18
- 238000012545 processing Methods 0.000 claims abstract description 18
- 238000013519 translation Methods 0.000 claims abstract description 13
- 238000000034 method Methods 0.000 claims description 23
- 238000001914 filtration Methods 0.000 claims description 20
- 238000009826 distribution Methods 0.000 claims description 12
- 230000011218 segmentation Effects 0.000 claims description 10
- 230000008569 process Effects 0.000 claims description 8
- 238000005286 illumination Methods 0.000 claims description 6
- 238000003709 image segmentation Methods 0.000 claims description 6
- 238000010586 diagram Methods 0.000 claims description 4
- 230000000694 effects Effects 0.000 claims description 4
- 230000003044 adaptive effect Effects 0.000 claims description 3
- 238000000605 extraction Methods 0.000 claims description 3
- 239000000523 sample Substances 0.000 description 5
- 238000004519 manufacturing process Methods 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 230000007547 defect Effects 0.000 description 3
- 238000011161 development Methods 0.000 description 3
- 230000018109 developmental process Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 2
- 238000003754 machining Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 238000010521 absorption reaction Methods 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 230000004907 flux Effects 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 239000012535 impurity Substances 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 230000003902 lesion Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 239000013618 particulate matter Substances 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 238000013139 quantization Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000011895 specific detection Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/95—Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/001—Industrial image inspection using an image reference approach
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/136—Segmentation; Edge detection involving thresholding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/194—Segmentation; Edge detection involving foreground-background segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/62—Analysis of geometric attributes of area, perimeter, diameter or volume
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/66—Analysis of geometric attributes of image moments or centre of gravity
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N2021/8411—Application to online plant, process monitoring
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8806—Specially adapted optical and illumination features
- G01N2021/8822—Dark field detection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
- G01N2021/8887—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/95—Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
- G01N2021/9511—Optical elements other than lenses, e.g. mirrors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20224—Image subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30164—Workpiece; Machine component
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Immunology (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- Chemical & Material Sciences (AREA)
- Geometry (AREA)
- Data Mining & Analysis (AREA)
- Health & Medical Sciences (AREA)
- Pathology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Artificial Intelligence (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Computation (AREA)
- Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
- Evolutionary Biology (AREA)
- Quality & Reliability (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
Abstract
The invention discloses an optical element laser-induced damage online detection device and a detection method, wherein the detection device comprises a pulse laser light source, a detection laser light source, an attenuation system, a lens, an optical microscope, a CCD camera, a computer and a two-dimensional translation table for placing an optical element to be detected; pulse laser emitted by the pulse laser light source passes through the attenuation system and then is focused on the surface of the optical element to be measured by the lens, and transmitted light passing through the optical element is absorbed by the arranged first light trap; continuous laser emitted by the detection laser light source is obliquely incident to the surface of the optical element to be detected, and transmitted light passing through the optical element is absorbed by a second light trap; the position of an imaging surface of the CCD camera is adjustable, scattered light on the rear surface of the optical element is collected by using an optical microscope, and the CCD camera obtains a two-dimensional image of surface damage and sends the two-dimensional image to a computer for processing. The invention can realize effective, accurate and rapid online detection of the surface damage of the optical element.
Description
Technical Field
The invention belongs to the technical field of optics, and particularly relates to an optical element laser-induced damage online detection device and a detection method.
Background
With the progress of laser technology and the improvement of optical device manufacturing process, the gradual increase of laser output power has become a future development trend. The application field of high-power laser devices typified by Inertial Confinement Fusion (ICF) laser devices is also increasing. For various Laser applications, it is desirable to select optical components with sufficiently high Laser-induced Damage thresholds (LIDTs). In such optical systems for high power lasers, the quality of each optical element becomes particularly important. However, due to limitations in manufacturing processes and transportation conditions, the surface of precision optical elements may have initial defects such as scratches, pits, bubbles, etc., which are difficult to avoid during the manufacturing process.
Research shows that impurity defects on the surface and the sub-surface of the element under strong laser irradiation can reduce the damage threshold to a great extent, so that laser damage is generated. The generation and growth of laser-induced damage on the surface of an element not only can cause the reduction of beam quality and luminous flux, but also can increase the damage degree of a system and damage adjacent elements in the system due to the strong absorption of laser and the modulation of a light field in a damaged area, and seriously affect the stability and reliability of the operation of the system. Therefore, it is important to detect the generation and amount of damage in time and accurately and evaluate the quality of the optical element in the optical system carrying high power laser.
In a complex laser device, frequent off-line detection of the unloaded element not only causes difficulty in optical path restoration but also brings errors of installation and adjustment, and reduces the operating efficiency of the system. Compared with off-line detection, the optical element damage on-line detection process does not need element unloading, the on-line detection system carries out real-time automatic detection on the damage in the laser emission interval period, the normal operation of the laser device is not influenced while the detection efficiency is improved, and therefore the on-line detection system is more and more widely applied to the intensive laser device represented by ICF. According to a series of phenomena accompanying the generation of damage, such as change of scattered light energy, thermoelastic effect, generation of white plasma flash and the like, sensors such as an energy meter, a photoacoustic probe, a spectrometer and the like are adopted in the early stage to replace human eyes for damage detection, and corresponding damage online detection methods include a scattered light intensity method, a photoacoustic method, a plasma flash method and the like, which improve the detection efficiency and the real-time property to a certain extent, but have some defects.
Dark field imaging is commonly used for an imaging system in a damage on-line detection optical path to enhance damage contrast, and the dark field imaging mode of total internal reflection is commonly used to avoid the interference of imaging of a plurality of elements. However, the existing damage detection scheme mainly aims at the surface damage detection of the large-caliber element and has the characteristics of complex light path structure, insufficient detection speed, incapability of monitoring the damage in real time and the like.
In recent years, with the development of machine vision technology, high-precision industrial cameras are applied to real-time online acquisition of damage images of the surface of an optical element in the operation of a laser system. The efficient damage image algorithm can automatically judge and process damage of the images before and after damage in real time and extract and analyze the characteristics of the surface damage target point so as to realize online detection of the damage. In the damage detection, the newly increased damage points after the pulse laser is emitted are mainly concerned, and all the tiny damage points which can be seen by human eyes and are generated by irradiating an image after each laser pulse is irradiated can be identified relative to an image before irradiation, and background noise is removed. Background noise of an actual damage image is complex, and the efficiency of damage detection and the accuracy of damage identification are affected.
Therefore, in response to the situation, there is a need for an improved apparatus and algorithm for image processing and damage identification, which enables efficient detection of laser-induced damage to optical elements.
Disclosure of Invention
The invention provides an optical element laser-induced damage online detection device and a detection method, which can realize effective, accurate and rapid online detection of optical element surface damage.
An optical element laser-induced damage online detection device comprises an illumination light source module, a microscopic imaging module, a computer and a two-dimensional translation table for placing an optical element to be detected; the illumination light source module comprises a pulse laser light source, a detection laser light source, an attenuation system and a lens, and the microscopic imaging module comprises an optical microscope and a CCD camera;
the pulse laser emitted by the pulse laser light source passes through the attenuation system and then is focused on the surface of the optical element to be measured by the lens, and the transmission light passing through the optical element is absorbed by the arranged first light trap;
continuous laser emitted by the detection laser light source is obliquely incident to the surface of the optical element to be detected, and transmitted light passing through the optical element is absorbed by a second light trap;
the position of an imaging surface of the CCD camera is adjustable, scattered light on the rear surface of the optical element is collected by using an optical microscope, and the CCD camera obtains a two-dimensional image of surface damage and sends the two-dimensional image to a computer for processing.
Preferably, the pulse laser light source is an Nd-YAG single longitudinal mode pulse laser, the output wavelength is 1064nm, the pulse width is 10ns, and the highest repetition frequency is 10 Hz.
The invention also provides an optical element laser-induced damage online detection method, which uses the optical element laser-induced damage online detection device and comprises the following steps:
(1) installing a detection device, placing an optical element to be detected on a two-dimensional translation table, and selecting a standard detection mode or a quick detection mode according to the repetition frequency of a pulse laser light source;
(2) controlling a CCD camera to shoot in an interval of two times of emission of a pulse laser light source, carrying out image acquisition on an optical element, carrying out the step (3) in a standard detection mode, and directly carrying out the step (6) in a rapid detection mode;
(3) performing background processing on the acquired images before and after laser irradiation, removing background noise through filtering, and simultaneously storing image edge information;
(4) performing image segmentation processing on the image after background processing, and dividing all pixels in the image into a damaged area and a background area by adopting an Otsu self-adaptive threshold segmentation method;
(5) analyzing the similarity of the surface images acquired before and after laser irradiation, marking the detected damage patterns on the images based on a damage extraction algorithm with similar images, extracting damage points from the images, and finishing standard detection;
(6) and differentiating the collected surface images of the optical element before and after laser irradiation, subtracting the image before the laser irradiation from the image after the laser irradiation, performing subtraction operation on the two images, quickly obtaining a differential image containing a damaged area, and further determining whether damage occurs through a gray distribution map.
In the step (1), since the time consumption of the standard detection mode is relatively long compared with that of the fast mode, if the repetition frequency of the pulse laser is high, such as close to 10Hz, the standard mode cannot judge whether damage is generated within 0.1s, so the fast mode is adopted.
The specific steps of the step (3) are as follows: performing median filtering on the acquired picture, traversing each pixel in the image, and replacing the pixel with a median in the neighborhood of the pixel, thereby eliminating an isolated noise point and retaining edge information while eliminating noise; wherein, the median filtering adopts a two-dimensional filtering window, and the formula is as follows:
wherein A is a filter window, Xi,jBeing a point on the image plane being processed, Yi,jIs Xi,jThe median of the pixel degrees within the orientation of the filtering window, i.e. the output of the median filtering.
The specific process of the step (4) is as follows:
suppose the total number of pixels in the damaged image is N, and the gray scale value range is [0, L-1 ]]The gray scale range of the pixel in the background region is [0, T-1 ]]The gray scale range of the damage target pixel is [ T, L-1 ]](ii) a Let the number of pixels with a gray value of i be niThen the gray value of the pixel point in the image is [0, L-1 ]]Probability of any gray value PiIs composed of
The above formula satisfies
Then the probability that the pixel is located in the background region and the damaged region is respectively:
p0and p1Satisfies p0+p1=1
Let μ be the average gray level of the global image, and the average gray levels of the background region and the damaged region are respectively:
the variance between the background region and the damaged region is sigma2Then there is
μ=p0μ0+p1μ1
σ2=p0(μ0-μ)2+p1(μ1-μ)2=p0p1(μ0-μ1)2
When the segmentation threshold T is traversed within 0-L-1, the inter-class variance sigma is enabled2And maximizing to obtain the optimal image segmentation effect.
The specific process of the step (5) is as follows:
performing binarization processing according to an Otsu adaptive threshold segmentation method to obtain a label graph of connected domains of the input image, and simultaneously returning coordinates, width and height of the upper left corner of an external rectangle of each connected domain in the image, the total number of pixels contained in the connected domains and the centroid of the connected domains; analyzing the connected domain of the next image, creating a labeled image, traversing the connected domain of the next image, calculating the similarity of the connected domains of the two images, and setting a similarity threshold value to generate a damage result image.
In the step (6), the calculation formula of the difference image is as follows;
D(i,j)=|F(i,j)-G(i,j)|
in the formula, F (i, j) is an acquired element surface image after laser irradiation, G (i, j) is an acquired element surface image before the same laser irradiation, and D (i, j) is a difference image of the two;
firstly, carrying out sliding mean filtering on the obtained gray level image of the difference image to eliminate background noise; taking the bulge in the gray level image as a threshold value for judging damage, and if the height or width of the bulge detected in the gray level distribution image is greater than the threshold value, determining that damage exists; wherein the protrusions represent the range and number of gray values.
Compared with the prior art, the invention has the following beneficial effects:
the invention provides a novel device and a novel method for detecting the laser-induced damage of an optical element on line under strong laser irradiation. Generally, an off-line detection method is adopted, in which after laser irradiation is finished, an optical element is detached from a system, damage judgment and measurement at the laser irradiation position are performed by observing surface micro-morphology change of the optical element, and on-line detection is performed by detecting the change condition of the surface at the irradiation position of the optical element in the operation process under the condition that the normal operation of a laser system is ensured, and possible generated damage points on the surface of the optical element are identified and analyzed on line. The invention fully considers the characteristics of conventional detection methods, such as a phase contrast method, a photoacoustic method, a photothermal method and the like, that the detection speed is not fast enough, and the damage size cannot be monitored in real time, and the like, researches an online damage detection method based on microscopic dark field scattering imaging. The algorithm is simple and rapid, has few steps, high efficiency and wide application range, has great engineering application value, can promote the development and application of an optical element surface damage online detection system, and provides powerful means for improving advanced optical manufacturing ultra-precision machining technology and researching various ultra-precision machining processes.
Drawings
FIG. 1 is a schematic view of an on-line detection device for laser-induced damage of an optical element according to the present invention;
FIG. 2 is a schematic diagram of a different mode of damage detection;
FIG. 3 is a flow chart of a lesion extraction algorithm based on image similarity;
fig. 4 is a gray scale distribution diagram of a difference image with no damage and with damage.
Detailed Description
The invention will be described in further detail below with reference to the drawings and examples, which are intended to facilitate the understanding of the invention without limiting it in any way.
As shown in fig. 1, an on-line detection device for laser-induced damage of an optical element comprises an illumination light source module, a microscopic imaging module, a computer 9 and a two-dimensional translation stage 4 for placing an optical element 5 to be detected. The illumination light source module comprises a pulse laser light source 1, an attenuation system 2, a lens 3 and a detection laser light source 6, and the microscopic imaging module comprises an optical microscope 7 and a CCD camera 8. The movement of the two-dimensional translation stage 4 and the opening and closing of the CCD camera 8 are controlled by the computer 9.
The working wavelengths of the pulse laser 1 and the attenuation system 2 are consistent and are 1064nm, the working wavelengths are focused on the surface of a sample by the lens 3 after passing through the attenuation system 2, the diameter of an effective light spot on the surface of the sample is measured to be 2mm, and the first light trap 10 is arranged in the direction of the pulse laser, so that the transmitted light of the pulse laser can be absorbed. The detection laser light source 6 is obliquely incident to the surface of the sample, the transmitted light is absorbed by the second light trap 11, and scattered light on the rear surface of the optical element is collected by the optical microscope 7 by adjusting the position of the imaging surface of the CCD camera 8 to obtain a clear image of the rear surface of the optical element.
The optical element 5 to be detected is placed on the two-dimensional displacement translation stage 4, is limited by a view field of a microscopic imaging system, and can realize full-aperture scanning detection of a sample through displacement of the two-dimensional translation stage 4. The image captured by the CCD camera 8 is subjected to image processing by an image processing method, it is determined whether or not a damage is generated by irradiation of the pulse laser light, and the size and position of the damage are further confirmed.
In the embodiment, the pulse laser light source is an Nd-YAG single longitudinal mode pulse laser, the output wavelength is 1064nm, the pulse width is 10ns, and the highest repetition frequency is 10 Hz.
Since the highest repetition frequency of the pulse laser is 10Hz, the conventional image processing algorithm cannot judge whether the damage exists or not at the interval of the two pulse lasers, so that the damage rapid judgment algorithm based on the gray distribution characteristic can be adopted, and the image only containing the damaged area can be rapidly obtained by subtracting the image before laser irradiation from the image after laser irradiation, so as to confirm whether the damage occurs or not. An appropriate damage detection method may be selected according to specific detection requirements.
The detection method comprises a standard detection mode or a quick detection mode.
As shown in fig. 2, an appropriate detection mode is selected according to the repetition frequency of the pulsed laser light source, and the two detection modes include the following steps:
wherein A is a filter window, Xi,jBeing a point on the image plane being processed, Yi,jIs Xi,jThe center of (2) and the median of the pixel degrees in the orientation of the filtering window, namely the output of median filtering, can effectively inhibit the influence of irrelevant details in the background.
suppose the total number of pixels in the damaged image is N, and the gray scale value range is [0, L-1 ]]The gray scale range of the pixel in the background region is [0, T-1 ]]Damage target pixel grayDegree range of [ T, L-1 ]]. Let the number of pixels with a gray value of i be niThen the gray value of the pixel point in the image is [0, L-1 ]]Probability of any gray value PiIs composed of
The above formula satisfies
Then the probability that the pixel is located in the background region and the damaged region is respectively:
it can be seen that:
p0+p1=1
let μ be the average gray level of the global image, and the average gray levels of the background region and the damaged region are respectively:
the variance between the background region and the damaged region is sigma2Then there is
μ=p0μ0+p1μ1
σ2=p0(μ0-μ)2+p1(μ1-μ)2=p0p1(μ0-μ1)2
When the segmentation threshold T is traversed within 0-L-1, the inter-class variance sigma can be enabled2Maximizing to obtain the optimal image segmentation effect;
and 3, obtaining a label graph of the connected domain of the input image after binarization processing is carried out according to an Otsu local threshold segmentation method, and returning important attributes such as coordinates, width and height of the upper left corner of an external rectangle of each connected domain in the image, the total number of pixels contained in the connected domain, the centroid of the connected domain and the like. Then analyzing the connected domain of the next image and creating a labeled image, traversing the connected domain of the next image, calculating the similarity of the connected regions of the two images, setting a similarity threshold value to generate a damage result image, wherein the specific algorithm flow is shown in fig. 3.
D(i,j)=|F(i,j)-G(i,j)|
in the formula, F (i, j) is an acquired element surface image after laser irradiation, G (i, j) is an element surface image acquired before the same laser irradiation, and D (i, j) is a difference image between the two.
Firstly, the gray level image of the difference image is subjected to sliding mean filtering to eliminate background noise. The height and width are threshold values for damage judgment, and the height or width of the bump is larger than the threshold value, and the damage is considered to exist. The height and width of the strongest and shortest bump detected is therefore compared to a threshold value for detection.
Fig. 4 is a gray distribution diagram of a difference image, in which (a) a graph in which damage occurs and (b) a graph in which damage occurs, the abscissa is a gray value and the ordinate is a number, the gray distribution of the difference image in the case where damage occurs and the gray distribution of the difference image in the case where damage occurs are selected and compared with each other. A protrusion having a height or width greater than a set threshold is considered to be damaged. The height and width of the strongest and shortest bump detected is therefore compared to a threshold value for detection. In general, the sampling interval of the array is 1, in order to improve the detection efficiency and meet the requirement of detection time, the number of sampling points is 1/2 or 1/4 at this time, and the threshold value is automatically adjusted correspondingly. The experimental result proves that the rapid detection algorithm can effectively judge whether damage exists according to the difference image, and the algorithm is proved to have the capability of rapidly detecting the damage and meet the actual detection requirement.
The embodiments described above are intended to illustrate the technical solutions and advantages of the present invention, and it should be understood that the above-mentioned embodiments are only specific embodiments of the present invention, and are not intended to limit the present invention, and any modifications, additions and equivalents made within the scope of the principles of the present invention should be included in the scope of the present invention.
Claims (1)
1. An optical element laser-induced damage online detection device is characterized by comprising an illumination light source module, a microscopic imaging module, a computer and a two-dimensional translation table for placing an optical element to be detected; the illumination light source module comprises a pulse laser light source, a detection laser light source, an attenuation system and a lens, and the microscopic imaging module comprises an optical microscope and a CCD camera;
pulse laser emitted by the pulse laser light source passes through the attenuation system and then is focused by the lens, passes through the two-dimensional translation table and passes through the optical element to be detected, and transmitted light passing through the optical element is absorbed by the arranged first light trap; YAG single longitudinal mode pulse laser, output wavelength is 1064nm, pulse width is 10ns, highest repetition frequency is 10 Hz;
continuous laser emitted by the detection laser light source is obliquely incident, passes through the two-dimensional translation table and passes through the optical element to be detected, and transmitted light passing through the optical element is absorbed by a second light trap;
the pulse laser light source and the detection laser light source are positioned on the same side of the two-dimensional translation table, and the microscopic imaging module is positioned on the opposite side of the two-dimensional translation table;
collecting scattered light on the rear surface of the optical element by using an optical microscope, imaging the scattered light on a CCD camera, wherein the position of an imaging surface of the CCD camera is adjustable, and the CCD camera obtains a two-dimensional image of surface damage and then sends the two-dimensional image to a computer for processing;
the detection method of the online detection device for the laser-induced damage of the optical element comprises the following steps:
(1) installing a detection device, placing an optical element to be detected on a two-dimensional translation table, and selecting a standard detection mode or a quick detection mode according to the repetition frequency of a pulse laser light source;
(2) controlling a CCD camera to shoot in an interval of two times of emission of a pulse laser light source, carrying out image acquisition on an optical element, carrying out the step (3) in a standard detection mode, and directly carrying out the step (6) in a rapid detection mode;
(3) performing background processing on the acquired images before and after laser irradiation, removing background noise through filtering, and simultaneously storing image edge information; the method comprises the following specific steps: performing median filtering on the acquired picture, traversing each pixel in the image, and replacing the pixel with a median in the neighborhood of the pixel, thereby eliminating an isolated noise point and retaining edge information while eliminating noise; wherein, the median filtering adopts a two-dimensional filtering window, and the formula is as follows:
wherein A is a filter window, Xi,jBeing a point on the image plane being processed, Yi,jIs Xi,jThe median of the pixel degrees in the orientation of the filtering window, i.e. the output of the median filtering;
(4) performing image segmentation processing on the image after background processing, and dividing all pixels in the image into a damaged area and a background area by adopting an Otsu self-adaptive threshold segmentation method; the specific process is as follows:
suppose the total number of pixels in the damaged image is N, and the gray scale value range is [0, L-1 ]]The gray scale range of the pixel in the background region is [0, T-1 ]]The gray scale range of the damage target pixel is [ T, L-1 ]](ii) a Let the number of pixels with a gray value of i be niThen the gray value of the pixel point in the image is [0, L-1 ]]Probability of any gray value PiIs composed of
The above formula satisfies
Then the probability that the pixel is located in the background region and the damaged region is respectively:
p0and p1Satisfies p0+p1=1
Let μ be the average gray level of the global image, and the average gray levels of the background region and the damaged region are respectively:
the variance between the background region and the damaged region is sigma2Then there is
When the segmentation threshold T is traversed within 0-L-1, the inter-class variance sigma is enabled2Maximizing to obtain the optimal image segmentation effect;
(5) analyzing the similarity of the surface images acquired before and after laser irradiation, marking the detected damage patterns on the images based on a damage extraction algorithm with similar images, extracting damage points from the images, and finishing standard detection; the specific process is as follows:
performing binarization processing according to an Otsu adaptive threshold segmentation method to obtain a label graph of connected domains of the input image, and simultaneously returning coordinates, width and height of the upper left corner of an external rectangle of each connected domain in the image, the total number of pixels contained in the connected domains and the centroid of the connected domains; analyzing the connected domain of the next image, creating a labeled image, traversing the connected domain of the next image, calculating the similarity of the connected domains of the two images, and setting a similarity threshold value to generate a damage result image;
(6) differentiating the collected surface images of the optical element before and after laser irradiation, subtracting the image before the laser irradiation from the image after the laser irradiation, performing subtraction operation on the two images, quickly obtaining a differential image containing a damaged area, and further determining whether damage occurs through a gray distribution map;
the differential image is calculated by the formula:
D(i,j)=|F(i,j)-G(i,j)|
in the formula, F (i, j) is an acquired optical element surface image after laser irradiation, G (i, j) is an optical element surface image acquired before the same laser irradiation, and D (i, j) is a difference image of the two;
firstly, carrying out sliding mean filtering on the obtained gray distribution map of the difference image to eliminate background noise; taking the bulge in the gray level distribution map as a threshold value for judging damage, and if the height or width of the bulge detected in the gray level distribution map is greater than the threshold value, determining that damage exists; the horizontal coordinate of the gray distribution diagram is gray value, the vertical coordinate is pixel number, the height of the bulge represents pixel number, and the width represents gray value range.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011156466.2A CN112326685B (en) | 2020-10-26 | 2020-10-26 | Online detection device and detection method for laser-induced damage of optical element |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011156466.2A CN112326685B (en) | 2020-10-26 | 2020-10-26 | Online detection device and detection method for laser-induced damage of optical element |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112326685A CN112326685A (en) | 2021-02-05 |
CN112326685B true CN112326685B (en) | 2022-02-01 |
Family
ID=74311687
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011156466.2A Active CN112326685B (en) | 2020-10-26 | 2020-10-26 | Online detection device and detection method for laser-induced damage of optical element |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112326685B (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113484326A (en) * | 2021-07-06 | 2021-10-08 | 南开大学 | Integrated laser damage surface observation system |
CN115326804B (en) * | 2022-09-02 | 2024-05-14 | 哈尔滨工业大学 | Automatic evaluation device and method for initiating surface damage and increasing damage of fused quartz element |
CN116660318A (en) * | 2023-07-25 | 2023-08-29 | 中国科学院长春光学精密机械与物理研究所 | Large-caliber optical element damage positioning device and repairing method |
CN117232792B (en) * | 2023-11-14 | 2024-01-30 | 南京木木西里科技有限公司 | Microscope defect detection system based on image information |
CN117745715B (en) * | 2024-02-06 | 2024-04-23 | 中科院南京耐尔思光电仪器有限公司 | Large-caliber telescope lens defect detection method based on artificial intelligence |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102841102A (en) * | 2012-09-05 | 2012-12-26 | 同济大学 | Recognition method and device for micro-scale damage point in damage threshold measurement |
CN105021627A (en) * | 2015-07-20 | 2015-11-04 | 中国科学院长春光学精密机械与物理研究所 | High-sensitivity fast on-line detection method of optical thin film and element surface laser-induced damage |
CN110749606A (en) * | 2019-11-14 | 2020-02-04 | 中国工程物理研究院激光聚变研究中心 | Laser damage detection method and system based on optical element |
-
2020
- 2020-10-26 CN CN202011156466.2A patent/CN112326685B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102841102A (en) * | 2012-09-05 | 2012-12-26 | 同济大学 | Recognition method and device for micro-scale damage point in damage threshold measurement |
CN105021627A (en) * | 2015-07-20 | 2015-11-04 | 中国科学院长春光学精密机械与物理研究所 | High-sensitivity fast on-line detection method of optical thin film and element surface laser-induced damage |
CN110749606A (en) * | 2019-11-14 | 2020-02-04 | 中国工程物理研究院激光聚变研究中心 | Laser damage detection method and system based on optical element |
Non-Patent Citations (1)
Title |
---|
基于机器视觉的光学元件损伤在线检测研究;陈静;《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》;20171115(第11期);第C030-32页 * |
Also Published As
Publication number | Publication date |
---|---|
CN112326685A (en) | 2021-02-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112326685B (en) | Online detection device and detection method for laser-induced damage of optical element | |
CN103506756B (en) | Laser lap welding gap detecting system and laser lap welding gap detecting method based on molten pool image visual sensing | |
US7141773B2 (en) | Image focusing in fluorescent imaging | |
US20110025838A1 (en) | Method and apparatus for inspecting defects in wafer | |
CN111899215B (en) | Method for extracting optical element body defect | |
CN111369511A (en) | Optical element surface weak scratch detection method based on spectral characteristics | |
JP4715016B2 (en) | Method for evaluating polysilicon film | |
JP2013029438A (en) | Defect inspection device and defect inspection method | |
CN111474179A (en) | Lens surface cleanliness detection device and method | |
Gao et al. | Feature extraction of laser welding pool image and application in welding quality identification | |
CN105195468A (en) | Method and device for online cleaning and detection of first lens of fusion device | |
CN102841055A (en) | Online detection method and device for laser injury in optical component body | |
JP2003017536A (en) | Pattern inspection method and inspection apparatus | |
Chen et al. | In-situ laser-induced surface damage inspection based on image super-resolution and adaptive segmentation method | |
US20080170772A1 (en) | Apparatus for determining positions of objects contained in a sample | |
TW201520669A (en) | Bevel-axial auto-focus microscopic system and method thereof | |
JP3940336B2 (en) | Surface inspection device | |
JP2008039444A (en) | Method and apparatus for inspecting foreign matter | |
KR20140144673A (en) | Method and apparatus for detecting defects | |
CN112581424B (en) | Classification extraction method for surface and subsurface defects of optical element | |
CN114383522A (en) | Method for measuring surface gap and surface difference of reflective difference workpiece | |
CN114820539A (en) | Robot laser detection method based on automobile welding production line | |
KR102114013B1 (en) | Device and method for detecting defect of display | |
EP0939294B1 (en) | Determining topographical values | |
JP2022014226A (en) | Workpiece inspection method and device, and workpiece processing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |