CN110827256B - Optical and thermal infrared multi-stage imaging detection method and device for defects of transparent component - Google Patents

Optical and thermal infrared multi-stage imaging detection method and device for defects of transparent component Download PDF

Info

Publication number
CN110827256B
CN110827256B CN201911054181.5A CN201911054181A CN110827256B CN 110827256 B CN110827256 B CN 110827256B CN 201911054181 A CN201911054181 A CN 201911054181A CN 110827256 B CN110827256 B CN 110827256B
Authority
CN
China
Prior art keywords
optical
defect
imaging
infrared
layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911054181.5A
Other languages
Chinese (zh)
Other versions
CN110827256A (en
Inventor
张国军
明五一
张臻
尹玲
陈志君
张红梅
廖敦明
卢亚
耿涛
沈帆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Hust Industrial Technology Research Institute
Original Assignee
Guangdong Hust Industrial Technology Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Hust Industrial Technology Research Institute filed Critical Guangdong Hust Industrial Technology Research Institute
Priority to CN201911054181.5A priority Critical patent/CN110827256B/en
Publication of CN110827256A publication Critical patent/CN110827256A/en
Application granted granted Critical
Publication of CN110827256B publication Critical patent/CN110827256B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/958Inspecting transparent materials or objects, e.g. windscreens
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N25/00Investigating or analyzing materials by the use of thermal means
    • G01N25/72Investigating presence of flaws
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J2005/0077Imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Biochemistry (AREA)
  • Multimedia (AREA)
  • Analytical Chemistry (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Quality & Reliability (AREA)
  • Investigating Or Analyzing Materials Using Thermal Means (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

The invention discloses an optical and thermal infrared multilevel imaging detection method and a device for defects of a transparent component, which comprises the following steps: carrying out optical imaging on the 3C transparent component, acquiring an optical image, and preliminarily judging the defect position and size of the 3C transparent component; heating the defect position of the 3C transparent member by using nitrogen, and carrying out thermal infrared imaging to obtain a thermal infrared image; and carrying out fusion processing on the optical image and the thermal infrared image, and recognizing the defect type of the 3C transparent component through deep learning. The device comprises an optical detection module, a thermal infrared detection module, a motion control module, a data fusion module, a deep learning module, an auxiliary manipulator and a display alarm module which are in communication connection with each other through a bus. According to the invention, through optical imaging and thermal infrared ultrasonic imaging, multi-physical quantity and multi-stage detection is carried out on a defective product, and then optical and infrared multi-source information fusion is completed through a convolutional neural network, so that the identification rate of defect detection of the 3C transparent member is improved.

Description

Optical and thermal infrared multi-stage imaging detection method and device for defects of transparent component
Technical Field
The invention relates to a method for detecting defects of a 3C transparent component, in particular to a method and a device for optical and thermal infrared multilevel imaging detection of the defects of the 3C transparent component.
Background
In the 3C industry of China, transparent components are increasingly applied, and particularly, with the rapid advance of the 5G technology, the components made of glass materials are widely applied to terminal communication products. However, due to the characteristics of transparent members in the 3C industry, the difficulty of defect detection is increasing. At present, many production enterprises mainly adopt a manual detection mode as a main mode, and the automation degree is not high. However, the manual method depends on the experience of workers, the standard scale is difficult to unify, the labor intensity is very high, the visual health is influenced, and the skilled workers can shift post without the retirement age.
At present, the defect detection of the mainstream 3C transparent purchased parts is mainly based on a manual visual inspection mode, a small amount of low-difficulty automatic detection equipment is utilized, and a single optical detection means (especially visible light) is mainly used. However, due to the characteristics of the 3C transparent member: the size is small, the types are many, and the detection accuracy is not obvious, and if the identification is carried out by only depending on the conventional visible light, the detection accuracy is required to be improved.
Disclosure of Invention
In order to solve the technical problem, the invention provides an optical and thermal infrared multi-stage imaging detection method and device for defects of a transparent component.
In order to solve the technical problems, the invention adopts the following technical scheme:
the optical and thermal infrared multi-stage imaging detection method for the defects of the transparent member comprises the following steps:
carrying out optical imaging on the 3C transparent component, acquiring an optical image, and preliminarily judging the defect position and size of the 3C transparent component;
heating the defect position of the 3C transparent member by using nitrogen, and carrying out thermal infrared imaging to obtain a thermal infrared image;
and carrying out fusion processing on the optical image and the thermal infrared image, and recognizing the defect type of the 3C transparent component through deep learning.
The fusion treatment specifically comprises the following steps:
performing thermal infrared imaging on the 3C transparent component by adopting two super-depth-of-field infrared modules, performing interval sampling on two obtained imaging videos, extracting 7 frames of videos with the size of 60 multiplied by 40 pixels respectively, totaling the size of 60 multiplied by 40 pixels and 2 sections of 7 frames of videos, and respectively obtaining a first infrared video and a second infrared video;
performing data feature extraction on the first infrared video and the second infrared video by using a first convolution neural network model, and calculating the probability of belonging to different defect types, wherein the defect types comprise normal, cracks, bubbles, scratches and broken edges;
compressing the size of an optical imaging image to 60 x 40 pixels, and randomly extracting 2 images of 4 infrared forming images with the size of 60 x 40 pixels from the imaging videos of the two super-depth-of-field infrared imaging modules respectively, so as to obtain 5 images with the total size of 60 x 40 pixels;
extracting data characteristics of the 5 pictures by using a second convolutional neural network model, and calculating the probability of belonging to different defect categories;
fusing the probability of the defect types obtained by calculation aiming at the first infrared video and the second infrared video with the probability of the defect types obtained by calculation aiming at 5 pictures, and calculating according to the following formula:
Figure RE-GDA0002328437120000021
wherein, PAFor predicting defect probability after fusion, PV1For predicting defect probability, P, of the first infrared videoV2For predicting defect probability, P, of the second IR videoOFor predicting defect probability, P, of optical imagesT1,PT2,PT3,PT4The predicted defect probabilities of the 4 infrared forming images are respectively;
and finally judging the defect type, and calculating according to the following formula:
Y=arg max(PA)
and Y is the defect type of the current 3C transparent component to be detected which is finally calculated and obtained.
The first convolutional neural network comprises 1 hard-wiring layer H1, 3 convolutional layers C2, C4 and C6, 2 downsampling layers S3 and S5, 1 full-connection layer FC and 1 soft regression layer SR;
each cube of the 3D convolution kernel convolution is 7 consecutive frames, each frame size being 60 × 40 pixels; the 1 hard connecting layer respectively processes input first infrared video and second infrared video, and information of five channels extracted from each frame is respectively as follows: the gray scale, the gradients in the x and y directions, the optical flows in the x and y directions, the first three gradients in the gray scale, the x and y directions are calculated according to each frame, the optical flows in the x and y directions are calculated by two continuous frames, and the total number is 33 features;
the 3 convolutional layers are obtained by convolution calculation of front layer data through convolution cores of 7x7x3, 7x6x3 and 7x4 respectively;
the 2 down-sampling layers are obtained by 2x2 pooling calculation and 3x3 pooling calculation respectively;
and finally, obtaining a 128-dimensional vector through 1 full connection layer, and connecting the vector with the full connection layer through 1 soft regression layer to obtain the probability of different defect types.
The second convolutional neural network comprises 2 convolutional layers which are respectively a first convolutional layer and a second convolutional layer, 2 downsampling layers which are respectively a first downsampling layer and a second downsampling layer, 1 full-connection layer and 1 soft regression layer;
the 2 convolutional layers are obtained by performing convolution calculation on the front layer data by using a convolution kernel of 2x 2; 2 downsampling layers are obtained through 2x2 pooling calculation;
finally, obtaining a 128-dimensional vector through 1 full connection layer, and connecting the vector with the full connection layer through 1 soft regression layer to obtain the probability of different defect types;
inputting 1 image of 60 x 40 pixel size of optical imaging and 4 images of thermal infrared imaging into a first convolution layer of a second convolution neural network model, and acquiring defect category identification probability of 5 images from a soft regression layer of the second convolution neural network model.
An optical and thermal infrared multi-stage imaging detection device for defects of transparent members, comprising:
the workbench is provided with a transparent objective table for placing a to-be-tested 3C transparent component;
the auxiliary mechanical arm is connected and communicated with the computer through a bus and is used for feeding, transferring and discharging;
the optical detection module is in communication connection with the computer through a bus and is used for optically imaging the to-be-detected 3C transparent component;
the thermal infrared detection module is in communication and communication connection with the computer through a bus and is used for performing thermal infrared imaging on the to-be-detected 3C transparent component;
the motion control module is in communication and signal connection with the computer through a bus and is used for controlling the movement of the optical detection module and the thermal infrared detection module;
the display alarm module is in communication connection with the motion control module through a bus and is used for displaying and alarming;
the data fusion module is in communication connection with the optical detection module and the thermal infrared detection module through a bus and is used for receiving and processing data of the optical detection module and the thermal infrared detection module;
and the deep learning module is in communication connection with the data fusion module through a bus and is used for processing the data transmitted from the data fusion module.
The optical detection module comprises a parallel light source, an optical lens and an imaging sensor, the parallel light source is arranged above the transparent objective table to polish the 3C transparent component to be detected, the imaging sensor and the optical lens are installed and connected and located below the transparent objective table, and light irradiated from the transparent objective table is subjected to digital imaging in the imaging sensor.
The workstation is the cuboid to the part of workstation and transparent objective table contact and lay communication cable department on this workstation are equipped with the opening for prevent that external light from influencing optical module's digital imaging effect.
The thermal infrared detection module comprises an electromagnetic clamp, a nitrogen spray gun, a power supply module, a drive control module, super depth of field infrared imaging module and a supporting table, super depth of field infrared imaging module is equipped with two, super depth of field infrared imaging module and drive control module respectively with computer communication connection, the electromagnetic clamp, super depth of field infrared imaging module, drive control module is connected with power supply module, drive control module is connected with motion control module, the electromagnetic clamp, super depth of field infrared imaging module, the nitrogen spray gun is connected with drive control module, the nitrogen spray gun is established in the supporting table and is connected with the removal module, the electromagnetic clamp is installed and is used for pressing from both sides 3C transparent component that awaits measuring in the supporting table upper end, two super depth of field infrared imaging modules are established and are used for being surveyed 3C transparent component that awaits measuring that the electromagnetic clamp cliied in the supporting table top and detect.
The two infrared imaging modules with the super field depth are obliquely arranged, and the inclination angle between the two infrared imaging modules with the super field depth and the horizontal line is 30-60 degrees.
The two super-depth-of-field infrared imaging modules have the same or different magnification ratios.
The invention has the following beneficial effects:
1. the method comprises the steps of carrying out preliminary detection through optical imaging by adopting an optical and infrared multi-stage joint detection mode, estimating the position and the size of a defect, further analyzing the defect through thermal infrared imaging, and improving the accuracy of the detection method through multi-stage joint detection;
2. the 3C transparent component is rapidly heated by high-temperature nitrogen to obtain dynamic information of heat conduction of thermal infrared imaging, so that thermal infrared video data of a defect area are obtained, compared with a single image of traditional optical detection, the information amount is more abundant, and the defect identification is more effective;
3. based on a multi-source information fusion method, fusion detection is carried out on the image information of optical imaging and the video information of thermal infrared, so that the defect of a single detection technology is overcome;
4. a convolutional neural network is adopted, after sample offline training, fusion processing can be automatically carried out on optical images and thermal infrared video data on line, the characteristics of defect samples are automatically extracted, manual work is replaced, and online, precise and efficient detection of the 3C transparent component is completed.
Drawings
FIG. 1 is a schematic block diagram of one embodiment of the present principles;
FIG. 2 is a schematic mechanical layout of the apparatus of the present invention;
FIG. 3 is a schematic diagram of an optical detection module;
4-1, 4-2 are flow charts of the optical detection module;
FIG. 5 is a schematic diagram of the results of the thermal infrared detection module;
6-1, 6-2 are the detection flow chart of the thermal infrared detection module;
FIGS. 7-1, 7-2, and 7-3 are schematic views of thermal infrared imaging of the region to be detected, respectively;
FIG. 8 is a schematic diagram of a first convolutional neural network structure for 3C transparent component detection and identification;
FIG. 9 is a schematic diagram of a second convolutional neural network structure for detection and identification of a 3C transparent component.
Detailed Description
For further understanding of the features and technical means of the present invention, as well as the specific objects and functions attained by the present invention, the present invention will be described in further detail with reference to the accompanying drawings and detailed description.
As shown in fig. 1-9, the present invention discloses an optical and thermal infrared multi-stage imaging detection device for defects of a transparent member, comprising:
and a stage 14 on which a transparent stage 13 is provided, the stage 13 placing a 3C transparent member 12 to be inspected thereon. Corresponding to-be-detected workbench inlets and to-be-classified workbench outlets can be arranged, and output of products is facilitated.
And the auxiliary mechanical arm 6 is connected and communicated with the computer through a bus and used for feeding, transferring and discharging.
And the optical detection module 1 is in communication connection with a computer through a bus and is used for carrying out optical imaging on the 3C transparent component to be detected.
And the thermal infrared detection module 2 is in communication and signal connection with a computer through a bus and is used for performing thermal infrared imaging on the to-be-detected 3C transparent component.
And the motion control module is in communication and signal connection with the computer through a bus and is used for controlling the movement of the optical detection module and the thermal infrared detection module.
And the display alarm module is in communication connection with the motion control module through a bus and is used for displaying and alarming so as to remind a worker.
And the data fusion module is in communication connection with the optical detection module and the thermal infrared detection module through buses and is used for receiving and processing data of the optical detection module and the thermal infrared detection module.
And the deep learning module is in communication connection with the data fusion module through a bus and is used for processing the data transmitted from the data fusion module.
The auxiliary mechanical arm 6 is installed on a mechanical arm base and arranged between an inlet of a workbench to be detected, the optical detection module 1, the thermal infrared detection module 2 and an outlet of a workbench to be classified, after a 3C transparent component to be detected is conveyed to the inlet of the workbench to be detected in a manual or automatic conveying belt mode, the auxiliary mechanical arm 6 is conveyed to a workbench 14 in the optical detection module 1 from the inlet of the workbench to be detected, after the optical detection module completes detection, the auxiliary mechanical arm 6 is conveyed to the area where the thermal infrared detection module 2 is located, and after the thermal infrared detection module completes detection, the auxiliary mechanical arm 6 is pushed to the outlet of the workbench to be classified in a manual or automatic conveying belt mode. Further, since the working time of the optical detection module 1 and the thermal infrared detection module 2 is longer than that of the auxiliary manipulator 6, the parallel work of the whole assembly line can be realized through beat control.
As shown in fig. 3, the optical detection module 1 includes a collimated light source 11, an optical lens 15 and an imaging sensor 16, the collimated light source 11 is disposed above a transparent stage 13 to illuminate a 3C transparent member 12 to be detected, the imaging sensor 16 and the optical lens 15 are mounted and connected and located below the transparent stage 13 to perform digital imaging on light irradiated from the transparent stage and in the imaging sensor. The workbench 14 is a cuboid, and an opening is formed in a part of the workbench 14, which is in contact with the transparent object stage 13, and a communication cable is laid on the workbench, so that external light is prevented from influencing the digital imaging effect of the optical module.
Thermal infrared detection module 2 includes electromagnetic fixture 21, nitrogen gas spray gun 22, power module 23, drive control module 24, super depth of field infrared imaging module 25 and brace table 26, super depth of field infrared imaging module is equipped with two, super depth of field infrared imaging module and drive control module respectively with computer communication connection, electromagnetic fixture, super depth of field infrared imaging module, drive control module is connected with power module, drive control module is connected with motion control module, electromagnetic fixture, super depth of field infrared imaging module, nitrogen gas spray gun is connected with drive control module, nitrogen gas spray gun establishes in the brace table and is connected with the removal module, electromagnetic fixture installs and is used for pressing from both sides tight 3C transparent component that awaits measuring in the brace table upper end, two super depth of field infrared imaging modules establish and are used for being surveyed 3C transparent component that awaits measuring by electromagnetic fixture clip and detect in the brace table top. The nitrogen gas spray gun uses nitrogen gas as a heating gas source to rapidly heat the 3C transparent component. The moving module can be a linear motor, an air cylinder or the like or a stepping motor or the like in the prior art, so that the nitrogen spray gun can move, a defect area at any position of the 3C transparent component to be detected can be sprayed with high-temperature nitrogen or normal-temperature nitrogen through the nitrogen spray gun 22. The drive control module 24 is connected with the computer 17 on one hand, and the other hand is connected with the stepping motor; the nitrogen spray gun can be provided with corresponding electromagnetic valves as control switches, the power module 23 provides power for the ultra-depth-of-field infrared imaging module 25, the electromagnetic clamp 21, the driving control module 24, the stepping motor and the electromagnetic valves, and the stepping motor and the electromagnetic valves are known product technologies and are not shown in the figure.
The two infrared imaging modules with the super field depth are obliquely arranged, and the inclination angle between the two infrared imaging modules with the super field depth and the horizontal line is 30-60 degrees. The amplification factor of the infrared lenses of the two super-depth-of-field infrared imaging modules 25 is set to be 10-100; in the infrared imaging process, the infrared lens magnifications of the two super-depth-of-field infrared imaging modules 25 may be set to be the same magnification, such as 20 times, or may be set to be different magnifications, such as one is set to be 10 times, and the other is set to be 50 times.
The data fusion module 4 is realized by software, an operation carrier is an independent data fusion DSP chip, and images and videos of the optical detection module 1 and the thermal infrared detection module 2 are preprocessed; after the image of the optical detection module 1 is subjected to drying removal, filtering and enhancement, a new image is generated; and performing frame extraction processing on the video data of the thermal infrared detection module 2 to obtain a dynamic process image of simplified continuous heat conduction.
The deep learning module 5 adopts a convolution neural network, is realized by an independent convolution neural network DSP chip, and can parallelly input multi-physical field data of the same 3C transparent component to be detected through a plurality of neural networks, particularly a neural network group comprising a 2-channel thermal infrared video, a 1-channel optical image and a 4-channel thermal infrared image.
For further processing, as shown below.
As shown in fig. 4, after the auxiliary manipulator 6 places the 3C transparent member 12 to be detected on the transparent object stage 13, a completion signal is sent to the motion control module 3, the motion control module 3 sends a message to the computer 17 through the CAN bus, the computer 17 controls the light of the parallel light source 11 to be in an on state, then the computer 17 controls the imaging sensor 16 to perform imaging and sends a first imaging result back to the computer 17, after the computer 17 is evaluated and analyzed, if the imaging quality is considered to be low, the optical lens 15 is controlled to perform focusing and digital imaging analysis again, and after multiple times of acquisition, the computer 17 sends an imaging photo of the optimal 3C transparent member back to the motion control module 3 through the CAN bus.
After the optical detection module 1 finishes the current detection task, the computer 17 sends a detection completion signal of the link to the motion control module 3 through a CAN bus, the computer 17 controls the light of the parallel light source 11 to be in a closed state, and the computer 17 controls the optical lens 15 to be in a focus position set by user parameters; the motion control module 3 sends a task of removing the optical detection module 1 to-be-detected 3C transparent component 12 to the auxiliary manipulator 6 through a CAN bus, and the auxiliary manipulator 6 carries the to-be-detected 3C transparent component 12 from the optical detection module 1 to the thermal infrared detection module 2; after the transportation is successful, the result is notified to the motion control module 3 through the CAN bus, and the motion control module 3 notifies the optical detection module 1 of the result.
As shown in fig. 6, after the auxiliary manipulator 6 places the 3C transparent member 12 to be detected in the working area of the electromagnetic clamp 21, a completion signal is sent to the motion control module 3, the motion control module 3 sends information of the defect position preliminarily analyzed by the optical detection module 1 through a CAN bus, and sends a message that the auxiliary manipulator 6 is in place and sends the message to the computer 17; the computer 17 controls the driving control module 24 to move the nitrogen spray gun 22 to the position below the defect position under the driving of the stepping motor, and simultaneously controls the driving control module 24 to drive the electromagnetic clamp 21 to clamp the edge of the 3C transparent component 12 to be detected; then, the driving control module 24 controls the electromagnetic clamp 21 closest to the defect position to act (open a switch), so as to inject high-temperature nitrogen to the defect area of the to-be-detected 3C transparent member 12 and rapidly raise the temperature of the to-be-detected 3C transparent member 12; then, the driving control module 24 controls the two super-depth-of-field infrared imaging modules 25 to perform video recording on the heat conduction process of the two super-depth-of-field infrared imaging modules, so as to obtain a heat conduction dynamic process near a defect area; finally, the computer 17 sends the video green back to the motion control module 3 via the CAN bus.
After the thermal infrared detection module 2 finishes the current detection task, the computer 17 sends a detection completion signal of the link to the motion control module 3 through a CAN bus, the computer 17 controls the drive control module 24 to stop the image acquisition work of the two super-depth infrared imaging modules 25, the auxiliary manipulator 6 grasps the to-be-detected 3C transparent member 12 through a sucker and informs the motion control module 3 of the in-place action information; then, after the drive control module 24 obtains the grasping in-place information through a communication link, the pair of electromagnetic clamps 21 is released, the auxiliary manipulator 6 is informed through the communication link, and the 3C transparent member 12 to be detected is carried to the outlet of the workbench to be classified from the thermal infrared detection module 2; after the transportation is successful, the motion control module 3 is informed of the result through a CAN bus, and the motion control module 3 informs the thermal infrared detection module 2 of the result; the computer 17 controls the drive control module 24 to move the nitrogen gas injection gun 22 to the default position near the line of axial symmetry of the 3C transparent member 12 to be inspected.
Fig. 7-1, 7-2, and 7-3 are schematic views of thermal infrared imaging of the region to be detected of the 3C transparent member of the present invention, respectively, wherein a is a normal region. When the 3C transparent member 12 to be detected is rapidly heated by the nitrogen gas spray gun 22, due to the bubble defect stored inside the 3C transparent member 12 to be detected, as shown at B of fig. 7-1; because the density of the gas in the bubbles is different from that of the base material of the 3C transparent member 12 to be detected, and the boundary condition of physical heat transfer exists in the contact area of the bubbles and the base, the temperature rise of the bubbles in the 3C transparent member 12 to be detected is different from that of other normal base materials, and the colors of the images are different under the imaging of the super-scene-depth infrared imaging module 25, so that the accurate identification of the defects is realized; as shown in fig. 7-2, when a crack defect C exists inside the 3C transparent member 12 to be detected, since two sides of the crack are normal substrate materials and the crack itself is a heat transfer boundary, the temperature of the crack is different from the temperature of the substrate, and thus the color of the image is different under the imaging of the super-depth infrared imaging module 25, thereby realizing accurate identification of the defect; as shown in fig. 7-3, when the 3C transparent member 12 to be inspected is free from defects, the temperature change of the entire thermal infrared imaging area to be inspected is insignificant, and thus there is no significant difference in the color of the image under imaging by the super-depth infrared imaging die 25.
In addition, the invention also discloses an optical and thermal infrared multilevel imaging detection method for the defects of the transparent component, which comprises the following steps:
and carrying out optical imaging on the 3C transparent component, acquiring an optical image, and preliminarily judging the position and size of the defect of the 3C transparent component.
And heating the defect position of the 3C transparent member by using nitrogen, and carrying out thermal infrared imaging to obtain a thermal infrared image.
And fusing the optical image and the thermal infrared image, and recognizing the defect types of the 3C transparent member through deep learning, wherein the defect types comprise normal, crack, bubble, scratch and edge breakage.
Specific examples are:
s1, performing thermal infrared imaging on the 3C transparent component by adopting two super-depth-of-field infrared modules, performing interval sampling on two obtained imaging videos, extracting 7 frames of videos with the size of 60 multiplied by 40 pixels respectively, and counting 2 sections of videos with the size of 60 multiplied by 40 pixels and 7 frames, wherein the two sections of videos are a first infrared video and a second infrared video respectively;
s2, performing data feature extraction on the first infrared video and the second infrared video by using a first convolution neural network (3D-CNN) model, and calculating the probability of the first infrared video and the second infrared video belonging to different defect types, wherein the defect types comprise normal, cracks, bubbles, scratches and edge breakage;
s3, compressing the size of the optical imaging image to 60 x 40 pixels, randomly extracting 2 images of 4 infrared forming images from the imaging videos of the two super-depth-of-field infrared imaging modules, wherein the sizes of the images are 60 x 40 pixels, and thus obtaining 5 images of 60 x 40 pixels in total;
s4, extracting data characteristics of the 5 pictures by using a second convolutional neural network (2D-CNN) model, and calculating the probability of belonging to different defect categories;
and S5, fusing the defect type probability obtained by calculating the video and picture characteristics in the step S2 and the step S4, and finally calculating and obtaining the defect type of the current 3C transparent member to be detected.
Fusion calculations were performed as follows:
Figure RE-GDA0002328437120000101
wherein, PAFor predicting defect probability after fusion, PV1For predicting defect probability, P, of the first infrared videoV2For predicting defect probability, P, of the second IR videoOFor predicting defect probability, P, of optical imagesT1,PT2,PT3,PT4The predicted defect probabilities of the 4 infrared forming images are respectively;
and finally judging the defect type, and calculating according to the following formula:
Y=arg max(PA)
and Y is the defect type of the current 3C transparent component to be detected which is finally calculated and obtained.
The first convolutional neural network (3D-CNN) comprises 1 hard-wiring layer H1, 3 convolutional layers C2, C4 and C6, 2 downsampling layers S3 and S5, 1 full-connection layer FC and 1 soft regression layer SR; each cube of the 3D convolution kernel convolution is 7 consecutive frames, each frame size being 60 × 40 pixels; the 1 hard connecting layer respectively processes input first infrared video and second infrared video, and information of five channels extracted from each frame is respectively as follows: the gray scale, the gradients in the x and y directions, the optical flows in the x and y directions, the first three gradients in the gray scale, the x and y directions are calculated according to each frame, the optical flows in the x and y directions are calculated by two continuous frames, and the total number is 33 features; the 3 convolutional layers are obtained by convolution calculation of front layer data through convolution cores of 7x7x3, 7x6x3 and 7x4 respectively; the 2 down-sampling layers are obtained by 2x2 pooling calculation and 3x3 pooling calculation respectively; and finally, obtaining a 128-dimensional vector through 1 full connection layer, and connecting the vector with the full connection layer through 1 soft regression layer to obtain the probability of different defect types.
The second convolutional neural network (2D-CNN) comprises 2 convolutional layers which are respectively a first convolutional layer and a second convolutional layer, 2 downsampling layers which are respectively a first downsampling layer and a second downsampling layer, and 1 full-connection layer and 1 soft regression layer; the 2 convolutional layers are obtained by performing convolution calculation on the front layer data by using a convolution kernel of 2x 2; 2 downsampling layers are obtained through 2x2 pooling calculation; finally, obtaining a 128-dimensional vector through 1 full connection layer, and connecting the vector with the full connection layer through 1 soft regression layer to obtain the probability of different defect types; inputting 1 image of 60 x 40 pixel size of optical imaging and 4 images of thermal infrared imaging into a first convolution layer of a second convolution neural network model, and acquiring defect category identification probability of 5 images from a soft regression layer of the second convolution neural network model.
The sample stock depended by the deep learning module 5 is in the motion control module 3 (such as an internal flash memory chip), and the motion control module 3 can update the corresponding deep convolutional neural network parameters of the scheme at the background; the off-line training sample library of the deep convolutional neural network can be increased by a manufacturer or a user in the number of samples. Therefore, the detection of the 3C transparent component can be increased or decreased according to the actual situation of the sample, and the detection accuracy of the transparent component to be detected with a specific specification and model is improved.
The deep convolutional neural network can be trained and updated by a user in the using process, and can also be updated regularly by a device manufacturer; the device supports multi-version deep convolutional neural networks, and can be autonomously selected by an end user according to an actual application scene.
Although the present invention has been described in detail with reference to the embodiments, it will be apparent to those skilled in the art that modifications, equivalents, improvements, and the like can be made in the technical solutions of the foregoing embodiments or in some of the technical features of the foregoing embodiments, but those modifications, equivalents, improvements, and the like are all within the spirit and principle of the present invention.

Claims (3)

1. The optical and thermal infrared multi-stage imaging detection method for the defects of the transparent member comprises the following steps:
carrying out optical imaging on the 3C transparent component, acquiring an optical image, and preliminarily judging the defect position and size of the 3C transparent component;
heating the defect position of the 3C transparent member by using nitrogen, and carrying out thermal infrared imaging to obtain a thermal infrared image;
the optical image and the thermal infrared image are subjected to fusion processing, and the defect type of the 3C transparent component is recognized through deep learning;
the fusion treatment specifically comprises the following steps:
performing thermal infrared imaging on the 3C transparent component by adopting two super-depth-of-field infrared modules, performing interval sampling on two obtained imaging videos, extracting 7 frames of videos with the size of 60 multiplied by 40 pixels respectively, counting 2 sections of videos with the size of 60 multiplied by 40 pixels and 7 frames, and respectively obtaining a first infrared video and a second infrared video;
performing data feature extraction on the first infrared video and the second infrared video by using a first convolution neural network 3D-CNN model, and calculating the probability of the first infrared video and the second infrared video belonging to different defect types, wherein the defect types comprise normal, cracks, bubbles, scratches and edge breakage;
compressing the size of an optical imaging image to 60 x 40 pixels, and randomly extracting 2 images of 4 infrared forming images with the size of 60 x 40 pixels from the imaging videos of the two super-depth-of-field infrared imaging modules respectively, so as to obtain 5 images with the total size of 60 x 40 pixels;
performing data feature extraction on the 5 pictures by using a second convolutional neural network 2D-CNN model, and calculating the probability of belonging to different defect categories;
fusing the probability of the defect types obtained by calculation aiming at the first infrared video and the second infrared video with the probability of the defect types obtained by calculation aiming at 5 pictures, and calculating according to the following formula:
Figure FDA0003515126480000021
wherein, PAFor predicting defect probability after fusion, PV1For predicting defect probability, P, of the first infrared videoV2For predicting defect probability, P, of the second IR videoOFor predicting defect probability, P, of optical imagesT1,PT2,PT3,PT4The predicted defect probabilities of the 4 infrared forming images are respectively;
and finally judging the defect type, and calculating according to the following formula:
Y=argmax(PA)
and Y is the defect type of the current 3C transparent component to be detected which is finally calculated and obtained.
2. The method for optical and thermal infrared multi-stage imaging detection of transparent member defects according to claim 1, wherein the first convolutional neural network 3D-CNN comprises 1 hard link layer H1, 3 convolutional layers C2, C4 and C6, 2 downsampling layers S3 and S5, 1 fully connected layer FC and 1 soft regression layer SR;
each cube of the 3D convolution kernel convolution is 7 consecutive frames, each frame size being 60 × 40 pixels; the 1 hard connecting layer respectively processes input first infrared video and second infrared video, and information of five channels extracted from each frame is respectively as follows: the gray scale, the gradients in the x and y directions, the optical flows in the x and y directions, the first three gradients in the gray scale, the x and y directions are calculated according to each frame, the optical flows in the x and y directions are calculated by two continuous frames, and the total number is 33 features;
the 3 convolutional layers are obtained by convolution calculation of front layer data through convolution cores of 7x7x3, 7x6x3 and 7x4 respectively;
the 2 down-sampling layers are obtained by 2x2 pooling calculation and 3x3 pooling calculation respectively;
and finally, obtaining a 128-dimensional vector through 1 full connection layer, and connecting the vector with the full connection layer through 1 soft regression layer to obtain the probability of different defect types.
3. The method of claim 2, wherein the second convolutional neural network comprises 2 convolutional layers respectively being a first convolutional layer and a second convolutional layer, 2 downsampling layers respectively being a first downsampling layer and a second downsampling layer, 1 fully-connected layer and 1 soft regression layer;
the 2 convolutional layers are obtained by performing convolution calculation on the front layer data by using a convolution kernel of 2x 2; 2 downsampling layers are obtained through 2x2 pooling calculation;
finally, obtaining a 128-dimensional vector through 1 full connection layer, and connecting the vector with the full connection layer through 1 soft regression layer to obtain the probability of different defect types;
inputting 1 image of 60 x 40 pixel size of optical imaging and 4 images of thermal infrared imaging into a first convolution layer of a second convolution neural network model, and acquiring defect category identification probability of 5 images from a soft regression layer of the second convolution neural network model.
CN201911054181.5A 2019-10-31 2019-10-31 Optical and thermal infrared multi-stage imaging detection method and device for defects of transparent component Active CN110827256B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911054181.5A CN110827256B (en) 2019-10-31 2019-10-31 Optical and thermal infrared multi-stage imaging detection method and device for defects of transparent component

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911054181.5A CN110827256B (en) 2019-10-31 2019-10-31 Optical and thermal infrared multi-stage imaging detection method and device for defects of transparent component

Publications (2)

Publication Number Publication Date
CN110827256A CN110827256A (en) 2020-02-21
CN110827256B true CN110827256B (en) 2022-04-26

Family

ID=69551837

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911054181.5A Active CN110827256B (en) 2019-10-31 2019-10-31 Optical and thermal infrared multi-stage imaging detection method and device for defects of transparent component

Country Status (1)

Country Link
CN (1) CN110827256B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111696075A (en) * 2020-04-30 2020-09-22 航天图景(北京)科技有限公司 Intelligent fan blade defect detection method based on double-spectrum image
JP7476057B2 (en) 2020-09-11 2024-04-30 キオクシア株式会社 Defect Inspection Equipment
CN112730454B (en) * 2020-12-23 2024-07-16 中国人民解放军空军工程大学 Intelligent damage detection method for composite material based on fusion of optical, infrared thermal wave and ultrasonic wave
CN112757747B (en) * 2020-12-30 2022-07-22 广东华中科技大学工业技术研究院 Special-shaped glass part film pasting device and method based on thermal spraying gas infrared imaging
CN113111946A (en) * 2021-04-15 2021-07-13 宁波九纵智能科技有限公司 Quality control method and system integrating hands, eyes and brains
US20230052634A1 (en) * 2021-05-28 2023-02-16 Wichita State University Joint autonomous repair verification and inspection system
CN115965571B (en) * 2022-04-28 2023-08-22 锋睿领创(珠海)科技有限公司 Multi-source information fusion detection and model training method and medium for incremental autonomous learning
CN116843615B (en) * 2023-05-16 2024-04-12 西安邮电大学 Lead frame intelligent total inspection method based on flexible light path

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102129564A (en) * 2011-02-14 2011-07-20 西南交通大学 Contact network failure detection and diagnosis method based on unmanned aerial vehicle
CN105466945A (en) * 2015-12-30 2016-04-06 深圳市创科自动化控制技术有限公司 Infrared detecting method for automatically locating detecting position and detecting equipment
CN109115805A (en) * 2018-10-25 2019-01-01 广东华中科技大学工业技术研究院 Transparent component defect detecting device and method based on ultrasound and the double imagings of optics
CN209148563U (en) * 2018-10-25 2019-07-23 广东华中科技大学工业技术研究院 A kind of double imaging type transparent component defect detecting devices

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2815691C (en) * 2010-11-30 2020-09-15 Bloom Energy Corporation Non-destructive testing methods for fuel cell interconnect manufacturing
JP2016057187A (en) * 2014-09-10 2016-04-21 株式会社東芝 Analyzer

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102129564A (en) * 2011-02-14 2011-07-20 西南交通大学 Contact network failure detection and diagnosis method based on unmanned aerial vehicle
CN105466945A (en) * 2015-12-30 2016-04-06 深圳市创科自动化控制技术有限公司 Infrared detecting method for automatically locating detecting position and detecting equipment
CN109115805A (en) * 2018-10-25 2019-01-01 广东华中科技大学工业技术研究院 Transparent component defect detecting device and method based on ultrasound and the double imagings of optics
CN209148563U (en) * 2018-10-25 2019-07-23 广东华中科技大学工业技术研究院 A kind of double imaging type transparent component defect detecting devices

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"Metal Surface Defect Detection System Based on Semiconductor Laser and Infrared Thermal Imaging";Zhijie Zhang 等;《2019 IEEE International Instrumentation and Measurement Technology Conference (I2MTC)》;20190909;全文 *
"基于可见光-红外光图像融合的苹果缺陷检测算法";陈乾辉 等;《食品与机械》;20180930;第34卷(第9期);135-138 *

Also Published As

Publication number Publication date
CN110827256A (en) 2020-02-21

Similar Documents

Publication Publication Date Title
CN110827256B (en) Optical and thermal infrared multi-stage imaging detection method and device for defects of transparent component
CN108765416B (en) PCB surface defect detection method and device based on rapid geometric alignment
CN103175847B (en) Grating surface defect detecting device
CN112881427B (en) Electronic component defect detection device and method based on visible light and infrared thermal imaging
CN114529510B (en) Automatic detection and classification method for cathode copper on-line quality
CN113267452A (en) Engine cylinder surface defect detection method and system based on machine vision
CN104483320A (en) Digitized defect detection device and detection method of industrial denitration catalyst
CN114881987B (en) Hot-pressing light guide plate defect visual detection method based on improvement YOLOv5
CN114280075B (en) Online visual detection system and detection method for surface defects of pipe parts
CN114113129B (en) Lens micro defect recognition and grabbing system and method
CN112033971A (en) Visual flaw detection system and method
CN102879404A (en) System for automatically detecting medical capsule defects in industrial structure scene
CN112630230A (en) Online surface defect detection method based on photometric stereo method
CN111505013A (en) Warm edge spacer machine vision detection device and method based on deep learning
CN115423785A (en) Defect detection system, method and device, electronic equipment and storage medium
CN106645185A (en) Method and device for intelligently detecting surface quality of industrial parts
CN110376475B (en) Device and method for rapidly detecting line defects on glass surface
CN117085969B (en) Artificial intelligence industrial vision detection method, device, equipment and storage medium
CN206146851U (en) Intellectual detection system industrial part surface quality's device
CN110838107B (en) Method and device for intelligently detecting defects of 3C transparent component by variable-angle optical video
CN106370673A (en) Automatic lens flaw detection method
CN112730442A (en) Automatic online product surface defect detection device and system based on machine vision
CN108375588A (en) A kind of online checking system of opaque bottle
CN117761060A (en) Visual detection system and detection method thereof
CN107248151A (en) A kind of LCD panel intelligent detecting method and system based on machine vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant