CN117455917A - Establishment of false alarm library of etched lead frame and false alarm on-line judging and screening method - Google Patents

Establishment of false alarm library of etched lead frame and false alarm on-line judging and screening method Download PDF

Info

Publication number
CN117455917A
CN117455917A CN202311788551.4A CN202311788551A CN117455917A CN 117455917 A CN117455917 A CN 117455917A CN 202311788551 A CN202311788551 A CN 202311788551A CN 117455917 A CN117455917 A CN 117455917A
Authority
CN
China
Prior art keywords
image
images
library
false alarm
false positive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311788551.4A
Other languages
Chinese (zh)
Other versions
CN117455917B (en
Inventor
肖波
肖贤军
朱为
王威
吴韬
何礼强
谢昌锋
涂丹
王林泉
文欢梅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Ruiyang Jingshi Technology Co ltd
Original Assignee
Shenzhen Ruiyang Jingshi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Ruiyang Jingshi Technology Co ltd filed Critical Shenzhen Ruiyang Jingshi Technology Co ltd
Priority to CN202311788551.4A priority Critical patent/CN117455917B/en
Publication of CN117455917A publication Critical patent/CN117455917A/en
Application granted granted Critical
Publication of CN117455917B publication Critical patent/CN117455917B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Geometry (AREA)
  • Quality & Reliability (AREA)
  • Image Processing (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an etched lead frame false alarm library establishment and false alarm on-line judging and screening method, which comprises the following process steps: s1, pre-training a network model; s2, modeling a product; s3, library building detection image acquisition and unit image division; s4, matching the library building unit images and extracting a defect image; s5, screening false alarm images; s6, false positive image scaling; s7, false positive image conversion; s8, performing false alarm image normalization operation; s9, establishing a false positive image feature library; s10, on-line image acquisition to be detected and unit image division; s11, detecting unit image matching and defect image extraction; s12, scaling candidate defect images; s13, converting candidate defect images; s14, carrying out normalization operation on the candidate defect images and extracting feature vectors; s15, calculating a characteristic distance; s15, judging and screening false alarm images; s16, online false alarm screening. The invention realizes on-line automatic, efficient and high-precision screening of false alarm images and improves the detection efficiency and accuracy of the lead frame.

Description

Establishment of false alarm library of etched lead frame and false alarm on-line judging and screening method
Technical Field
The invention relates to the field of algorithms, in particular to an etched lead frame false alarm library establishment and false alarm online judgment screening method.
Background
The lead frame is a carrier of an integrated circuit chip in the semiconductor industry, has very wide application and high quality requirement, and the quality can directly influence the performance and the service life of a final semiconductor product; the semiconductor industry belongs to the high-precision manufacturing industry, has strict quality requirements on lead frames, and quality control on the lead frames is completed through a detection procedure of the lead frames, and the lead frames with defects such as dirty points, scratches and the like are screened and removed through the detection procedure.
In the lead frame detection procedure, a high-resolution camera is adopted for imaging, and the imaging area of the tiny defect is only more than ten pixels; the current mainstream detection method is a template matching method, a product image shot in real time is compared with a standard template image for detection, and due to the process characteristics of the product, the consistency of product imaging has false alarm of flaws and defects caused by various interferences; such as fluctuation of consistency of product size, fluctuation of complicated background texture of the product, fluctuation of imaging system, etc., can cause difference between the real-time product image and the standard template; therefore, when the image detection method is adopted to screen the defective products, a large number of false positives are caused due to inconsistent imaging of the products and the influence of the factors, so that the non-defective products are false-reported as defective products, and therefore, an efficient and accurate processing method is needed to filter out a large number of false positives existing in automatic detection.
The prior method for eliminating false alarms is to manually re-judge the machine inspection result, and select the false alarms by naked eyes, so that the method is time-consuming and labor-consuming and has low efficiency, meanwhile, a re-judging person is required to have a certain technical threshold, the true defects and the false alarm areas are required to be known relatively, and meanwhile, the person needs to keep high attention for a long time in the whole re-judging process, so that a good false alarm elimination effect is possible.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides the method for establishing and on-line judging and screening the false alarm library of the etched lead frame, which comprises the steps of firstly, carrying out network model pre-training by shooting a lead frame image to obtain a characteristic network of the lead frame, then shooting a limited library-building detection image, screening candidate defect images by unit image gray value difference comparison, screening false alarm images, inputting the false alarm images into the characteristic network of the lead frame, converting the characteristic network of the lead frame into characteristic vectors, storing and establishing a false alarm feature library, converting the product image to be detected into the characteristic vectors by the unit image gray value difference comparison, carrying out characteristic distance calculation with the characteristic vectors in the false alarm feature library one by one, judging whether the characteristic distance is the false alarm image or not according to the characteristic distance, thereby realizing on-line automatic high-efficiency high-precision screening of the false alarm images, and improving the lead frame detection efficiency and accuracy.
The technical scheme adopted by the invention is as follows: an etching lead frame false alarm library establishment and false alarm on-line judging and screening method comprises the following steps:
s1, pre-training a network model: collecting a defect picture and a false alarm picture of the etched lead frame; after classifying and marking the collected defect pictures and false alarm pictures, training through a feature extraction network according to classification and marking categories to obtain a feature network of the etched lead frame;
s2, modeling a product: collecting a defect-free etched lead frame picture serving as a template picture through an optical camera platform; dividing the template map into X template unit images with the same area;
s3, library building detection image acquisition and unit image division: shooting and collecting at least two etched lead frame pictures as library building detection images, and dividing the collected library building detection images into X Zhang Jianku unit images in the same area as the template unit images in the step S2;
s4, matching the library building unit images and extracting a defect image: analyzing and matching the library building unit image in the step S3 and the template unit image in the step S2 one by one, taking the pixel gray value of the unit image as a matching standard, and marking the library building unit image and the template unit image as abnormal unit images when the pixel gray difference value of the library building unit image and the template unit image is larger than a difference threshold T1; when the counted abnormal unit image area after the library building detection image is matched is larger than an abnormal threshold T2, extracting the library building detection image as a candidate defect image;
S5, screening false alarm images: repeating the steps S3 to S4 until all the library-building detection images are matched, extracting candidate defect images in all the library-building detection images, screening the candidate defect images one by one according to the existence of defects in the images, and screening the defect-free images as false alarm images;
s6, false positive image scaling: scaling the false positive images screened in the step S5 to 128 x 128 pixel size images one by one;
s7, false positive image conversion: inputting the scaled false alarm image in the step S6 into the feature network of the etched lead frame trained in the step S1, and outputting a high-dimensional feature vector a of the false alarm image; the false positive image high-dimensional feature vector a is a vector array comprising 128 dimensions;
s8, false alarm image normalization operation: outputting in step S7Carrying out normalization operation on the false positive image high-dimensional feature vector a to obtain a false positive image feature vector a; the false positive image feature vector a is a vector array with 128 dimensions; the normalization operation formula is a i /|a i I, wherein the a i The ith vector value in the 128-dimensional vector array of the high-dimensional characteristic vector a of the false positive image; the |a i The I is a i Is |a|= (the sum of the second powers of the components of vector a) square;
S9, establishing a false positive image feature library: storing the false positive image feature vector a obtained by normalization operation in the step S8 to form a false positive image feature library;
s10, on-line image acquisition to be detected and unit image division: shooting and collecting an image to be detected of the etched lead frame, and dividing the collected image to be detected into X detection unit images in the same area as the template unit image in the step S2:
s11, detecting unit image matching and defect image extraction: analyzing and matching the detection unit image in the step S10 and the template unit image in the step S2 one by one, taking the pixel gray value of the unit image as a matching standard, and marking the detection unit image and the template unit image as abnormal unit images when the pixel gray difference value of the detection unit image and the template unit image is larger than a difference threshold T1; when the area of the abnormal unit image counted after the images to be detected are matched is larger than an abnormal threshold T2, extracting the images to be detected as candidate defect images:
s12, scaling candidate defect images: scaling the candidate defect image screened in the step S11 to an image with the size of 128 x 128 pixels;
s13, converting candidate defect images: inputting the scaled candidate defect image in the step S12 into the feature network of the etched lead frame trained in the step S1, and outputting a candidate image high-dimensional feature vector b; the candidate image high-dimensional feature vector b is a vector array comprising 128 dimensions;
S14, candidate defect image normalization operation and feature vector extraction: carrying out normalization operation on the candidate image high-dimensional feature vector b output in the step S13 to obtain a candidate image feature vector b; the candidate image feature vector b is 128-dimensionalIs a vector array of (a); the normalization operation formula is b=b i /|b i I, wherein said b i The i-th vector value in the 128-dimensional vector array of the candidate image high-dimensional feature vector b; the |b i I is b i Is |b|= (the sum of the second powers of the components of vector b);
s15, calculating the characteristic distance: searching and comparing the candidate image feature vector b lambda obtained after normalization operation in the step S14 with a plurality of false positive image feature vectors a lambda in a false positive feature library established in the step S9 one by one, respectively calculating feature distances between the candidate image feature vector b lambda and the false positive image feature vectors a lambda by utilizing an L2 normal distance formula, and taking the minimum distance value as the final evaluation value L of the image to be detected and the false positive image library; the L2 normal distance formula isA represents a false positive image high-dimensional feature vector, b represents a candidate image high-dimensional feature vector, n=128, i represents the ith dimension in a 128-dimensional vector array, and ai and bi represent vector values of the ith dimension in a and b features respectively;
S15, false positive image judgment and screening: comparing the evaluation value L selected in the step S15 with a false alarm screening threshold value T3, and judging that the image to be detected is a false alarm image when L is less than or equal to T3, otherwise, judging that the image to be detected is a defect image;
s16, online false alarm screening: repeating the steps S10 to S15, comparing the images to be detected with each other through gray values on line, screening candidate defect images, calculating the feature distance through an L2 normal distance formula by using a candidate feature vector b [ lambda ] of the candidate defect images and a plurality of false positive image feature vectors a [ lambda ] in a false positive image library, and comparing the feature distance with a threshold T3 according to the selected evaluation value L, and screening on line to judge whether the candidate images are false positive images.
Preferably, in the step S1, the defect image type includes a dirty image and a scratch image; the false positive image category comprises texture false positive images and edge false positive images.
Preferably, the feature extraction network in the step S1 is a PPLCNetV2 network; the network model pre-training step is to scale the acquired picture to an image with the pixel size of 128 x 128, input the image into a PPLCNetV2 network, and convert the image into a high-dimensional feature vector which is a vector data set comprising 128-dimensional features through the PPLCNetV2 network.
Preferably, the camera resolutions of the template image, the library-building detection image and the image to be detected in the step S2, the step S3 and the step S10 are the same.
Preferably, in the step S2, the step S3 and the step S10, the number X of the template images, the library-building detection images and the online image to be detected divided into the unit images is 280 to 360.
Preferably, when the library-building unit image and the template unit image in the step S4 are based on the pixel gray value matching analysis, the closer the pixel gray values of the library-building unit image and the template unit image are, the smaller the pixel gray difference value of the library-building unit image and the template unit image is, and conversely, the larger the pixel gray difference value of the library-building unit image and the template unit image is, and when the pixel difference value of the library-building unit image is larger than the set difference threshold T1, the library-building unit image is judged to be a difference image; in the step S11, when the detection unit image and the template unit image are based on the pixel gray value matching analysis, the closer the pixel gray values of the detection unit image and the template unit image are, the smaller the pixel gray difference value of the detection unit image and the template unit image is, otherwise, the larger the pixel gray difference value of the detection unit image and the template unit image is, and when the pixel difference value of the detection unit image and the template unit image is larger than the set difference threshold T1, the detection unit image is judged to be a difference image; the difference threshold T1 is a pixel gray difference critical value of a library building unit image or a detection unit image and a template unit image; the value range of the difference threshold T1 is 15 to 35.
Preferably, the anomaly threshold T2 in the step S4 and the step S11 is a number threshold of difference images in the single Zhang Dejian library detection image or the single image to be detected, and when the difference image area of the library building unit image or the detection unit image in the Shan Zhangde library building detection image or the single image to be detected is greater than T2, the Zhang Jianku detection image or the image to be detected is determined to be a candidate defect image; the value range of the abnormal threshold T2 is 15 to 25 pixel areas.
Preferably, in the step S15, the range value of the feature distance between the candidate image feature vector b and the plurality of false positive image feature vectors a is 0 to 1, which is calculated by the L2-norm distance formula through the candidate image feature vector b and the false positive image feature vector a in the false positive image library.
Preferably, the false positive filtering threshold T3 in the step S16 is a critical determination value of the feature distance between the candidate image feature vector b and the false positive image feature vector a based on the L2 normal distance formula; when the feature distance calculated by the candidate image feature vector b & lt/a & gt and the false positive image feature vector a & lt/a & gt based on the L2 normal form distance formula is smaller, the feature distance is closer to the candidate image feature vector b & lt/a & gt, and when the selected evaluation value L representing the minimum feature distance is smaller than T3, the candidate defect image is judged to be the false positive image, otherwise, the candidate defect image is judged to be the defect image.
Preferably, the false positive screening threshold T3 has a range of values from 0.05 to 0.15.
The invention has the beneficial effects that:
the invention designs a method for establishing and on-line judging and screening the false alarm library of the etched lead frame, which aims at the defects and defects existing in the prior art, and designs the method for establishing and on-line judging and screening the false alarm library of the etched lead frame, wherein the method comprises the steps of firstly, carrying out network model pre-training by shooting a lead frame image to obtain a characteristic network of the lead frame, then shooting a limited library-building detection image, screening candidate defect images by unit image gray value difference contrast, screening false alarm images, converting the characteristic network of the false alarm images into characteristic vectors, storing and establishing a false alarm feature library, converting the product image to be detected into the characteristic vectors by the unit image gray value difference contrast, carrying out characteristic distance calculation with the characteristic vectors in the false alarm feature library one by one, and judging whether the characteristic distance is the false alarm image according to the characteristic distance, thereby realizing on-line automatic, efficient and high-precision screening of the false alarm image, and improving the detection efficiency and accuracy of the lead frame.
The invention is applied to the defect detection field of the lead frame, and has the functions of solving the false alarm problem in the defect detection process of the lead frame, realizing online automatic high-precision false alarm screening, reducing the workload of manual re-judgment and improving the detection efficiency. Specifically, the method integrally comprises four steps of feature network pre-training of a lead frame, product modeling, establishment of a false alarm feature library and online false alarm screening.
The feature network pre-training aims at acquiring limited lead frame images through pre-shooting, inputting the lead frame images into a PPLCNetV2 network after the lead frame images are scaled to 128 x 128 pixel sizes, and converting image information into 128-dimensional high-dimensional feature vectors (namely, data groups comprising 128 vector values) through the PPLCNetV2 network, so that a network model suitable for the lead frame is built through pre-training.
The product modeling aims to provide a reference standard for establishing a false alarm feature library or providing pixel gray value comparison analysis in an online false alarm screening process so as to preliminarily screen candidate defect images, wherein the candidate defect images comprise actual defect images and false alarm images (actually defect-free images) when the pixel gray values are compared, 1 lead frame picture without the defect is shot and collected as a template image in the product modeling process, the template image is divided into a plurality of template unit images with the same area according to image pixels (the number of the template unit images can be set to 40 rows and 8 columns on the premise that the shot pixels of a camera group are 500 ten thousand pixels), and the specific division number can be comprehensively selected according to image pixels, detection precision requirements and analysis matching workload), so that the product modeling is completed.
The method comprises the steps of establishing a false positive feature library, namely establishing a feature vector false positive library corresponding to a false positive image which is already in existence and serving as a reference object in a feature distance judging process, substituting feature vectors corresponding to a product image shot in real time in an online detection process and feature vectors in the false positive feature library into an L2 normal form distance calculation formula one by one, and judging whether the image is a false positive image after calculating the feature distance; firstly shooting and collecting false positive images (such as texture false positive images, edge false positive images and the like) which possibly exist in the actual detection process (the number of the images is limited, the number is selected according to different detection requirements, for example, 10 images), dividing the unit images in the same mode as that in the product modeling process, comparing and matching the unit images with the template unit images one by one according to pixel gray values, wherein the pixel gray values of the two are smaller when the pixel gray values of the two are closer, otherwise, the pixel gray values of the two are larger, and when the pixel difference values of the two are larger than a set difference threshold T1 (the difference threshold T1 is a value determined according to multiple field experiments and pixel matching experience, the value range is 25 to 35), judging the unit images as difference images, and when the number of the unit images in Shan Zhangjian library detection images is larger than a set abnormal threshold T2 (the abnormal threshold T2 is determined according to factors such as multiple field experiments, the number of image pixels and unit division, and the like, the value range is 15 to 25 pixel area), judging the Zhang Jianku detection images as candidate defect images; screening out images without flaw defects from the candidate defect images by a manual re-judging mode, namely false alarm images (the quantity of library building detection images is limited, so that the workload of manual re-judging is limited); after the screened false positive image is scaled to the pixel size of 128 x 128, inputting the network model of the lead frame, and converting the image information into a high-dimensional feature vector (namely a data group comprising 128 vector values) which is output into 128 dimensions through the network model; when the high-dimensional feature vector is not normalized, the result obtained by calculating the feature distance through the L2 normal form feature distance formula is 0 to infinite, the obtained feature distance value is uncontrollable and cannot be used as a judgment standard, so that the vector value in the high-dimensional feature vector is normalized to a value in the range of-1 to 1 through normalization operation, the normalization operation is carried out to obtain a false positive image feature vector (namely, a data set comprising 128 vector values, and the range of 128 vector values is-1 to 1), and then the false positive image feature vector is stored, so that a false positive feature library comprising a plurality of feature vectors is established.
On-line false alarm screening is the final aim to be realized by the invention, namely, false alarm screening is automatically finished on line in real time without manual re-judgment, when a network model of the lead frame is pre-trained, product modeling is finished and a false alarm feature library is built, for a lead frame product to be detected subsequently, the lead frame to be detected is shot and collected through an optical camera platform (such as a CCD lens assembly), after an on-line to-be-detected image is formed, the image is subjected to unit division in the same way as a template image to form a detection unit image, similar to the building of the false alarm feature library, by matching comparison between the detection unit image and the template unit image based on pixel gray values, when the pixel difference value of the detection unit image and the template unit image is larger than a set difference threshold T1, the detection unit image is judged to be the difference image, otherwise, the detection unit image is the normal image, when the number of the detection unit image in the single Zhang Zaixian to-be-detected image is larger than the set abnormal threshold T2, the on-line to-be-detected image is judged to be the candidate defect image, then the candidate defect image is 128 pixels, the on-size of the template image is calculated, and the on-line to be the lead frame is converted into a high-dimensional vector data (namely, the network model is 128 is built by scaling data; after the high-dimensional feature vector is obtained, carrying out normalization calculation on the high-dimensional feature vector to obtain a feature vector with a vector value ranging from-1 to 1; and then the feature vector is matched with the feature vector in the error feature library one by one, and is substituted into an L2 normal form distance calculation formula, after the feature distance between the feature vector and different feature vectors in the error feature library is calculated, the minimum feature distance is taken as a final evaluation value L, and then the evaluation value L is compared with a set false alarm screening threshold T3 (the false alarm screening threshold T3 is a value determined according to a plurality of field experiments, the size range of the false alarm screening threshold T3 is between 0.05 and 0.15, if the evaluation value L representing the minimum feature distance is less than T3, the candidate defect image is judged to be the false alarm image, otherwise, the candidate defect image is judged to be the defect image, so that the purpose of screening the false alarm image for the second time in the candidate defect images screened for the first time through the pixel gray values is realized.
Drawings
Fig. 1 is a functional block diagram of an algorithm of the present invention.
FIG. 2 is a schematic diagram of the steps of modeling the product of the present invention.
FIG. 3 is a schematic diagram showing steps for extracting candidate defect images of a library-built inspection image or an online image to be inspected according to the present invention.
Fig. 4 is a schematic diagram of a feature network pre-training step of the lead frame of the present invention.
Fig. 5 is a schematic diagram of the network framework of the PPLCNetV2 network of the present invention.
FIG. 6 is a schematic representation of a defect-free template image acquired in accordance with the present invention.
Fig. 7 is a schematic diagram of the template unit image in fig. 6.
FIG. 8 is a schematic diagram of a library-building detection image or an online image to be detected collected by the present invention.
Fig. 9 is a schematic diagram of screening false alarm images of a library-building detection image according to the present invention.
Fig. 10 is a partially enlarged schematic illustration of the filtered false positive unit image of fig. 9.
FIG. 11 is an enlarged view of a portion of the image acquired according to the present invention.
FIG. 12 is an enlarged partial schematic view of a scratch in an acquired image according to the present invention.
FIG. 13 is a schematic representation of a partial magnification of a texture false positive in an acquired image according to the present invention.
Fig. 14 is a schematic enlarged view of a part of an edge false alarm in an acquired image according to the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be noted that all directional indicators such as up, down, left, right, front, and rear … … in the embodiments of the present invention are merely used to explain the relative positional relationship between the components, the movement condition, etc. in a specific posture, and if the specific posture is changed, the directional indicator is changed accordingly.
In the present invention, unless explicitly specified and limited otherwise, the terms "connected," "fixed," and the like are to be construed broadly, and for example, "connected" may be either a fixed connection or a removable connection or integrated; can be mechanically or electrically connected; either directly or indirectly, through intermediaries, or both, may be in communication with each other or in interaction with each other, unless expressly defined otherwise. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art according to the specific circumstances.
As shown in fig. 1 to 3, an etched lead frame false alarm library establishment and false alarm online judgment screening method includes the following steps:
s1, pre-training a network model: collecting a defect picture and a false alarm picture of the etched lead frame; after classifying and marking the collected defect pictures and false alarm pictures, training through a feature extraction network according to classification and marking categories to obtain a feature network of the etched lead frame;
S2, modeling a product: collecting a defect-free etched lead frame picture serving as a template picture through an optical camera platform; dividing the template map into X template unit images with the same area;
s3, library building detection image acquisition and unit image division: shooting and collecting at least two etched lead frame pictures as library building detection images, and dividing the collected library building detection images into X Zhang Jianku unit images in the same area as the template unit images in the step S2;
s4, matching the library building unit images and extracting a defect image: analyzing and matching the library building unit image in the step S3 and the template unit image in the step S2 one by one, taking the pixel gray value of the unit image as a matching standard, and marking the library building unit image and the template unit image as abnormal unit images when the pixel gray difference value of the library building unit image and the template unit image is larger than a difference threshold T1; when the counted abnormal unit image area after the library building detection image is matched is larger than an abnormal threshold T2, extracting the library building detection image as a candidate defect image;
s5, screening false alarm images: repeating the steps S3 to S4 until all the library-building detection images are matched, extracting candidate defect images in all the library-building detection images, screening the candidate defect images one by one according to the existence of defects in the images, and screening the defect-free images as false alarm images;
S6, false positive image scaling: scaling the false positive images screened in the step S5 to 128 x 128 pixel size images one by one;
s7, false positive image conversion: inputting the scaled false alarm image in the step S6 into the feature network of the etched lead frame trained in the step S1, and outputting a high-dimensional feature vector a of the false alarm image; the false positive image high-dimensional feature vector a is a vector array comprising 128 dimensions;
s8, false alarm image normalization operation: carrying out normalization operation on the high-dimensional feature vector a of the false positive image output in the step S7 to obtain a false positive image feature vector a; the false positive image feature vector a is a vector array with 128 dimensions; the normalization operation formula is a i /|a i I, wherein the a i The ith vector value in the 128-dimensional vector array of the high-dimensional characteristic vector a of the false positive image; the |a i The I is a i Is |a|= (the sum of the second powers of the components of vector a) square;
s9, establishing a false positive image feature library: storing the false positive image feature vector a obtained by normalization operation in the step S8 to form a false positive image feature library;
s10, on-line image acquisition to be detected and unit image division: shooting and collecting an image to be detected of the etched lead frame, and dividing the collected image to be detected into X detection unit images in the same area as the template unit image in the step S2:
S11, detecting unit image matching and defect image extraction: analyzing and matching the detection unit image in the step S10 and the template unit image in the step S2 one by one, taking the pixel gray value of the unit image as a matching standard, and marking the detection unit image and the template unit image as abnormal unit images when the pixel gray difference value of the detection unit image and the template unit image is larger than a difference threshold T1; when the area of the abnormal unit image counted after the images to be detected are matched is larger than an abnormal threshold T2, extracting the images to be detected as candidate defect images:
s12, scaling candidate defect images: scaling the candidate defect image screened in the step S11 to an image with the size of 128 x 128 pixels;
s13, converting candidate defect images: inputting the scaled candidate defect image in the step S12 into the feature network of the etched lead frame trained in the step S1, and outputting a candidate image high-dimensional feature vector b; the candidate image high-dimensional feature vector b is a vector array comprising 128 dimensions;
s14, candidate defect image normalization operation and feature vector extraction: carrying out normalization operation on the candidate image high-dimensional feature vector b output in the step S13 to obtain a candidate image feature vector b; the candidate image feature vector b is a vector array with 128 dimensions; the normalization operation formula is b=b i /|b i I, wherein said b i The i-th vector value in the 128-dimensional vector array of the candidate image high-dimensional feature vector b; the |b i I is b i Is |b|= (the sum of the second powers of the components of vector b);
s15, calculating the characteristic distance: searching and comparing the candidate image feature vector b lambda obtained after normalization operation in the step S14 with a plurality of false positive image feature vectors a lambda in a false positive feature library established in the step S9 one by one, respectively calculating feature distances between the candidate image feature vector b lambda and the false positive image feature vectors a lambda by utilizing an L2 normal distance formula, and taking the minimum distance value as the final evaluation value L of the image to be detected and the false positive image library; the L2 normal distance formula isA represents a false positive image high-dimensional feature vector, b represents a candidate image high-dimensional feature vector, n=128, i represents the ith dimension in a 128-dimensional vector array, and ai and bi represent vector values of the ith dimension in a and b features respectively;
s15, false positive image judgment and screening: comparing the evaluation value L selected in the step S15 with a false alarm screening threshold value T3, and judging that the image to be detected is a false alarm image when L is less than or equal to T3, otherwise, judging that the image to be detected is a defect image;
s16, online false alarm screening: repeating the steps S10 to S15, comparing the images to be detected with each other through gray values on line, screening candidate defect images, calculating the feature distance through an L2 normal distance formula by using a candidate feature vector b [ lambda ] of the candidate defect images and a plurality of false positive image feature vectors a [ lambda ] in a false positive image library, and comparing the feature distance with a threshold T3 according to the selected evaluation value L, and screening on line to judge whether the candidate images are false positive images.
As shown in fig. 6, a schematic diagram of a non-defective template image acquired according to the present invention is shown, after the non-defective template image is acquired, the image is divided into a plurality of template unit images, and one unit image is selected as a candidate template unit image for subsequent pixel gray value matching, as shown in fig. 7.
As shown in fig. 8, the image is divided into a plurality of unit images and candidate defect images are screened out through pixel gray level matching.
As shown in fig. 9, a schematic diagram of screening false alarm images by using the library-building detection image according to the present invention is shown, when candidate defect images are screened out by using pixel gray level matching in the library-building detection image, each unit image of the candidate defect images is checked and re-judged one by one in fig. 9 by using a manual re-judgment mode, and the false alarm images are picked out from the unit images for subsequent establishment of a false alarm feature library.
As shown in fig. 10, which is a partially enlarged schematic view of the false alarm unit image screened in fig. 9, it is known from the figure that the reason that the false alarm image is selected as a candidate defect image after the pixel gray value is matched is that the difference of the edge gray value is too large, and in fact, the image is a defect-free defect image, so that the false alarm image is screened as a false alarm image for library establishment, and the false alarm image is input to a feature network of a lead frame after being scaled to convert image information into a high-dimensional vector feature.
As shown in fig. 10, the high-dimensional vector features of the image corresponding to the false positive image schematic diagram 3 in fig. 10 output through the feature network are as follows: [ [0.00254897, -0.0198782,0.0622372,0.00438112,0.0145146, -0.00670561, -0.0349422, -0.0761061, -0.0330957, -0.00865035, -0.0483953,0.0217152, -0.0546154, … … ].
As shown in fig. 10, the high-dimensional vector features of the image corresponding to the false positive image schematic diagram 2 in fig. 10 output through the feature network are as follows: [ [0.0653486,0.0178405, -0.0237074, -0.0178318, -0.024009, -0.0821383, -0.0145345, -0.0486141, -0.0632687,0.00192454, -0.0225132,0.0817714, -0.0371344,0.0183654, -0.0287645, -0.0605773,0.0211897,0.0270865,0.0375279,0.072306,0.0014263, -0.0325393,0.0624365, -0.0299767, -0.0331617,0.0186456, -0.0790586,0.06878, -0.00176705,0.0435906,0.0043805,0.00493038,0.0047855,0.0426672,0.0568707,0.0441367,0.0211548,0.00891262, -0.0186768, -0.0466011,0.00561119, -0.00211063,0.00331003, -0.0656972,0.0586017, … … ].
As shown in fig. 10, the high-dimensional vector features of the image corresponding to the false positive image schematic diagram 1 in fig. 10 output through the feature network are as follows: [ -0.0394494,0.0142747, -0.028444, -0.00866088, -0.00438898, -0.0583859, -0.0765748, -0.041415, -0.0530525,0.0157711,0.00267087, -0.00918346, -0.0889935,0.076135,0.0344477, -0.0647142, -0.091354,0.0535637, -0.0474916,0.0225929, -0.0630671, -0.0183488, -0.00315991, -0.00258599,0.000261465,0.0212461, -0.0150233,0.0103629,0.0447965,0.005706,0.0418162,0.00130445, -0.092437, … ].
As shown in fig. 10, the high-dimensional vector features of the image corresponding to the false positive image schematic diagram 4 in fig. 10 output through the feature network are as follows: [, -0.0716673,0.0134239, -0.0457978, -0.00368738, -0.064066, -0.03132, -0.0924681,0.0357773,0.0140021,0.0148327, -0.00190882, -0.0980173,0.0466814,0.0441684, -0.0923377, -0.012295,0.0546425,0.000881547,0.0413368, -0.0358201, -0.0608816, -0.0155415, -0.0366058, -0.03519, -0.0217756, -0.0497719, -0.0294413, -0.109245, -0.0607234,0.00482594,0.0186923,0.00590011,0.0445973, -0.0397616,0.0436038, -0.0641756,0.0466428,0.0546191,0.00206646, -0.0731959, -0.0420972,0.0283895, … … ].
The false alarm image information is converted and output into a high-dimensional feature vector through a feature network, then the high-dimensional feature vector is converted into the feature vector through normalization calculation, and the distance range value is controlled to be between 0 and 1 when the feature distance calculation is carried out at the follow-up time.
As shown in fig. 11, the feature vector extracted by the normalization operation of the high-dimensional vector feature output by the image corresponding to the dirt 1 in fig. 11 through the feature network is: [ [ -0.0199702,0.0298826,0.00574119, -0.0413988, -0.0175548, -0.0568687, -0.091874, -0.0173608,0.0412338, -0.0307102, -0.016219, -0.00166185, -0.054901,0.118219, -0.000705321, -0.0568975, -0.0796738,0.0438039, -0.0647409,0.04662, -0.0691875, -0.0243827, -0.00413231, -0.0298326, -0.0327811,0.0170793, -0.0420636,0.00543999,0.0443309,0.030534,0.0616699,0.0314932, -0.0662809,0.0558587, -0.0146055, -0.0201346, -0.00587449,0.088057,0.00853607, -0.0182423,0.0262702,0.0354289, -0.0401268,0.0212121, -0.0161035,0.0214663, -0.0388577,0.0436364,0.0452089, -0.0137521,0.0202017,0.00636449, -0.0492859,0.0536226, -0.0206718,0.00102175,0.0339028, … … ]
In fig. 10, the feature vector obtained by normalizing the feature of the high-dimensional vector output by the feature network of the image corresponding to the false alarm image schematic diagram 3 is: [ -0.0358201, -0.0608816, -0.0155415, -0.0366058, -0.03519, -0.0217756, -0.0497719, -0.0294413, -0.109245, -0.0607234,0.00482594,0.0186923,0.00590011,0.0445973, -0.0397616,0.0436038, -0.0641756,0.0466428,0.0546191,0.00206646, -0.0731959, -0.0420972,0.0283895, -0.0340378, -0.0202859,0.0290878,0.0878383,0.0353065, -0.0106767,0.0577252, -0.033487, -0.051565,0.025638, -0.0479884, -0.115865,0.041643,0.0532406,0.0945049, -0.0547059, -0.0479764,0.00388267, -0.0718287, -0.0155912,0.000541382,0.126608, -0.0270506, -0.0312805,0.0170672, -0.0600545, -0.0112602, -0.00436069,0.00198579,0.00489726, -0.0326198,0.0339323, … … ].
And after the feature vectors of the other three error images in fig. 10 are obtained by analogy, substituting the feature vector of the image corresponding to the dirty 1 in fig. 11 and the feature vector corresponding to the four error images in fig. 10 into the distance value obtained by calculation in the L2 normal form calculation formula, wherein the minimum value is 0.723, judging the minimum value as an evaluation value L, setting the false alarm screening threshold T3 as 0.1, and if L is larger than T3, dissimilar to the received data, the image corresponding to the dirty 1 cannot be filtered, namely judging that the image corresponding to the dirty 1 belongs to the defect image in the candidate defect images.
Similarly, if the feature vector corresponding to the schematic diagram 3 of the false positive image in fig. 10 and the feature vector corresponding to the other three error images in fig. 10 are substituted into the distance value calculated in the L2-norm calculation formula, the minimum value is 0.054, the minimum value is determined to be the evaluation value L, and the false positive screening threshold T3 is set to 0.1, L is smaller than T3, so that the image is similar to the other three images, and the false positive image is determined.
According to the invention, firstly, a network model pre-training is carried out by shooting lead frame images to obtain a characteristic network of the lead frame, then a limited library-building detection image is shot, and a candidate defect image is screened through unit image gray value difference comparison, then a false alarm image is screened out, the false alarm image is input into the characteristic network of the lead frame to be converted into a characteristic vector, and after the false alarm feature library is stored and established, a product image to be detected is screened out through unit image gray value difference comparison and the candidate defect image is converted into the characteristic vector through the characteristic network, the characteristic distance calculation is carried out with the characteristic vector in the false alarm feature library one by one, and whether the characteristic distance is the false alarm image is judged according to the characteristic distance, so that the method for automatically, efficiently and accurately screening the false alarm image on line is realized, and the method for establishing the false alarm library of the etched lead frame and judging and screening the false alarm on line is improved in the lead frame detection efficiency and accuracy is realized. The invention is applied to the defect detection field of the lead frame, and has the functions of solving the false alarm problem in the defect detection process of the lead frame, realizing online automatic high-precision false alarm screening, reducing the workload of manual re-judgment and improving the detection efficiency. Specifically, the method integrally comprises four steps of feature network pre-training of a lead frame, product modeling, establishment of a false alarm feature library and online false alarm screening.
The feature network pre-training aims at acquiring limited lead frame images through pre-shooting, inputting the lead frame images into a PPLCNetV2 network after the lead frame images are scaled to 128 x 128 pixel sizes, and converting image information into 128-dimensional high-dimensional feature vectors (namely, data groups comprising 128 vector values) through the PPLCNetV2 network, so that a network model suitable for the lead frame is built through pre-training.
The product modeling aims to provide a reference standard for establishing a false alarm feature library or providing pixel gray value comparison analysis in an online false alarm screening process so as to preliminarily screen candidate defect images, wherein the candidate defect images comprise actual defect images and false alarm images (actually defect-free images) when the pixel gray values are compared, 1 lead frame picture without the defect is shot and collected as a template image in the product modeling process, the template image is divided into a plurality of template unit images with the same area according to image pixels (the number of the template unit images can be set to 40 rows and 8 columns on the premise that the shot pixels of a camera group are 500 ten thousand pixels), and the specific division number can be comprehensively selected according to image pixels, detection precision requirements and analysis matching workload), so that the product modeling is completed.
The method comprises the steps of establishing a false positive feature library, namely establishing a feature vector false positive library corresponding to a false positive image which is already in existence and serving as a reference object in a feature distance judging process, substituting feature vectors corresponding to a product image shot in real time in an online detection process and feature vectors in the false positive feature library into an L2 normal form distance calculation formula one by one, and judging whether the image is a false positive image after calculating the feature distance; firstly shooting and collecting false positive images (such as texture false positive images, edge false positive images and the like) which possibly exist in the actual detection process (the number of the images is limited, the number is selected according to different detection requirements, for example, 10 images), dividing the unit images in the same mode as that in the product modeling process, comparing and matching the unit images with the template unit images one by one in pixel gray values, wherein the pixel gray values of the two are closer to each other, the pixel gray values of the two are smaller, the pixel gray values of the two are larger, and when the pixel difference values of the two are larger than a set difference threshold T1 (the difference threshold T1 is a value determined according to multiple field experiments and pixel matching experience, the value range is 25 to 35), judging the unit images to be difference images, and when the number of the unit images in the Shan Zhangjian library detection images is larger than a set abnormal threshold T2 (the abnormal threshold T2 is determined according to factors such as multiple field experiments, the number of image pixels and the unit division, and the like, the value range is 15 to 25), judging the Zhang Jianku detection images to be candidate images; screening out images without flaw defects from the candidate defect images by a manual re-judging mode, namely false alarm images (the quantity of library building detection images is limited, so that the workload of manual re-judging is limited); after the screened false positive image is scaled to the pixel size of 128 x 128, inputting the network model of the lead frame, and converting the image information into a high-dimensional feature vector (namely a data group comprising 128 vector values) which is output into 128 dimensions through the network model; when the high-dimensional feature vector is not normalized, the result obtained by calculating the feature distance through the L2 normal form feature distance formula is 0 to infinite, the obtained feature distance value is uncontrollable and cannot be used as a judgment standard, so that the vector value in the high-dimensional feature vector is normalized to a value in the range of-1 to 1 through normalization operation, the normalization operation is carried out to obtain a false positive image feature vector (namely, a data set comprising 128 vector values, and the range of 128 vector values is-1 to 1), and then the false positive image feature vector is stored, so that a false positive feature library comprising a plurality of feature vectors is established.
On-line false alarm screening is the final aim to be realized by the invention, namely, false alarm screening is automatically finished on line in real time without manual re-judgment, when a network model of the lead frame is pre-trained, product modeling is finished and a false alarm feature library is built, for a lead frame product to be detected subsequently, the lead frame to be detected is shot and collected through an optical camera platform (such as a CCD lens assembly), after an on-line to-be-detected image is formed, the image is subjected to unit division in the same way as a template image to form a detection unit image, similar to the building of the false alarm feature library, by matching comparison between the detection unit image and the template unit image based on pixel gray values, when the pixel difference value of the detection unit image and the template unit image is larger than a set difference threshold T1, the detection unit image is judged to be the difference image, otherwise, the detection unit image is the normal image, when the number of the detection unit image in the single Zhang Zaixian to-be-detected image is larger than the set abnormal threshold T2, the on-line to-be-detected image is judged to be the candidate defect image, then the candidate defect image is 128 pixels, the on-size of the template image is calculated, and the on-line to be the lead frame is converted into a high-dimensional vector data (namely, the network model is 128 is built by scaling data; after the high-dimensional feature vector is obtained, carrying out normalization calculation on the high-dimensional feature vector to obtain a feature vector with a vector value ranging from-1 to 1; and then the feature vector is matched with the feature vector in the error feature library one by one, and is substituted into an L2 normal form distance calculation formula, after the feature distance between the feature vector and different feature vectors in the error feature library is calculated, the minimum feature distance is taken as a final evaluation value L, and then the evaluation value L is compared with a set false alarm screening threshold T3 (the false alarm screening threshold T3 is a value determined according to a plurality of field experiments, the size range of the false alarm screening threshold T3 is between 0.05 and 0.15, if the evaluation value L representing the minimum feature distance is less than T3, the candidate defect image is judged to be the false alarm image, otherwise, the candidate defect image is judged to be the defect image, so that the purpose of screening the false alarm image for the second time in the candidate defect images screened for the first time through the pixel gray values is realized.
Example 1
As shown in fig. 4 to 5, the pre-training step of the network model of the lead frame and the feature extraction network thereof are shown, and as an embodiment of the present invention, the feature extraction network in step S1 of the present invention is a PPLCNetV2 network; the network model pre-training step is to scale the acquired picture to an image with the pixel size of 128 x 128, input the image into a PPLCNetV2 network, and convert the image into a high-dimensional feature vector which is a vector data set comprising 128-dimensional features through the PPLCNetV2 network.
Example 2
As shown in fig. 11 to 14, as an embodiment of the present invention, the defect image type in step S1 of the present invention includes a dirty image, a scratch image; the false positive image category comprises texture false positive images and edge false positive images.
Example 3
As an embodiment of the present invention, in the steps S2, S3 and S10 of the present invention, the resolution of the cameras for capturing the template image, the library-building detection image and the image to be detected are the same, and each camera with 500 ten thousand pixels is used for capturing the image.
Example 4
As an embodiment of the present invention, in step S2, step S3 and step S10 of the present invention, the template image, the library-building detection image and the online image to be detected are divided into the number X of unit images, which takes a value of 40 rows and 8 columns, and a total of 320 unit images.
Example 5
As an embodiment of the present invention, when the library building unit image and the template unit image in step S4 of the present invention are based on the pixel gray value matching analysis, the closer the pixel gray values of the two are, the smaller the pixel gray difference value of the two is, and conversely, the larger the pixel gray difference value of the two is, and when the pixel difference value of the two is greater than the set difference threshold T1, the library building unit image is determined to be a difference image; in the step S11, when the detection unit image and the template unit image are based on the pixel gray value matching analysis, the closer the pixel gray values of the detection unit image and the template unit image are, the smaller the pixel gray difference value of the detection unit image and the template unit image is, otherwise, the larger the pixel gray difference value of the detection unit image and the template unit image is, and when the pixel difference value of the detection unit image and the template unit image is larger than the set difference threshold T1, the detection unit image is judged to be a difference image; the difference threshold T1 is a pixel gray difference critical value of a library building unit image or a detection unit image and a template unit image; the difference threshold T1 has a value of 20.
Example 6
As an embodiment of the present invention, the anomaly threshold T2 in step S4 and step S11 of the present invention is an area threshold of a difference image in a single Zhang Dejian library detection image or a single image to be detected, and when the difference image area of a library unit image or a detection unit image in a Shan Zhangde library detection image or a single image to be detected is greater than T2, the Zhang Jianku detection image or the image to be detected is determined to be a candidate defect image; the value of the abnormal threshold T2 is 20 pixel areas. Specifically, after two compared unit images are aligned based on image contours, calculating a difference binary image of the two unit images, wherein the binary image consists of two numbers of 2 and 255, and the comparison principle of the binary image is that when the gray difference value of the two unit images is greater than or equal to a threshold value T1, the difference binary image of the two unit images is marked as 255, otherwise, the difference binary image of the two unit images is marked as 0; and then carrying out connected domain analysis on the binary image collected after comparison, and selecting the image as a candidate defect image when the area value of the pixel block marked with 255 is larger than a threshold value T2.
Example 7
As one embodiment of the present invention, in the step S15 of the present invention, the range value of the feature distance between the candidate image feature vector b and the plurality of false positive image feature vectors a is 0 to 1, which is calculated by the L2-norm distance formula from the candidate image feature vector b and the false positive image feature vector a in the false positive image library.
The false positive screening threshold T3 in the step S16 is a critical judgment value of the feature distance between the candidate image feature vector b and the false positive image feature vector a based on the L2 normal form distance formula; when the feature distance calculated by the candidate image feature vector b & lt/a & gt and the false positive image feature vector a & lt/a & gt based on the L2 normal form distance formula is smaller, the feature distance is closer to the candidate image feature vector b & lt/a & gt, and when the selected evaluation value L representing the minimum feature distance is smaller than T3, the candidate defect image is judged to be the false positive image, otherwise, the candidate defect image is judged to be the defect image.
The false alarm screening threshold T3 is 0.1.
The examples of the present invention are presented only to describe specific embodiments thereof and are not intended to limit the scope of the invention. Certain modifications may be made by those skilled in the art in light of the teachings of this embodiment, and all equivalent changes and modifications that come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.

Claims (10)

1. The method for establishing the false alarm library of the etched lead frame and judging and screening the false alarm on line is characterized by comprising the following steps of:
s1, pre-training a network model: collecting a defect picture and a false alarm picture of the etched lead frame; after classifying and marking the collected defect pictures and false alarm pictures, training through a feature extraction network according to classification and marking categories to obtain a feature network of the etched lead frame;
s2, modeling a product: collecting a defect-free etched lead frame picture serving as a template picture through an optical camera platform; dividing the template map into X template unit images with the same area;
s3, library building detection image acquisition and unit image division: shooting and collecting at least two etched lead frame pictures as library building detection images, and dividing the collected library building detection images into X Zhang Jianku unit images in the same area as the template unit images in the step S2;
s4, matching the library building unit images and extracting a defect image: analyzing and matching the library building unit image in the step S3 and the template unit image in the step S2 one by one, taking the pixel gray value of the unit image as a matching standard, and marking the library building unit image and the template unit image as abnormal unit images when the pixel gray difference value of the library building unit image and the template unit image is larger than a difference threshold T1; when the counted abnormal unit image area after the library building detection image is matched is larger than an abnormal threshold T2, extracting the library building detection image as a candidate defect image;
S5, screening false alarm images: repeating the steps S3 to S4 until all the library-building detection images are matched, extracting candidate defect images in all the library-building detection images, screening the candidate defect images one by one according to the existence of defects in the images, and screening the defect-free images as false alarm images;
s6, false positive image scaling: scaling the false positive images screened in the step S5 to 128 x 128 pixel size images one by one;
s7, false positive image conversion: inputting the scaled false alarm image in the step S6 into the feature network of the etched lead frame trained in the step S1, and outputting a high-dimensional feature vector a of the false alarm image; the false positive image high-dimensional feature vector a is a vector array comprising 128 dimensions;
s8, false alarm image normalization operation: carrying out normalization operation on the high-dimensional feature vector a of the false positive image output in the step S7 to obtain a false positive image feature vector a; the false positive image feature vector a is a vector array with 128 dimensions; the normalization operation formula is a i /|a i I, wherein, the instituteThe a i The ith vector value in the 128-dimensional vector array of the high-dimensional characteristic vector a of the false positive image; the |a i The I is a i Is |a|= (the sum of the second powers of the components of vector a) square;
S9, establishing a false positive image feature library: storing the false positive image feature vector a obtained by normalization operation in the step S8 to form a false positive image feature library;
s10, on-line image acquisition to be detected and unit image division: shooting and collecting an image to be detected of the etched lead frame, and dividing the collected image to be detected into X detection unit images in the same area as the template unit image in the step S2:
s11, detecting unit image matching and defect image extraction: analyzing and matching the detection unit image in the step S10 and the template unit image in the step S2 one by one, taking the pixel gray value of the unit image as a matching standard, and marking the detection unit image and the template unit image as abnormal unit images when the pixel gray difference value of the detection unit image and the template unit image is larger than a difference threshold T1; when the area of the abnormal unit image counted after the images to be detected are matched is larger than an abnormal threshold T2, extracting the images to be detected as candidate defect images:
s12, scaling candidate defect images: scaling the candidate defect image screened in the step S11 to an image with the size of 128 x 128 pixels;
s13, converting candidate defect images: inputting the scaled candidate defect image in the step S12 into the feature network of the etched lead frame trained in the step S1, and outputting a candidate image high-dimensional feature vector b; the candidate image high-dimensional feature vector b is a vector array comprising 128 dimensions;
S14, candidate defect image normalization operation and feature vector extraction: carrying out normalization operation on the candidate image high-dimensional feature vector b output in the step S13 to obtain a candidate image feature vector b; the candidate image feature vector b is a vector array with 128 dimensions; the normalization operation formula is b=b i /|b i I, wherein said b i The i-th vector value in the 128-dimensional vector array of the candidate image high-dimensional feature vector b; the |b i I is b i Is |b|= (the sum of the second powers of the components of vector b);
s15, calculating the characteristic distance: searching and comparing the candidate image feature vector b lambda obtained after normalization operation in the step S14 with a plurality of false positive image feature vectors a lambda in a false positive feature library established in the step S9 one by one, respectively calculating feature distances between the candidate image feature vector b lambda and the false positive image feature vectors a lambda by utilizing an L2 normal distance formula, and taking the minimum distance value as the final evaluation value L of the image to be detected and the false positive image library; the L2 normal distance formula isA in the formula represents a false positive image high-dimensional feature vector, b represents a candidate image high-dimensional feature vector, n=128, i represents the ith dimension in a 128-dimensional vector array, and ai and bi represent vector values of the ith dimension in a and b features respectively;
S16, false positive image judgment and screening: comparing the evaluation value L selected in the step S15 with a false alarm screening threshold value T3, and judging that the image to be detected is a false alarm image when L is less than or equal to T3, otherwise, judging that the image to be detected is a defect image;
s17, online false alarm screening: repeating the steps S10 to S15, comparing the images to be detected with each other through gray values on line, screening candidate defect images, calculating the feature distance through an L2 normal distance formula by using a candidate feature vector b [ lambda ] of the candidate defect images and a plurality of false positive image feature vectors a [ lambda ] in a false positive image library, and comparing the feature distance with a threshold T3 according to the selected evaluation value L, and screening on line to judge whether the candidate images are false positive images.
2. The etched lead frame false alarm library establishment and false alarm online judgment screening method according to claim 1, wherein the method comprises the following steps: the defect image types in the step S1 comprise dirty images and scratch images; the false positive image category comprises texture false positive images and edge false positive images.
3. The etched lead frame false alarm library establishment and false alarm online judgment screening method according to claim 1, wherein the method comprises the following steps: the feature extraction network in the step S1 is a PPLCNetV2 network; the network model pre-training step is to scale the acquired picture to an image with the pixel size of 128 x 128, input the image into a PPLCNetV2 network, and convert the image into a high-dimensional feature vector which is a vector data set comprising 128-dimensional features through the PPLCNetV2 network.
4. The etched lead frame false alarm library establishment and false alarm online judgment screening method according to claim 1, wherein the method comprises the following steps: and the camera resolutions of the template image, the library-building detection image and the image to be detected which are shot and acquired in the step S2, the step S3 and the step S10 are the same.
5. The etched lead frame false alarm library establishment and false alarm online judgment screening method according to claim 1, wherein the method comprises the following steps: in the step S2, the step S3 and the step S10, the number X of the template images, the library-building detection images and the online image to be detected divided into the unit images is 280 to 360.
6. The etched lead frame false alarm library establishment and false alarm online judgment screening method according to claim 1, wherein the method comprises the following steps: when the library building unit image and the template unit image are subjected to matching analysis based on pixel gray values in the step S4, the closer the pixel gray values of the library building unit image and the template unit image are, the smaller the pixel gray difference values of the library building unit image and the template unit image are, and conversely, the larger the pixel gray difference values of the library building unit image and the template unit image are, and when the pixel difference values of the library building unit image and the template unit image are larger than a set difference threshold T1, the library building unit image is judged to be a difference image;
in the step S11, when the detection unit image and the template unit image are based on the pixel gray value matching analysis, the closer the pixel gray values of the detection unit image and the template unit image are, the smaller the pixel gray difference value of the detection unit image and the template unit image is, otherwise, the larger the pixel gray difference value of the detection unit image and the template unit image is, and when the pixel difference value of the detection unit image and the template unit image is larger than the set difference threshold T1, the detection unit image is judged to be a difference image;
The difference threshold T1 is a pixel gray difference critical value of a library building unit image or a detection unit image and a template unit image; the value range of the difference threshold T1 is 15 to 35.
7. The etched lead frame false alarm library establishment and false alarm online judgment screening method according to claim 1, wherein the method comprises the following steps: the abnormal threshold T2 in the step S4 and the step S11 is an area critical value of a difference image in a single Zhang Dejian library detection image or a single image to be detected, and when the area of the difference image of a library building unit image or a detection unit image in a Shan Zhangde library building detection image or a single image to be detected is greater than T2, the Zhang Jianku detection image or the image to be detected is determined to be a candidate defect image;
the value range of the abnormal threshold T2 is 15 to 25 pixel areas.
8. The etched lead frame false alarm library establishment and false alarm online judgment screening method according to claim 1, wherein the method comprises the following steps: in the step S15, the range value of the feature distance between the candidate image feature vector b and the multiple false positive image feature vectors a is 0 to 1, which is calculated by the candidate image feature vector b and the false positive image feature vector a in the false positive image library through the L2 normal distance formula.
9. The etched lead frame false alarm library establishment and false alarm online judgment screening method according to claim 8, wherein the method comprises the following steps: the false positive screening threshold T3 in the step S16 is a critical judgment value of the feature distance between the candidate image feature vector b and the false positive image feature vector a based on the L2 normal form distance formula; when the feature distance calculated by the candidate image feature vector b & lt/a & gt and the false positive image feature vector a & lt/a & gt based on the L2 normal form distance formula is smaller, the feature distance is closer to the candidate image feature vector b & lt/a & gt, and when the selected evaluation value L representing the minimum feature distance is smaller than T3, the candidate defect image is judged to be the false positive image, otherwise, the candidate defect image is judged to be the defect image.
10. The etched lead frame false alarm library establishment and false alarm online judgment screening method according to claim 9, wherein the method comprises the following steps: the false positive screening threshold T3 ranges from 0.05 to 0.15.
CN202311788551.4A 2023-12-25 2023-12-25 Establishment of false alarm library of etched lead frame and false alarm on-line judging and screening method Active CN117455917B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311788551.4A CN117455917B (en) 2023-12-25 2023-12-25 Establishment of false alarm library of etched lead frame and false alarm on-line judging and screening method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311788551.4A CN117455917B (en) 2023-12-25 2023-12-25 Establishment of false alarm library of etched lead frame and false alarm on-line judging and screening method

Publications (2)

Publication Number Publication Date
CN117455917A true CN117455917A (en) 2024-01-26
CN117455917B CN117455917B (en) 2024-03-26

Family

ID=89595219

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311788551.4A Active CN117455917B (en) 2023-12-25 2023-12-25 Establishment of false alarm library of etched lead frame and false alarm on-line judging and screening method

Country Status (1)

Country Link
CN (1) CN117455917B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117911732A (en) * 2024-03-19 2024-04-19 中船黄埔文冲船舶有限公司 Robot polishing rule template matching method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112014409A (en) * 2020-10-25 2020-12-01 西安邮电大学 Method and system for detecting defects of semiconductor etching lead frame die
CN113592866A (en) * 2021-09-29 2021-11-02 西安邮电大学 Semiconductor lead frame exposure defect detection method
JP2021180309A (en) * 2020-05-15 2021-11-18 清華大学Tsinghua University Two-dimensional pcb appearance defect real-time automatic detection technology based on deep learning
CN115205209A (en) * 2022-05-28 2022-10-18 浙江工业大学 Monochrome cloth flaw detection method based on weak supervised learning

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021180309A (en) * 2020-05-15 2021-11-18 清華大学Tsinghua University Two-dimensional pcb appearance defect real-time automatic detection technology based on deep learning
CN112014409A (en) * 2020-10-25 2020-12-01 西安邮电大学 Method and system for detecting defects of semiconductor etching lead frame die
CN113592866A (en) * 2021-09-29 2021-11-02 西安邮电大学 Semiconductor lead frame exposure defect detection method
CN115205209A (en) * 2022-05-28 2022-10-18 浙江工业大学 Monochrome cloth flaw detection method based on weak supervised learning

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117911732A (en) * 2024-03-19 2024-04-19 中船黄埔文冲船舶有限公司 Robot polishing rule template matching method and device

Also Published As

Publication number Publication date
CN117455917B (en) 2024-03-26

Similar Documents

Publication Publication Date Title
CN111325713A (en) Wood defect detection method, system and storage medium based on neural network
CN111402226A (en) Surface defect detection method based on cascade convolution neural network
CN111652098B (en) Product surface defect detection method and device
CN117455917B (en) Establishment of false alarm library of etched lead frame and false alarm on-line judging and screening method
CN111179250B (en) Industrial product defect detection system based on multitask learning
CN112053318A (en) Two-dimensional PCB defect real-time automatic detection and classification device based on deep learning
CN112233067A (en) Hot rolled steel coil end face quality detection method and system
CN116563641A (en) Surface defect identification method and system based on small target detection
CN110544231A (en) lithium battery electrode surface defect detection method based on background standardization and centralized compensation algorithm
CN112862770A (en) Defect analysis and diagnosis system, method and device based on artificial intelligence
CN113077416A (en) Welding spot welding defect detection method and system based on image processing
CN110837809A (en) Blood automatic analysis method, blood automatic analysis system, blood cell analyzer, and storage medium
CN110728269B (en) High-speed rail contact net support pole number plate identification method based on C2 detection data
CN116468961A (en) Image classification method and intelligent flaw detection system for injection molding product
CN115239672A (en) Defect detection method and device, equipment and storage medium
CN113706496B (en) Aircraft structure crack detection method based on deep learning model
CN113962929A (en) Photovoltaic cell assembly defect detection method and system and photovoltaic cell assembly production line
CN111582332B (en) Picture identification method for high-speed railway contact net dropper component
CN116091506B (en) Machine vision defect quality inspection method based on YOLOV5
CN117351472A (en) Tobacco leaf information detection method and device and electronic equipment
CN112561875A (en) Photovoltaic cell panel coarse grid detection method based on artificial intelligence
TW202034421A (en) Color filter inspection device, inspection device, color filter inspection method, and inspection method
CN112750113B (en) Glass bottle defect detection method and device based on deep learning and linear detection
CN114037682A (en) Two-dimensional automatic detection method for optical element surface defects
CN117115114B (en) YOLO-based power electronic device production process target identification and defect detection method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant