CN110111293B - Failure identification method and device for plastic package device - Google Patents

Failure identification method and device for plastic package device Download PDF

Info

Publication number
CN110111293B
CN110111293B CN201810082596.2A CN201810082596A CN110111293B CN 110111293 B CN110111293 B CN 110111293B CN 201810082596 A CN201810082596 A CN 201810082596A CN 110111293 B CN110111293 B CN 110111293B
Authority
CN
China
Prior art keywords
image
area
failure
sample
local
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810082596.2A
Other languages
Chinese (zh)
Other versions
CN110111293A (en
Inventor
王俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guoke Saisi Beijing Technology Co ltd
Original Assignee
Guoke Saisi Beijing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guoke Saisi Beijing Technology Co ltd filed Critical Guoke Saisi Beijing Technology Co ltd
Priority to CN201810082596.2A priority Critical patent/CN110111293B/en
Publication of CN110111293A publication Critical patent/CN110111293A/en
Application granted granted Critical
Publication of CN110111293B publication Critical patent/CN110111293B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration by the use of local operators
    • G06T5/30Erosion or dilatation, e.g. thinning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/0008Industrial image inspection checking presence/absence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • G06T2207/10061Microscopic image from scanning electron microscope
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30148Semiconductor; IC; Wafer

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Geometry (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of component defect detection, particularly provides a failure identification method and device for a plastic package device, and aims to solve the technical problem of conveniently, effectively and accurately identifying whether the plastic package device has defects. For this purpose, the failure identification method of the present invention mainly comprises the following steps: acquiring a sound scanning image containing a plurality of plastic packaged devices through an ultrasonic microscope; and identifying the acquired sweep image based on a pre-constructed identification model to obtain the invalid plastic packaged device. Based on the steps, the method can automatically identify the sound scanning image of the plastic package device and has higher identification accuracy. Meanwhile, the device in the invention can execute and realize the failure identification method.

Description

Failure identification method and device for plastic package device
Technical Field
The invention relates to the technical field of component defect detection, in particular to a failure identification method and device for a plastic package device.
Background
The plastic package device is a semiconductor device packaged by taking resin polymers as materials, and the semiconductor device can be effectively protected by a plastic package technology. At present, the ultrasonic scanning technology can be adopted to detect defects of the plastic package device, such as internal cracks, cavities, layering and the like. However, the ultrasonic scanning technology can only collect image information of the plastic package device, and a technician is required to judge whether the plastic package device has defects or not based on corresponding technical standards and observe the image information by naked eyes, so that misjudgment or misjudgment is easily caused, and when a large number of components are required to be detected, the efficiency of judging by naked eyes is low.
Disclosure of Invention
The method aims to solve the technical problem in the prior art, namely, the technical problem of how to conveniently, effectively and accurately identify whether the plastic package device has defects or not is solved. For this purpose, the invention provides a failure identification method and a device of a plastic package device.
In a first aspect, the method for identifying the failure of the plastic package device in the present invention includes:
acquiring a sound scanning image containing a plurality of plastic packaged devices through an ultrasonic microscope;
identifying the acquired sweep image based on a pre-constructed identification model to obtain a failure plastic package device;
the recognition model is a model constructed based on a machine learning algorithm, and the training method comprises the following steps:
acquiring a sound scanning image containing a plurality of plastic packaging device samples through an ultrasonic microscope;
extracting a local image of each plastic package device sample in the acoustic scanning image;
screening the extracted local images to obtain local images of potential failure samples;
extracting image features of a local image of the potentially failing sample;
and performing network training on the recognition model based on the extracted image characteristics to obtain the optimized recognition model.
Further, a preferred technical solution provided by the present invention is:
the step of "extracting the local image of each plastic package device sample in the acoustic scanning image" specifically includes:
cutting to remove the boundary of the sound scanning image;
carrying out graying processing, filtering, binarization processing and secondary filtering processing on the cut sweep image in sequence;
expanding the acoustic scanning image subjected to the graying processing, the filtering, the binarization processing and the secondary filtering processing so as to enable a pin area of each plastic packaged device sample in the acoustic scanning image and a device main body area to form a communicated area;
acquiring a minimum rectangular frame containing each connected region, and selecting a rectangular frame of which the length is within a preset length range and the width is within a preset width range from the acquired rectangular frames;
and according to the selected rectangular frame, segmenting the original sound scanning image acquired by the ultrasonic microscope to obtain a local image of each plastic package device sample.
Further, a preferred technical solution provided by the present invention is:
the step of screening the extracted local images to obtain the local images of the potential failure samples specifically comprises the following steps:
primarily screening the local images to obtain local images containing potential defects;
performing secondary screening on the local images to remove the local images of which potential defects are noise points in the local images obtained by the primary screening; the noise points are potential defects which exist in isolation in the local image and have an area smaller than a preset threshold value.
Further, a preferred technical solution provided by the present invention is:
before the step of extracting the image features of the local image of the potential failure sample, the step of determining the failure type of the potential failure sample comprises the following steps:
sequentially carrying out gray processing and binarization processing on the local image of the potential failure sample;
extracting the device main body outline of the potential failure sample in the local image after the graying processing and the binarization processing;
expanding the local image subjected to the graying processing and the binarization processing so as to form a communication area in each hierarchical area of the same pin in the potential failure sample and a communication area in a device main body area in the potential failure sample;
determining the failure type of the potential failure sample according to the position relation between the centroid of the communication area and the device body outline: if the centroids of all the connected regions are outside the device body profile, the potential failure sample is of a first failure type; if the centroids of all connected regions are within the device body profile, the potential failure sample is of a second failure type; if both the interior and exterior of the device body profile contain the centroid of the connected region, then the potential failure sample is of a third failure type.
Further, a preferred technical solution provided by the present invention is:
the step of "extracting image features of the local image of the potentially failing sample" specifically includes:
extracting image features of a local image of a first failure type potential failure sample according to a method shown in the following formula:
Figure BDA0001561461480000031
wherein, the Per1iArea of i-th defect regioniArea of the lead Area where the i-th defect Area is locatedyiOf said Area, said Areayi=Areaypi1Said AreaypiThe eta is the area of a connected region corresponding to the pin region1Is a correction factor, and
Figure BDA0001561461480000032
the AreaolThe Area is the sum of the areas of all the lead areas before expansion processing in the local imagepolThe sum of the areas of the connected regions corresponding to all the pin regions after expansion processing in the local image is obtained, wherein epsilon is a preset empirical coefficient;
extracting image features of the local image of the second failure type potential failure sample according to a method shown in the following formula:
Figure BDA0001561461480000033
wherein, the Per2Sum of Area of defect region sum (Area)j) Area of the region corresponding to the main body region of the device after expansion treatmentpzThe ratio of (A) to (B); the AreajIs the area of the jth defective region, the η2Is a correction factor, and
Figure BDA0001561461480000041
the AreazThe area of the device body region before expansion processing;
according to the image feature extraction method of the first failure type potential failure sample and the second failure type potential failure sample, the image features of the local images of the third failure type potential failure sample are extracted.
Further, a preferred technical solution provided by the present invention is:
the step of performing network training on the recognition model based on the extracted image features to obtain the optimized recognition model specifically comprises the following steps:
and respectively carrying out network training on a plurality of pre-constructed recognition models based on the image characteristics, and selecting the recognition model with the highest recognition accuracy as the finally optimized recognition model.
Further, a preferred technical solution provided by the present invention is:
the step of determining the failure type of the potential failure sample according to the position relation between the centroid of the communication area and the device body outline comprises the following steps:
calculating the centroid of the communication area and the distance between each boundary in the device main body outline;
and determining the position relation between the centroid of the communication region and the outline of the device body according to the calculated distance.
In a second aspect, the failure recognition apparatus for a plastic package device in the present invention includes:
the device comprises an image acquisition module, an identification model and a model training module, wherein the identification model is a model constructed based on a machine learning algorithm; the image acquisition module is configured to acquire a sound scanning image containing a plurality of plastic packaged devices through an ultrasonic microscope; the identification model is configured to identify the sweep image acquired by the image acquisition module to obtain a failed plastic package device; the model training module is configured to perform network training on the recognition model to obtain an optimized recognition model;
the model training module comprises an image acquisition sub-module, an image extraction sub-module, an image screening sub-module, a feature extraction sub-module and a training sub-module;
the image acquisition submodule is configured to acquire a sound scanning image containing a plurality of plastic package device samples through an ultrasonic microscope; the image extraction submodule is configured to extract a local image of each plastic package device sample in the acoustic scanning image acquired by the image acquisition submodule; the image screening submodule is configured to screen the local image extracted by the image extraction submodule to obtain a local image of a potential failure sample; the feature extraction submodule is configured to extract image features of the local image of the potential failure sample obtained by screening of the image screening submodule; and the training submodule is configured to perform network training on the recognition model based on the image features extracted by the feature extraction submodule to obtain the optimized recognition model.
Further, a preferred technical solution provided by the present invention is:
the image extraction submodule comprises an image preprocessing unit and a local image extraction unit;
the image preprocessing unit is configured to perform the following operations:
cutting to remove the boundary of the sound scanning image;
carrying out graying processing, filtering, binarization processing and secondary filtering processing on the cut sweep image in sequence;
expanding the acoustic scanning image subjected to the graying processing, the filtering, the binarization processing and the secondary filtering processing so as to enable a pin area of each plastic packaged device sample in the acoustic scanning image and a device main body area to form a communicated area;
the local image extraction unit is configured to perform the following operations:
acquiring a minimum rectangular frame containing the connected region obtained by each image preprocessing unit, and selecting a rectangular frame of which the length is within a preset length range and the width is within a preset width range from the acquired rectangular frames;
and according to the selected rectangular frame, segmenting the original sound scanning image acquired by the ultrasonic microscope to obtain a local image of each plastic package device sample.
Further, a preferred technical solution provided by the present invention is:
the image screening submodule comprises a primary screening unit and a secondary screening unit;
the preliminary screening unit is configured to preliminarily screen the local images to obtain local images containing potential defects;
the secondary screening unit is configured to perform secondary screening on the local images screened by the primary screening unit so as to remove the local images of which the potential defects are noise points; the noise points are potential defects which exist in isolation in the local image and have an area smaller than a preset threshold value.
Further, a preferred technical solution provided by the present invention is:
the feature extraction submodule comprises a potential failure sample type determination unit; the potential failure sample type determining unit comprises a first image preprocessing subunit, a device body contour extraction subunit, a second image preprocessing subunit and a sample failure type determining subunit;
the first image preprocessing subunit is configured to sequentially perform graying processing and binarization processing on the local image of the potential failure sample;
the device body contour extraction subunit is configured to extract the device body contour of the potentially failed sample in the local image obtained by the first image preprocessing subunit;
the second image preprocessing subunit is configured to perform expansion processing on the local image obtained by the first image preprocessing subunit, so that each layered region of the same pin in the potential failure sample forms a connected region, and a device body region in the potential failure sample forms a connected region;
the sample failure type determining subunit is configured to determine the failure type of the potential failure sample according to the position relationship between the centroid of the connected region and the device body outline: if the centroids of all the connected regions are outside the device body profile, the potential failure sample is of a first failure type; if the centroids of all connected regions are within the device body profile, the potential failure sample is of a second failure type; if both the interior and exterior of the device body profile contain the centroid of the connected region, then the potential failure sample is of a third failure type.
Further, a preferred technical solution provided by the present invention is:
the feature extraction sub-module comprises a first type image feature extraction unit, a second type image feature extraction unit and a third type image feature extraction unit;
the first type image feature extraction unit is configured to extract image features of a local image of a first failure type potential failure sample according to a method shown in the following formula:
Figure BDA0001561461480000061
wherein, the Per1iArea of i-th defect regionriArea of the lead Area where the i-th defect Area is locatedyiOf said Area, said Areayi=Areaypi1Said AreaypiThe eta is the area of a connected region corresponding to the pin region1Is a correction factor, and
Figure BDA0001561461480000062
the AreaolThe Area is the sum of the areas of all the lead areas before expansion processing in the local imagepolThe sum of the areas of the connected regions corresponding to all the pin regions after expansion processing in the local image is obtained, wherein epsilon is a preset empirical coefficient;
the second-type image feature extraction unit is configured to extract image features of a local image of a second failure-type potential failure sample according to a method shown in the following formula:
Figure BDA0001561461480000071
wherein, the Per2The Area of the connected region corresponding to the main body region of the device after expansion treatment is the sum of the areas of the defect regionspzThe ratio of (A) to (B); the AreajIs the area of the jth defective region, the η2Is a correction factor, and
Figure BDA0001561461480000072
the AreazThe area of the device body region before expansion processing;
the third type image feature extraction unit is configured to control the first type image feature extraction unit and the second type image feature extraction unit respectively to extract image features of a local image of a third failure type potential failure sample.
Further, a preferred technical solution provided by the present invention is:
the training submodule comprises a model training unit;
and the model training unit is configured to respectively perform network training on a plurality of pre-constructed recognition models based on the image characteristics, and select the recognition model with the highest recognition accuracy as the finally optimized recognition model.
Further, a preferred technical solution provided by the present invention is:
the feature extraction submodule also comprises a position relation determining unit;
the positional relationship determination unit is configured to perform the following operations:
calculating the centroid of the communication area and the distance between each boundary in the device main body outline;
and determining the position relation between the centroid of the communication region and the outline of the device body according to the calculated distance.
Further, a preferred technical solution provided by the present invention is:
the failure recognition device further comprises a human-computer interaction module.
In a third aspect, the storage device in the present invention stores a plurality of programs, and the programs are suitable for being loaded and executed by a processor to implement the failure identification method for plastic packaged devices according to the above technical solution.
In a fourth aspect, a processing apparatus in the present invention comprises:
a processor adapted to execute various programs;
a storage device adapted to store a plurality of programs;
the method is characterized in that the program is suitable for being loaded and executed by a processor to realize the failure identification method of the plastic package device in the technical scheme.
Compared with the closest prior art, the technical scheme at least has the following beneficial effects:
1. according to the failure identification method, the sound scanning image containing the plurality of plastic packaged devices can be obtained through the ultrasonic microscope, and the obtained sound scanning image is identified based on the pre-established identification model, so that the failure plastic packaged device is obtained. Based on the steps, the method can automatically identify the sound scanning image of the plastic package device and has higher identification accuracy.
2. The failure identification method can accurately calculate the centroid of the defect area in the plastic package device, determine the failure type of the plastic package device according to the position relation between the centroid of the defect area and the outline of the main body of the plastic package device, and further extract the image characteristics of the plastic package devices of different failure types by adopting different characteristic extraction methods. The process does not depend on subjective will and experience of detection personnel, can accurately obtain the defect characteristics of the plastic package device, and reduces the detection error rate.
3. When the failure identification method is used for extracting the local image of each plastic package device in the sound scanning image, the sound scanning image is subjected to cutting, graying processing, filtering, binarization processing, secondary filtering processing and expansion processing in sequence, so that part of noise points in the sound scanning image can be filtered, the background image of the plastic package device is smoother, and the risk of image feature extraction is further reduced.
4. The failure identification method can automatically identify the sound scanning image containing a plurality of plastic packaging devices, namely, simultaneously identify the plurality of plastic packaging devices, and does not need to individually identify the sound scanning image of each plastic packaging device. Based on this, can greatly improve discernment efficiency, reduce time cost.
5. The identification model is constructed based on a machine learning algorithm, has the advantages of self-organization, self-learning, self-adaption, fault tolerance and error correction, can accurately identify the plastic package device without obvious failure characteristics, and reduces the detection error rate.
Drawings
Fig. 1 illustrates the main steps of a failure identification method for a plastic packaged device in an embodiment of the present invention;
FIG. 2 is a schematic view of a sonographic image including a plurality of plastic packaged devices according to an embodiment of the present invention;
FIG. 3 illustrates the main steps of a recognition model training method according to an embodiment of the present invention;
fig. 4 is a schematic partial image of a single plastic packaged device in an embodiment of the invention;
fig. 5 is a schematic outline view of a main body of a plastic package device according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of region partition of a partial image of a potentially failing sample according to an embodiment of the present invention;
fig. 7 is a schematic diagram of a first failure type plastic encapsulated device in an embodiment of the invention;
fig. 8 is a schematic diagram of a second failure type plastic encapsulated device in an embodiment of the invention;
fig. 9 is a schematic diagram of a third failure-type plastic encapsulated device in an embodiment of the invention;
fig. 10 is a main structure of a failure recognition apparatus for a plastic package device according to an embodiment of the present invention.
Detailed Description
Preferred embodiments of the present invention are described below with reference to the accompanying drawings. It should be understood by those skilled in the art that these embodiments are only for explaining the technical principle of the present invention, and are not intended to limit the scope of the present invention.
Referring to fig. 1, fig. 1 schematically illustrates main steps of a failure identification method for a plastic packaged device in this embodiment. As shown in fig. 1, in this embodiment, a failed plastic package device can be identified according to the following steps:
step S101: and acquiring a sound scanning image containing a plurality of plastic packaged devices through an ultrasonic microscope.
Referring to fig. 2, fig. 2 schematically shows a sonogram image including a plurality of plastic packaged devices in the present embodiment, and as shown in fig. 2, the sonogram image obtained by the ultrasonic microscope in the present embodiment includes 15 plastic packaged devices, and the 15 plastic packaged devices are distributed in 3 rows and 5 columns. It should be noted that although the present embodiment discloses only a sonographic image containing 15 plastic packaged devices and a distribution manner of the plastic packaged devices, the scope of the present invention is obviously not limited to this specific embodiment. Without departing from the principle of the present invention, a person skilled in the art may make changes to the number and distribution of the plastic package devices, and the technical solutions after these changes will fall within the protection scope of the present invention.
Step S102: and identifying the acquired sweep image based on a pre-constructed identification model to obtain the invalid plastic packaged device.
The recognition model in this embodiment refers to a model constructed based on a machine learning algorithm, and for example, the recognition model may be a recognition model based on Logistic multiple regression, a recognition model based on a self-organizing competitive neural network, a recognition model based on a probabilistic neural network, a recognition model based on a recurrent neural network, or a recognition model based on a support vector machine, and the like.
With continuing reference to FIG. 3, FIG. 3 illustrates the main steps of a recognition model training method in this embodiment. As shown in fig. 3, in this embodiment, the network training may be performed on the recognition model pre-constructed in the method shown in fig. 1 according to the following steps:
step S201: and acquiring a sound scanning image containing a plurality of plastic packaging device samples through an ultrasonic microscope. In the embodiment, the ultrasonic microscope scans the plurality of plastic package device samples at one time, so that a sweep image containing the plurality of plastic package device samples can be obtained, each plastic package device sample does not need to be scanned, and the workload of obtaining the sweep image is reduced.
Step S202: and extracting a local image of each plastic packaged device sample in the acoustic scanning image.
In this embodiment, the sweep image refers to an image including a plurality of plastic package device samples, and therefore, a local image of each plastic package device sample in the sweep image needs to be extracted according to the following steps:
step S2021: cropping removes the boundaries of the sweep image.
In the embodiment, the boundary of the acoustic scanning image is cut and removed, and only the device body part of the plastic package device sample is reserved, so that the size of the acoustic scanning image can be reduced, and the calculation amount of subsequent image processing can be reduced.
Step S2022: and carrying out graying processing, filtering, binarization processing and secondary filtering processing on the cut sweep image in sequence.
In this embodiment, the clipped sweep image is first converted from an RGB image to a gray-scale image. The gray scale map is then filtered to remove noise points that are present in isolation and to smooth the image. And finally, converting the filtered gray level image into a binary image, and performing secondary filtering processing on the binary image to clearly display the outline of each plastic package device sample, wherein the pin area and the device body area of the plastic package device sample are in a separated state.
Step S2023: and expanding the acoustic scanning image subjected to graying processing, filtering, binarization processing and secondary filtering processing so as to enable a pin area of each plastic package device sample in the acoustic scanning image and a device main body area to form a communicated area. As can be seen from the foregoing, the pin portion and the device main body portion of the plastic package device sample are in a separated state after the binarization processing and the secondary filtering processing, so that a connected region can be formed between the pin region and the device main body region of each plastic package device sample through the expansion processing in this embodiment.
Step S2024: and acquiring a minimum rectangular frame containing each connected region, and selecting the rectangular frame with the length within a preset length range and the width within a preset width range in the acquired rectangular frames.
In this embodiment, before the minimum rectangular frame including each connected region is obtained, a region which exists in isolation and has a small area in the image needs to be deleted. Meanwhile, by selecting the rectangular frame with the length within the preset length range and the width within the preset width range, the noise area with the overlarge or undersize rectangular frame can be effectively removed, and only the communication area of the plastic package device sample is reserved.
Step S2025: and according to the selected rectangular frame, segmenting the original sound scanning image acquired by the ultrasonic microscope to obtain a local image of each plastic package device sample. In this embodiment, after the local image of each plastic package device sample is obtained, each local image may be further amplified, so as to perform image feature extraction.
Referring to fig. 4, fig. 4 schematically illustrates a partial image of a single plastic packaged device in this embodiment. As shown in fig. 4, a local image of a single plastic package device in this embodiment mainly includes a device body region and a lead region, where a device entity portion corresponding to the device body region is mainly an interface between a lead terminal bonding pad of the device and a molding compound, and a device entity portion corresponding to the lead region is mainly a lead frame of the device.
Step S203: and screening the extracted local images to obtain the local images of the potential failure samples. In this embodiment, the extracted local images may be screened according to the following steps to obtain a local image of a potential failure sample with a high failure probability:
step S2031: and carrying out primary screening on the local images to obtain the local images containing potential defects. In this embodiment, the potential defect region of the plastic package device sample may be highlighted in the image by the ultrasonic microscope, for example, the potential defect region is displayed in red, and the other normal regions are displayed in non-red.
Step S2032: performing secondary screening on the local images to remove the local images of which the potential defects are noise points in the local images obtained by the primary screening; the noise points are potential defects which exist in isolation in the local image and have an area smaller than a preset threshold value.
Step S204: image features of a partial image of a potentially failing sample are extracted.
In this embodiment, before extracting the image features, it is also necessary to determine the failure type of the potential failure sample, so as to extract the image features by adopting different methods for potential failure samples of different failure types. Specifically, the type of the potentially failing sample may be determined according to the following steps in this embodiment:
step S2041: and sequentially carrying out gray processing and binarization processing on the local image of the potential failure sample.
Step S2042: and extracting the device body outline of the potential failure sample in the local image after the graying processing and the binarization processing.
Referring to fig. 5, fig. 5 schematically illustrates a device body profile of a plastic encapsulated device sample in accordance with the present embodiment. As shown in fig. 5, the device body contour in this embodiment refers to a contour of a device body portion corresponding to the device body region in the partial image.
Step S2043: and expanding the local image subjected to the graying processing and the binarization processing so as to form a communication area in each hierarchical area of the same pin in the potential failure sample and a communication area in the device main body area in the potential failure sample. In this embodiment, after the image is binarized, the pin regions of the potential failure samples may have a layered separation state, and at this time, each layered region of the same pin needs to be formed into a connected region, that is, an overall state, by expansion processing. Meanwhile, a potential defect may exist in the device body region of the potential failure sample, so that the device body region is not a complete region, and therefore, the device body region also needs to be formed into a connected region, namely, an integral state, through the expansion process.
Step S2044: determining the failure type of the potential failure sample according to the position relation between the centroid of the communication area and the device body outline: if the centroids of all the communication areas are outside the outline of the device body, the potential failure sample is of a first failure type; if the centroids of all the communication areas are within the outline of the device body, the potential failure sample is of a second failure type; if both the interior and exterior of the device body profile contain the centroid of the connected region, the potential failure sample is of a third failure type. In the embodiment, a cartesian rectangular coordinate system can be constructed in the local image of the potential failure sample device, and then the position coordinates of each boundary of the potential failure sample device under the cartesian rectangular coordinate system and the centroid coordinates of the connected region are obtained, so that the failure type of the potential failure sample can be obtained.
Referring to fig. 6, fig. 6 illustrates a region division manner of a partial image of a potential failure sample in the present embodiment. As shown in fig. 6, in this embodiment, the local image of the potentially failing sample may be divided into 9 regions, where the region 5 is a region where the device body of the potentially failing sample is located, the other 8 regions are respectively distributed in a # -shaped manner with the region 5 as a center, the parameter x represents the smaller of the distance between the centroid of the connected region and the left boundary of the region 5 and the distance between the centroid of the connected region and the right boundary, and the parameter y represents the smaller of the distance between the centroid of the connected region and the upper boundary of the region 5 and the distance between the centroid of the connected region and the lower boundary. If the centroids of all connected regions in the local image are located in any one or more of the regions 1, 2, 3, 4, 6, 7, 8, and 9, the corresponding potential failure sample is of the first failure type. If the centroids of all connected regions in the local image are located in the region 5, the corresponding potential failure sample is of the second failure type. If there is a connected region with a centroid located in the region 5 and a connected region with a centroid located in the other 8 regions in the local image, the corresponding potential failure sample is of the third failure type. For example, the centroid coordinates of the connected components in the partial image are (9, 11.89), (0, 12.45), (0, 14.06), and (6.10, 22.42), respectively, and the corresponding potential failure sample is the first failure type. And if the centroid coordinates of the connected region in the local image are both (0, 0), the corresponding potential failure sample is of the second failure type. The centroid coordinates of the connected region in the partial image are (9, 11.89), (0, 12.45) and (0, 0) in this order, and the corresponding potential failure sample is the third failure type.
With continuing reference to fig. 7-9, fig. 7 exemplarily shows a first failure type plastic package device in the present embodiment, fig. 8 exemplarily shows a second failure type plastic package device in the present embodiment, and fig. 9 exemplarily shows a third failure type plastic package device in the present embodiment, wherein the black filling region is the connected region obtained through step S2043. As shown in fig. 7, in the present embodiment, the centroids of all connected regions in the local image of the first failure type plastic package device are outside the device body profile, that is, the defect region is located in the lead region. As shown in fig. 8, in the present embodiment, the centroids of all connected regions in the local image of the second failure type plastic packaged device are within the device body profile, that is, the defect region is located in the device body region. As shown in fig. 9, in the local image of the third failure type plastic package device in this embodiment, both the inside and the outside of the device body outline include centroids of connected regions, that is, defect regions are located in the device body region and the pin region at the same time.
Further, after determining the type of the potential failure sample, the image features of the local image of the potential failure sample can be extracted according to the following steps:
in this embodiment, the image features of the local image of the first failure type potential failure sample may be extracted according to a method shown in the following formula (1):
Figure BDA0001561461480000131
the meaning of each parameter in the formula (1) is as follows:
Per1iarea of i-th defect regioniArea of the lead Area where the i-th defect Area is locatedyiThe ratio of (a) to (b),
Figure BDA0001561461480000132
Areaypiis the area, eta, of the connected region corresponding to the lead region1Is a correction factor, and
Figure BDA0001561461480000141
Areaolarea is the sum of the areas of all the lead areas in the partial image before expansion processingpolAnd epsilon is the sum of the areas of the corresponding connected regions of all the pin regions in the local image after expansion processing, and is a preset empirical coefficient.
In a preferred embodiment of this embodiment, the pin area where each defect area outside the device body area is located can be determined according to the following method (2):
min(|xi-xk|+|yi-yk|) (2)
the meaning of each parameter in the formula (2) is as follows: (x)i,yi) Is the centroid coordinate of the ith defect region, (x)k,yk) Is the centroid coordinate of the kth pin field.
In this embodiment, the image features of the local image of the second failure type potential failure sample may be extracted according to a method shown in the following formula (3):
Figure BDA0001561461480000142
the meaning of each parameter in the formula (3) is as follows:
Per2sum of Area of defect region sum (Area)j) Area of the region corresponding to the main body region of the device after expansion treatmentpzRatio of (1), AreajIs the area of the jth defective region, η2Is a correction factor, and
Figure BDA0001561461480000143
Areazthe area of the device body region before expansion processing.
In this embodiment, the image features of the local image of the third failure type potential failure sample may be extracted according to the image feature extraction method of the first failure type potential failure sample shown in formula (1) and the image feature extraction method of the second failure type potential failure sample shown in formula (3).
Step S205: and performing network training on the recognition model based on the extracted image characteristics to obtain the optimized recognition model.
Specifically, in this embodiment, network training may be performed on a plurality of pre-constructed recognition models respectively based on image features, and the recognition model with the highest recognition accuracy may be selected as the finally optimized recognition model. For example, a recognition model based on Logistic multiple regression, a recognition model based on self-organizing competitive neural network, a recognition model based on probabilistic neural network, a recognition model based on recurrent neural network, a recognition model based on support vector machine and other models are respectively subjected to network training, and the model with the highest recognition accuracy rate in the recognition models is selected as the finally optimized recognition model.
Although the foregoing embodiments describe the steps in the above sequential order, those skilled in the art will understand that, in order to achieve the effect of the present embodiments, the steps may not be executed in such an order, and may be executed simultaneously (in parallel) or in an inverse order, and these simple variations are within the scope of the present invention.
Based on the same technical concept as the method embodiment, the embodiment of the invention also provides a failure identification device of the plastic package device. The failure recognition device of the plastic package device is specifically described below with reference to the accompanying drawings.
Referring to fig. 10, fig. 10 illustrates a main structure of a failure recognition apparatus for a plastic packaged device in the present embodiment. As shown in fig. 10, the failure recognition apparatus for a plastic package device in this embodiment may include an image obtaining module 11, a recognition model 12, and a model training module 13, where the recognition model 12 is a model constructed based on a machine learning algorithm. The image acquisition module 11 may be configured to acquire a scanning acoustic image including a plurality of plastic packaged devices through an ultrasonic microscope. The recognition model 12 may be configured to recognize the sonogram image acquired by the image acquisition module, resulting in a failed plastic package device. The model training module 13 may be configured to perform network training on the recognition model to obtain an optimized recognition model.
The model training module 13 in this embodiment may include an image acquisition sub-module, an image extraction sub-module, an image screening sub-module, a feature extraction sub-module, and a training sub-module. The image acquisition sub-module can be configured to acquire a sound scanning image containing a plurality of plastic packaged device samples through the ultrasonic microscope. The image extraction sub-module may be configured to extract a local image of each plastic packaged device sample in the acoustic scan image acquired by the image acquisition sub-module. The image screening submodule can be configured to screen the local images extracted by the image extraction submodule to obtain the local images of the potential failure samples. The feature extraction submodule can be configured to extract image features of local images of potential failure samples obtained by screening of the image screening submodule, and the training submodule can be configured to perform network training on the recognition model based on the image features extracted by the feature extraction submodule to obtain the optimized recognition model.
Further, the image extraction sub-module in this embodiment may include an image preprocessing unit and a local image extraction unit.
The image preprocessing unit in this embodiment may be configured to perform the following operations: cropping removes the boundaries of the sweep image. And carrying out graying processing, filtering, binarization processing and secondary filtering processing on the cut sweep image in sequence. And expanding the acoustic scanning image subjected to graying processing, filtering, binarization processing and secondary filtering processing so as to enable a pin area of each plastic package device sample in the acoustic scanning image and a device main body area to form a communicated area.
The partial image extraction unit in the present embodiment may be configured to perform the following operations: and acquiring a minimum rectangular frame containing the connected region obtained by each image preprocessing unit, and selecting the rectangular frame with the length within a preset length range and the width within a preset width range from the acquired rectangular frames. And according to the selected rectangular frame, segmenting the original sound scanning image acquired by the ultrasonic microscope to obtain a local image of each plastic package device sample.
Further, the image filtering sub-module in this embodiment may include a primary filtering unit and a secondary filtering unit. The preliminary screening unit may be configured to perform preliminary screening on the local images to obtain local images containing potential defects. The secondary screening unit can be configured to perform secondary screening on the local images screened by the primary screening unit so as to remove the local images of which the potential defects are noise points; the noise points are potential defects which exist in isolation in the local image and have an area smaller than a preset threshold value.
Further, in this embodiment, the feature extraction sub-module may include a potential failure sample type determination unit, and the potential failure sample type determination unit may include a first image preprocessing sub-unit, a device body contour extraction sub-unit, a second image preprocessing sub-unit, and a sample failure type determination sub-unit. The first image preprocessing subunit can be configured to sequentially perform graying processing and binarization processing on the local image of the potentially failed sample. The device body contour extraction subunit may be configured to extract a device body contour of a potentially failing sample in the local image obtained by the first image preprocessing subunit. The second image preprocessing subunit may be configured to perform dilation processing on the local image obtained by the first image preprocessing subunit, so that each hierarchical region of the same pin in the potential failure sample forms a connected region, and the device body region in the potential failure sample forms a connected region. The sample failure type determination subunit may be configured to determine the failure type of the potential failure sample according to the positional relationship between the centroid of the connected region and the device body profile: if the centroids of all the communication areas are outside the outline of the device body, the potential failure sample is of a first failure type; if the centroids of all the communication areas are within the outline of the device body, the potential failure sample is of a second failure type; if both the interior and exterior of the device body profile contain the centroid of the connected region, the potential failure sample is of a third failure type.
Further, the feature extraction sub-module in this embodiment may include a first-type image feature extraction unit, a second-type image feature extraction unit, and a third-type image feature extraction unit. Wherein the first type image feature extraction unit may be configured to extract image features of the partial image of the first failure type potential failure sample according to a method shown in formula (1). The second-type image feature extraction unit may be configured to extract image features of the partial image of the second failure-type potential failure sample in accordance with a method shown in formula (3). The third-type image feature extraction unit may be configured to control the first-type image feature extraction unit and the second-type image feature extraction unit to extract image features of the partial image of the third failure-type potential failure sample, respectively.
Further, the training sub-module in this embodiment may include a model training unit, and the model training unit may be configured to perform network training on a plurality of pre-constructed recognition models respectively based on the image features, and select a recognition model with the highest recognition accuracy as the finally optimized recognition model.
Further, in this embodiment, the feature extraction sub-module may further include a position relationship determination unit, and the position relationship determination unit may be configured to perform the following operations: first, the distance between the centroid of the connected region and each boundary in the device body contour is calculated. Then, the positional relationship between the centroid of the connected region and the outline of the device body is determined based on the calculated distance.
Further, the failure recognition device for the plastic package device in this embodiment may further include a human-computer interaction module, where the human-computer interaction module may display a failure recognition process of the plastic package device, and may also display content specified by the control instruction according to the entered control instruction.
Those skilled in the art will appreciate that the above-described failure recognition apparatus for plastic packaged devices may also include other known structures, such as processors, controllers, memories, etc., wherein the memories include, but are not limited to, ram, flash, rom, prom, eprom, nvm, non-volatile memory, serial, parallel, or registers, etc., and the processors include, but are not limited to, CPLD/FPGA, DSP, ARM processor, MIPS processor, etc., and these known structures are not shown in fig. 10 in order to unnecessarily obscure embodiments of the present disclosure.
It should be understood that the number of individual modules in fig. 10 is merely illustrative. The number of modules may be any according to actual needs.
Based on the above embodiment of the failure identification method for the plastic package device, an embodiment of the present invention further provides a storage device, where the storage device stores a plurality of programs, and the programs are suitable for being loaded and executed by a processor to implement the failure identification method for the plastic package device described in the above embodiment of the method.
Further, based on the above embodiment of the method for identifying the failure of the plastic package device, the embodiment of the present invention further provides a processing apparatus, where the processing apparatus includes a processor and a storage device, where the processor may be adapted to execute various programs, the storage device may be adapted to store a plurality of programs, and the programs may be adapted to be loaded and executed by the processor to implement the method for identifying the failure of the plastic package device according to the above embodiment of the method.
Those skilled in the art will appreciate that the modules in the devices of the embodiments can be adaptively changed and placed in one or more devices other than the embodiments. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the claims of the present invention, any of the claimed embodiments may be used in any combination.
The various component embodiments of the invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that a microprocessor or Digital Signal Processor (DSP) may be used in practice to implement some or all of the functionality of some or all of the components in a server, client, or the like, according to embodiments of the present invention. The present invention may also be embodied as an apparatus or device program (e.g., PC program and PC program product) for carrying out a portion or all of the methods described herein. Such a program implementing the invention may be stored on a PC readable medium or may be in the form of one or more signals. Such a signal may be downloaded from an internet website or provided on a carrier signal or in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed PC. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
So far, the technical solutions of the present invention have been described in connection with the preferred embodiments shown in the drawings, but it is easily understood by those skilled in the art that the scope of the present invention is obviously not limited to these specific embodiments. Equivalent changes or substitutions of related technical features can be made by those skilled in the art without departing from the principle of the invention, and the technical scheme after the changes or substitutions can fall into the protection scope of the invention.

Claims (15)

1. A failure identification method of a plastic package device is characterized by comprising the following steps:
acquiring a sound scanning image containing a plurality of plastic packaged devices through an ultrasonic microscope;
identifying the acquired sweep image based on a pre-constructed identification model to obtain a failure plastic package device;
the recognition model is a model constructed based on a machine learning algorithm, and the training method comprises the following steps:
acquiring a sound scanning image containing a plurality of plastic packaging device samples through an ultrasonic microscope;
extracting a local image of each plastic package device sample in the acoustic scanning image;
screening the extracted local images to obtain local images of potential failure samples;
extracting image features of a local image of the potentially failing sample;
performing network training on the recognition model based on the extracted image features to obtain an optimized recognition model;
the training method further comprises the following steps of determining the failure type of the potential failure sample before extracting the image characteristics of the local image of the potential failure sample:
sequentially carrying out gray processing and binarization processing on the local image of the potential failure sample;
extracting the device main body outline of the potential failure sample in the local image after the graying processing and the binarization processing;
expanding the local image subjected to the graying processing and the binarization processing so as to form a communication area in each hierarchical area of the same pin in the potential failure sample and a communication area in a device main body area in the potential failure sample;
determining the failure type of the potential failure sample according to the position relation between the centroid of the communication area and the device body outline: if the centroids of all the connected regions are outside the device body profile, the potential failure sample is of a first failure type; if the centroids of all connected regions are within the device body profile, the potential failure sample is of a second failure type; if both the interior and exterior of the device body profile contain the centroid of the connected region, then the potential failure sample is of a third failure type.
2. The failure identification method according to claim 1, wherein the step of "extracting a partial image of each plastic package device sample in the acoustic scan image" specifically comprises:
cutting to remove the boundary of the sound scanning image;
carrying out graying processing, filtering, binarization processing and secondary filtering processing on the cut sweep image in sequence;
expanding the acoustic scanning image subjected to the graying processing, the filtering, the binarization processing and the secondary filtering processing so as to enable a pin area of each plastic packaged device sample in the acoustic scanning image and a device main body area to form a communicated area;
acquiring a minimum rectangular frame containing each connected region, and selecting a rectangular frame of which the length is within a preset length range and the width is within a preset width range from the acquired rectangular frames;
and according to the selected rectangular frame, segmenting the original sound scanning image acquired by the ultrasonic microscope to obtain a local image of each plastic package device sample.
3. The failure recognition method according to claim 1, wherein the step of screening the extracted partial images to obtain the partial images of the potential failure samples specifically comprises:
primarily screening the local images to obtain local images containing potential defects;
performing secondary screening on the local images to remove the local images of which potential defects are noise points in the local images obtained by the primary screening; the noise points are potential defects which exist in isolation in the local image and have an area smaller than a preset threshold value.
4. The failure recognition method according to claim 1, wherein the step of "extracting image features of the partial images of the potential failure samples" specifically comprises:
extracting image features of a local image of a first failure type potential failure sample according to a method shown in the following formula:
Figure FDA0002945857380000021
wherein, the Per1iArea of i-th defect regioniArea of the lead Area where the i-th defect Area is locatedyiOf said Area, said Areayi=Areaypi1Said AreaypiThe eta is the area of a connected region corresponding to the pin region1Is a correction factor, and
Figure FDA0002945857380000022
the AreaolThe Area is the sum of the areas of all the lead areas in the partial image before the expansion processpolThe sum of the areas of the connected regions corresponding to all the pin regions after expansion processing in the local image is obtained, wherein epsilon is a preset empirical coefficient;
extracting image features of the local image of the second failure type potential failure sample according to a method shown in the following formula:
Figure FDA0002945857380000031
wherein, the Per2Sum of Area of defect region sum (Area)j) Area of the region corresponding to the main body region of the device after expansion treatmentpzThe ratio of (A) to (B); the AreajIs the area of the jth defective region, the η2Is a correction factor, and
Figure FDA0002945857380000032
the AreazThe area of the device body region before expansion processing;
according to the image feature extraction method of the first failure type potential failure sample and the second failure type potential failure sample, the image features of the local images of the third failure type potential failure sample are extracted.
5. The failure recognition method according to any one of claims 1 to 3, wherein the step of performing network training on the recognition model based on the extracted image features to obtain the optimized recognition model specifically comprises:
and respectively carrying out network training on a plurality of pre-constructed recognition models based on the image characteristics, and selecting the recognition model with the highest recognition accuracy as the finally optimized recognition model.
6. The failure recognition method according to claim 1, wherein the step of determining the type of the potentially failed sample based on the positional relationship between the centroid of the connected region and the device body profile comprises:
calculating the centroid of the communication area and the distance between each boundary in the device main body outline;
and determining the position relation between the centroid of the communication region and the outline of the device body according to the calculated distance.
7. The failure recognition device of the plastic package device is characterized by comprising an image acquisition module, a recognition model and a model training module, wherein the recognition model is a model constructed based on a machine learning algorithm; the image acquisition module is configured to acquire a sound scanning image containing a plurality of plastic packaged devices through an ultrasonic microscope; the identification model is configured to identify the sweep image acquired by the image acquisition module to obtain a failed plastic package device; the model training module is configured to perform network training on the recognition model to obtain an optimized recognition model;
the model training module comprises an image acquisition sub-module, an image extraction sub-module, an image screening sub-module, a feature extraction sub-module and a training sub-module;
the image acquisition submodule is configured to acquire a sound scanning image containing a plurality of plastic package device samples through an ultrasonic microscope; the image extraction submodule is configured to extract a local image of each plastic package device sample in the acoustic scanning image acquired by the image acquisition submodule; the image screening submodule is configured to screen the local image extracted by the image extraction submodule to obtain a local image of a potential failure sample; the feature extraction submodule is configured to extract image features of the local image of the potential failure sample obtained by screening of the image screening submodule; the training submodule is configured to perform network training on the recognition model based on the image features extracted by the feature extraction submodule to obtain an optimized recognition model;
the feature extraction submodule comprises a potential failure sample type determination unit; the potential failure sample type determining unit comprises a first image preprocessing subunit, a device body contour extraction subunit, a second image preprocessing subunit and a sample failure type determining subunit;
the first image preprocessing subunit is configured to sequentially perform graying processing and binarization processing on the local image of the potential failure sample;
the device body contour extraction subunit is configured to extract the device body contour of the potentially failed sample in the local image obtained by the first image preprocessing subunit;
the second image preprocessing subunit is configured to perform expansion processing on the local image obtained by the first image preprocessing subunit, so that each layered region of the same pin in the potential failure sample forms a connected region, and a device body region in the potential failure sample forms a connected region;
the sample failure type determining subunit is configured to determine the failure type of the potential failure sample according to the position relationship between the centroid of the connected region and the device body outline: if the centroids of all the connected regions are outside the device body profile, the potential failure sample is of a first failure type; if the centroids of all connected regions are within the device body profile, the potential failure sample is of a second failure type; if both the interior and exterior of the device body profile contain the centroid of the connected region, then the potential failure sample is of a third failure type.
8. The failure recognition device according to claim 7, wherein the image extraction sub-module includes an image preprocessing unit and a partial image extraction unit;
the image preprocessing unit is configured to perform the following operations:
cutting to remove the boundary of the sound scanning image;
carrying out graying processing, filtering, binarization processing and secondary filtering processing on the cut sweep image in sequence;
expanding the acoustic scanning image subjected to the graying processing, the filtering, the binarization processing and the secondary filtering processing so as to enable a pin area of each plastic packaged device sample in the acoustic scanning image and a device main body area to form a communicated area;
the local image extraction unit is configured to perform the following operations:
acquiring a minimum rectangular frame containing the connected region obtained by each image preprocessing unit, and selecting a rectangular frame of which the length is within a preset length range and the width is within a preset width range from the acquired rectangular frames;
and according to the selected rectangular frame, segmenting the original sound scanning image acquired by the ultrasonic microscope to obtain a local image of each plastic package device sample.
9. The failure recognition device according to claim 7, wherein the image screening submodule includes a primary screening unit and a secondary screening unit;
the preliminary screening unit is configured to preliminarily screen the local images to obtain local images containing potential defects;
the secondary screening unit is configured to perform secondary screening on the local images screened by the primary screening unit so as to remove the local images of which the potential defects are noise points; the noise points are potential defects which exist in isolation in the local image and have an area smaller than a preset threshold value.
10. The failure recognition device according to claim 7, wherein the feature extraction sub-module includes a first type image feature extraction unit, a second type image feature extraction unit, and a third type image feature extraction unit;
the first type image feature extraction unit is configured to extract image features of a local image of a first failure type potential failure sample according to a method shown in the following formula:
Figure FDA0002945857380000051
wherein, the Per1iArea of i-th defect regionriArea of the lead Area where the i-th defect Area is locatedyiOf said Area, said Areayi=Areaypi1Said AreaypiThe eta is the area of a connected region corresponding to the pin region1Is a correction factor, and
Figure FDA0002945857380000061
the AreaolThe Area is the sum of the areas of all the lead areas in the partial image before the expansion processpolThe sum of the areas of the connected regions corresponding to all the pin regions in the local image after expansion processing is obtained, wherein epsilon is a preset empirical coefficient;
the second-type image feature extraction unit is configured to extract image features of a local image of a second failure-type potential failure sample according to a method shown in the following formula:
Figure FDA0002945857380000062
wherein, the Per2The Area of the connected region corresponding to the main body region of the device after expansion treatment is the sum of the areas of the defect regionspzThe ratio of (A) to (B); the AreajIs the area of the jth defective region, the η2Is a correction factor, and
Figure FDA0002945857380000063
the AreazThe area of the device body region before expansion processing;
the third type image feature extraction unit is configured to control the first type image feature extraction unit and the second type image feature extraction unit respectively to extract image features of a local image of a third failure type potential failure sample.
11. The failure recognition device of any one of claims 7-9, wherein the training submodule comprises a model training unit;
and the model training unit is configured to respectively perform network training on a plurality of pre-constructed recognition models based on the image characteristics, and select the recognition model with the highest recognition accuracy as the finally optimized recognition model.
12. The failure recognition device according to claim 7, wherein the feature extraction sub-module further includes a positional relationship determination unit;
the positional relationship determination unit is configured to perform the following operations:
calculating the centroid of the communication area and the distance between each boundary in the device main body outline;
and determining the position relation between the centroid of the communication region and the outline of the device body according to the calculated distance.
13. The failure recognition device according to any one of claims 7-9, wherein the failure recognition device further comprises a human-machine interaction module.
14. A storage means, in which a plurality of programs are stored, characterized in that said programs are adapted to be loaded and executed by a processor to implement the method for failure identification of plastic encapsulated devices according to any of claims 1-6.
15. A processing apparatus, comprising:
a processor adapted to execute various programs;
a storage device adapted to store a plurality of programs;
characterized in that the program is adapted to be loaded and executed by a processor to implement the method for failure identification of plastic encapsulated devices according to any of claims 1-6.
CN201810082596.2A 2018-01-29 2018-01-29 Failure identification method and device for plastic package device Active CN110111293B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810082596.2A CN110111293B (en) 2018-01-29 2018-01-29 Failure identification method and device for plastic package device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810082596.2A CN110111293B (en) 2018-01-29 2018-01-29 Failure identification method and device for plastic package device

Publications (2)

Publication Number Publication Date
CN110111293A CN110111293A (en) 2019-08-09
CN110111293B true CN110111293B (en) 2021-05-11

Family

ID=67483557

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810082596.2A Active CN110111293B (en) 2018-01-29 2018-01-29 Failure identification method and device for plastic package device

Country Status (1)

Country Link
CN (1) CN110111293B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111678466A (en) * 2020-05-08 2020-09-18 苏州浪潮智能科技有限公司 Device and method for testing flatness of SMT (surface mount technology) surface mount connector
CN117078665B (en) * 2023-10-13 2024-04-09 东声(苏州)智能科技有限公司 Product surface defect detection method and device, storage medium and electronic equipment

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101840572A (en) * 2010-04-13 2010-09-22 河海大学常州校区 QFP element position error visual inspection method based on region segmentation
CN102592268A (en) * 2012-01-06 2012-07-18 清华大学深圳研究生院 Method for segmenting foreground image
CN105092598A (en) * 2015-09-28 2015-11-25 深圳大学 Method and system for rapidly recognizing defects of big-breadth PCB on basis of connected areas
CN105303573A (en) * 2015-10-26 2016-02-03 广州视源电子科技股份有限公司 Method and system of pin detection of gold needle elements
CN105957059A (en) * 2016-04-20 2016-09-21 广州视源电子科技股份有限公司 Electronic component missing detection method and system
CN106127746A (en) * 2016-06-16 2016-11-16 广州视源电子科技股份有限公司 Circuit board component missing part detection method and system
CN106248803A (en) * 2016-07-07 2016-12-21 北京航空航天大学 A kind of flash memory plastic device method of determining defects based on acoustic scan
CN106952250A (en) * 2017-02-28 2017-07-14 北京科技大学 A kind of metal plate and belt detection method of surface flaw and device based on Faster R CNN networks
CN107516309A (en) * 2017-07-12 2017-12-26 天津大学 One kind printing panel defect visible detection method
CN107563999A (en) * 2017-09-05 2018-01-09 华中科技大学 A kind of chip defect recognition methods based on convolutional neural networks

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10650508B2 (en) * 2014-12-03 2020-05-12 Kla-Tencor Corporation Automatic defect classification without sampling and feature selection

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101840572A (en) * 2010-04-13 2010-09-22 河海大学常州校区 QFP element position error visual inspection method based on region segmentation
CN102592268A (en) * 2012-01-06 2012-07-18 清华大学深圳研究生院 Method for segmenting foreground image
CN105092598A (en) * 2015-09-28 2015-11-25 深圳大学 Method and system for rapidly recognizing defects of big-breadth PCB on basis of connected areas
CN105303573A (en) * 2015-10-26 2016-02-03 广州视源电子科技股份有限公司 Method and system of pin detection of gold needle elements
CN105957059A (en) * 2016-04-20 2016-09-21 广州视源电子科技股份有限公司 Electronic component missing detection method and system
CN106127746A (en) * 2016-06-16 2016-11-16 广州视源电子科技股份有限公司 Circuit board component missing part detection method and system
CN106248803A (en) * 2016-07-07 2016-12-21 北京航空航天大学 A kind of flash memory plastic device method of determining defects based on acoustic scan
CN106952250A (en) * 2017-02-28 2017-07-14 北京科技大学 A kind of metal plate and belt detection method of surface flaw and device based on Faster R CNN networks
CN107516309A (en) * 2017-07-12 2017-12-26 天津大学 One kind printing panel defect visible detection method
CN107563999A (en) * 2017-09-05 2018-01-09 华中科技大学 A kind of chip defect recognition methods based on convolutional neural networks

Non-Patent Citations (7)

* Cited by examiner, † Cited by third party
Title
A fast and robust convolutional neural network-based defect detection model in product quality control;Tian Wang et al;《The International Journal of Advanced Manufacturing Technology》;20170815;第94卷;全文 *
A study on using scanning acoustic microscopy and neural network techniques to evaluate the quality of resistance spot welding;Hsu-Tung Lee et al;《The International Journal of Advanced Manufacturing Technology》;20030801;第22卷;全文 *
Discriminating defects method for plastic encapsulated microcircuits by using scanning acoustic microscope;Jiaoying Huang et al;《 2016 Prognostics and System Health Management Conference》;20161021;全文 *
Using scanning acoustic microscopy and LM-BP algorithm for defect inspection of micro solder bumps;Fan Liu et al;《Microelectronics Reliability》;20171231;第79卷;全文 *
基于主动红外和超声扫描的倒装芯片缺陷检测研究;查哲瑜;《中国优秀硕士学位论文全文数据库信息科技辑》;20120715;第2012年卷(第7期);正文第4.1节-4.6节 *
塑封半导体器件特殊封装缺陷的声学扫描检测;李智等;《国外电子测量技术》;20161031;第36卷(第10期);全文 *
查哲瑜.基于主动红外和超声扫描的倒装芯片缺陷检测研究.《中国优秀硕士学位论文全文数据库信息科技辑》.2012,第2012年卷(第7期), *

Also Published As

Publication number Publication date
CN110111293A (en) 2019-08-09

Similar Documents

Publication Publication Date Title
CN108154508B (en) Method, apparatus, storage medium and the terminal device of product defects detection positioning
CN110060237B (en) Fault detection method, device, equipment and system
CN109684981B (en) Identification method and equipment of cyan eye image and screening system
CN111462049B (en) Automatic lesion area form labeling method in mammary gland ultrasonic radiography video
CN109741335B (en) Method and device for segmenting vascular wall and blood flow area in blood vessel OCT image
CN112017986A (en) Semiconductor product defect detection method and device, electronic equipment and storage medium
CN111862044A (en) Ultrasonic image processing method and device, computer equipment and storage medium
CN110264444A (en) Damage detecting method and device based on weak segmentation
CN110111293B (en) Failure identification method and device for plastic package device
CN110555875A (en) Pupil radius detection method and device, computer equipment and storage medium
CN111950812B (en) Method and device for automatically identifying and predicting rainfall
CN112819796A (en) Tobacco shred foreign matter identification method and equipment
CN113420673B (en) Garbage classification method, device, equipment and storage medium
CN114998324A (en) Training method and device for semiconductor wafer defect detection model
CN108889635B (en) Online visual inspection method for manufacturing defects of ring-pull cans
CN113392681A (en) Human body falling detection method and device and terminal equipment
CN113781477A (en) Calculus image identification method, device and equipment based on artificial intelligence
CN115731282A (en) Underwater fish weight estimation method and system based on deep learning and electronic equipment
CN110738702B (en) Three-dimensional ultrasonic image processing method, device, equipment and storage medium
CN111899253A (en) Method and device for judging and analyzing abnormity of fetal craniocerebral section image
CN112818946A (en) Training of age identification model, age identification method and device and electronic equipment
CN115937991A (en) Human body tumbling identification method and device, computer equipment and storage medium
CN114582012A (en) Skeleton human behavior recognition method, device and equipment
CN112308061A (en) License plate character sorting method, recognition method and device
Perng et al. A novel quasi-contact lens auto-inspection system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant