US20190050978A9 - Defect inspection method and apparatus - Google Patents
Defect inspection method and apparatus Download PDFInfo
- Publication number
- US20190050978A9 US20190050978A9 US15/287,418 US201615287418A US2019050978A9 US 20190050978 A9 US20190050978 A9 US 20190050978A9 US 201615287418 A US201615287418 A US 201615287418A US 2019050978 A9 US2019050978 A9 US 2019050978A9
- Authority
- US
- United States
- Prior art keywords
- defect
- image
- inspection object
- inspection
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N29/00—Investigating or analysing materials by the use of ultrasonic, sonic or infrasonic waves; Visualisation of the interior of objects by transmitting ultrasonic or sonic waves through the object
- G01N29/04—Analysing solids
- G01N29/06—Visualisation of the interior, e.g. acoustic microscopy
- G01N29/0654—Imaging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/95—Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
- G01N21/9501—Semiconductor wafers
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N29/00—Investigating or analysing materials by the use of ultrasonic, sonic or infrasonic waves; Visualisation of the interior of objects by transmitting ultrasonic or sonic waves through the object
- G01N29/04—Analysing solids
- G01N29/06—Visualisation of the interior, e.g. acoustic microscopy
- G01N29/0654—Imaging
- G01N29/069—Defect imaging, localisation and sizing using, e.g. time of flight diffraction [TOFD], synthetic aperture focusing technique [SAFT], Amplituden-Laufzeit-Ortskurven [ALOK] technique
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N29/00—Investigating or analysing materials by the use of ultrasonic, sonic or infrasonic waves; Visualisation of the interior of objects by transmitting ultrasonic or sonic waves through the object
- G01N29/22—Details, e.g. general constructional or apparatus details
- G01N29/26—Arrangements for orientation or scanning by relative movement of the head and the sensor
- G01N29/265—Arrangements for orientation or scanning by relative movement of the head and the sensor by moving the sensor relative to a stationary material
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N29/00—Investigating or analysing materials by the use of ultrasonic, sonic or infrasonic waves; Visualisation of the interior of objects by transmitting ultrasonic or sonic waves through the object
- G01N29/44—Processing the detected response signal, e.g. electronic circuits specially adapted therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/001—Industrial image inspection using an image reference approach
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L22/00—Testing or measuring during manufacture or treatment; Reliability measurements, i.e. testing of parts without further processing to modify the parts as such; Structural arrangements therefor
- H01L22/10—Measuring as part of the manufacturing process
- H01L22/12—Measuring as part of the manufacturing process for structural parameters, e.g. thickness, line width, refractive index, temperature, warp, bond strength, defects, optical inspection, electrical measurement of structural dimensions, metallurgic measurement of diffusions
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L22/00—Testing or measuring during manufacture or treatment; Reliability measurements, i.e. testing of parts without further processing to modify the parts as such; Structural arrangements therefor
- H01L22/20—Sequence of activities consisting of a plurality of measurements, corrections, marking or sorting steps
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L22/00—Testing or measuring during manufacture or treatment; Reliability measurements, i.e. testing of parts without further processing to modify the parts as such; Structural arrangements therefor
- H01L22/20—Sequence of activities consisting of a plurality of measurements, corrections, marking or sorting steps
- H01L22/24—Optical enhancement of defects or not directly visible states, e.g. selective electrolytic deposition, bubbles in liquids, light emission, colour change
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N2291/00—Indexing codes associated with group G01N29/00
- G01N2291/02—Indexing codes associated with the analysed material
- G01N2291/023—Solids
- G01N2291/0231—Composite or layered materials
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N2291/00—Indexing codes associated with group G01N29/00
- G01N2291/02—Indexing codes associated with the analysed material
- G01N2291/028—Material parameters
- G01N2291/0289—Internal structure, e.g. defects, grain size, texture
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N2291/00—Indexing codes associated with group G01N29/00
- G01N2291/04—Wave modes and trajectories
- G01N2291/044—Internal reflections (echoes), e.g. on walls or defects
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N2291/00—Indexing codes associated with group G01N29/00
- G01N2291/10—Number of transducers
- G01N2291/101—Number of transducers one transducer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10132—Ultrasound image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30148—Semiconductor; IC; Wafer
Definitions
- the present invention relates to an apparatus for inspecting a defect from an image of an inspection object obtained by using an ultrasonic wave, an x-ray, or the like, and specifically to an inspection method suitable for an inspection of an inspection body having a multi-layer structure and a non-destructive inspection apparatus using the same.
- non-destructive inspection method for inspecting a defect from an image of an inspection object there are a method of using an ultrasonic image generated by irradiating the inspection object with an ultrasonic wave and detecting a reflected wave therefrom, and a method of using an x-ray image obtained by irradiating the inspection object with an x-ray and detecting an x-ray transmitted therethrough.
- a reflection property due to difference in acoustic impedance is generally used.
- the ultrasonic wave propagates through a liquid or solid material and generates a reflected wave at an interface between materials having different acoustic impedances or at a cavity. Since a reflected wave from a defect is different from a reflected wave from a defect-free portion in its strength, it is possible to obtain an image that exposes the defect present in the inspection object by visualizing reflection intensities at inter-layer interfaces of the inspection object.
- Determination of presence of a defect in the obtained image of the reflection intensity is often performed visually by an inspector, which may lead to variation in the evaluation result due to the experience of each inspector.
- major inspection objects such as semiconductors and electronic devices are increasingly miniaturized, making it more difficult to visually distinguish a defect from a normal pattern.
- multi-layer structures have become more popular to be adapted to multi-functionalization and miniaturization of mounting products, a WLP (Wafer Level package) method of handling the product in a form of a wafer until the final process of packaging is becoming a mainstream in the manufacturing scene.
- WLP Wafer Level package
- Patent Document 1 One conventional technique of automatically detecting a defect from an ultrasonic inspection image is a method described in Japanese Patent Laid-open No. 2007-101320 (Patent Document 1). This includes a function of sequentially generating and displaying ultrasonic inspection images, thereby extracting a candidate defect based on contiguity of a luminance distribution in each image. A defect and a noise can be distinguished by the length of the continuous repetition of the candidate defect. Furthermore, there is another method described in Japanese Patent Laid-open No. 2012-253193 (Patent Document 2). In this method, a presence of a void in a TSV (Through Silicon Via) in a three-dimensional integration structure is estimated based on ultrasonic scanning.
- TSV Three-dimensional integration structure
- the pattern of the inspection object is limited to the TSV, and in order to avoid an effect by a structure that may reduce resolution of the TSV in the vertical direction (bump electrode or wiring layer), the presence of the void in an active TSV is presumed by forming a TEG (Test Element Group) region including only an etch stop layer and the TSV and inspecting the presence of the void in the TEG region, which cannot inspect a wafer whole surface including a mixture of various patterns.
- TEG Transmission Element Group
- the present invention provides a defect inspection method of detecting a defect including the steps of: obtaining an image of an inspection object by imaging the inspection object having a pattern formed thereon; generating a reference image that does not include a defect from the obtained image of the inspection object; generating a multi-value mask for masking a non-defective pixel from the obtained image of the inspection object; calculating a defect accuracy by matching the brightness of the image of the inspection object and the reference image; and comparing the calculated defect accuracy with the generated multi-value mask.
- the present invention also provides a defect inspection apparatus including: an image acquisition unit obtaining an image of an inspection object by imaging the inspection object having a pattern thereon; a reference image generation unit generating a reference image that does not include a defect from the image of the inspection object obtained by the image acquisition unit and generating a multi-value mask for masking a non-defective pixel from the obtained images of the inspection object; a feature amount computing unit calculating a defect accuracy by matching the brightness of the image of the inspection object obtained by the image acquisition unit and the reference image generated by the reference image generation unit; and a defect detection processing unit detecting the defect by comparing the defect accuracy calculated by the feature amount computing unit with the multi-value mask generated by the reference image generation unit.
- the present invention further provides an ultrasonic inspection apparatus including: a detection unit including an ultrasonic probe emitting an ultrasonic wave and a flaw detector detecting a reflected echo generated from an inspection object by the ultrasonic wave emitted from the ultrasonic probe; an A/D conversion unit A/D converting a signal output from the flaw detector having detected the reflected echo in the detection unit; and an image processing unit detecting the reflected echo from the flaw detector converted into a digital signal by the A/D conversion unit, processing the output signal, generating a sectional image in a plane parallel with a surface of the inspection object inside the inspection object, processing the generated internal sectional image, and thereby inspecting an internal defect of the inspection object, wherein the image processing unit includes: a sectional image generation unit detecting the reflected echo generated from the flaw detector, processing the output signal, and generating the sectional image of the inside of the inspection object; a reference image generation unit generating a reference image that does not include a defect from the section
- the present invention makes it possible to detect and output a fine defect near a normal pattern on an internal image of the inspection object including a mixture of aperiodic and complicated patterns.
- the present invention also makes it possible to detect the defect inside the inspection object by processing the sectional image of the inside of the inspection object detected using an ultrasonic wave.
- FIG. 1 is an exemplary flow chart of a process showing a concept of a method for inspecting an internal defect of a wafer carrying various devices thereon according to a first embodiment of the present invention
- FIG. 2 is a block diagram showing a concept of an ultrasonic inspection apparatus according to the first embodiment of the present invention
- FIG. 3 is a block diagram showing a configuration of the ultrasonic inspection apparatus according to the first embodiment of the present invention
- FIG. 4 is a perspective view of a wafer having a multi-layer structure used as an inspection object in the first embodiment of the present invention
- FIG. 5A is a sectional view of the multi-layer wafer showing a relation between the multi-layer wafer and an ultrasonic probe used as the inspection object in the first embodiment of the present invention
- FIG. 5B is a graph showing a reflected echo signal from the multi-layer wafer detected by using the ultrasonic probe used as the inspection object in the first embodiment of the present invention
- FIG. 6A is a plan view of the multi-layer wafer used as the inspection object in the first embodiment of the present invention.
- FIG. 6B is an image of the multi-layer wafer used as the inspection object in the first embodiment of the present invention.
- FIG. 7 is a plan view of the wafer with a label applied to each chip of the multi-layer wafer used as the inspection object in the first embodiment of the present invention
- FIG. 8 is a block diagram showing a configuration of a defect detection unit of the ultrasonic inspection apparatus according to the first embodiment of the present invention.
- FIG. 9A is a block diagram showing a configuration of a reference image generation unit of the ultrasonic inspection apparatus according to the first embodiment of the present invention.
- FIG. 9B is a process flow chart of the reference image generation unit in the defect detection unit of the ultrasonic inspection apparatus according to the first embodiment of the present invention.
- FIG. 10 shows an image and a graph showing a procedure of generating a multi-value mask by the defect detection unit of the ultrasonic inspection apparatus according to the first embodiment of the present invention
- FIG. 11 is a flow chart showing a defect detection process by the defect detection unit of the ultrasonic inspection apparatus according to the first embodiment of the present invention.
- FIG. 12A is a plan view of the wafer labeled with respect to each pattern group according to the first embodiment of the present invention.
- FIG. 12B is a plan view of chips on the wafer showing an example in which information of the defect detected with respect to each group is integrated and output by a defect information output unit in the defect detection unit of the ultrasonic inspection apparatus according to the first embodiment of the present invention
- FIG. 12C is a plan view of the wafer showing another example in which information of the defect detected with respect to each group is integrated and output by a defect information output unit in the defect detection unit of the ultrasonic inspection apparatus according to the first embodiment of the present invention
- FIG. 13A is a plan view of the wafer labeled with respect to each pattern group according to the first embodiment of the present invention.
- FIG. 13B is a flow chart showing a process by the defect detection unit of the ultrasonic inspection apparatus according to the first embodiment of the present invention but different from what is described with reference to FIG. 12B ;
- FIG. 14 shows a perspective view of the wafer and an image of chips showing an example of grouping on the multi-layer wafer used as the inspection object according to the first embodiment of the present invention
- FIG. 15A is a plan view of an IC tray used as an inspection object according to a second embodiment of the present invention.
- FIG. 15B is a flow chart showing a process for the IC tray used as the inspection object according to the second embodiment of the present invention.
- the present invention relates to a defect inspection method making it possible to separate signals of a normal pattern from that of a defect on an inspection object including an aperiodic pattern structure and thereby detecting a fine defect, and an apparatus for the same. That is, the present invention is configured to segment an image into regions each consisting of the same pattern group, group the regions, and detect a defect in a partial image of the same group, even if the image obtained from the inspection object includes an aperiodic pattern.
- the present invention is effective for an appearance inspection, a non-destructive inspection, and the like performed on such an inspection object having a complicated pattern structure.
- the present invention is configured to detect a defect in an internal image of the inspection object by segmenting an image into regions each consisting of the same pattern group, grouping the regions, and integrating features of the segmented internal images belonging to the group. Grouping is performed based on labels applied to segmented regions by a user in advance, or based on design data or an exposure recipe used when patterning each layer. Moreover, for detection of the defect, a reference segmented internal image is formed by integrating the features of the segmented internal images belonging to the same group, and the features are compared between the reference segmented internal image and each segmented internal image to calculate a defect accuracy.
- a multi-value mask is generated from the segmented internal image, masking is performed on the pixel having the defect accuracy using the multi-value mask, and the remaining pixels are determined to be defective.
- FIGS. 1 to 14 An implementation of the inspection method according to the present invention and the apparatus thereof is described with reference to FIGS. 1 to 14 .
- MEMS Micro Electro Mechanical System
- an ultrasonic wave As a property of an ultrasonic wave, it propagates through the inspection object and, if there is a boundary at which a material property (acoustic impedance) changes, it is partially reflected. Since a large part of the ultrasonic wave is reflected when there is a cavity, such a defect as a void or a stripping can be detected with a high sensitivity based on the reflection intensity especially at a bonding surface of the wafer including multiple layers bonded together. Hereinbelow, a defect on the bonding surface of the multi-layer wafer is to be detected.
- FIG. 2 is a conceptual diagram showing the implementation of the ultrasonic inspection apparatus according to the present invention.
- the ultrasonic inspection apparatus according to the present invention includes a detection unit 1 , an A/D convertor 6 , an image processing unit 7 , and a total control unit 8 .
- the detection unit 1 includes an ultrasonic probe 2 and a flaw detector 3 .
- the flaw detector 3 drives the ultrasonic probe 2 by applying a pulse signal to the ultrasonic probe 2 .
- the ultrasonic probe 2 driven by the flaw detector 3 generates an ultrasonic wave and emits it toward the inspection object (sample 5 ).
- a reflected echo 4 is generated from the surface of the sample 5 or from the bonding surface of the wafer.
- the reflected echo 4 is then received by the ultrasonic probe 2 , processed by the flaw detector 3 as needed, and converted into a reflection intensity signal.
- the reflection intensity signal is then converted into digital waveform data by the A/D convertor 6 and input to the image processing unit 7 .
- the image processing unit 7 appropriately includes an image generation unit 7 - 1 , a defect detection unit 7 - 2 , and a data output unit 7 - 3 . Signal conversion to be described later is performed by the image generation unit 7 - 1 on the waveform data input from the A/D convertor 6 to the image processing unit 7 , thereby generating a sectional image of a specific bonding surface of the sample 5 from the digital waveform data.
- the defect detection unit 7 - 2 performs a process to be described later based on the sectional image of the bonding surface generated by the image generation unit 7 - 1 to detect the defect.
- the data output unit 7 - 3 generates data to be output as an inspection result such as information about an individual defect detected by the defect detection unit 7 - 2 and an image for observation of the section, and outputs the data to the total control unit 8 .
- FIG. 3 Shown in FIG. 3 is a schematic diagram of an exemplary configuration of a specific ultrasonic inspection apparatus 100 implementing the configuration shown in FIG. 2 .
- denoted by 10 is a coordinate system having three orthogonal axes of X, Y, and Z.
- Reference numeral 1 in FIG. 3 corresponds to the detection unit 1 described with reference to FIG. 2 .
- 11 included in the detection unit 1 is a scanner table
- 12 is a tank arranged on the scanner table 11
- 13 is a scanner arranged so as to bridge over the tank 12 on the scanner table 11 and movable in X, Y, and Z directions.
- the scanner table 11 is a base placed substantially horizontal.
- the tank 12 contains water 14 injected to the height indicated by a dotted line, and the sample 5 is placed on the bottom (in the water) of the tank 12 .
- the sample 5 is the semiconductor wafer including the multi-layer structure and the like, as described above.
- the water 14 is a medium required for effectively propagating the ultrasonic wave emitted by the ultrasonic probe 2 into the sample 5 .
- Denoted by 16 is a mechanical controller, which drives the scanner 13 in the X, Y, and Z directions.
- the ultrasonic probe 2 emits the ultrasonic wave from an ultrasonic output unit at its lower edge, and receives a reflected echo returned from the sample 5 .
- the ultrasonic probe 2 is attached to a holder 15 and movable in the X, Y, and Z directions by the scanner 13 driven by the mechanical controller 16 .
- the ultrasonic probe 2 can receive the reflected echo at a plurality of measurement points of the sample 5 set in advance while travelling in the X and Y directions, obtain a two-dimensional image of a bonding surface within a measurement range (X-Y plane), and thus inspect the defect.
- the ultrasonic probe 2 is connected to the flaw detector 3 that converts the reflected echo into a reflection intensity signal via a cable 22 .
- the ultrasonic inspection apparatus 100 further includes the A/D convertor 6 that converts the reflection intensity signal output from the flaw detector 3 of the detection unit 1 into a digital waveform, the image processing unit 7 that processes an image signal having been A/D converted by the A/D convertor 6 , the total control unit 8 that controls the detection unit 1 , the A/D convertor 6 , and the image processing unit 7 , and the mechanical controller 16 .
- the image processing unit 7 processes the image signal having been A/D converted by the A/D convertor 6 and detects an internal defect of the sample 5 .
- the image processing unit 7 includes the image generation unit 7 - 1 , the defect detection unit 7 - 2 , the data output unit 7 - 3 , and a parameter setting unit 7 - 4 .
- the image generation unit 7 - 1 generates an image from the digital data obtained by A/D converting the reflected echo returned from the sample surface and each bonding surface, and the like within the measurement range of the sample 5 set in advance and the position information of the ultrasonic probe obtained by the mechanical controller 16 .
- the defect detection unit 7 - 2 processes the image generated by the image generation unit 7 - 1 and thereby becomes apparent or detects the internal defect.
- the data output unit 7 - 3 output the inspection result from becoming apparent or detecting the internal defect by the defect detection unit 7 - 2 .
- the parameter setting unit 7 - 4 receives a parameter such as a measurement condition input from the outside, and sets the parameter to the defect detection unit 7 - 2 and the data output unit 7 - 3 .
- the parameter setting unit 7 - 4 is connected to a storage unit 18 that stores therein a database.
- the total control unit 8 includes a CPU (incorporated in the total control unit 8 ) that performs various controls, receives the parameter or the like from the user, and appropriately connects a user interface unit (GUI unit) 17 that includes a display means for displaying information including an image of the defect detected by the image processing unit 7 , the number of defects, a coordinate and dimension of the individual defect, and the like, and an input means, and the storage unit 18 storing therein the feature amount, image, and the like of the defect detected by the image processing unit 7 .
- the mechanical controller 16 drives the scanner 13 based on a control instruction from the total control unit 8 . It should be noted that the image processing unit 7 , the flaw detector 3 , and the like are also driven by the instruction from the total control unit 8 .
- FIG. 4 shows a configuration of an inspection object 400 as an example of the sample 5 .
- the inspection object 400 shown in FIG. 4 schematically represents appearance of a wafer including the multi-layer structure which is the main inspection object.
- the inspection object 400 is a laminated wafer formed by laminating and bonding wafers 41 to 45 of different types such as MEMS, CPU, memory, CMOS, and the like.
- the number of lamination is not limited to five but may be any number larger than one.
- the ultrasonic inspection apparatus 100 according to the present invention is used to inspect whether the wafers 41 to 45 in the inspection object 400 are properly bonded together on the whole lamination surface (bonding surface) without forming any depleted region such as a void or a stripping.
- FIG. 5A is an example schematically showing a vertical structure of the inspection object 400 having the multi-layer structure shown in FIG. 4 .
- the ultrasonic wave 50 transfers through the inspection object 400 and is reflected from the inspection object surface 401 and bonding surfaces 402 , 403 , 404 , 405 between the wafers due to difference in acoustic impedance, and the ultrasonic probe 2 receives them as a single reflected echo.
- a graph 51 in FIG. 5B shows an exemplary reflected echo from the inspection object received by the ultrasonic probe 2 , with its abscissa representing time and ordinate representing reflection intensity. Time also indicates the depth of the inspection object 400 .
- a visualization gate 52 hereinbelow, simply referred to as “gate 52 ”
- the desired time domain is cut out and a peak value in the gate 52 is detected.
- the image generation unit 7 - 1 of the image processing unit 7 detects the peak value in each scanning position from the reflected echo obtained while scanning the measurement range (X-Y plane) by the scanner 13 and converts the peak value into a gray value (for example, 0 to 255 in a case of generating a 256-tone image), thereby generating the sectional image of the bonding surface (an image of a section (a plane parallel to the wafer surface) in a depth direction from the wafer surface) from the gray value information at each scanning position.
- a gray value for example, 0 to 255 in a case of generating a 256-tone image
- the inspection object has the multi-layer structure like the inspection object 400 and has a plurality of bonding surfaces (such as 402 to 405 ) to be inspected, it is possible to set the gate 52 to the reflected echo in the time domain corresponding to each bonding surface and generate the sectional image of each bonding surface.
- FIGS. 6A and 6B Shown in FIGS. 6A and 6B are exemplary sectional images of the bonding surface generated.
- FIG. 6A schematically shows a top view of a laminated wafer 60 that is the inspection object.
- the laminated wafer 60 is eventually diced along straight lines shown in FIG. 6A to become a finished product.
- chip is used to refer to the diced product.
- Denoted by 62 in (a) of FIG. 6B is an exemplary sectional image of the bonding surface obtained from a region 61 delimited by a broken line and including three chips on the laminated wafer 60 .
- Denoted by 63 , 64 , and 65 in (b) of FIG. 6B are partial sectional images made by segmenting the sectional image 62 in (a) of FIG.
- pattern group the pattern configurations included in the obtained partial sectional image (hereinbelow, referred to as “ pattern group”) are also the same, while the left half of the partial sectional image 64 in (b) of FIG. 6B is constituted by two patterns, indicating that its pattern group is different from that of the partial sectional images 63 and 65 .
- the sectional images are grouped with respect to each region having the same pattern group (for example, the partial sectional images 63 and 65 belong to a group A and the partial sectional image 64 belong to a group B), and the defect detection process is performed with respect to each group.
- FIG. 1 is the conceptual diagram of this case.
- Denoted by 101 is an appearance of a wafer including a mixture of various devices thereon as an example of the inspection object.
- the inspection object (wafer) 101 includes chips formed thereon in a grid shape, and the different hatch patterns indicate different types of the devices constituting the chip. In other words, basically the inspection images constituted by the same pattern group are obtained from the regions of the same hatch pattern.
- the detection unit 1 obtains a surface image or an internal sectional image of the inspection object 101 (S 11 ), and the image processing unit 7 first extracts partial images constituted by the same pattern group from the obtained surface image or the internal sectional image of the inspection object 101 (S 12 ).
- the partial images corresponding to the regions 103 , 104 of the wave hatch pattern in the inspection object 101 are extracted from the extracted partial image and aligned as shown by 102 (S 13 ).
- the image alignment means because the extracted partial image 103 to 108 have the same pattern group, performing a position correction so that regions of the same pattern may be present at the same coordinate value in each image.
- FIG. 7 shows an example thereof.
- Denoted by 60 in (a) of FIG. 7 is a layout of chips formed on the wafer 101 used as the inspection object. This is displayed on a screen by the user interface unit 17 shown in FIG. 3 , and the parameter setting unit 7 - 4 receives the labels applied to the individual chip on the screen by the user. In this process, the inspection object 101 is grouped based on the labels applied by the user.
- Denoted by 701 in (b) of FIG. 7 is an example of the result, which is formed by segmenting the wafer 101 used as the inspection object into partial images in the unit of chips and grouping the partial images into four categories of A to D based on the labels applied by the user.
- An automatic setting is also possible using the recipe for the exposure even if there is no user setting.
- the exposure recipe includes exposure position information indicative of where to print a circuit pattern on the substrate, exposure order, and the like, from which the information about the pattern to be formed at each position can be obtained.
- FIG. 8 shows an example thereof.
- the defect detection process is performed using the partial images constituted by the same pattern group.
- An inspection recipe 801 constituted by various parameter values used for the processing, and an image of the wafer whole surface 802 are input.
- the defect detection unit 7 - 2 generally includes a partial image group generation unit 81 , a reference image generation unit 82 , a defect detection processing unit 83 , and a defect information output unit 84 .
- a plurality of partial images applied with the same label by the partial image group generation unit 81 for example, 103 to 108 in FIG.
- the reference image generation unit 82 generates a reference partial image 804 and a multi-value mask 805 .
- the reference partial image 804 means the normal image constituted by the same pattern group as that of the input partial image.
- FIGS. 9A and 9B Shown in FIGS. 9A and 9B is an example of a method of generating the reference partial image.
- Denoted by 90 a , 91 a, 92 a, . . . in FIG. 9B are the partial images of the same label cut out of the inspection object 101 .
- These partial images include the same pattern group (denoted herein by three different hatch patterns 911 to 913 ).
- the defects 921 to 923 may possibly be included.
- There also may be a positional shift of the pattern due to a slight difference in the position of obtaining the image when scanning (sampling error) (indicated by difference in positions of the hatch patterns 911 to 913 with respect to the black background).
- correction of the position of each image namely inter-image position correction is performed so as to correct the position of the partial image, or so as to align the coordinates of the hatch patterns 911 to 913 with the black background (S 901 ).
- the position correction between the partial images at Step S 901 is performed using a general matching method such as: specifying one partial image; calculating a shift amount that makes the minimum sum of squares of the luminance difference between the specified image and other partial images to be corrected while shifting the partial image to be corrected with respect to the specified image, or calculating a shift amount that makes the maximum normalized cross-correlation coefficient; and shifting the partial image by the calculated shift amount.
- a general matching method such as: specifying one partial image; calculating a shift amount that makes the minimum sum of squares of the luminance difference between the specified image and other partial images to be corrected while shifting the partial image to be corrected with respect to the specified image, or calculating a shift amount that makes the maximum normalized cross-correlation coefficient; and shifting the partial image by the calculated shift amount.
- 90 b, 91 b, 92 b, . . . in FIG. 9B are the partial images after the position correction.
- the features of the pixels in the partial images 90 b , 91 b, 92 b, . . . after the position correction are then calculated (S 902 ).
- the feature may be any of a contrast in each pixel (Equation 1) (luminance gradient with peripheral pixels), a luminance average including proximate pixels (Equation 2), a luminance dispersion value (Equation 3), increase or decrease of the brightness and its maximum gradient direction with respect to the proximate pixels, which represents the feature of the pixel.
- f(x, y) is the luminance value of the coordinate (x, y) in the partial image.
- the feature of each pixel (x, y) calculated for each partial image is integrated between the partial images (S 903 ) to generate the reference partial image 804 .
- This processing method includes: collecting features Fi(x, y) of the corresponding coordinate (x, y) between partial images (i is the number designated to the partial image), and thereby statistically determining the reference feature value S(x, y) of the feature of each pixel as represented by Equation 4.
- the luminance value of the partial image equal to the reference feature value is determined as the luminance value of the reference partial image. In this manner, the reference partial image 804 exclusive of influences from a defect is generated.
- Median Function outputting a median value (median) of the feature of each partial image
- F* (x, y) Feature value of the partial images 90 b , 91 b, 92 b, . . . after position correction
- the statistical processing may be performed by calculating an average of the feature at the corresponding coordinate between images and using the luminance value of the partial image having its feature closest to the average as the luminance value of the reference partial image.
- the reference image generation unit 82 generates, in addition to the reference partial image, the multi-value mask 805 for eliminating (masking) a non-defective pixel between images.
- the multi-value mask according to this embodiment is set by calculating multiple values (0 to 255) with respect to each pixel in the image.
- the luminance value f(x, y) of the corresponding pixel is integrated, and the dispersion value of the luminance values is calculated as the feature according to Equation 6.
- a graph 1001 shows a distribution of luminance values of a coordinate indicated by a white square 1011 in the partial images 90 b, 91 b, 92 b , . . . , showing that the dispersion value ⁇ 1 is calculated by integrating the luminance values between the images.
- a graph 1002 shows the distribution of the luminance values of the coordinate indicated by a black square 1012 in the partial images 90 b, 91 b, 92 b, . . . , showing that the dispersion value ⁇ 2 is calculated by integrating the luminance values between the images.
- the dispersion value ⁇ is calculated for all the pixels within the partial images in the same manner.
- Reference numeral 1003 shows a pattern near the coordinate indicated by the black square 1012 .
- a curve 1021 in the graph 1020 shows a luminance profile of a location indicated by an arrow 1005 ( ⁇ ⁇ ) on the longitudinal pattern 1004 in the pattern 1003 .
- a curve 1022 shows a luminance profile when the longitudinal pattern 1004 in the pattern 1003 is shifted by an amount ⁇ .
- ⁇ in the graph 1020 indicates the luminance difference caused by the positional shift by the amount ⁇ .
- the ⁇ is regarded as the second feature of the pixel indicated by the black square 1012 .
- the luminance difference ⁇ is calculated for all the pixels within the partial images in the same manner.
- a multi-value mask value M is calculated according to Equation 7.
- An aspect 1031 in the three-dimensional graph 1030 corresponds to the value M of the multi-value mask calculated from ⁇ and ⁇ .
- the value M of the multi-value mask is calculated separately with respect to each pixel according to ⁇ and ⁇ . This may cause a difference in the pattern luminance values between partial images, despite the same pattern group, due to a fabrication tolerance or a sampling error at the time of image acquisition, and the difference is reflected on the mask.
- the parameters ⁇ (described in FIG. 10 ), k, m, and n are set in advance, and the distribution of the multi-value mask M indicated by the aspect 1031 in the three-dimensional graph 1030 can be adjusted by adjusting these parameters.
- the multi-value mask M was calculated based on ⁇ and ⁇ calculated by integrating the features of each pixel between the partial images, any feature indicative of the property of the pixel can be used, and the way of integrating the feature may also be changed accordingly.
- the number of the features to be integrated is not limited to two but the multi-value mask M can be calculated from any number more than one of the integration features.
- the value of n was described as a fixed value, it can be set with respect to each pixel in the partial image.
- the reference image may also be generated by cutting out partial images constituted by the same pattern group from a good sample guaranteed to be free of defect.
- the defect detection processing unit 83 that detects a defect from the partial images 103 to 108 is described using the reference partial image 804 and the multi-value mask 805 in FIG. 8 .
- FIG. 11 shows an example of the process performed by the defect detection processing unit 83 .
- the reference partial image 804 and the multi-value mask 805 output from the reference image generation unit 82 and a partial image group 803 of the inspection object (partial images in the same group) are input, which images have been subjected to an inter-image position correction, as described with reference to FIG. 9 .
- each image in the partial image group used as the inspection object is matched with the reference partial image for the brightness, as needed (S 1101 ).
- One example of the method thereof described herein includes the step of correcting the brightness of the partial image to match that of the reference partial image 804 based on the least squares approximation.
- a and b are calculated so that Equation 9 makes the minimum value and they are used as correction coefficients “gain” and “offset”.
- the brightness is corrected on the corresponding pixels in respective images in the partial image group 803 and the reference partial image 804 as well as all the pixel values f(x, y) in the partial images to be corrected for the brightness, as represented by Equation 10
- the defect accuracy is then calculated for each pixel in a partial image 1110 (S 1102 ).
- An exemplary defect accuracy is defined by a value indicative of an appearance at the normal time, namely a degree of deviation from the luminance value of the reference partial image 804 , which is calculated according to Equation 11.
- the masking process is performed on the defect accuracy calculated according to Equation 11 using the multi-value mask 805 for each pixel, and the remaining pixels are detected as defective (S 1103 ).
- the masking process detects the defect when the defect accuracy exceeds a mask value as represented by Equation 12.
- the multi-value mask 805 can mask the pixels including a noise of the fabrication tolerance or the sampling error in the defect accuracy calculated according to Equation 11.
- the defect feature of a defective pixel is calculated for determining whether it is defective or not (S 1104 ).
- the process steps S 1101 to S 1104 by the defect detection processing unit 83 described above are performed on the partial images constituted by the same pattern group after grouping, and the same is performed on each group.
- the information about the defect detected by the process per group is then rearranged into a chip array on the inspection object by the defect information output unit 84 . Its concept is shown in FIGS. 12A and 12B .
- a wafer 120 shown in FIG. 12A is an inspection object segmented into regions constituted by the same pattern group and labeled. Based on this, it is assumed that the defect detection process is performed on each of the groups A to D to detect a defect 1202 a in a region 1202 of the group A, a defect 1201 a in a region 1201 of the group B, a defect 1203 a in a region 1203 of a group C, and a defect 1204 a in a region 1204 of a group D, as shown in FIG. 12B .
- the defect information output unit 84 in FIG. 8 rearranges the output result from the segmented partial images based on region arrangement information of the inspection object (wafer) 120 . That is, it maps the detected results 1201 a to 1204 a at the positions of regions 1201 to 1204 in FIG. 8B , generates a defect distribution image 121 on the wafer, and output the defect distribution image 121 .
- the defects at 1202 a and 1203 a detected in separate processes are thus output as a single defect.
- the coordinate indicative of the defect position in the partial image is converted into the coordinate system of the inspection object 101 , and separately calculated defect features (area, maximum length, etc.) are also integrated.
- the defect information after the conversion and integration is output to the data output unit 7 - 3 and displayed by a display means such as a display unit via the user interface unit (GUI unit) 17 . It is also possible to simultaneously determining whether the chip is good or defective based on the defect features and display the result. For example, the number of defects, the maximum defective area, the ratio of the defective pixels in the chip are measured, and the chip exceeding a judgement condition input as the inspection recipe is output and displayed as a faulty chip.
- FIG. 12B shows an example of mapping the detected result and outputting the defect distribution image on the wafer as denoted by 121
- the defect detection process also has a plurality of detection methods other than using the luminance difference compared with the reference image as the defect accuracy, as described above. Its concept is shown in FIGS. 13A and 13B .
- FIG. 13A shows an wafer 130 used as the inspection object segmented into regions constituted by the same pattern group and labeled.
- the group A includes seven regions
- the group B includes nine regions
- the group C includes three regions
- the group D includes two regions.
- the defect detection process can change the method of detecting the defect depending on the number of the regions having the same label.
- the reference partial image with the influence by the defect removed is statistically generated by integrating the features of each partial image.
- reliability of the statistical processing decreases. Therefore when the number of the regions is smaller than a certain number (for example, less than four regions), the statistical processing is not performed, but comparison between actual subjects, comparison with a model, comparison with a fixed threshold, and the like may be performed.
- An exemplary processing in the case of three partial images like the group C is as follows.
- Denoted by 131 , 132 , 133 in FIG. 13B are partial images generated by cutting out the regions corresponding to the label C in the inspection object (wafer) 130 and performing position correction and brightness matching, where the partial images (hereinbelow, referred to simply as images) 132 and 133 include defects 1321 and 1331 , respectively.
- differences among them are computed.
- a difference image 131 a takes a difference between the images 131 and 132
- a difference image 132 a takes a difference between the images 132 and 133
- a difference image 133 a takes a difference between the images 133 and 131 .
- the defective portion becomes apparent.
- the defect accuracy is calculated by taking the minimum value from the differences between two images. That is, a difference image 131 b is the minimum value between the difference image 133 a and the difference image 131 a, a difference image 132 b is the minimum value between the difference image 131 a and the difference image 132 a, and a difference image 133 b is the minimum value between the difference image 132 a and the difference image 133 a; the difference image 131 b is the defect accuracy of the image 131 , the difference image 132 b is the defect accuracy of the image 132 , and the difference image 133 b is the defect accuracy of the image 133 .
- the defects 1321 and 1331 are detected by masking them with the fixed value or the multi-value mask.
- processing when there are two or less partial images like the group D, it is also possible to detect the defect by performing a processing similar to that shown in FIG. 11 using the reference partial image extracted from a good sample as an input. It is also possible as another example to detect the defect regarding the luminance value itself in an unmasked area as the defect accuracy using a binary mask that can fully mask a non-inspection area (designed in advance), based on a given threshold.
- this embodiment is characterized in detecting the defect with respect to each group by grouping the whole regions of the inspection object to each pattern group constituting the region. This enables detection of the defect with a high accuracy even on a wafer not constituted by regular pattern groups. Furthermore, it is also effective for the case in which the same inspection object is a multi-layer bonded wafer, especially with each layer having irregular pattern groups.
- Reference numerals 141 , 142 , 143 in FIG. 14 schematically show arrays of each layer in a three-layer bonded wafer used as the inspection object.
- Each layer of the wafer is constituted by chips having a plurality of different pattern groups (indicated by different hatch patterns).
- Lines 144 , 145 in FIG. 14 indicate the chips superimposed in the depth direction.
- the same patterns are formed on a chip 1441 on the line 144 and a chip 1451 on the line 145
- the patterns formed on a pattern 1442 on the line 144 and a pattern 1452 on the line 145 on the second layer of the wafer 142 are different and the patterns formed on a pattern 1443 on the line 144 and a pattern 1453 on the line 145 on the third layer of the wafer 143 are also different.
- the group A 146 and the group B 147 in FIG. 14 show the label information of each chip when the chips on the lines 144 and 145 are superimposed in the depth direction.
- the label information may be newly grouped depending on the difference of the label combination designating the region having the chip combination with the same pattern as the chip pattern on the line 144 formed thereon as a label A and the region having the chip combination with the same pattern as the chip pattern on the line 145 as a label B, and it is stored as the label information uniquely determined for the bonded wafer.
- the label information is automatically set according to the combination pattern in the depth direction based on the label information of each layer of the wafer.
- the image obtained from the inspection object includes aperiodic patterns
- the image is segmented and grouped into regions having the same pattern group and a defection is detected within the partial images belonging to the same group.
- the image obtained from the inspection object includes such aperiodic patterns, it is possible to segment and group such an image into the regions having the same pattern group and detect the defect within the partial images belonging to the same group.
- the implementation of the inspection method according to the present invention and the apparatus thereof is described above taking an example of a substrate having a multi-layer structure and a complicated pattern such as a semiconductor wafer and a MEMS (Micro Electro Mechanical) wafer as the inspection object, and it is also applicable to an inspection of an IC package mounted on an IC tray or the like.
- a substrate having a multi-layer structure and a complicated pattern such as a semiconductor wafer and a MEMS (Micro Electro Mechanical) wafer
- FIGS. 15A and 15B One example is shown in FIGS. 15A and 15B .
- Denoted by 150 in FIG. 15A is an IC tray, and labels A, B, C, and D in each pocket of the IC tray 150 indicate different types and model numbers of the IC packages placed in the IC tray.
- FIG. 15B shows a processing procedure according to this embodiment.
- tray matrix information 152 of the IC package including the type and the model number (model number of IC package placed in each pocket on the tray, and the like) is received along with an inspection recipe 151 , tray pockets are grouped based on the tray matrix information 152 (S 1500 ), images of the pockets belonging to the same group are collected from images 153 of the IC packages on the tray pockets obtained (S 1501 ), and the defect detection process described with reference to FIG. 8 in the first embodiment is performed at the defect detection unit 7 - 2 . The same process is performed on each group. This enables a highly sensitive inspection even when multiple types of IC packages are placed on a single IC tray.
- the above processing is also effective for the inspection of the IC package formed on a strip substrate.
- labels may be applied according to the type of the device placed therein or the pattern group of the obtained image, and the same processing is applied thereafter.
- Embodiments of the present invention are described above taking an example of the defect inspection using the ultrasonic inspection apparatus in a case where there are multiple types of devices formed on a wafer or an IC tray, but it is also effective for an inspection of a discrete IC package.
- the reference image is generated from a good sample with respect to each type in advance;
- the corresponding reference image is input according to the type of the inspection object; and the defect accuracy is calculated to make a determination.
- the present inspection method can be used by applying the same labels to all the regions.
- the present invention is applicable not only to images obtained by the ultrasonic inspection apparatus but also to non-destructive inspection images obtained by an x-ray defect inspection apparatus and images of appearance inspection.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Manufacturing & Machinery (AREA)
- Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Immunology (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Power Engineering (AREA)
- Computer Hardware Design (AREA)
- Acoustics & Sound (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Testing Or Measuring Of Semiconductors Or The Like (AREA)
- Investigating Or Analyzing Materials By The Use Of Ultrasonic Waves (AREA)
Abstract
Description
- The present invention relates to an apparatus for inspecting a defect from an image of an inspection object obtained by using an ultrasonic wave, an x-ray, or the like, and specifically to an inspection method suitable for an inspection of an inspection body having a multi-layer structure and a non-destructive inspection apparatus using the same.
- As a non-destructive inspection method for inspecting a defect from an image of an inspection object, there are a method of using an ultrasonic image generated by irradiating the inspection object with an ultrasonic wave and detecting a reflected wave therefrom, and a method of using an x-ray image obtained by irradiating the inspection object with an x-ray and detecting an x-ray transmitted therethrough.
- In order to detect a defect present in an inspection object having a multi-layer structure using an ultrasonic wave, a reflection property due to difference in acoustic impedance is generally used. The ultrasonic wave propagates through a liquid or solid material and generates a reflected wave at an interface between materials having different acoustic impedances or at a cavity. Since a reflected wave from a defect is different from a reflected wave from a defect-free portion in its strength, it is possible to obtain an image that exposes the defect present in the inspection object by visualizing reflection intensities at inter-layer interfaces of the inspection object.
- Determination of presence of a defect in the obtained image of the reflection intensity is often performed visually by an inspector, which may lead to variation in the evaluation result due to the experience of each inspector. Moreover, major inspection objects such as semiconductors and electronic devices are increasingly miniaturized, making it more difficult to visually distinguish a defect from a normal pattern. Furthermore, multi-layer structures have become more popular to be adapted to multi-functionalization and miniaturization of mounting products, a WLP (Wafer Level package) method of handling the product in a form of a wafer until the final process of packaging is becoming a mainstream in the manufacturing scene. Thus, it is required for the ultrasonic inspection to detect a micron-order internal defect at a high speed with high sensitivity by separating the micron-order internal defect from a complicated pattern in the form of the wafer. However, this corresponds to detecting only a few pixels showing the defect from several tens of millions of pixels constituting an internal image, which is nearly impossible to be determined visually.
- One conventional technique of automatically detecting a defect from an ultrasonic inspection image is a method described in Japanese Patent Laid-open No. 2007-101320 (Patent Document 1). This includes a function of sequentially generating and displaying ultrasonic inspection images, thereby extracting a candidate defect based on contiguity of a luminance distribution in each image. A defect and a noise can be distinguished by the length of the continuous repetition of the candidate defect. Furthermore, there is another method described in Japanese Patent Laid-open No. 2012-253193 (Patent Document 2). In this method, a presence of a void in a TSV (Through Silicon Via) in a three-dimensional integration structure is estimated based on ultrasonic scanning.
- In a case where the inspection object has a complicated pattern, as well as a multi-layer structure, of a semiconductor or an electronic device, then it is possible to distinguish a defect having a certain length and a noise generated at random times using the method described in Japanese Patent Laid-open No. 2007-101320, but impossible to distinguish between a fine defect from a normal pattern. With the method described in Japanese Patent Laid-open No. 2012-253193, the pattern of the inspection object is limited to the TSV, and in order to avoid an effect by a structure that may reduce resolution of the TSV in the vertical direction (bump electrode or wiring layer), the presence of the void in an active TSV is presumed by forming a TEG (Test Element Group) region including only an etch stop layer and the TSV and inspecting the presence of the void in the TEG region, which cannot inspect a wafer whole surface including a mixture of various patterns.
- It is therefore an object of the present invention to provide an inspection method and an inspection apparatus capable of detecting an internal fault with a high sensitivity by separating it from a normal pattern in an ultrasonic inspection performed on an inspection object including a fine and multi-layer structure such as a semiconductor wafer and a MEMS wafer.
- To address the above problem, the present invention provides a defect inspection method of detecting a defect including the steps of: obtaining an image of an inspection object by imaging the inspection object having a pattern formed thereon; generating a reference image that does not include a defect from the obtained image of the inspection object; generating a multi-value mask for masking a non-defective pixel from the obtained image of the inspection object; calculating a defect accuracy by matching the brightness of the image of the inspection object and the reference image; and comparing the calculated defect accuracy with the generated multi-value mask.
- To address the above problem, the present invention also provides a defect inspection apparatus including: an image acquisition unit obtaining an image of an inspection object by imaging the inspection object having a pattern thereon; a reference image generation unit generating a reference image that does not include a defect from the image of the inspection object obtained by the image acquisition unit and generating a multi-value mask for masking a non-defective pixel from the obtained images of the inspection object; a feature amount computing unit calculating a defect accuracy by matching the brightness of the image of the inspection object obtained by the image acquisition unit and the reference image generated by the reference image generation unit; and a defect detection processing unit detecting the defect by comparing the defect accuracy calculated by the feature amount computing unit with the multi-value mask generated by the reference image generation unit.
- Moreover, to address the above problem, the present invention further provides an ultrasonic inspection apparatus including: a detection unit including an ultrasonic probe emitting an ultrasonic wave and a flaw detector detecting a reflected echo generated from an inspection object by the ultrasonic wave emitted from the ultrasonic probe; an A/D conversion unit A/D converting a signal output from the flaw detector having detected the reflected echo in the detection unit; and an image processing unit detecting the reflected echo from the flaw detector converted into a digital signal by the A/D conversion unit, processing the output signal, generating a sectional image in a plane parallel with a surface of the inspection object inside the inspection object, processing the generated internal sectional image, and thereby inspecting an internal defect of the inspection object, wherein the image processing unit includes: a sectional image generation unit detecting the reflected echo generated from the flaw detector, processing the output signal, and generating the sectional image of the inside of the inspection object; a reference image generation unit generating a reference image that does not include a defect from the sectional image of the inside of the inspection object generated by the sectional image generation unit and generating a multi-value mask for masking a non-defective pixel from the obtained internal image of the inspection object; a feature amount computing unit calculating a defect accuracy by matching the brightness of the image of the inspection object obtained by the image acquisition unit and the reference image generated by the reference image generation unit; a defect detection processing unit detecting the defect by comparing the defect accuracy calculated by the feature amount computing unit with the multi-value mask generated by the reference image generation unit; and an output unit outputting the internal defect detected by the defect detection processing unit.
- The present invention makes it possible to detect and output a fine defect near a normal pattern on an internal image of the inspection object including a mixture of aperiodic and complicated patterns.
- Moreover, the present invention also makes it possible to detect the defect inside the inspection object by processing the sectional image of the inside of the inspection object detected using an ultrasonic wave.
- These features and advantages of the invention will be apparent from the following more particular description of preferred embodiments of the invention, as illustrated in the accompanying drawings.
-
FIG. 1 is an exemplary flow chart of a process showing a concept of a method for inspecting an internal defect of a wafer carrying various devices thereon according to a first embodiment of the present invention; -
FIG. 2 is a block diagram showing a concept of an ultrasonic inspection apparatus according to the first embodiment of the present invention; -
FIG. 3 is a block diagram showing a configuration of the ultrasonic inspection apparatus according to the first embodiment of the present invention; -
FIG. 4 is a perspective view of a wafer having a multi-layer structure used as an inspection object in the first embodiment of the present invention; -
FIG. 5A is a sectional view of the multi-layer wafer showing a relation between the multi-layer wafer and an ultrasonic probe used as the inspection object in the first embodiment of the present invention; -
FIG. 5B is a graph showing a reflected echo signal from the multi-layer wafer detected by using the ultrasonic probe used as the inspection object in the first embodiment of the present invention; -
FIG. 6A is a plan view of the multi-layer wafer used as the inspection object in the first embodiment of the present invention; -
FIG. 6B is an image of the multi-layer wafer used as the inspection object in the first embodiment of the present invention; -
FIG. 7 is a plan view of the wafer with a label applied to each chip of the multi-layer wafer used as the inspection object in the first embodiment of the present invention; -
FIG. 8 is a block diagram showing a configuration of a defect detection unit of the ultrasonic inspection apparatus according to the first embodiment of the present invention; -
FIG. 9A is a block diagram showing a configuration of a reference image generation unit of the ultrasonic inspection apparatus according to the first embodiment of the present invention; -
FIG. 9B is a process flow chart of the reference image generation unit in the defect detection unit of the ultrasonic inspection apparatus according to the first embodiment of the present invention; -
FIG. 10 shows an image and a graph showing a procedure of generating a multi-value mask by the defect detection unit of the ultrasonic inspection apparatus according to the first embodiment of the present invention; -
FIG. 11 is a flow chart showing a defect detection process by the defect detection unit of the ultrasonic inspection apparatus according to the first embodiment of the present invention; -
FIG. 12A is a plan view of the wafer labeled with respect to each pattern group according to the first embodiment of the present invention; -
FIG. 12B is a plan view of chips on the wafer showing an example in which information of the defect detected with respect to each group is integrated and output by a defect information output unit in the defect detection unit of the ultrasonic inspection apparatus according to the first embodiment of the present invention; -
FIG. 12C is a plan view of the wafer showing another example in which information of the defect detected with respect to each group is integrated and output by a defect information output unit in the defect detection unit of the ultrasonic inspection apparatus according to the first embodiment of the present invention; -
FIG. 13A is a plan view of the wafer labeled with respect to each pattern group according to the first embodiment of the present invention; -
FIG. 13B is a flow chart showing a process by the defect detection unit of the ultrasonic inspection apparatus according to the first embodiment of the present invention but different from what is described with reference toFIG. 12B ; -
FIG. 14 shows a perspective view of the wafer and an image of chips showing an example of grouping on the multi-layer wafer used as the inspection object according to the first embodiment of the present invention; -
FIG. 15A is a plan view of an IC tray used as an inspection object according to a second embodiment of the present invention; and -
FIG. 15B is a flow chart showing a process for the IC tray used as the inspection object according to the second embodiment of the present invention. - The present invention relates to a defect inspection method making it possible to separate signals of a normal pattern from that of a defect on an inspection object including an aperiodic pattern structure and thereby detecting a fine defect, and an apparatus for the same. That is, the present invention is configured to segment an image into regions each consisting of the same pattern group, group the regions, and detect a defect in a partial image of the same group, even if the image obtained from the inspection object includes an aperiodic pattern. The present invention is effective for an appearance inspection, a non-destructive inspection, and the like performed on such an inspection object having a complicated pattern structure.
- Moreover, the present invention is configured to detect a defect in an internal image of the inspection object by segmenting an image into regions each consisting of the same pattern group, grouping the regions, and integrating features of the segmented internal images belonging to the group. Grouping is performed based on labels applied to segmented regions by a user in advance, or based on design data or an exposure recipe used when patterning each layer. Moreover, for detection of the defect, a reference segmented internal image is formed by integrating the features of the segmented internal images belonging to the same group, and the features are compared between the reference segmented internal image and each segmented internal image to calculate a defect accuracy. Furthermore, with respect to each pixel having a defect accuracy, a multi-value mask is generated from the segmented internal image, masking is performed on the pixel having the defect accuracy using the multi-value mask, and the remaining pixels are determined to be defective. By performing this on each group, the non-destructive inspection can be performed on the entire region of the inspection object covering a wide range.
- Hereinbelow, embodiments of the present invention will be described with reference to drawings.
- Hereinbelow, an explanation of a case where a defect inspection method according to the present invention is applied to an ultrasonic inspection apparatus.
- An implementation of the inspection method according to the present invention and the apparatus thereof is described with reference to
FIGS. 1 to 14 . First, an implementation of the ultrasonic inspection apparatus using a substrate having a multi-layer structure and a complicated pattern, such as a semiconductor wafer and a MEMS (Micro Electro Mechanical System) wafer, as the inspection object is described. - As a property of an ultrasonic wave, it propagates through the inspection object and, if there is a boundary at which a material property (acoustic impedance) changes, it is partially reflected. Since a large part of the ultrasonic wave is reflected when there is a cavity, such a defect as a void or a stripping can be detected with a high sensitivity based on the reflection intensity especially at a bonding surface of the wafer including multiple layers bonded together. Hereinbelow, a defect on the bonding surface of the multi-layer wafer is to be detected.
-
FIG. 2 is a conceptual diagram showing the implementation of the ultrasonic inspection apparatus according to the present invention. The ultrasonic inspection apparatus according to the present invention includes adetection unit 1, an A/D convertor 6, animage processing unit 7, and atotal control unit 8. - The
detection unit 1 includes anultrasonic probe 2 and aflaw detector 3. Theflaw detector 3 drives theultrasonic probe 2 by applying a pulse signal to theultrasonic probe 2. Theultrasonic probe 2 driven by theflaw detector 3 generates an ultrasonic wave and emits it toward the inspection object (sample 5). When the emitted ultrasonic wave enters thesample 5 having the multi-layer structure, a reflectedecho 4 is generated from the surface of thesample 5 or from the bonding surface of the wafer. The reflectedecho 4 is then received by theultrasonic probe 2, processed by theflaw detector 3 as needed, and converted into a reflection intensity signal. - The reflection intensity signal is then converted into digital waveform data by the A/
D convertor 6 and input to theimage processing unit 7. Theimage processing unit 7 appropriately includes an image generation unit 7-1, a defect detection unit 7-2, and a data output unit 7-3. Signal conversion to be described later is performed by the image generation unit 7-1 on the waveform data input from the A/D convertor 6 to theimage processing unit 7, thereby generating a sectional image of a specific bonding surface of thesample 5 from the digital waveform data. The defect detection unit 7-2 performs a process to be described later based on the sectional image of the bonding surface generated by the image generation unit 7-1 to detect the defect. The data output unit 7-3 generates data to be output as an inspection result such as information about an individual defect detected by the defect detection unit 7-2 and an image for observation of the section, and outputs the data to thetotal control unit 8. - Shown in
FIG. 3 is a schematic diagram of an exemplary configuration of a specificultrasonic inspection apparatus 100 implementing the configuration shown inFIG. 2 . InFIG. 3 , denoted by 10 is a coordinate system having three orthogonal axes of X, Y, and Z. -
Reference numeral 1 inFIG. 3 corresponds to thedetection unit 1 described with reference toFIG. 2 . Denoted by 11 included in thedetection unit 1 is a scanner table, 12 is a tank arranged on the scanner table 11, and 13 is a scanner arranged so as to bridge over thetank 12 on the scanner table 11 and movable in X, Y, and Z directions. The scanner table 11 is a base placed substantially horizontal. Thetank 12 containswater 14 injected to the height indicated by a dotted line, and thesample 5 is placed on the bottom (in the water) of thetank 12. Thesample 5 is the semiconductor wafer including the multi-layer structure and the like, as described above. Thewater 14 is a medium required for effectively propagating the ultrasonic wave emitted by theultrasonic probe 2 into thesample 5. Denoted by 16 is a mechanical controller, which drives thescanner 13 in the X, Y, and Z directions. - For the
sample 5, theultrasonic probe 2 emits the ultrasonic wave from an ultrasonic output unit at its lower edge, and receives a reflected echo returned from thesample 5. Theultrasonic probe 2 is attached to aholder 15 and movable in the X, Y, and Z directions by thescanner 13 driven by themechanical controller 16. Thus, theultrasonic probe 2 can receive the reflected echo at a plurality of measurement points of thesample 5 set in advance while travelling in the X and Y directions, obtain a two-dimensional image of a bonding surface within a measurement range (X-Y plane), and thus inspect the defect. Theultrasonic probe 2 is connected to theflaw detector 3 that converts the reflected echo into a reflection intensity signal via acable 22. - The
ultrasonic inspection apparatus 100 further includes the A/D convertor 6 that converts the reflection intensity signal output from theflaw detector 3 of thedetection unit 1 into a digital waveform, theimage processing unit 7 that processes an image signal having been A/D converted by the A/D convertor 6, thetotal control unit 8 that controls thedetection unit 1, the A/D convertor 6, and theimage processing unit 7, and themechanical controller 16. - The
image processing unit 7 processes the image signal having been A/D converted by the A/D convertor 6 and detects an internal defect of thesample 5. Theimage processing unit 7 includes the image generation unit 7-1, the defect detection unit 7-2, the data output unit 7-3, and a parameter setting unit 7-4. - The image generation unit 7-1 generates an image from the digital data obtained by A/D converting the reflected echo returned from the sample surface and each bonding surface, and the like within the measurement range of the
sample 5 set in advance and the position information of the ultrasonic probe obtained by themechanical controller 16. The defect detection unit 7-2 processes the image generated by the image generation unit 7-1 and thereby becomes apparent or detects the internal defect. The data output unit 7-3 output the inspection result from becoming apparent or detecting the internal defect by the defect detection unit 7-2. The parameter setting unit 7-4 receives a parameter such as a measurement condition input from the outside, and sets the parameter to the defect detection unit 7-2 and the data output unit 7-3. In theimage processing unit 7, for example, the parameter setting unit 7-4 is connected to astorage unit 18 that stores therein a database. - The
total control unit 8 includes a CPU (incorporated in the total control unit 8) that performs various controls, receives the parameter or the like from the user, and appropriately connects a user interface unit (GUI unit) 17 that includes a display means for displaying information including an image of the defect detected by theimage processing unit 7, the number of defects, a coordinate and dimension of the individual defect, and the like, and an input means, and thestorage unit 18 storing therein the feature amount, image, and the like of the defect detected by theimage processing unit 7. Themechanical controller 16 drives thescanner 13 based on a control instruction from thetotal control unit 8. It should be noted that theimage processing unit 7, theflaw detector 3, and the like are also driven by the instruction from thetotal control unit 8. -
FIG. 4 shows a configuration of aninspection object 400 as an example of thesample 5. Theinspection object 400 shown inFIG. 4 schematically represents appearance of a wafer including the multi-layer structure which is the main inspection object. Theinspection object 400 is a laminated wafer formed by laminating andbonding wafers 41 to 45 of different types such as MEMS, CPU, memory, CMOS, and the like. The number of lamination is not limited to five but may be any number larger than one. Theultrasonic inspection apparatus 100 according to the present invention is used to inspect whether thewafers 41 to 45 in theinspection object 400 are properly bonded together on the whole lamination surface (bonding surface) without forming any depleted region such as a void or a stripping. -
FIG. 5A is an example schematically showing a vertical structure of theinspection object 400 having the multi-layer structure shown inFIG. 4 . When anultrasonic wave 50 emitted from theultrasonic probe 2 enters asurface 401 of theinspection object 400, theultrasonic wave 50 transfers through theinspection object 400 and is reflected from theinspection object surface 401 andbonding surfaces ultrasonic probe 2 receives them as a single reflected echo. - A
graph 51 inFIG. 5B shows an exemplary reflected echo from the inspection object received by theultrasonic probe 2, with its abscissa representing time and ordinate representing reflection intensity. Time also indicates the depth of theinspection object 400. In thegraph 51, by applying a visualization gate 52 (hereinbelow, simply referred to as “gate 52”) to a time domain that may include the reflected echo from the bonding surface to be observed, the desired time domain is cut out and a peak value in thegate 52 is detected. - The image generation unit 7-1 of the
image processing unit 7 detects the peak value in each scanning position from the reflected echo obtained while scanning the measurement range (X-Y plane) by thescanner 13 and converts the peak value into a gray value (for example, 0 to 255 in a case of generating a 256-tone image), thereby generating the sectional image of the bonding surface (an image of a section (a plane parallel to the wafer surface) in a depth direction from the wafer surface) from the gray value information at each scanning position. - Now, when the inspection object has the multi-layer structure like the
inspection object 400 and has a plurality of bonding surfaces (such as 402 to 405) to be inspected, it is possible to set thegate 52 to the reflected echo in the time domain corresponding to each bonding surface and generate the sectional image of each bonding surface. - Shown in
FIGS. 6A and 6B are exemplary sectional images of the bonding surface generated.FIG. 6A schematically shows a top view of alaminated wafer 60 that is the inspection object. Thelaminated wafer 60 is eventually diced along straight lines shown inFIG. 6A to become a finished product. Hereinbelow, “chip” is used to refer to the diced product. Denoted by 62 in (a) ofFIG. 6B is an exemplary sectional image of the bonding surface obtained from aregion 61 delimited by a broken line and including three chips on thelaminated wafer 60. Denoted by 63, 64, and 65 in (b) ofFIG. 6B are partial sectional images made by segmenting thesectional image 62 in (a) ofFIG. 6B into three regions corresponding to each chip. Since the partialsectional images FIG. 6B has the same devices mounted on the chip, the pattern configurations included in the obtained partial sectional image (hereinbelow, referred to as “ pattern group”) are also the same, while the left half of the partialsectional image 64 in (b) ofFIG. 6B is constituted by two patterns, indicating that its pattern group is different from that of the partialsectional images - According to this embodiment, for such an inspection object constituted by multiple types of chips having different pattern groups, the sectional images are grouped with respect to each region having the same pattern group (for example, the partial
sectional images sectional image 64 belong to a group B), and the defect detection process is performed with respect to each group. -
FIG. 1 is the conceptual diagram of this case. Denoted by 101 is an appearance of a wafer including a mixture of various devices thereon as an example of the inspection object. The inspection object (wafer) 101 includes chips formed thereon in a grid shape, and the different hatch patterns indicate different types of the devices constituting the chip. In other words, basically the inspection images constituted by the same pattern group are obtained from the regions of the same hatch pattern. - In the defection inspection according to the invention, the
detection unit 1 obtains a surface image or an internal sectional image of the inspection object 101 (S11), and theimage processing unit 7 first extracts partial images constituted by the same pattern group from the obtained surface image or the internal sectional image of the inspection object 101 (S12). The partial images corresponding to theregions inspection object 101 are extracted from the extracted partial image and aligned as shown by 102 (S13). The image alignment means, because the extractedpartial image 103 to 108 have the same pattern group, performing a position correction so that regions of the same pattern may be present at the same coordinate value in each image. - Features are then calculated in each pixel of each image and integrated between images as denoted by 109 and 110 (S14). This step is performed on all the pixels in the partial images to generate a reference partial image 111 (S15) and generate a multi-value mask 112 (S16). Integral comparison (S17) with the generated reference partial image 111 and the
multi-value mask 112 is then performed on each of thepartial image 103 to 108 to detect adefect 113. Finally, the detecteddefect 113 is combined on the wafer level (S18) and the result is displayed (S19). The same process is performed on the partial images constituted by other pattern groups (images corresponding to the striped, dotted, or checkerboard hatch patterns on the wafer 101). - Here, extracting the partial images having the same pattern group from the inspection object (wafer) 101 used as the inspection object at Step S12 is performed by receiving a prior setting from the user.
FIG. 7 shows an example thereof. Denoted by 60 in (a) ofFIG. 7 is a layout of chips formed on thewafer 101 used as the inspection object. This is displayed on a screen by theuser interface unit 17 shown inFIG. 3 , and the parameter setting unit 7-4 receives the labels applied to the individual chip on the screen by the user. In this process, theinspection object 101 is grouped based on the labels applied by the user. - Denoted by 701 in (b) of
FIG. 7 is an example of the result, which is formed by segmenting thewafer 101 used as the inspection object into partial images in the unit of chips and grouping the partial images into four categories of A to D based on the labels applied by the user. An automatic setting is also possible using the recipe for the exposure even if there is no user setting. The exposure recipe includes exposure position information indicative of where to print a circuit pattern on the substrate, exposure order, and the like, from which the information about the pattern to be formed at each position can be obtained. - Next, a configuration of the process performed by the defect detection unit 7-2 of the
image processing unit 7 is described.FIG. 8 shows an example thereof. The defect detection process is performed using the partial images constituted by the same pattern group. Aninspection recipe 801 constituted by various parameter values used for the processing, and an image of the waferwhole surface 802 are input. The defect detection unit 7-2 generally includes a partial imagegroup generation unit 81, a referenceimage generation unit 82, a defectdetection processing unit 83, and a defectinformation output unit 84. First, when the waferwhole surface 802 is input to the defect detection unit 7-2, a plurality of partial images applied with the same label by the partial image group generation unit 81 (for example, 103 to 108 inFIG. 1 ) are input to the referenceimage generation unit 82. The referenceimage generation unit 82 generates a referencepartial image 804 and amulti-value mask 805. The referencepartial image 804 means the normal image constituted by the same pattern group as that of the input partial image. - Shown in
FIGS. 9A and 9B is an example of a method of generating the reference partial image. Denoted by 90 a, 91 a, 92 a, . . . inFIG. 9B are the partial images of the same label cut out of theinspection object 101. These partial images include the same pattern group (denoted herein by three different hatch patterns 911 to 913). Thedefects 921 to 923 (indicated by white color) may possibly be included. There also may be a positional shift of the pattern due to a slight difference in the position of obtaining the image when scanning (sampling error) (indicated by difference in positions of the hatch patterns 911 to 913 with respect to the black background). Thus, correction of the position of each image, namely inter-image position correction is performed so as to correct the position of the partial image, or so as to align the coordinates of the hatch patterns 911 to 913 with the black background (S901). - The position correction between the partial images at Step S901 is performed using a general matching method such as: specifying one partial image; calculating a shift amount that makes the minimum sum of squares of the luminance difference between the specified image and other partial images to be corrected while shifting the partial image to be corrected with respect to the specified image, or calculating a shift amount that makes the maximum normalized cross-correlation coefficient; and shifting the partial image by the calculated shift amount. Denoted by 90 b, 91 b, 92 b, . . . in
FIG. 9B are the partial images after the position correction. - The features of the pixels in the
partial images -
[Equation 1] -
F1(x, y); max{f(x, y), f(x+1, y), f(x, y+1), f(x+1, y+1)}−min{f(x, y), f(x+1, y), f(x, y+1), f(x+1, y+1) (Equation 1) -
[Equation 2] -
F2(x, y); Σf(x+i, y+j)/M (Equation 2) - (i, j=−1, 0, 1 M=9)
-
[Equation 3] -
F3(x, y); [Σ{f(x+i, y+j)2 }−{Σf(x+i, y+j)}2 /M]/(M−1) (Equation 3) - i, j=−1, 0, 1 M=9
- where f(x, y) is the luminance value of the coordinate (x, y) in the partial image.
- Next, as described above, the feature of each pixel (x, y) calculated for each partial image is integrated between the partial images (S903) to generate the reference
partial image 804. One example of this processing method includes: collecting features Fi(x, y) of the corresponding coordinate (x, y) between partial images (i is the number designated to the partial image), and thereby statistically determining the reference feature value S(x, y) of the feature of each pixel as represented byEquation 4. The luminance value of the partial image equal to the reference feature value is determined as the luminance value of the reference partial image. In this manner, the referencepartial image 804 exclusive of influences from a defect is generated. -
[Equation 4] -
S(x, y)=Median{F1(x, y), F2(x, y), F3(x, y), . . . } (Equation 4) - Median: Function outputting a median value (median) of the feature of each partial image
- S(x, y): Reference feature value
- F* (x, y): Feature value of the
partial images - It is noted that, as represented by
Equation 5, the statistical processing may be performed by calculating an average of the feature at the corresponding coordinate between images and using the luminance value of the partial image having its feature closest to the average as the luminance value of the reference partial image. -
[Equation 5] -
S(x, y)=Σ{F i(x, y)}/N (Equation 5) - i: the number designated to the partial image
N: partial image - As shown in
FIG. 8 , the referenceimage generation unit 82 generates, in addition to the reference partial image, themulti-value mask 805 for eliminating (masking) a non-defective pixel between images. One example of the generation procedure is shown inFIG. 10 . The multi-value mask according to this embodiment is set by calculating multiple values (0 to 255) with respect to each pixel in the image. For thepartial images FIG. 9B , the luminance value f(x, y) of the corresponding pixel is integrated, and the dispersion value of the luminance values is calculated as the feature according toEquation 6. - In
FIG. 10 , agraph 1001 shows a distribution of luminance values of a coordinate indicated by a white square 1011 in thepartial images graph 1002 shows the distribution of the luminance values of the coordinate indicated by a black square 1012 in thepartial images - Another feature is also calculated from the same pixel.
Reference numeral 1003 shows a pattern near the coordinate indicated by theblack square 1012. There is alongitudinal pattern 1004 with high luminance. Acurve 1021 in thegraph 1020 shows a luminance profile of a location indicated by an arrow 1005 (→ ←) on thelongitudinal pattern 1004 in thepattern 1003. Acurve 1022 shows a luminance profile when thelongitudinal pattern 1004 in thepattern 1003 is shifted by an amount α. Thus, Δ in thegraph 1020 indicates the luminance difference caused by the positional shift by the amount α. The Δ is regarded as the second feature of the pixel indicated by theblack square 1012. The luminance difference Δ is calculated for all the pixels within the partial images in the same manner. Then, based on the values of the two features σ and Δ calculated from all the pixels within the partial images, a multi-value mask value M is calculated according toEquation 7. Anaspect 1031 in the three-dimensional graph 1030 corresponds to the value M of the multi-value mask calculated from Δ and σ. -
[Equation 6] -
σ(x,y)=┌Σ{f i(x, y)2}−{Σf i(x, y)}2 /N┐/(N−1) (Equation 6) - i: the number designated to the partial image
N: partial image -
[Equation 7] -
M(x, y)=k×σ(x, y)+m×Δ(x, y)+n (Equation 7) - Since σ and Δ are calculated from the features of each pixel, the value M of the multi-value mask is calculated separately with respect to each pixel according to σ and Δ. This may cause a difference in the pattern luminance values between partial images, despite the same pattern group, due to a fabrication tolerance or a sampling error at the time of image acquisition, and the difference is reflected on the mask.
- The parameters α (described in
FIG. 10 ), k, m, and n are set in advance, and the distribution of the multi-value mask M indicated by theaspect 1031 in the three-dimensional graph 1030 can be adjusted by adjusting these parameters. In addition, although the example was given in which the multi-value mask M was calculated based on σ and Δ calculated by integrating the features of each pixel between the partial images, any feature indicative of the property of the pixel can be used, and the way of integrating the feature may also be changed accordingly. Furthermore, the number of the features to be integrated is not limited to two but the multi-value mask M can be calculated from any number more than one of the integration features. Although the value of n was described as a fixed value, it can be set with respect to each pixel in the partial image. - Although the above description was given taking an example of generating the reference partial image exclusive of any defect from partial images, the reference image may also be generated by cutting out partial images constituted by the same pattern group from a good sample guaranteed to be free of defect.
- Hereinbelow, the defect
detection processing unit 83 that detects a defect from thepartial images 103 to 108 is described using the referencepartial image 804 and themulti-value mask 805 inFIG. 8 . -
FIG. 11 shows an example of the process performed by the defectdetection processing unit 83. The referencepartial image 804 and themulti-value mask 805 output from the referenceimage generation unit 82 and apartial image group 803 of the inspection object (partial images in the same group) are input, which images have been subjected to an inter-image position correction, as described with reference toFIG. 9 . - First, each image in the partial image group used as the inspection object is matched with the reference partial image for the brightness, as needed (S1101). There may be difference in brightness even between the partial images constituted by the same pattern group, due to difference in thickness of each layer when the sample is formed of a multi-layer film, or due to warpage of a wafer when the inspection object is the wafer. Therefore, matching of the brightness is performed (correct the brightness of one image so that they have the same brightness).
- One example of the method thereof described herein includes the step of correcting the brightness of the partial image to match that of the reference
partial image 804 based on the least squares approximation. Assuming that there is a linear relation represented byEquation 8 between the pixels f(x, y) and g(x, y) of the respective image in thepartial image group 803 and the referencepartial image 804, a and b are calculated so that Equation 9 makes the minimum value and they are used as correction coefficients “gain” and “offset”. The brightness is corrected on the corresponding pixels in respective images in thepartial image group 803 and the referencepartial image 804 as well as all the pixel values f(x, y) in the partial images to be corrected for the brightness, as represented byEquation 10 -
[Equation 8] -
g(x,y)=a+b·f(x,y) (Equation 8) -
[Equation 9] -
Σ{g(x, y)−(a+b·f(x, y))}2 (Equation 9) -
[Equation 10] -
f′(x,y))=gain·f(x,y)+offset (Equation 10) - The defect accuracy is then calculated for each pixel in a partial image 1110 (S1102). An exemplary defect accuracy is defined by a value indicative of an appearance at the normal time, namely a degree of deviation from the luminance value of the reference
partial image 804, which is calculated according toEquation 11. -
[Equation 11] -
d(x, y)=f′(x, y)−g(x, y) (Equation 11) - The masking process is performed on the defect accuracy calculated according to
Equation 11 using themulti-value mask 805 for each pixel, and the remaining pixels are detected as defective (S1103). - The masking process detects the defect when the defect accuracy exceeds a mask value as represented by
Equation 12. -
[Equation 12] -
P(x, y): defect (if d(x,y)≥M((x,y) -
P(x,y: normal (if d(x,y9<M(x,y) (Equation 12) - where, m(x, y)=k×σ(x, y)+m×Δ(x, y)+n(x,y)
- It is noted that, although the example of detecting the defect by masking the pixels brighter than the luminance value of the reference
partial image 804 is described above, the same applied to the pixels darker than the luminance value of the referencepartial image 804. As already described, an influence by the fabrication tolerance and the sampling error at the time of image acquisition between the images is taken into account for themulti-value mask 805. Thus, themulti-value mask 805 can mask the pixels including a noise of the fabrication tolerance or the sampling error in the defect accuracy calculated according toEquation 11. - Finally, the defect feature of a defective pixel is calculated for determining whether it is defective or not (S1104). There may be one or more defect features indicative of the feature of the defect. Examples include an area, a maximum length, a luminance value, an edge intensity, and the like of the defect.
- The process steps S1101 to S1104 by the defect
detection processing unit 83 described above are performed on the partial images constituted by the same pattern group after grouping, and the same is performed on each group. - As described above, the information about the defect detected by the process per group is then rearranged into a chip array on the inspection object by the defect
information output unit 84. Its concept is shown inFIGS. 12A and 12B . - A
wafer 120 shown inFIG. 12A is an inspection object segmented into regions constituted by the same pattern group and labeled. Based on this, it is assumed that the defect detection process is performed on each of the groups A to D to detect adefect 1202 a in aregion 1202 of the group A, adefect 1201 a in aregion 1201 of the group B, adefect 1203 a in a region 1203 of a group C, and adefect 1204 a in aregion 1204 of a group D, as shown inFIG. 12B . - Upon receipt of this result, the defect
information output unit 84 inFIG. 8 rearranges the output result from the segmented partial images based on region arrangement information of the inspection object (wafer) 120. That is, it maps the detectedresults 1201 a to 1204 a at the positions ofregions 1201 to 1204 inFIG. 8B , generates adefect distribution image 121 on the wafer, and output thedefect distribution image 121. The defects at 1202 a and 1203 a detected in separate processes are thus output as a single defect. At the same time, the coordinate indicative of the defect position in the partial image is converted into the coordinate system of theinspection object 101, and separately calculated defect features (area, maximum length, etc.) are also integrated. The defect information after the conversion and integration is output to the data output unit 7-3 and displayed by a display means such as a display unit via the user interface unit (GUI unit) 17. It is also possible to simultaneously determining whether the chip is good or defective based on the defect features and display the result. For example, the number of defects, the maximum defective area, the ratio of the defective pixels in the chip are measured, and the chip exceeding a judgement condition input as the inspection recipe is output and displayed as a faulty chip. - Although
FIG. 12B shows an example of mapping the detected result and outputting the defect distribution image on the wafer as denoted by 121, it is also possible to display the defective chips in a color different from that of defect-free chips on the wafer, as shown inFIG. 12C . - For inspecting the wafer, the defect detection process also has a plurality of detection methods other than using the luminance difference compared with the reference image as the defect accuracy, as described above. Its concept is shown in
FIGS. 13A and 13B .FIG. 13A shows anwafer 130 used as the inspection object segmented into regions constituted by the same pattern group and labeled. In this example, the group A includes seven regions, the group B includes nine regions, the group C includes three regions, and the group D includes two regions. - The defect detection process according to this embodiment can change the method of detecting the defect depending on the number of the regions having the same label. For example, as described above, the reference partial image with the influence by the defect removed is statistically generated by integrating the features of each partial image. However, as the number of the partial images decrease, reliability of the statistical processing decreases. Therefore when the number of the regions is smaller than a certain number (for example, less than four regions), the statistical processing is not performed, but comparison between actual subjects, comparison with a model, comparison with a fixed threshold, and the like may be performed. An exemplary processing in the case of three partial images like the group C is as follows.
- Denoted by 131, 132, 133 in
FIG. 13B are partial images generated by cutting out the regions corresponding to the label C in the inspection object (wafer) 130 and performing position correction and brightness matching, where the partial images (hereinbelow, referred to simply as images) 132 and 133 includedefects difference image 131 a takes a difference between theimages difference image 132 a takes a difference between theimages difference image 133 a takes a difference between theimages difference image 133 a and thedifference image 131 a, adifference image 132 b is the minimum value between thedifference image 131 a and thedifference image 132 a, and adifference image 133 b is the minimum value between thedifference image 132 a and thedifference image 133 a; the difference image 131 b is the defect accuracy of theimage 131, thedifference image 132 b is the defect accuracy of theimage 132, and thedifference image 133 b is the defect accuracy of theimage 133. Thedefects - As another example of processing, when there are two or less partial images like the group D, it is also possible to detect the defect by performing a processing similar to that shown in
FIG. 11 using the reference partial image extracted from a good sample as an input. It is also possible as another example to detect the defect regarding the luminance value itself in an unmasked area as the defect accuracy using a binary mask that can fully mask a non-inspection area (designed in advance), based on a given threshold. - As described above, this embodiment is characterized in detecting the defect with respect to each group by grouping the whole regions of the inspection object to each pattern group constituting the region. This enables detection of the defect with a high accuracy even on a wafer not constituted by regular pattern groups. Furthermore, it is also effective for the case in which the same inspection object is a multi-layer bonded wafer, especially with each layer having irregular pattern groups.
-
Reference numerals FIG. 14 schematically show arrays of each layer in a three-layer bonded wafer used as the inspection object. Each layer of the wafer is constituted by chips having a plurality of different pattern groups (indicated by different hatch patterns). - When viewing the chips on the wafer in the depth direction, the combinations of the pattern groups (group A146 and group B147) are different.
Lines 144, 145 inFIG. 14 indicate the chips superimposed in the depth direction. On the first layer of thewafer 141, the same patterns are formed on a chip 1441 on the line 144 and a chip 1451 on theline 145, while the patterns formed on apattern 1442 on the line 144 and apattern 1452 on theline 145 on the second layer of thewafer 142 are different and the patterns formed on apattern 1443 on the line 144 and apattern 1453 on theline 145 on the third layer of thewafer 143 are also different. In such a case, it is possible to use grouping information of any one of thewafers 141 to 143 depending on where the bonding surface to be inspect is located, and it is also possible to generate combined grouping information of the these wafers to be commonly used for all the bonding surfaces. - The group A146 and the group B147 in
FIG. 14 show the label information of each chip when the chips on thelines 144 and 145 are superimposed in the depth direction. The label information may be newly grouped depending on the difference of the label combination designating the region having the chip combination with the same pattern as the chip pattern on the line 144 formed thereon as a label A and the region having the chip combination with the same pattern as the chip pattern on theline 145 as a label B, and it is stored as the label information uniquely determined for the bonded wafer. The label information is automatically set according to the combination pattern in the depth direction based on the label information of each layer of the wafer. - According to this embodiment, even though the image obtained from the inspection object includes aperiodic patterns, the image is segmented and grouped into regions having the same pattern group and a defection is detected within the partial images belonging to the same group. Thus, even when the image obtained from the inspection object includes such aperiodic patterns, it is possible to segment and group such an image into the regions having the same pattern group and detect the defect within the partial images belonging to the same group.
- The implementation of the inspection method according to the present invention and the apparatus thereof is described above taking an example of a substrate having a multi-layer structure and a complicated pattern such as a semiconductor wafer and a MEMS (Micro Electro Mechanical) wafer as the inspection object, and it is also applicable to an inspection of an IC package mounted on an IC tray or the like.
- One example is shown in
FIGS. 15A and 15B . Denoted by 150 inFIG. 15A is an IC tray, and labels A, B, C, and D in each pocket of theIC tray 150 indicate different types and model numbers of the IC packages placed in the IC tray.FIG. 15B shows a processing procedure according to this embodiment. With the inspection method of the present invention and the apparatus thereof,tray matrix information 152 of the IC package including the type and the model number (model number of IC package placed in each pocket on the tray, and the like) is received along with aninspection recipe 151, tray pockets are grouped based on the tray matrix information 152 (S1500), images of the pockets belonging to the same group are collected fromimages 153 of the IC packages on the tray pockets obtained (S1501), and the defect detection process described with reference toFIG. 8 in the first embodiment is performed at the defect detection unit 7-2. The same process is performed on each group. This enables a highly sensitive inspection even when multiple types of IC packages are placed on a single IC tray. - The above processing is also effective for the inspection of the IC package formed on a strip substrate. Instead of labeling each pocket on the IC tray, labels may be applied according to the type of the device placed therein or the pattern group of the obtained image, and the same processing is applied thereafter.
- Embodiments of the present invention are described above taking an example of the defect inspection using the ultrasonic inspection apparatus in a case where there are multiple types of devices formed on a wafer or an IC tray, but it is also effective for an inspection of a discrete IC package. In this case, the reference image is generated from a good sample with respect to each type in advance;
- the corresponding reference image is input according to the type of the inspection object; and the defect accuracy is calculated to make a determination. When there is only one type of constructions formed on the wafer or IC packages placed on the IC tray and the obtained image of the inspection object is constituted by a regular pattern, the present inspection method can be used by applying the same labels to all the regions.
- The present invention is applicable not only to images obtained by the ultrasonic inspection apparatus but also to non-destructive inspection images obtained by an x-ray defect inspection apparatus and images of appearance inspection.
- The present invention has been specifically described above based on its embodiments, and it is obvious that the present invention is not limited to the embodiments and various modifications can be made without departing from the spirit of the invention.
- The invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The present embodiment is therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims, rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.
Claims (14)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/289,404 US10529068B2 (en) | 2015-10-08 | 2019-02-28 | Defect inspection method and apparatus |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015200089A JP6546826B2 (en) | 2015-10-08 | 2015-10-08 | Defect inspection method and apparatus therefor |
JP2015-200089 | 2015-10-08 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/289,404 Continuation US10529068B2 (en) | 2015-10-08 | 2019-02-28 | Defect inspection method and apparatus |
Publications (3)
Publication Number | Publication Date |
---|---|
US20180101944A1 US20180101944A1 (en) | 2018-04-12 |
US20190050978A9 true US20190050978A9 (en) | 2019-02-14 |
US10354372B2 US10354372B2 (en) | 2019-07-16 |
Family
ID=58405624
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/287,418 Active US10354372B2 (en) | 2015-10-08 | 2016-10-06 | Defect inspection method and apparatus |
US16/289,404 Active US10529068B2 (en) | 2015-10-08 | 2019-02-28 | Defect inspection method and apparatus |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/289,404 Active US10529068B2 (en) | 2015-10-08 | 2019-02-28 | Defect inspection method and apparatus |
Country Status (6)
Country | Link |
---|---|
US (2) | US10354372B2 (en) |
JP (1) | JP6546826B2 (en) |
KR (1) | KR101820332B1 (en) |
CN (1) | CN107024541B (en) |
DE (1) | DE102016012000A1 (en) |
TW (1) | TWI627405B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11030738B2 (en) | 2019-07-05 | 2021-06-08 | International Business Machines Corporation | Image defect identification |
US11295439B2 (en) | 2019-10-16 | 2022-04-05 | International Business Machines Corporation | Image recovery |
Families Citing this family (41)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11062174B2 (en) * | 2017-02-28 | 2021-07-13 | Nec Solution Innovators, Ltd. | Mobile entity detection apparatus, mobile entity detection method, and computer-readable recording medium |
JP6588675B2 (en) * | 2017-03-10 | 2019-10-09 | 富士フイルム株式会社 | Image processing system, image processing apparatus, image processing method, and image processing program |
JP6854186B2 (en) * | 2017-05-09 | 2021-04-07 | 株式会社日立パワーソリューションズ | Ultrasound image device and ultrasonic image generation method |
CN107505395A (en) * | 2017-08-31 | 2017-12-22 | 北京金风慧能技术有限公司 | Inside workpiece damage detecting method and device |
US10607119B2 (en) * | 2017-09-06 | 2020-03-31 | Kla-Tencor Corp. | Unified neural network for defect detection and classification |
EP3748575A4 (en) * | 2018-01-29 | 2021-03-10 | NEC Corporation | Image processing device, image processing method, and recording medium |
JP7042149B2 (en) * | 2018-04-12 | 2022-03-25 | 株式会社日立パワーソリューションズ | Ultrasonic inspection equipment and ultrasonic inspection method |
CN108760885A (en) * | 2018-06-14 | 2018-11-06 | 德淮半导体有限公司 | Ultrasonic scanning method and ultrasonic scanning device |
CN109085237A (en) * | 2018-06-20 | 2018-12-25 | 德淮半导体有限公司 | A kind of ultrasonic scanning device and scan method |
KR101975816B1 (en) * | 2018-07-10 | 2019-08-28 | 주식회사 에이치비테크놀러지 | Apparatus and Method for Discriminating Defects in Auto Repair System |
CN112424826A (en) | 2018-07-13 | 2021-02-26 | Asml荷兰有限公司 | Pattern grouping method based on machine learning |
KR20200044252A (en) * | 2018-10-18 | 2020-04-29 | 삼성디스플레이 주식회사 | Display apparatus inspection system, inspection method of display apparatus and display apparatus using the same |
US10902620B1 (en) * | 2019-04-18 | 2021-01-26 | Applied Materials Israel Ltd. | Registration between an image of an object and a description |
US11379969B2 (en) | 2019-08-01 | 2022-07-05 | Kla Corporation | Method for process monitoring with optical inspections |
JP7257290B2 (en) * | 2019-08-27 | 2023-04-13 | 株式会社日立パワーソリューションズ | ULTRASOUND INSPECTION DEVICE AND ULTRASOUND INSPECTION METHOD |
EP4033241A4 (en) * | 2019-09-19 | 2022-10-19 | JFE Steel Corporation | Mobile inspection device, mobile inspection method, and method for manufacturing steel material |
JP7448331B2 (en) * | 2019-10-09 | 2024-03-12 | 株式会社京都製作所 | Determination device, sealing system, estimation model, generation device, determination method, sealing method, and generation method |
CN110988146A (en) * | 2019-11-01 | 2020-04-10 | 航天科工防御技术研究试验中心 | Packaged chip detection method |
CN112816557B (en) * | 2019-11-18 | 2022-02-18 | 中国商用飞机有限责任公司 | Defect detection method, device, equipment and storage medium |
CN111077223A (en) * | 2019-12-19 | 2020-04-28 | 西安增材制造国家研究院有限公司 | Additive manufacturing method with three-dimensional display, online detection and repair functions |
JP7317747B2 (en) * | 2020-02-28 | 2023-07-31 | 株式会社Ihiエアロスペース | Inspection device and inspection method |
US11686707B2 (en) * | 2020-03-30 | 2023-06-27 | Verifi Technologies, Llc | System and method for real-time visualization of defects in a material |
US11726065B2 (en) | 2020-03-30 | 2023-08-15 | Verifi Technologies, Llc | System and method for real-time visualization of defects in a material |
US11754529B2 (en) | 2020-03-30 | 2023-09-12 | Verifi Technologies, Llc | System and method for evaluating defects in a material |
US11860131B2 (en) | 2020-03-30 | 2024-01-02 | Verifi Technologies, Llc | System and method for portable ultrasonic testing |
US11650183B2 (en) | 2020-03-30 | 2023-05-16 | Verifi Technologies, Llc | System and method for real-time degree of cure evaluation in a material |
JP7428616B2 (en) * | 2020-08-27 | 2024-02-06 | 株式会社日立パワーソリューションズ | Ultrasonic inspection equipment and ultrasonic inspection method |
JP7476057B2 (en) * | 2020-09-11 | 2024-04-30 | キオクシア株式会社 | Defect Inspection Equipment |
JP7490533B2 (en) | 2020-10-30 | 2024-05-27 | 株式会社東芝 | Ultrasonic image processing device and ultrasonic image processing method |
CN112348803B (en) * | 2020-11-19 | 2024-03-29 | 西安维控自动化科技有限公司 | Ultrasonic edge detection method and system |
CN112345556B (en) * | 2020-11-23 | 2023-07-21 | 兰州大学 | Fault diagnosis system and method for integrated circuit |
TWI746320B (en) * | 2020-12-18 | 2021-11-11 | 財團法人工業技術研究院 | Method and system for generating and updating position distribution graph |
CN112581463B (en) * | 2020-12-25 | 2024-02-27 | 北京百度网讯科技有限公司 | Image defect detection method and device, electronic equipment, storage medium and product |
CN112862832B (en) * | 2020-12-31 | 2022-07-12 | 盛泰光电科技股份有限公司 | Dirt detection method based on concentric circle segmentation positioning |
DE112022000587T5 (en) * | 2021-02-09 | 2023-10-26 | Hitachi Power Solutions Co., Ltd. | ULTRASONIC TESTING APPARATUS, ULTRASONIC TESTING METHOD AND PROGRAM |
JP2022138014A (en) * | 2021-03-09 | 2022-09-22 | キオクシア株式会社 | Method for manufacturing semiconductor device, semiconductor manufacturing system, and semiconductor device |
CN113670828A (en) * | 2021-08-11 | 2021-11-19 | 中国电子科技集团公司第十四研究所 | Lamination quality detection method for multilayer printed board |
JP7290780B1 (en) | 2022-09-01 | 2023-06-13 | 株式会社エクサウィザーズ | Information processing method, computer program and information processing device |
CN115479949B (en) * | 2022-09-14 | 2023-04-07 | 交铁检验认证实验室(成都)有限公司 | Bridge safety monitoring and early warning method and system based on big data |
CN116091505B (en) * | 2023-04-11 | 2023-06-30 | 青岛芯康半导体科技有限公司 | Automatic defect detection and classification method and system for sapphire substrate |
CN117611587B (en) * | 2024-01-23 | 2024-06-04 | 赣州泰鑫磁性材料有限公司 | Rare earth alloy material detection system and method based on artificial intelligence |
Family Cites Families (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS58122456A (en) * | 1982-01-14 | 1983-07-21 | Hitachi Ltd | Ultrasonic microscope |
JPH0763691A (en) * | 1993-08-24 | 1995-03-10 | Toshiba Corp | Method and apparatus for inspection of pattern defect |
JP2999679B2 (en) * | 1994-11-30 | 2000-01-17 | 大日本スクリーン製造株式会社 | Pattern defect inspection equipment |
JPH0933599A (en) * | 1995-05-15 | 1997-02-07 | Hitachi Ltd | Pattern inspection method and apparatus |
JPH10104168A (en) * | 1996-09-26 | 1998-04-24 | Toshiba Corp | Graphic data developing device based on design data |
US6606909B2 (en) * | 2001-08-16 | 2003-08-19 | Lockheed Martin Corporation | Method and apparatus to conduct ultrasonic flaw detection for multi-layered structure |
US6900888B2 (en) * | 2001-09-13 | 2005-05-31 | Hitachi High-Technologies Corporation | Method and apparatus for inspecting a pattern formed on a substrate |
JP3647416B2 (en) * | 2002-01-18 | 2005-05-11 | Necエレクトロニクス株式会社 | Pattern inspection apparatus and method |
JP4275345B2 (en) * | 2002-01-30 | 2009-06-10 | 株式会社日立製作所 | Pattern inspection method and pattern inspection apparatus |
JP4230880B2 (en) * | 2003-10-17 | 2009-02-25 | 株式会社東芝 | Defect inspection method |
JP2005158780A (en) | 2003-11-20 | 2005-06-16 | Hitachi Ltd | Method and device for inspecting defect of pattern |
JP2005189655A (en) * | 2003-12-26 | 2005-07-14 | Nec Electronics Corp | Mask inspection method |
US20090136118A1 (en) * | 2005-04-11 | 2009-05-28 | Advantest Corporation | Electronic Device Handling Apparatus |
JP4728762B2 (en) * | 2005-10-03 | 2011-07-20 | 株式会社東芝 | Ultrasonic flaw detection image processing device |
JP2007149837A (en) * | 2005-11-25 | 2007-06-14 | Tokyo Seimitsu Co Ltd | Device, system, and method for inspecting image defect |
US8103087B2 (en) * | 2006-01-20 | 2012-01-24 | Hitachi High-Technologies Corporation | Fault inspection method |
CN101140619A (en) * | 2006-09-05 | 2008-03-12 | 大日本网目版制造株式会社 | Image processing device, data processing device and parameter adjusting method |
JP4862031B2 (en) * | 2008-10-20 | 2012-01-25 | 株式会社ニューフレアテクノロジー | Mask defect review method and mask defect review apparatus |
JP5275017B2 (en) * | 2008-12-25 | 2013-08-28 | 株式会社日立ハイテクノロジーズ | Defect inspection method and apparatus |
JP5075850B2 (en) | 2009-01-30 | 2012-11-21 | 株式会社日立エンジニアリング・アンド・サービス | Ultrasonic inspection apparatus and ultrasonic inspection method |
JP2011047724A (en) * | 2009-08-26 | 2011-03-10 | Hitachi High-Technologies Corp | Apparatus and method for inspecting defect |
US8787663B2 (en) | 2010-03-01 | 2014-07-22 | Primesense Ltd. | Tracking body parts by combined color image and depth processing |
JP4970569B2 (en) | 2010-06-17 | 2012-07-11 | 株式会社東芝 | Pattern inspection apparatus and pattern inspection method |
CN102053093A (en) * | 2010-11-08 | 2011-05-11 | 北京大学深圳研究生院 | Method for detecting surface defects of chip cut from wafer surface |
JP5710385B2 (en) | 2011-06-02 | 2015-04-30 | インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation | Presence of voids in through silicon vias (TSV) based on ultrasonic scanning |
DE102011079382B4 (en) * | 2011-07-19 | 2020-11-12 | Carl Zeiss Smt Gmbh | Method and device for analyzing and eliminating a defect in an EUV mask |
US8502146B2 (en) * | 2011-10-03 | 2013-08-06 | Kla-Tencor Corporation | Methods and apparatus for classification of defects using surface height attributes |
JP5640027B2 (en) * | 2012-02-17 | 2014-12-10 | 株式会社日立ハイテクノロジーズ | Overlay measurement method, measurement apparatus, scanning electron microscope, and GUI |
JP2013213681A (en) * | 2012-03-30 | 2013-10-17 | Asahi Glass Co Ltd | Assembly substrate inspection apparatus, inspection method, and manufacturing method |
JP6241120B2 (en) * | 2012-09-14 | 2017-12-06 | 株式会社リコー | Image inspection apparatus, image inspection method, and control program for image inspection apparatus |
TW201419853A (en) * | 2012-11-09 | 2014-05-16 | Ind Tech Res Inst | Image processor and image dead pixel detection method thereof |
KR20140066584A (en) | 2012-11-23 | 2014-06-02 | 삼성메디슨 주식회사 | Ultrasound system and method for providing guide line of needle |
JP6049101B2 (en) | 2013-01-17 | 2016-12-21 | 株式会社日立ハイテクノロジーズ | Inspection device |
JP6361140B2 (en) * | 2013-03-15 | 2018-07-25 | 株式会社リコー | Image inspection apparatus, image inspection system, and image inspection method |
JP6368081B2 (en) | 2013-11-06 | 2018-08-01 | 株式会社ニューフレアテクノロジー | Measuring device |
JP6512965B2 (en) * | 2015-07-01 | 2019-05-15 | キヤノン株式会社 | Image processing apparatus and image processing method |
-
2015
- 2015-10-08 JP JP2015200089A patent/JP6546826B2/en active Active
-
2016
- 2016-09-27 CN CN201610854918.1A patent/CN107024541B/en active Active
- 2016-10-04 KR KR1020160127470A patent/KR101820332B1/en active IP Right Grant
- 2016-10-06 US US15/287,418 patent/US10354372B2/en active Active
- 2016-10-06 TW TW105132346A patent/TWI627405B/en active
- 2016-10-06 DE DE102016012000.2A patent/DE102016012000A1/en active Pending
-
2019
- 2019-02-28 US US16/289,404 patent/US10529068B2/en active Active
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11030738B2 (en) | 2019-07-05 | 2021-06-08 | International Business Machines Corporation | Image defect identification |
US11295439B2 (en) | 2019-10-16 | 2022-04-05 | International Business Machines Corporation | Image recovery |
Also Published As
Publication number | Publication date |
---|---|
TW201713946A (en) | 2017-04-16 |
JP6546826B2 (en) | 2019-07-17 |
US20190197680A1 (en) | 2019-06-27 |
CN107024541B (en) | 2019-12-20 |
DE102016012000A1 (en) | 2017-04-13 |
US10354372B2 (en) | 2019-07-16 |
KR101820332B1 (en) | 2018-01-19 |
US10529068B2 (en) | 2020-01-07 |
US20180101944A1 (en) | 2018-04-12 |
JP2017072501A (en) | 2017-04-13 |
KR20170042232A (en) | 2017-04-18 |
TWI627405B (en) | 2018-06-21 |
CN107024541A (en) | 2017-08-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10529068B2 (en) | Defect inspection method and apparatus | |
JP6608292B2 (en) | Ultrasonic inspection method and apparatus | |
US11144778B2 (en) | Descriptor guided fast marching method for analyzing images and systems using the same | |
TWI735862B (en) | Ultrasonic inspection device and ultrasonic inspection method | |
JP6310814B2 (en) | Image processing method and ultrasonic inspection method and apparatus using the same | |
CN107580710A (en) | For the system and method for the inspection sensitivity for strengthening the instruments of inspection | |
US20060222232A1 (en) | Appearance inspection apparatus and appearance inspection method | |
JP7257290B2 (en) | ULTRASOUND INSPECTION DEVICE AND ULTRASOUND INSPECTION METHOD | |
CN101236914B (en) | Wafer appearance detection device | |
JP2010091361A (en) | Method and device for inspecting image | |
EP2063259A1 (en) | Method for inspecting mounting status of electronic component | |
JP2006317408A (en) | Warpage checker | |
WO2024116934A1 (en) | Defect inspection system and defect inspection method | |
JP3695993B2 (en) | Semiconductor device inspection apparatus and inspection method | |
JP7508384B2 (en) | Ultrasonic inspection device, ultrasonic inspection method, and program | |
TWI824581B (en) | Ultrasonic inspection device and ultrasonic inspection method | |
CN111106025B (en) | Edge defect inspection method | |
JP2008078204A (en) | Testing method of semiconductor device | |
TW202235871A (en) | Ultrasonic wave inspection device, ultrasonic inspection method, and program | |
JP4248319B2 (en) | Inspection condition evaluation program and inspection apparatus | |
JP2003077972A (en) | Manufacturing method of semiconductor device | |
JP2008064553A (en) | Pattern inspection device and pattern inspecting method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: HITACHI POWER SOLUTIONS CO., LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SAKAI, KAORU;KIKUCHI, OSAMU;SAHARA, KENJI;SIGNING DATES FROM 20160410 TO 20160930;REEL/FRAME:040275/0126 |
|
FEPP | Fee payment procedure |
Free format text: PETITION RELATED TO MAINTENANCE FEES GRANTED (ORIGINAL EVENT CODE: PTGR); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
FEPP | Fee payment procedure |
Free format text: PETITION RELATED TO MAINTENANCE FEES GRANTED (ORIGINAL EVENT CODE: PTGR); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |