CN114905169A - Observation device and observation method - Google Patents

Observation device and observation method Download PDF

Info

Publication number
CN114905169A
CN114905169A CN202210098627.XA CN202210098627A CN114905169A CN 114905169 A CN114905169 A CN 114905169A CN 202210098627 A CN202210098627 A CN 202210098627A CN 114905169 A CN114905169 A CN 114905169A
Authority
CN
China
Prior art keywords
crack
image
imaging
region
internal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210098627.XA
Other languages
Chinese (zh)
Inventor
坂本刚志
佐野育
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hamamatsu Photonics KK
Original Assignee
Hamamatsu Photonics KK
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hamamatsu Photonics KK filed Critical Hamamatsu Photonics KK
Publication of CN114905169A publication Critical patent/CN114905169A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/70Auxiliary operations or equipment
    • B23K26/702Auxiliary equipment
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/02Positioning or observing the workpiece, e.g. with respect to the point of impact; Aligning, aiming or focusing the laser beam
    • B23K26/03Observing, e.g. monitoring, the workpiece
    • B23K26/032Observing, e.g. monitoring, the workpiece using optical means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/59Transmissivity
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K2101/00Articles made by soldering, welding or cutting
    • B23K2101/36Electric or electronic devices
    • B23K2101/40Semiconductor devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8854Grading and classifying of flaws
    • G01N2021/8858Flaw counting
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8854Grading and classifying of flaws
    • G01N2021/888Marking defects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2201/00Features of devices classified in G01N21/00
    • G01N2201/10Scanning
    • G01N2201/104Mechano-optical scan, i.e. object and beam moving
    • G01N2201/1047Mechano-optical scan, i.e. object and beam moving with rotating optics and moving stage

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Biochemistry (AREA)
  • Pathology (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Immunology (AREA)
  • General Health & Medical Sciences (AREA)
  • Plasma & Fusion (AREA)
  • Mechanical Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Microscoopes, Condenser (AREA)
  • Testing Or Measuring Of Semiconductors Or The Like (AREA)

Abstract

The observation apparatus and observation method of the present invention can acquire information on the position of the modified region more accurately. The observation device includes: an imaging unit for imaging an object by using transmitted light that is transmissive to the object; and a control unit for controlling at least the imaging unit, wherein the object has a1 st surface and a2 nd surface opposite to the 1 st surface, and a reformed region and a crack extending from the reformed region are formed in the object, the reformed region being aligned in an X direction along the 1 st surface and the 2 nd surface, and the control unit controls the imaging unit to perform the following imaging processing: the transmitted light is made incident from the 1 st surface into the object, and a target crack extending in the Z direction intersecting the 1 st surface and the 2 nd surface and the X direction among the cracks is imaged by the transmitted light.

Description

Observation device and observation method
Technical Field
The present invention relates to an observation apparatus and an observation method.
Background
There is known a laser processing apparatus which irradiates a wafer including a semiconductor substrate and a functional element layer formed on a front surface of the semiconductor substrate with laser light from a back surface side of the semiconductor substrate to cut the wafer along a plurality of lines, respectively, and forms a plurality of rows of modified regions in the semiconductor substrate along the plurality of lines, respectively. The laser processing apparatus described in patent document 1 (japanese patent application laid-open No. 2017-64746) includes an infrared camera, and can observe a modified region formed in a semiconductor substrate and a processing damage or the like formed in a functional device layer from the back surface side of the semiconductor substrate.
Disclosure of Invention
As described above, when the reformed region formed inside the semiconductor substrate is observed by the infrared camera, it may be unclear which portion of the reformed region is detected in the thickness direction of the semiconductor substrate. Therefore, in the above-described technical field, it is required to acquire more accurate information on the position of the modified region, such as the upper end position, the lower end position, and the width of the modified region in the thickness direction of the semiconductor substrate.
An object of the present invention is to provide an observation apparatus and an observation method capable of acquiring information on the position of a modified region more accurately.
The present inventors have made intensive studies to solve the above-mentioned problems, and have found the following findings. That is, when a modified region is formed inside an object such as the semiconductor substrate by laser processing, for example, cracks extending in various directions from the modified region may be formed. Among the cracks, those crossing the Z direction crossing the laser incident surface of the object and the X direction as the advancing direction of the laser processing can be accurately detected by the transmitted light transmitted through the object as compared with the reformed region. Therefore, if information on the position where the crack intersecting the X direction and the Z direction is imaged can be obtained, information on the position of the modified region can be more accurately obtained based on the position. The present invention has been completed based on such findings.
That is, the observation device of the present invention includes: an imaging unit for imaging an object by using transmitted light that is transmissive to the object; and a control unit for controlling at least the imaging unit, wherein the object has a1 st surface and a2 nd surface opposite to the 1 st surface, and a modified region and a crack extending from the modified region are formed in the object, the modified region being aligned in an X-direction along the 1 st surface and the 2 nd surface, and the control unit controls the imaging unit to perform the following imaging processing: the transmission light is made incident into the object from the 1 st surface, and the object crack extending in the Z direction intersecting the 1 st surface and the 2 nd surface and the X direction among the cracks is imaged by the transmission light.
Further, the observation method of the present invention includes: a preparation step of preparing an object having a1 st surface and a2 nd surface opposite to the 1 st surface, the object having a reformed region arranged in an X direction along the 1 st surface and the 2 nd surface and a crack extending from the reformed region; and an imaging step of, after the preparation step, making transmitted light transmitted from the object enter the object from the 1 st surface, and imaging the object crack by the transmitted light, wherein the object crack is a crack extending in a Z direction intersecting the 1 st surface and the 2 nd surface and a direction intersecting the X direction among the cracks.
In the object to be subjected to these apparatuses and methods, a reformed region arranged in the X direction and a crack extending from the reformed region are formed. In such an object, the object crack intersecting the X direction and the Z direction can be imaged using the transmitted light transmitted through the object. As described above, the target crack intersecting the X-direction and the Z-direction from the reformed region can be imaged (detected) more accurately in the Z-direction than the reformed region. Therefore, if information such as the amount of movement of the condenser lens when imaging the object crack can be acquired, for example, information on the position of the modified region can be acquired more accurately based on the amount of movement.
The observation apparatus of the present invention may include a moving unit configured to relatively move a condenser lens for condensing transmitted light on an object, wherein the control unit controls the imaging unit and the moving unit to relatively move the condenser lens in a Z direction so that a condensing point of the transmitted light is located at a plurality of positions inside the object to image the object in an imaging process, and the control unit may acquire a plurality of internal images by performing the following operation after the imaging process: the crack position, which is the position of the target crack in the Z direction, is calculated based on the plurality of internal images and the amount of movement of the condenser lens in the Z direction when each of the internal images is captured. In this way, by calculating the crack position based on the amount of movement of the condenser lens when imaging the target crack, information on the position of the modified region can be acquired more accurately.
In the observation device of the present invention, the control unit may determine an internal image in which the image of the target crack is clear among the plurality of internal images, and calculate the crack position based on the amount of movement of the condenser lens when the determined internal image is captured. In this way, the crack position can be calculated more accurately by the determination of the internal image in which the target crack is clear by the control unit.
The observation device of the present invention may be configured such that the control unit executes the following estimation processing after the arithmetic processing: at least 1 of the position of the 1 st surface side end of the modified region in the Z direction, the position of the 2 nd surface side end of the modified region in the Z direction, and the width of the modified region in the Z direction is estimated based on the formation condition of the modified region and the crack position. The shape and size of the modified region may vary depending on the formation conditions of the modified region, such as the processing conditions of laser processing (for example, the wavelength, pulse width, pulse energy, and aberration correction amount of the laser beam). Therefore, if the formation condition of the reformed region and the crack position are used in this way, information on the position of the reformed region can be estimated more accurately.
The observation apparatus of the present invention may be configured such that, in the imaging process, the control unit executes the 1 st imaging process and the 2 nd imaging process, wherein in the 1 st image pickup processing, the transmitted light is made incident on the object from the 1 st surface, and the condensing lens is relatively moved in the Z direction, so that the object is picked up at a plurality of positions while moving the condensing point of the transmitted light which is not reflected by the 2 nd surface from the 1 st surface side to the 2 nd surface side, thereby acquiring a plurality of 1 st internal images as internal images, in the 2 nd image pickup processing, the transmitted light is made incident on the object from the 1 st surface, and the condensing lens is relatively moved in the Z direction, so that the object is picked up at a plurality of positions while moving the condensing point of the transmitted light reflected on the 2 nd surface from the 2 nd surface side to the 1 st surface side, thereby acquiring a plurality of 2 nd internal images as internal images. In this way, if the object is imaged (directly observed) using the transmitted light that enters from the 1 st surface of the object but is not reflected by the 2 nd surface, and the object is imaged (back-reflected observation) using the transmitted light that enters from the 1 st surface of the object and is reflected by the 2 nd surface, and the internal images are acquired separately, it is possible to acquire information on the position of the modified region more accurately using the crack position acquired based on the amount of movement of the condenser lens at the time of imaging the internal image.
The observation apparatus of the present invention may be configured such that, in the arithmetic processing, the control unit executes the 1 st arithmetic processing and the 2 nd arithmetic processing, wherein, in the 1 st calculation processing, the 1 st internal image with clear object cracks in the 1 st internal images is judged, and the 1 st crack position as the crack position is calculated based on the movement amount of the condenser lens when the judged 1 st internal image is shot, in the 2 nd calculation processing, a2 nd internal image with clear object cracks in the 2 nd internal image is judged, and the 2 nd crack position as the crack position is calculated based on the movement amount of the condensing lens when the judged 2 nd internal image is shot, in the estimation process, the control unit estimates the width of the modified region in the Z direction based on the formation condition of the modified region and the interval between the 1 st crack position and the 2 nd crack position. As described above, in this case, based on the interval between the crack position acquired by direct observation and the crack position acquired by back-surface reflection observation, information on the width of the modified region can be acquired more accurately.
The observation device of the present invention may further include a display unit for displaying information, and the control unit may control the display unit to perform a display process of displaying information on the crack position on the display unit after the arithmetic processing. In this case, the user can grasp the information of the crack position through the display unit. The information on the crack position is at least one of the crack position itself and various information included in the information on the position of the reformed region that can be estimated based on the crack position.
According to the present invention, it is possible to provide an observation apparatus and an observation method capable of acquiring information on the position of a reformed region more accurately.
Drawings
Fig. 1 is a configuration diagram of a laser processing apparatus according to an embodiment.
Figure 2 is a top view of a wafer according to one embodiment.
Fig. 3 is a cross-sectional view of a portion of the wafer shown in fig. 2.
Fig. 4 is a structural diagram of the laser irradiation unit shown in fig. 1.
Fig. 5 is a structural diagram of the inspection imaging unit shown in fig. 1.
Fig. 6 is a structural diagram of the alignment correction image pickup unit shown in fig. 1.
Fig. 7 is a cross-sectional view of a wafer for explaining the imaging principle of the inspection imaging unit shown in fig. 5, and images of each part obtained by the inspection imaging unit.
Fig. 8 is a cross-sectional view of a wafer for explaining the imaging principle of the inspection imaging unit shown in fig. 5, and images of each part obtained by the inspection imaging unit.
Fig. 9 is an SEM image of the modified region and the crack formed in the inside of the semiconductor substrate.
Fig. 10 is an SEM image of the modified region and the crack formed in the inside of the semiconductor substrate.
Fig. 11 is a schematic diagram for explaining the imaging principle of the inspection imaging unit shown in fig. 5.
Fig. 12 is a schematic diagram for explaining the imaging principle of the inspection imaging unit shown in fig. 5.
Fig. 13 is a diagram showing an object in which a modified region is formed.
Fig. 14 is a graph regarding the modified zone and the position of the crack in the Z direction.
Fig. 15 is a view obtained by plotting the detection result on a cross-sectional photograph of the object.
Fig. 16 is a flowchart showing an example of the observation method according to the present embodiment.
Fig. 17 is a diagram showing a step of the observation method shown in fig. 17.
Fig. 18 is a diagram showing a step of the observation method shown in fig. 17.
Fig. 19 is a plurality of internal images captured at positions different from each other in the Z direction.
Fig. 20 is a diagram illustrating crack detection.
Fig. 21 is a diagram illustrating crack detection.
Fig. 22 is a diagram for explaining flaw detection.
Fig. 23 is a diagram for explaining flaw detection.
Fig. 24 is a diagram for explaining flaw detection.
Detailed Description
One embodiment will be described in detail below with reference to the drawings. In the description of the drawings, the same or corresponding portions are denoted by the same reference numerals, and redundant description thereof may be omitted. In the drawings, a rectangular coordinate system defined by an X axis, a Y axis, and a Z axis may be shown. For example, the X direction and the Y direction are a1 st horizontal direction and a2 nd horizontal direction intersecting (orthogonal to) each other, and the Z direction is a vertical direction intersecting (orthogonal to) the X direction and the Y direction.
As shown in fig. 1, the laser processing apparatus 1 includes a mounting table 2, a laser irradiation unit 3 (irradiation section), a plurality of imaging units 4, 5, and 6, a driving unit 7, a control section 8, and a display 150 (display section). The laser processing apparatus 1 is an apparatus for forming a modified region 12 in an object 11 by irradiating the object 11 with a laser beam L.
The mounting table 2 supports the object 11 by, for example, adsorbing a film attached to the object 11. The mounting table 2 is movable in the X direction and the Y direction, respectively, and is rotatable about an axis parallel to the Z direction as a center line.
The laser irradiation unit 3 condenses and irradiates the object 11 with the laser light L having a transmittance with respect to the object 11. When the laser light L is condensed into the object 11 supported by the stage 2, the laser light L is absorbed particularly in a portion corresponding to the condensed point C of the laser light L, and the reformed region 12 can be formed in the object 11.
The modified region 12 is a region having a density, refractive index, mechanical strength, or other physical property different from that of the surrounding unmodified region. Examples of the modified region 12 include a melt-processed region, a crack region, an insulation breakdown region, and a refractive index change region. The modified region 12 has a characteristic that cracks easily extend from the modified region 12 to the incident side of the laser light L and the opposite side thereof. The characteristics of the modified region 12 are used for cutting the object 11.
For example, when the stage 2 is moved in the X direction and the converging point C is moved relative to the object 11 in the X direction, a plurality of modified spots 12s are formed so as to be arranged in 1 row in the X direction. The 1 modified spot 12s is formed by irradiation of 1 pulse of the laser light L. The row 1 modification region 12 is a collection of a plurality of modification points 12s arranged in row 1. The adjacent modified spots 12s may be connected to each other or separated from each other depending on the relative movement speed of the condensed spot C with respect to the object 11 and the repetition frequency of the laser light L.
The imaging unit 4 images a modified region 12 formed in the object 11 and the tip of the crack extending from the modified region 12.
The imaging units 5 and 6 image the object 11 supported by the stage 2 with light transmitted through the object 11 under the control of the control unit 8. The image pickup units 5 and 6 pick up images, for example, to align the irradiation positions of the laser beams L.
The drive unit 7 supports the laser irradiation unit 3 and the plurality of imaging units 4, 5, and 6. The driving unit 7 moves the laser irradiation unit 3 and the plurality of imaging units 4, 5, and 6 in the Z direction.
The controller 8 controls operations of the stage 2, the laser irradiation unit 3, the plurality of imaging units 4, 5, and 6, and the driving unit 7. The control unit 8 is configured as a computer device including a processor, a memory, a storage, a communication device, and the like. In the control unit 8, the processor executes software (program) read from the memory or the like, and controls reading and writing of data from and to the memory and communication by the communication device.
The display 150 has a function as an input unit for receiving input of information by a user and a function as a display unit for displaying information to the user.
[ Structure of object ]
The object 11 of the present embodiment is a wafer 20 as shown in fig. 2 and 3. The wafer 20 includes a semiconductor substrate 21 and a functional element layer 22. In this embodiment, the mode in which the wafer 20 includes the functional element layer 22 is described, but the wafer 20 may include the functional element layer 22, may not include the functional element layer 22, or may be a carrier wafer. The semiconductor substrate 21 has a front surface 21a (2 nd surface) and a back surface 21b (1 st surface). The semiconductor substrate 21 is, for example, a silicon substrate. The functional element layer 22 is formed on the front surface 21a of the semiconductor substrate 21. The functional element layer 22 includes a plurality of functional elements 22a two-dimensionally arranged along the front surface 21 a. The functional element 22a is, for example, a light receiving element such as a photodiode, a light emitting element such as a laser diode, a circuit element such as a memory, or the like. The functional element 22a may be three-dimensionally formed by stacking a plurality of layers. Although the semiconductor substrate 21 is provided with the notch 21c indicating the crystal orientation, an orientation flat may be provided instead of the notch 21 c.
The wafer 20 is cut for each functional element 22a along the plurality of lines 15. The plurality of lines 15 pass between the plurality of functional elements 22a, respectively, when viewed in the thickness direction of the wafer 20. More specifically, the line 15 passes through the center (center in the width direction) of the grid line region 23 when viewed from the thickness direction of the wafer 20. The grid line regions 23 extend between the functional element layers 22 so as to pass between the adjacent functional elements 22 a. In the present embodiment, the plurality of functional elements 22a are arranged in a matrix along the front surface 21a, and the plurality of lines 15 are arranged in a lattice shape. The line 15 is a virtual line, but may be a line actually drawn.
[ Structure of laser irradiation Unit ]
As shown in fig. 4, the laser irradiation unit 3 has a light source 31, a spatial light modulator 32, and a condenser lens 33. The light source 31 outputs the laser light L by, for example, a pulse oscillation method. The spatial light modulator 32 modulates the laser light L output from the light source 31. The Spatial Light Modulator 32 is, for example, a Spatial Light Modulator (SLM) of a reflective Liquid Crystal (LCOS). The condenser lens 33 condenses the laser light L modulated by the spatial light modulator 32. The condenser lens 33 may be a correction ring lens.
In the present embodiment, the laser irradiation unit 3 irradiates the wafer 20 with the laser light L from the back surface 21b side of the semiconductor substrate 21 along the plurality of lines 15, respectively, thereby forming 2 rows of modified regions 12a, 12b inside the semiconductor substrate 21 along the plurality of lines 15, respectively. The modified region 12a is the modified region closest to the front face 21a among the 2 rows of modified regions 12a, 12 b. The modified region 12b is the modified region closest to the modified region 12a among the 2-row modified regions 12a, 12b, and is the modified region closest to the back surface 21 b.
The 2 rows of modified regions 12a, 12b are adjacent in the thickness direction (Z direction) of the wafer 20. The 2-row modified regions 12a, 12b are formed by relatively moving the 2 light condensing points C1, C2 with respect to the semiconductor substrate 21 along the line 15. The laser light L is modulated by the spatial light modulator 32 so that, for example, the condensed point C2 is located on the rear side in the traveling direction with respect to the condensed point C1 and on the incident side of the laser light L. The formation of the modified region may be a single focus, a multi-focus, or 1 path or a plurality of paths.
The laser irradiation unit 3 irradiates the wafer 20 with the laser light L from the back surface 21b side of the semiconductor substrate 21 along each of the plurality of lines 15. For example, 2 converging points C1 and C2 are focused on a semiconductor substrate 21, which is a substrate of monocrystalline silicon having a thickness of 400 μm < 100 >, at positions 54 μm and 128 μm from the front surface 21a, respectively, and the wafer 20 is irradiated with the laser light L from the rear surface 21b side of the semiconductor substrate 21 along each of the plurality of lines 15. In this case, for example, when the conditions are selected such that the cracks 14 in the 2-row modified regions 12a and 12b reach the front surface 21a of the semiconductor substrate 21, the wavelength of the laser light L is 1099nm, the pulse width is 700n seconds, and the repetition frequency is 120 kHz. Further, the output of the laser light L at the condensed point C1 was made 2.7W, the output of the laser light L at the condensed point C2 was made 2.7W, and the relative movement speed of the 2 condensed points C1, C2 with respect to the semiconductor substrate 21 was made 800 mm/sec. Note that, for example, when the number of processing paths is 5, as the processing positions of the wafer 20, ZH80 (a position 328 μm from the front surface 21a), ZH69 (a position 283 μm from the front surface 21a), ZH57 (a position 234 μm from the front surface 21a), ZH26 (a position 107 μm from the front surface 21a), and ZH12 (a position 49.2 μm from the front surface 21a) may be used, for example. In this case, for example, the laser light L may have a wavelength of 1080nm, a pulse width of 400nsec, a repetition frequency of 100kHz, and a moving speed of 490 mm/sec.
The formation of such 2-row modified regions 12a, 12b and cracks 14 is performed in the following case. That is, in the subsequent step, the semiconductor substrate 21 is thinned by, for example, grinding the back surface 21b of the semiconductor substrate 21, and the cracks 14 are exposed to the back surface 21b, and the wafer 20 is cut into a plurality of semiconductor devices along the plurality of lines 15.
[ Structure of imaging Unit for inspection ]
As shown in fig. 5, the imaging unit 4 (imaging section) includes a light source 41, a mirror 42, an objective lens 43 (condensing lens), and a light detection section 44. The imaging unit 4 images the wafer 20. The light source 41 outputs light I1 having transmissivity with respect to the semiconductor substrate 21. The light source 41 is composed of, for example, a halogen lamp and a filter, and outputs light I1 in the near infrared region. The light I1 output from the light source 41 is reflected by the mirror 42, passes through the objective lens 43, and is irradiated from the back surface 21b side of the semiconductor substrate 21 to the wafer 20. At this time, the stage 2 supports the wafer 20 on which the 2-line modified regions 12a and 12b are formed as described above.
The objective lens 43 is for condensing light (transmitted light) I1 having transmissivity with respect to the semiconductor substrate 21 toward the semiconductor substrate 21. The objective lens 43 passes the light I1 reflected by the front surface 21a of the semiconductor substrate 21. That is, the objective lens 43 passes the light I1 propagating through the semiconductor substrate 21. The Numerical Aperture (NA) of the objective lens 43 is, for example, 0.45 or more. The objective lens 43 has a correction ring 43 a. The correction ring 43a corrects aberration generated by the light I1 in the semiconductor substrate 21, for example, by adjusting the distance between the plurality of lenses constituting the objective lens 43. Note that the means for correcting the aberration is not limited to the correction ring 43a, and may be other correction means such as a spatial light modulator. The light detector 44 detects light I1 transmitted through the objective lens 43 and the mirror 42. The light detection unit 44 is formed of, for example, an InGaAs camera, and detects light I1 in the near infrared region. The means for detecting (imaging) the light I1 in the near infrared region is not limited to the InGaAs camera, and may be other imaging means for performing transmission type imaging such as a transmission type confocal microscope.
The imaging unit 4 can capture 2 the leading ends of the modified regions 12a, 12b and the cracks 14a, 14b, 14c, 14d (details will be described later). The crack 14a extends from the reformed region 12a toward the front surface 21 a. The crack 14b extends from the reformed region 12a toward the rear surface 21 b. The crack 14c extends from the reformed region 12b toward the front surface 21 a. The crack 14d extends from the reformed region 12b toward the back surface 21 b.
[ Structure of image pickup Unit for alignment correction ]
As shown in fig. 6, the imaging unit 5 has a light source 51, a mirror 52, a lens 53, and a light detection section 54. The light source 51 outputs light I2 having transmittance with respect to the semiconductor substrate 21. The light source 51 is composed of, for example, a halogen lamp and a filter, and outputs light I2 in the near infrared region. The light source 51 may also be shared with the light source 41 of the imaging unit 4. The light I2 output from the light source 51 is reflected by the mirror 52, passes through the lens 53, and is irradiated from the back surface 21b side of the semiconductor substrate 21 to the wafer 20.
The lens 53 passes the light I2 reflected by the front surface 21a of the semiconductor substrate 21. That is, the lens 53 passes the light I2 propagating through the semiconductor substrate 21. The numerical aperture of the lens 53 is 0.3 or less. That is, the numerical aperture of the objective lens 43 of the image pickup unit 4 is larger than the numerical aperture of the lens 53. The light detection unit 54 detects the light I2 passing through the lens 53 and the mirror 52. The light detection unit 54 is formed of, for example, an InGaAs camera, and detects light I2 in the near infrared region.
The imaging unit 5 irradiates the wafer 20 with light I2 from the back surface 21b side and detects light I2 returned from the front surface 21a (functional element layer 22) under the control of the control unit 8, thereby imaging the functional element layer 22. Similarly, the imaging unit 5 irradiates the wafer 20 with light I2 from the rear surface 21b side and detects light I2 returned from the formation position of the modified regions 12a and 12b of the semiconductor substrate 21 under the control of the controller 8, thereby acquiring images of the regions including the modified regions 12a and 12 b. These images are used for alignment of the irradiation position of the laser light L. The imaging unit 6 has the same configuration as the imaging unit 5 except for a point of lower magnification (for example, 6 times in the imaging unit 5 and 1.5 times in the imaging unit 6) than the lens 53, and is used for alignment as in the imaging unit 5.
[ imaging principle of imaging unit for inspection ]
Using the imaging unit 4 shown in fig. 5, as shown in fig. 7, the focal point F (the focal point of the objective lens 43) is moved from the back surface 21b side to the front surface 21a side with respect to the semiconductor substrate 21 where the crack 14 spanning the 2-row modified regions 12a, 12b reaches the front surface 21 a. In this case, if the focal point is focused from the rear surface 21b side F to the front end 14e of the crack 14 extending from the reformed region 12b to the rear surface 21b side, the front end 14e (the right image in fig. 7) can be confirmed. However, even if the focal point is focused from the back surface 21b side F to the crack 14 itself and the front end 14e of the crack 14 reaching the front surface 21a, the confirmation cannot be made (left image of fig. 7). In addition, if the focal point F is focused on the front surface 21a of the semiconductor substrate 21 from the back surface 21b side, the functional element layer 22 can be confirmed.
Then, using the imaging unit 4 shown in fig. 5, as shown in fig. 8, the focal point F is moved from the back surface 21b side to the front surface 21a side with respect to the semiconductor substrate 21 in which the cracks 14 spanning the 2-row modified regions 12a, 12b do not reach the front surface 21 a. In this case, even if the focal point F is focused from the back surface 21b side to the front end 14e of the crack 14 extending from the reformed region 12a to the front surface 21a side, the front end 14e cannot be confirmed (left image in fig. 8). However, if the focal point F is focused from the back surface 21b side to a region on the opposite side of the back surface 21b with respect to the front surface 21a (i.e., a region on the functional element layer 22 side with respect to the front surface 21a), and a virtual focal point Fv symmetrical with respect to the focal point F with respect to the front surface 21a is positioned at the front end 14e, the front end 14e (the right image in fig. 8) can be confirmed. In addition, the virtual focus Fv is a point symmetrical to the focus F with respect to the front surface 21a in consideration of the refractive index of the semiconductor substrate 21.
The crack 14 could not be confirmed as described above, and the width of the crack 14 was presumably smaller than the wavelength of the light I1 as the illumination light. Fig. 9 and 10 are sem (scanning Electron microscope) images of the modified region 12 and the crack 14 formed in the semiconductor substrate 21 as a silicon substrate. Fig. 9 (b) is an enlarged image of the region a1 shown in fig. 9 (a), fig. 10 (a) is an enlarged image of the region a2 shown in fig. 9 (b), and fig. 10 (b) is an enlarged image of the region A3 shown in fig. 10 (a). In this way, the width of the crack 14 is about 120nm and is smaller than the wavelength (for example, 1.1 to 1.2 μm) of the light I1 in the near infrared region.
The imaging principle assumed from the above matters is as follows. As shown in fig. 11 a, if the focal point F is located in the air, the light I1 does not return, and a blackish black image (image on the right side of fig. 11 a) is obtained. As shown in fig. 11 b, if the focal point F is located inside the semiconductor substrate 21, the light I1 reflected by the front surface 21a returns, and a clear white image (the right image in fig. 11 b) is obtained. As shown in fig. 11 c, if the focal point F is focused on the modified region 12 from the back surface 21b side, the modified region 12 absorbs, scatters, or the like a part of the light I1 returned by being reflected by the front surface 21a, and thus an image of the modified region 12 showing a dark black color on a clear white background (the right image of fig. 11 c) is obtained.
As shown in fig. 12 (a) and (b), if the focal point F is focused from the back surface 21b side to the front end 14e of the crack 14, for example, due to optical specificity (stress concentration, distortion, discontinuity of atomic density, and the like) generated in the vicinity of the front end 14e, light is confined in the vicinity of the front end 14e, and thus a part of the light I1 returned by being reflected by the front surface 21a is scattered, reflected, interfered, absorbed, and the like, so that an image of the front end 14e showing a blackish black background is obtained (an image on the right side of fig. 12 (a) and (b)). As shown in fig. 12 c, if the focal point F is focused from the back surface 21b side to a portion other than the vicinity of the front end 14e of the crack 14, at least a part of the light I1 reflected by the front surface 21a returns, and a clear white image (image on the right side of fig. 12 c) is obtained.
[ embodiment for internal Observation ]
Fig. 13 is a diagram showing an object in which a modified region is formed. Fig. 13 (a) is a photograph of a cross section of the cut object with the modified region exposed. Fig. 13 (b) shows an example of an image of the object obtained by imaging light transmitted through the object. Fig. 13 (c) shows another example of an image of the object obtained by imaging with light transmitted through the object. As shown in fig. 13 (a), the modified region 12 formed in the object (here, the semiconductor substrate 21) by condensing the laser light L includes: a defect (Void) region 12m located on the front surface 21a side opposite to the incident surface of the laser light L of the semiconductor substrate 21; and a defect upper region 12n on the rear surface 21b side which is the incident surface of the laser beam L with respect to the defect region 12 m.
When the semiconductor substrate 21 on which the modified region 12 is formed is imaged by light I1 having transparency with respect to the semiconductor substrate 21, an image of a crack 14k extending in a direction intersecting the Z direction and the X direction (having an angle with respect to the X direction) may be observed as shown in fig. 13 (b) and (c). The crack 14k is substantially parallel to the Y direction in the example of fig. 13 (b) and slightly inclined with respect to the Y direction in the example of fig. 13 (c) when viewed from the Z direction. When the semiconductor substrate 21 is imaged at a plurality of positions while moving the converging point of the light I1 in the Z direction, the image of the crack 14k can be clearly detected within a limited range in the Z direction as compared with the modified region 12.
Fig. 14 is a graph relating to the modified zone and the location of cracks in the Z direction. In fig. 14, the lower end of the defect, the upper end of the defect, the lower end of the region above the defect, and the upper end of the region above the defect are plotted, and the actually measured values are actually measured by observing the cross section. The lower end refers to the end on the front face 21a side, and the upper end refers to the end on the back face 21b side. Therefore, for example, the lower end of the defect upper region refers to the end of the defect upper region 12n on the front surface 21a side.
The rendering of the direct observation and the back reflection observation in the graph of fig. 14 is a value calculated based on the movement amount (hereinafter, sometimes simply referred to as "movement amount") of the objective lens 43 in the Z direction when the internal image including the clear image of the crack 14k among the images captured by the light I1 is captured, and is obtained by image determination based on AI, as an example. The direct observation is a case where the light I1 is made incident from the back surface 21b and the focal point of the light I1 is directly aligned with the crack 14k without being reflected by the front surface 21a (in the above example, the focal point F is aligned with the crack 14k from the back surface 21b side), and the back surface reflection observation is a case where the light I1 is made incident from the back surface 21b and the focal point of the light I1 reflected by the front surface 21a is aligned with the crack 14k (in the above example, the focal point F is aligned with the region on the opposite side of the back surface 21b from the back surface 21b side with respect to the front surface 21a and the virtual focal point Fv symmetrical with respect to the front surface 21a and the focal point F is aligned with the crack 14 k).
As shown in fig. 14, in the cases C1 to C4 in which the formation positions of the modified regions 12 were 4 different in the Z direction, the crack 14k was detected between the lower end of the defect upper region and the upper end of the defect upper region in the direct observation, the crack 14k was detected at substantially the same position as the lower end of the defect upper region in the case C1 in the back reflection observation, and the crack 14k was detected between the lower end of the defect upper region and the upper end of the defect in the cases C2 to C4. The width of the modified region 12 in the Z direction is the distance between the lower end of the defect and the upper end of the region above the defect. In this way, the crack 14k can be detected more accurately in the Z direction than the reformed region 12 itself.
Therefore, by acquiring the amount of movement of the internal image when the crack 14k occurs in the Z direction, the information on the position of the modified region 12 can be acquired more accurately. In fig. 14, the vertical axis represents the distance from the back surface, which is the back surface with respect to the incident surface of the light I1 and the front surface 21a of the semiconductor substrate 21. Fig. 15 is a photograph of a cross section of the detection result in case C1.
In the present embodiment, based on the above-described findings, the crack 14k is detected by internal observation, and information on the position of the reformed region 12 is acquired. Next, the observation method of the present embodiment will be explained. In this observation method, the crack 14k is an object crack of the detection object.
Fig. 16 is a flowchart showing an example of the observation method according to the present embodiment. As shown in fig. 16, in this case, laser processing is performed to prepare an object in which a modified region is formed (step S11: preparation step). However, the step of laser processing is not essential as one step of the observation method, and for example, an object in which the modified region 12 is formed using another laser processing apparatus (or at another time by the laser processing apparatus 1) may be prepared.
In step S11, as shown in fig. 17, an object including the semiconductor substrate 21 is prepared. The semiconductor substrate 21 includes a back surface (1 st surface) 21b and a surface (2 nd surface) 21a opposite to the back surface 21 b. In the semiconductor substrate 21, a line 15 extending in the X direction along the back surface 21b and the front surface 21a is set. The semiconductor substrate 21 is supported by the stage 2 such that the rear surface 21b faces the laser irradiation unit 3 so that the rear surface 21b is an incident surface of the laser beam L. In this state, the controller 8 controls the laser irradiation unit 3 and the driving unit 7 and/or the moving mechanism of the stage 2 to relatively move the semiconductor substrate 21 in the X direction and to relatively move the focal point C of the laser light L along the line 15 with respect to the semiconductor substrate 21.
At this time, the controller 8 displays a pattern of laser beams L1 and L2 for causing the spatial light modulator 32 to divide the laser beam L into a plurality of (2 in this case). Thereby, the focal points C1 and C2 of the laser lights L1 and L2 are formed inside the semiconductor substrate 21 so as to be spaced apart by the distance Dz in the Z direction and by the distance Dx in the X direction. As a result, in the semiconductor substrate 21, a plurality of (here, 2 rows) modified regions 12a, 12b are formed along the line 15. Therefore, the X direction is a processing direction in which the converging points C1 and C2 advance.
In this way, the control unit 8 performs laser processing for irradiating the semiconductor substrate 21 with the laser light L along the X direction, which is the extending direction of the line 15, by controlling the laser irradiation unit 3 (irradiation unit), and forming the cracks ( cracks 14 and 14k) extending from the modified region 12 in the plurality of modified regions 12 arranged in the X direction on the semiconductor substrate 21. In fig. 17 and the following drawings, the functional element layer 22 formed on the front surface 21a of the semiconductor substrate 21 is omitted.
Then, internal observation was performed. That is, in the next step, the semiconductor substrate 21 is moved to the observation position (step S12). More specifically, the controller 8 controls the driving unit 7 and/or the movement mechanism of the stage 2 to relatively move the semiconductor substrate 21 to a position directly below the objective lens 43 of the imaging unit 4. When the semiconductor substrate 21 on which the modified region 12 is formed is separately prepared, the semiconductor substrate 21 may be placed at an observation position by a user, for example.
Next, as shown in fig. 18, the semiconductor substrate 21 is imaged by light (transmitted light) I1 having transparency with respect to the semiconductor substrate 21 (step S13: imaging step). In step S13, the following imaging process is executed by controlling the imaging unit 4 (imaging unit): light I1 is made incident from the back surface 21b of the semiconductor substrate 21 into the semiconductor substrate 21, and the crack 14k extending from the modified region 12 in the direction intersecting the Z direction and the X direction, that is, the target crack is imaged by light I1. The Y direction is an example of a direction intersecting the X direction as the machine advancing direction and a Z direction intersecting the back surface 21b and the front surface 21 a.
More specifically, in step S13, the controller 8 controls the driving unit 7 (moving unit) and the imaging unit 4 to move the imaging unit 4 in the Z direction, and thereby causes the light I1 to focus on a plurality of positions inside the semiconductor substrate 21 to image the semiconductor substrate 21, thereby acquiring a plurality of internal images ID. In the present embodiment, the objective lens 43 moves integrally with the imaging unit 4. Therefore, moving the image pickup unit 4 also moves the objective lens 43, and the amount of movement of the image pickup unit 4 is equal to the amount of movement of the objective lens 43.
At this time, the control unit 8 controls the driving unit 7 to move the imaging unit 4 in the Z direction, and performs imaging of the semiconductor substrate 21a plurality of times while moving the focal point (focus F, virtual focus Fv) of the light I1 in the Z direction. The range in which the light I1 is focused can be the full range of the thickness of the semiconductor substrate 21, but here, in the laser processing in step S11, a partial range RA can be selected, the partial range RA including: the converging points C1 and C2 of the laser beams L1 and L2 are aligned with each other in the Z direction to form the modified regions 12a and 12 b. The interval of movement of the imaging unit 4 in the Z direction when performing imaging a plurality of times, that is, the imaging interval of the semiconductor substrate 21 is arbitrary, and is preferably set more finely from the viewpoint of more accurately detecting the crack 14 k. The imaging interval is, for example, within 1 μm, and is 0.2 μm here.
Further, here, the control section 8 controls the imaging unit 4 and the driving unit 7, and performs direct observation and back surface reflection observation of the semiconductor substrate 21. More specifically, the control unit 8 first executes the following 1 st image capturing process: the light I1 is made incident on the semiconductor substrate 21 from the rear surface 21b, and the imaging unit 4 is moved in the Z direction, and the semiconductor substrate 21 is imaged at a plurality of positions in the Z direction while moving the focal point (focal point F) of the light I1 that has not been reflected by the front surface 21a from the rear surface 21b side toward the front surface 21a side, and a plurality of 1 st internal images ID1 are acquired as the internal images ID. The 1 st image pickup processing is direct observation.
The control unit 8 executes the following 2 nd image capturing process: the light I1 is made incident on the object from the back surface 21b, and the imaging unit 4 is moved in the Z direction, and the semiconductor substrate 21 is imaged at a plurality of positions while moving the focal point (virtual focus Fv) of the light I1 reflected by the front surface 21a from the front surface 21a side toward the back surface 21b side, whereby a plurality of 2 nd internal images ID2 are acquired as the internal images ID. This 2 nd imaging process is an observation from the back surface (here, referred to as the front surface 21a in the structure of the semiconductor substrate 21) side with respect to the incident surface of the light I1, and is therefore a back surface reflection observation.
In the next step, image pickup data on the internal image ID acquired by the image pickup of step S13 is saved (step S14). As described above, in step S13, the controller 8 controls the drive unit 7 to move the imaging unit 4 (i.e., the focal point of the light I1) in the Z direction to perform imaging. Therefore, the control unit 8 can acquire the movement amount of the imaging unit 4 when imaging each internal image. Here, information on the amount of movement can be associated with each internal image ID and stored as image capture data. The imaging data can be stored in any storage device accessible to the control unit 8, regardless of the control unit 8 and the laser processing apparatus 1.
The amount of movement of the imaging unit 4 (objective lens 43) can be, for example, the amount of movement of the imaging unit 4 in the case where the focal point of the light I1 is aligned with a desired position inside the semiconductor substrate 21 by moving the imaging unit 4 in the Z direction from the position where the focal point of the light I1 is aligned with the back surface 21b of the semiconductor substrate 21.
Next, the control unit 8 inputs image pickup data from a predetermined storage device (step S15). Then, the control unit 8 determines the formation state of the crack 14k (step S16). Here, the control unit 8 automatically determines, by image recognition, an intra-image ID in which the image of the crack 14k is relatively sharp among the plurality of intra-image IDs, for example (AI determination is performed). Here, an example of an algorithm for detecting a crack or a reformed region by AI judgment will be described.
Fig. 20 and 21 are diagrams illustrating crack detection. Fig. 20 illustrates the internal observation result (internal image of the semiconductor substrate 21). The control unit 8 first detects the linear group 140 with respect to the internal image of the semiconductor substrate 21 shown in fig. 20 (a). For the detection of the straight Line group 140, for example, an algorithm such as Hough transform or LSD (Line Segment Detector) can be used. The Hough transform is a method of detecting all straight lines passing through all points on an image and weighting more straight lines passing through feature points to detect straight lines. The LSD is a method of calculating the gradient and angle of the luminance value in the image to estimate an area that becomes a line segment, and detecting a straight line by regarding the area as a rectangle.
Next, the control unit 8 calculates the degree of similarity to the crack line for the straight line group 140 as shown in fig. 21, and detects the crack 14 from the straight line group 140. As shown in the upper diagram of fig. 21, the crack line is characterized by being very bright in the front and rear directions in the Y direction with respect to the brightness value on the line. Therefore, the control unit 8 compares the detected luminance values of all the pixels of the straight line group 140 with the front and rear values in the Y direction, and takes the number of pixels whose difference is equal to or greater than a threshold value as the similarity score. Then, the representative value of the image is determined as the score having the highest similarity to the crack line in the plurality of detected straight line groups 140. The higher the representative value is, the higher the possibility of the crack 14 being present becomes. The control unit 8 compares the representative values of the plurality of images, and thereby determines an image having a relatively high score as a crack image candidate.
Fig. 22 to 24 are diagrams for explaining flaw detection. Fig. 22 illustrates the internal observation result (internal image of the semiconductor substrate 21). The control unit 8 detects corners (edges) in an image of the inside of the semiconductor substrate 21 shown in fig. 22a as key points, and detects the positions, sizes, and directions of the key points to detect the feature points 250. Methods for detecting feature points in this manner are known as Eigen, Harris, Fast, SIFT, SURF, STAR, MSER, ORB, akage, and the like.
Here, as shown in fig. 23, since the scratches 280 are arranged at regular intervals in a circular shape, a rectangular shape, or the like, the features as corners are strong. Therefore, the flaw 280 can be detected with high accuracy by counting the feature amount of the feature point 250 in the image. As shown in fig. 24, the sum of the feature amounts of each image captured while shifting in the depth direction is compared, and the change in the mountain indicating the crack displacement amount of each modified layer can be confirmed. The control unit 8 estimates the peak of the change as the position of the flaw 280. By counting the feature amount in this way, not only the flaw position but also the pulse pitch can be estimated.
Although the AI determination is described above with respect to the crack 14 and the flaw 280 extending in the X direction, the crack 14k extending in the direction intersecting the Z direction and the X direction may be determined to be an internal image ID having a relatively high score by comparing representative values of a plurality of internal image IDs using the same algorithm, and the image of the crack 14k may be determined to be relatively sharp.
As an example, fig. 19 shows a plurality of internal images ID captured at different positions in the Z direction. In fig. 19, the imaging position of the internal image IDd shown in (d) is set as the center, (c) is the internal image IDc of the imaging position removed by 1 μm toward the back surface 21b, (b) is the internal image IDb of the imaging position removed by 3 μm toward the back surface 21b, (a) is the internal image IDa of the imaging position removed by 5 μm toward the back surface 21b, (e) is the internal image IDe of the imaging position removed by 1 μm toward the front surface 21a, (f) is the internal image IDf of the imaging position removed by 3 μm toward the front surface 21a, and (g) is the internal image IDg of the imaging position removed by 5 μm toward the front surface 21 a. The imaging position here is a value inside the semiconductor substrate 21.
In the example shown in fig. 19, the image of the crack 14k is the sharpest in the intra-image IDd, and the control unit 8 can determine that the intra-image IDd is an intra-image having a relatively high score and the image of the crack 14k is relatively sharp (that is, can determine that the crack 14k is detected in the intra-image IDd). The control unit 8 can acquire the amount of movement when the internal image IDd is captured. Therefore, the control unit 8 can calculate the crack position of the crack 14k based on the movement amount when the internal image IDd is obtained by imaging.
In this way, the control unit 8 executes the following arithmetic processing: based on the plurality of internal images ID and the amount of movement of the imaging unit 4 when each internal image ID is imaged, the crack position, which is the position of the crack 14k extending in the direction intersecting the Z direction and the X direction, i.e., the target crack in the Z direction, is calculated. More specifically, the control unit 8 determines an internal image ID in which the image of the crack 14k is sharp among the plurality of internal images IDs in the calculation process, and calculates the crack position based on the movement amount when the determined internal image ID is captured. The crack position can be calculated by multiplying the movement amount by a predetermined correction coefficient, for example. The correction coefficient can be obtained from, for example, the NA of the objective lens 43, the refractive index of the semiconductor substrate 21, and the like.
The controller 8 can calculate the crack position of the crack 14k as described above for both the 1 st internal image ID1 acquired by direct observation and the 2 nd internal image ID2 acquired by back-reflection observation. Thus, the controller 8 can calculate the crack position of the crack 14k located on the back surface 21b side relative to the 1 st internal image ID1 and the crack position of the crack 14k located on the front surface 21a side relative to the 2 nd internal image ID 2.
That is, in this case, the control unit 8 executes 1 st arithmetic processing of determining a1 st internal image in which the crack 14k is clear among the plurality of 1 st internal images ID1, calculating a1 st crack position Z1 as a crack position based on the amount of movement of the imaging unit 4 when the determined 1 st internal image is captured, and 2 nd arithmetic processing of determining a2 nd internal image in which the crack 14k is clear among the plurality of 2 nd internal images ID2, and calculating a2 nd crack position Z2 as a crack position based on the amount of movement of the imaging unit 4 when the determined 2 nd internal image is captured (see fig. 15 for an example of the 1 st crack position Z1 and the 2 nd crack position Z2). The distance between the 1 st crack position Z1 located on the opposite side of the back surface 21b and the 2 nd crack position Z2 located on the opposite side of the front surface 21a defines the width of the portion (crack initiation portion) of the reformed region 12 where the crack 14k is formed.
Next, in step S16, the control unit 8 estimates the position of the modified region 12 and the like based on the acquired crack position and the like. That is, here, the control unit 8 executes the following estimation processing: based on the forming condition of the modified region 12 (here, the processing condition of the laser processing) and the crack position, at least one of the position of the end portion on the back surface 21b side of the modified region 12 (the defect upper region upper end) in the Z direction, the position of the end portion on the front surface 21a side of the modified region 12 (the defect lower end) in the Z direction, and the width of the modified region 12 in the Z direction (the interval between the defect upper region upper end and the defect lower end) is set.
Here, the controller 8 calculates the 1 st crack position Z1 of the crack 14k (upper crack) on the back surface 21b side based on direct observation and calculates the 2 nd crack position Z2 of the crack 14k (lower crack) on the front surface 21a side based on back surface reflection observation. Therefore, the controller 8 can calculate the width of the crack start portion in the semiconductor substrate 21 as the distance between the 1 st crack position Z1 of the upper crack and the 2 nd crack position Z2 of the lower crack.
Then, the controller 8 can calculate the width of the reformed region 12 in the semiconductor substrate 21 in the Z direction by multiplying the calculated width of the crack initiation portion by a coefficient relating to the processing conditions of the laser processing, for example. The coefficient here is determined based on various factors that affect the formation of the modified region 12, such as the wavelength of the laser light L during laser processing, the amount of aberration correction, the pulse width, and the pulse energy. The coefficient here is around 3.0 in one example.
In this way, the controller 8 can estimate the width of the modified region 12 in the Z direction based on the formation condition of the modified region 12 (the processing condition of the laser processing) and the interval between the 1 st crack position Z1 and the 2 nd crack position Z2 in the estimation process.
On the other hand, the control unit 8 can calculate the position of the lower end of the modified region 12 on the front surface 21a side by subtracting the assumed modified region width, which is the entire assumed modified region 12 width, from the 1 st crack position Z1 of the upper crack. The modified region width can be determined based on various conditions affecting the formation of the modified region 12, such as the wavelength of the laser light L during laser processing, the amount of aberration correction, the pulse width, and the pulse energy. The modified region width is, for example, about 20 μm.
The controller 8 can calculate the position of the lower end of the front surface 21a of the modified region 12 by subtracting the assumed defect region width, which is the width of the assumed defect region 12m, from the 2 nd crack position Z2 of the lower crack. The defect region width can be determined based on various conditions affecting the formation of the modified region 12, such as the wavelength of the laser light L during laser processing, the amount of aberration correction, the pulse width, and the pulse energy. The defect region width is about 10 μm as an example.
Further, the control unit 8 can calculate the position of the upper end of the modified region 12 on the back surface 21b side by adding the assumed defect upper region width, which is the width of the assumed defect upper region 12n, to the 2 nd crack position Z2 of the lower crack. The defect upper region width can be determined based on various conditions affecting the formation of the modified region 12, such as the wavelength of the laser light L during laser processing, the amount of aberration correction, the pulse width, and the pulse energy. The width of the region above the defect is, for example, about 10 μm.
As described above, the control portion 8 estimates and acquires various information about the position of the modified area 12 in step S16. In the next step, the control unit 8 outputs the information of the determination result in step S16 to an arbitrary storage device (step S17) and stores the information in the storage device (step S18). Thereafter, if necessary, various information is displayed on the display 150 in a state where the user input can be accepted (step S19), and the process is completed. Examples of the information displayed on the display 150 are the 1 st crack position Z1, the 2 nd crack position Z2, the initiator width, the position of the end of the modified region 12, the width of the modified region 12 in the Z direction, and the like. In this way, in step S19, the control unit 8 controls the display 150 to execute a display process for displaying information on the crack position on the display 150.
In this manner, the observation method by the laser processing apparatus 1 is completed. In the present embodiment, the observation method is performed by the imaging unit 4, the driving unit 7, and the control unit 8 in the laser processing apparatus 1. In other words, in the laser processing apparatus 1, the observation apparatus 1A is configured by the imaging unit 4, the driving unit 7, and the control unit 8, wherein the imaging unit 4 is configured to image the semiconductor substrate 21 with the light I1 having the transmittance with respect to the semiconductor substrate 21, the driving unit 7 is configured to move the imaging unit 4 relative to the semiconductor substrate 21, and the control unit 8 is configured to control at least the imaging unit 4 and the driving unit 7 (see fig. 1).
As described above, the reformed region 12 and the cracks 14 and 14k extending from the reformed region 12 are formed in the semiconductor substrate 21 to be observed by the observation method and the observation apparatus 1A for performing the observation method according to the present embodiment, which are aligned in the X direction. With such a semiconductor substrate 21, the crack 14k extending in the direction intersecting the Z direction and the X direction can be imaged using the light I1 transmitted through the semiconductor substrate 21. The crack 14k intersecting the Z direction and the X direction can be imaged (detected) more accurately in the Z direction than the reformed region 12 itself. Therefore, for example, if information such as the amount of movement of the imaging unit 4 when the crack 14k is imaged is acquired, information on the position of the modified region 12 can be acquired more accurately based on the amount of movement.
The observation device 1A of the present embodiment includes a drive unit 7 for moving the light converging point of the light I1 relative to the semiconductor substrate 21. In the imaging process, the control unit 8 controls the imaging unit 4 and the driving unit 7 to move the imaging unit 4 in the Z direction, so that the light converging point of the light I1 is positioned at a plurality of positions inside the semiconductor substrate 21 to image the semiconductor substrate 21, thereby acquiring a plurality of internal images ID. After the imaging process, the control unit 8 executes the following arithmetic processing: based on the plurality of internal images ID and the amount of movement in the Z direction of the imaging unit 4 when each internal image ID is imaged, crack positions (1 st crack position Z1 and 2 nd crack position Z2) which are the positions of the crack 14k in the Z direction are calculated. In this way, the information on the position of the modified region 12 can be acquired more accurately based on the amount of movement of the imaging unit 4 when the crack 14k is imaged.
In the observation device 1A according to the present embodiment, in the calculation process, the control unit 8 determines an internal image in which the image of the crack 14k is clear among the plurality of internal images ID, and calculates the crack position of the crack 14k based on the movement amount of the imaging unit 4 when the determined internal image is obtained. In this way, the control unit 8 determines a clear internal image of the crack 14k, thereby calculating the position of the crack 14k more accurately.
In the observation device 1A of the present embodiment, the control unit 8 executes the following estimation processing after the arithmetic processing: at least one of the position of the end portion on the back surface 21b side of the modified region 12 in the Z direction, the position of the end portion on the front surface 21a side of the modified region 12 in the Z direction, and the width of the modified region 12 in the Z direction is estimated based on the formation conditions of the modified region 12 and the crack position of the crack 14 k. The shape and size of the modified region 12 may vary depending on the formation conditions of the modified region, such as the processing conditions of laser processing (for example, the wavelength, pulse width, pulse energy, and aberration correction amount of the laser beam). Therefore, if the forming conditions of the modified region 12 such as the processing conditions of the laser processing and the position of the crack 14k are used, the information on the position of the modified region 12 can be acquired more accurately.
In the observation apparatus of this embodiment, moreover, in the image pickup process, the control unit 8 executes the 1 st image pickup process and the 2 nd image pickup process, wherein in the 1 st image pickup process, the light I1 is made incident from the back surface 21b to the semiconductor substrate 21, and the image pickup unit 4 is moved, thereby, the semiconductor substrate 21 is imaged at a plurality of positions while moving the converging point of the light I1 reflected without passing through the front surface 21a from the back surface 21b side to the front surface 21a side, a plurality of 1 st internal images ID1 are acquired as internal images ID, in the 2 nd image pickup process, the light I1 is made incident from the back surface 21b to the semiconductor substrate 21, and the image pickup unit 4 is moved, thereby, the semiconductor substrate 21 is imaged at a plurality of positions while moving the converging point of the light I1 reflected by the front surface 21a from the front surface 21a side to the rear surface 21b side, and a plurality of 2 nd internal images ID2 are acquired as the internal images ID.
In this way, if the internal images are acquired by imaging (direct observation) of the semiconductor substrate 21 using the light I1 incident from the back surface 21b of the semiconductor substrate 21 but not reflected by the front surface 21a and imaging (back reflection observation) of the semiconductor substrate 21 using the light I1 incident from the back surface 21b of the semiconductor substrate 21 and reflected by the front surface 21a, information on the position of the modified region 12 can be acquired more accurately using the crack position acquired based on the amount of movement of the imaging unit 4 when the internal images are acquired by imaging.
In the observation device 1A of the present embodiment, the control unit 8 executes the 1 st arithmetic processing of determining the 1 st internal image in which the crack 14k is clear among the plurality of 1 st internal images ID1 and calculating the 1 st crack position Z1 as the crack position based on the moving amount of the imaging unit 4 when the determined 1 st internal image is obtained by imaging and the 2 nd arithmetic processing of determining the 2 nd internal image in which the crack 14k is clear among the plurality of 2 nd internal images ID2 and calculating the 2 nd crack position Z2 as the crack position based on the moving amount of the imaging unit 4 when the determined 2 nd internal image is obtained by imaging, in the arithmetic processing. In the estimation process, the controller 8 can estimate the width of the reformed region 12 in the Z direction based on the formation condition of the reformed region 12 and the interval between the 1 st crack position Z1 and the 2 nd crack position Z2. As described above, in this case, the information on the width of the modified region 12 can be more accurately acquired based on the interval of the 1 st crack position Z1 acquired by direct observation and the 2 nd crack position Z2 acquired by back-surface reflection observation.
Further, the observation device 1A of the present embodiment further includes a display 150 for displaying information. After the arithmetic processing, the control unit 8 may perform a display processing of displaying information on the crack position on the display 150 by controlling the display 150. In this case, the user can grasp the information of the crack position through the display 150. The information on the crack position is at least one of the crack position itself and various information included in the information on the position of the modified region 12 that can be estimated based on the crack position.
The above-described embodiment is an embodiment of the present invention. Therefore, the present invention is not limited to the above embodiment, and can be arbitrarily modified.
For example, in the above-described embodiment, the driving unit 7 that moves the objective lens 43 together with the imaging unit 4 is exemplified as a means for moving the objective lens 43 relative to the semiconductor substrate 21 in the Z direction. However, for example, only the objective lens 43 may be moved in the Z direction by an actuator.
In the above embodiment, the example in which the control unit 8 automatically determines the image in step S16 has been described, but the control unit 8 may acquire the crack position of the crack 14k based on the determination result of the user. In this case, the control unit 8 causes the display 150 to display a plurality of internal images ID, for example, and causes the display 150 to display information that urges the determination (selection) of one internal image in which the image of the crack 14k is clear from the plurality of internal images ID. The control unit 8 can receive the input of the determination result through the display 150, and calculate the crack position of the crack 14k based on the movement amount of the internal image ID corresponding to the determination result. In this case, the display 150 is a display section for displaying information, and is also an input accepting section that accepts input. In this case, the processing load for performing image recognition and the like by the control unit 8 is reduced.
Further, in the above-described embodiment, in step S13, the observation of the 1 modified region 12, both the direct observation and the back-reflection observation, are performed, and the 1 st internal image ID1 and the 2 nd internal image ID2 as the internal image IDs are acquired. However, in step S13, only one of direct observation and back-reflection observation may be performed. In this case, since one of the 1 st and 2 nd internal image IDs 1 and 2 can be obtained, the position and width of the end of the modified region 12 can be estimated based on the one.

Claims (8)

1. A viewing device, comprising:
an imaging unit for imaging an object by using transmitted light that is transmissive to the object; and
a control section for controlling at least the image pickup section,
the object has a1 st surface and a2 nd surface opposite to the 1 st surface,
a modified region arranged in the X-direction along the 1 st and 2 nd surfaces and a crack extending from the modified region are formed in the object,
the control unit controls the imaging unit to perform the following imaging processing: the transmitted light is made incident from the 1 st surface into the object, and an image of a target crack extending in a Z direction intersecting the 1 st surface and the 2 nd surface and a direction intersecting the X direction among the cracks is taken by the transmitted light.
2. A viewing device according to claim 1, wherein:
includes a moving unit for moving a condenser lens for condensing the transmitted light on the object relative to the object,
in the image pickup processing, the control unit controls the image pickup unit and the moving unit to relatively move the condenser lens in the Z direction so that the converging point of the transmitted light is located at a plurality of positions inside the object to pick up an image of the object, thereby acquiring a plurality of internal images,
the control unit executes the following arithmetic processing after the image pickup processing: calculating a crack position as a position of the target crack in the Z direction based on the plurality of internal images and a movement amount of the condenser lens in the Z direction when each of the internal images is captured.
3. Viewing apparatus according to claim 2, wherein:
in the calculation process, the control unit determines the internal image in which the image of the target crack is sharp among the plurality of internal images, and calculates the crack position based on the movement amount of the condenser lens when the determined internal image is captured.
4. A viewing device according to claim 3, wherein:
the control unit executes the following estimation processing after the arithmetic processing: estimating at least 1 of a position of the 1 st surface side end of the modified region in the Z direction, a position of the 2 nd surface side end of the modified region in the Z direction, and a width of the modified region in the Z direction based on the formation condition of the modified region and the crack position.
5. An observation device according to any one of claims 2 to 4, wherein:
in the image pickup processing, the control section executes 1 st image pickup processing and 2 nd image pickup processing, wherein,
in the 1 st image pickup process, the transmitted light is made incident on the object from the 1 st surface, and the condenser lens is relatively moved in the Z direction, so that the object is picked up at a plurality of positions while moving the converging point of the transmitted light that has not been reflected by the 2 nd surface from the 1 st surface side to the 2 nd surface side, thereby acquiring a plurality of 1 st internal images as the internal images,
in the 2 nd image pickup processing, the transmitted light is made incident on the object from the 1 st surface, and the condensing lens is relatively moved in the Z direction, so that the object is picked up at a plurality of positions while moving the condensing point of the transmitted light reflected on the 2 nd surface from the 2 nd surface side to the 1 st surface side, thereby acquiring a plurality of 2 nd internal images as the internal images.
6. Viewing apparatus according to claim 5, wherein:
in the arithmetic processing, the control unit executes 1 st arithmetic processing and 2 nd arithmetic processing, wherein,
in the 1 st calculation process, the 1 st internal image in which the target crack is clear is determined from the 1 st internal images, and the 1 st crack position as the crack position is calculated based on the movement amount of the condenser lens when the determined 1 st internal image is imaged,
in the 2 nd calculation process, the 2 nd internal image in which the target crack is clear is determined from the plurality of 2 nd internal images, and the 2 nd crack position as the crack position is calculated based on the movement amount of the condenser lens when the determined 2 nd internal image is imaged,
the control unit estimates the width of the reformed region in the Z direction based on the formation condition of the reformed region and the interval between the 1 st crack position and the 2 nd crack position.
7. An observation device according to any one of claims 2 to 6, wherein:
also comprises a display part for displaying information,
the control unit controls the display unit after the arithmetic processing to execute a display process of displaying information on the crack position on the display unit.
8. An observation method, comprising:
a preparation step of preparing an object having a1 st surface and a2 nd surface opposite to the 1 st surface, the object having a reformed region and a crack extending from the reformed region, the reformed region being formed in the object and aligned in an X direction along the 1 st surface and the 2 nd surface; and
an imaging step of, after the preparation step, making transmitted light transmitted from the object enter the object from the 1 st surface, and imaging a target crack, which is one of the cracks extending in a Z direction intersecting the 1 st surface and the 2 nd surface and in a direction intersecting the X direction, with the transmitted light.
CN202210098627.XA 2021-01-29 2022-01-27 Observation device and observation method Pending CN114905169A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021013542A JP2022117060A (en) 2021-01-29 2021-01-29 Observation device and method for observation
JP2021-013542 2021-01-29

Publications (1)

Publication Number Publication Date
CN114905169A true CN114905169A (en) 2022-08-16

Family

ID=82749949

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210098627.XA Pending CN114905169A (en) 2021-01-29 2022-01-27 Observation device and observation method

Country Status (4)

Country Link
JP (1) JP2022117060A (en)
KR (1) KR20220110083A (en)
CN (1) CN114905169A (en)
TW (1) TW202235194A (en)

Also Published As

Publication number Publication date
KR20220110083A (en) 2022-08-05
JP2022117060A (en) 2022-08-10
TW202235194A (en) 2022-09-16

Similar Documents

Publication Publication Date Title
US20230154774A1 (en) Inspection device and inspection method
US20230095941A1 (en) Laser machining device and laser machining method
CN114905169A (en) Observation device and observation method
CN114799575A (en) Observation device and observation method
US20220331908A1 (en) Inspection device and inspection method
US20230158609A1 (en) Laser processing device and laser processing method
CN114905170A (en) Observation device, observation method, and observation target
KR20230086588A (en) Inspection device and inspection method
CN112770866A (en) Imaging device, laser processing device, and imaging method
US20220331909A1 (en) Inspection device and inspection method
CN116246968A (en) Inspection method
CN112789135B (en) Imaging device, laser processing device, and imaging method
TWI831841B (en) Camera device, laser processing device, and camera method
US20230120386A1 (en) Laser processing device and laser processing method
US20230245906A1 (en) Laser processing device and laser processing method
CN114054985A (en) Inspection apparatus and inspection method
TW202145396A (en) Inspection device and inspection method
TW202145397A (en) Inspection device and inspection method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination