WO2020102612A1 - Multi-sensor tiled camera with flexible electronics for wafer inspection - Google Patents

Multi-sensor tiled camera with flexible electronics for wafer inspection Download PDF

Info

Publication number
WO2020102612A1
WO2020102612A1 PCT/US2019/061579 US2019061579W WO2020102612A1 WO 2020102612 A1 WO2020102612 A1 WO 2020102612A1 US 2019061579 W US2019061579 W US 2019061579W WO 2020102612 A1 WO2020102612 A1 WO 2020102612A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
disposed
flex board
camera system
folded flex
Prior art date
Application number
PCT/US2019/061579
Other languages
French (fr)
Inventor
Pablo POMBO
Kurt Lehman
Original Assignee
Kla Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/379,900 external-priority patent/US10724964B1/en
Application filed by Kla Corporation filed Critical Kla Corporation
Priority to CN201980073213.8A priority Critical patent/CN113039633B/en
Priority to KR1020217017725A priority patent/KR102518210B1/en
Publication of WO2020102612A1 publication Critical patent/WO2020102612A1/en
Priority to IL282680A priority patent/IL282680A/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/9501Semiconductor wafers
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L22/00Testing or measuring during manufacture or treatment; Reliability measurements, i.e. testing of parts without further processing to modify the parts as such; Structural arrangements therefor
    • H01L22/30Structural arrangements specially adapted for testing or measuring during manufacture or treatment, or specially adapted for reliability measurements
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L22/00Testing or measuring during manufacture or treatment; Reliability measurements, i.e. testing of parts without further processing to modify the parts as such; Structural arrangements therefor
    • H01L22/10Measuring as part of the manufacturing process
    • H01L22/12Measuring as part of the manufacturing process for structural parameters, e.g. thickness, line width, refractive index, temperature, warp, bond strength, defects, optical inspection, electrical measurement of structural dimensions, metallurgic measurement of diffusions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L21/00Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
    • H01L21/67Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere

Definitions

  • This disclosure relates to wafer inspection systems. BACKGROUND OF THE DISCLOSURE
  • Fabricating semiconductor devices typically includes processing a semiconductor wafer using a large number of fabrication processes to form various features and multiple levels of the semiconductor devices, For example, lithography is a semiconductor fabrication process that involves transferring a pattern from a reticle to a photoresist arranged on a semiconductor wafer, Additional examples of semiconductor fabrication processes include, but are not limited to, chemical-mechanical polishing (CMP), etch, deposition, and ion implantation. Multiple semiconductor devices may be fabricated in an arrangement on a single semiconductor wafer that are separated into individual semiconductor devices.
  • CMP chemical-mechanical polishing
  • etch etch
  • deposition deposition
  • ion implantation ion implantation
  • Inspection processes are used at various steps during semiconductor manufacturing to detect defects on wafers to promote higher yield in the manufacturing process and, thus, higher profits. Inspection has always been an important part of fabricating semiconductor devices such as integrated circuits (ICs). However, as the dimensions of semiconductor devices decrease, inspection becomes even more important to the successful manufacture of acceptable semiconductor devices because smaller defects can cause the devices to fail. For instance, as the dimensions of ICs.
  • Determining which of the defects actually have an effect on the electrical parameters of the devices and tire yield may allow process control methods to be focused on those defects while largely ignoring others. Furthermore, at smaller design rules, process induced failures, in some cases, tend to be systematic. That is, process-iirduced failures tend to fail at predetermined design patterns often repeated many times within the design. Elimination of spatially-systematic, electrically-relevant defects cart have an impact on yield.
  • TDI time delay integr ation
  • the common carrier approach does not allow for flexible integration of each sensor. Furthermore, alignment is done without live feedback because the camera cannot operate dining alignment, This makes operation challenging. There also is a size limit as to how many sensors can be supported using a common earner.
  • a detached sensor solution providing pig-tails or flexible media also was previously used.
  • This cabling solution allow independent sensor mechanical manipulation, but does not provide a high-speed solution that can support high throughput camera requirements (e.g., more than 3 million lines per second).
  • the number of connectors that need to be used can result in a larger volume implementation and reliability problems with the connectors.
  • a camera system is provided in a first embodiment.
  • the camera system comprises a support member and a plurality of sensor units disposed in the support member.
  • Each of the sensor units includes a folded flex board having a plurality of laminations, a sensor disposed in the folded flex board, a high-density digitizer disposed in the folded flex board, and a field progr ammable gate array disposed in the folded flex board.
  • the folded flex board defines an aperture.
  • the sensor is disposed in the folded flex board such that the sensor is positioned over the aperture.
  • the system can include, for example, three of the sensor units or six of the sensor units.
  • the sensor may be a time delay integration sensor.
  • the sensor can image 1536 pixels. [0014] Two of the sensor units can be spaced apart by a distance from 5 mm to 7 mm.
  • a land grid array for the sensor can have a pitch of 1 mm.
  • the camera system can further include a processor in electronic communication with the sensor units.
  • the processor can be configured to stich images from the sensors.
  • a broadband plasma inspection tool can include the camera system of the first embodiment.
  • a method is provided in a second embodiment.
  • the method comprises imaging a wafer with a plurality of sensor units disposed in a support member.
  • Each of the sensor units includes a folded flex board having a plurality of laminations, a sensor disposed in the folded flex board, a high-density digitizer disposed in the folded flex board, and a field programmable gate array disposed in the folded flex board.
  • the folded flex board defines an aperture.
  • the sensor is disposed in the folded flex board such that the sensor is positioned over the aperture.
  • the method can further include stitching images from the sensors using a processor in electronic communication with the sensor units.
  • a processor in electronic communication with the sensor units.
  • three or six of the sensor units can be disposed in the support member.
  • FIG. 1 illustrates a perspective view of an embodiment of a camera system in accordance with the present disclosure
  • FIG. 2 illustrates an embodiment of a flex board in accordance with the present disclosure
  • FIG. 3 illustrates an embodiment of the flex board of FIG. 2 when folded
  • FIG. 4 is ait optical representation of field of view
  • FIG. 5 is a block diagram of a system that includes the camera system of FIG. 1:
  • FIG. 6 is a flowchart of an embodiment of a method in accordance with the present disclosure.
  • Embodiments disclosed herein are dir ected to a camera architecture that supports multi-sensor integration in a single camera system with flexible electronics to allow optimal field of view utilization. This can maximize the light produced in tool. Many degrees of freedom are required for each sensor to support mechanical integration and individual alignment for optimal optical performance. Flexible electronics can provide mecharrical independence for the individual sensors in tiled cameras while providing high quality substrate for high-speed electronics, chips, aud interconnects. In the modular approach disclosed herein, every tile is considered a standalone camera. All electronic circuits required to operate each individual sensor are included in a tile block. The architecture decouples the opto-mechauical aspects of the design from the electronics required to operate the sensor. This allows full camera operation during individual alignment process.
  • FIG, 1 illustrates a perspective view of au embodiment of a camera system 100.
  • FIG. 1 illustrates a perspective view of au embodiment of a camera system 100.
  • the camera system 100 includes a plurality of sensor units 101 in a support member 102.
  • Each sensor unit 101 includes a folded flex board 103 with laminations 104, such as that illustrated in FIG. 2.
  • the laminations 104 may extend to or into rigid sections of the folded flex board 103.
  • the laminations 104 may be internal layers of the flex board 103 that are laminated with rigid cores from the rigid sections of the ilex board 103. For example, the laminations 104 can fold along the dotted line in FIG. 2 or elsewhere on the laminations 104.
  • the flex board 103 also defines au aperture 105.
  • a sensor 106 is disposed in the folded flex board 103 such that the sensor 106 is positioned over the aperture 105.
  • the sensor 106 can be held to the aperture 105 using a mechanical interposer between the sensor package and rigid sections of tire folded flex board 103 (e.g., land grid array (LGA) interfaces).
  • the mechanical interposer can include approximately 2000 spring-based contacts to bridge across both LGA patterns.
  • the assembly may be held in compression by screws from a front plate and a backing or pressure plate on an opposite side.
  • the sensor 106 can be a TDI sensor or another type of sensor.
  • An LGA for the sensor 106 can have a pitch of 1 mm or other values.
  • the sensor 106 can image 1536 pixels.
  • the sensor 106 can be configured to image other ranges of pixels, and this is merely one example.
  • the sensor units 101 each also can include a high-density digitizer 107 and a field programmable gate array 108 disposed in the folded flex board 103.
  • the high-density digitizer 107 is part of the sensor 106.
  • All sensor control (e.g., timing, gate driving) and processing (e.g., image calibration) electronics may be contained in or around the aperture 105 so the individual sensor 106 operation is contained in a single sensor unit 101.
  • the flex cores of the folded flex board 103 are made of PYRALUX AP from DuPont, which is a flexible circuit material that includes an all polyimide, double-sided, copper-clad laminate, and the rigid cores are standard 370HR board material. Other materials for the flex cores or rigid cores are possible. All cores can be laminated together and the parts that only contain flex cores may be flexible. In an instance, the flex cores are not glued together in what is commonly known as“open book” design to allow for a small bending radius and better flexibility. The number of layers, the number of vias, and the size of the vias can increase complexity of fabrication for the folded flex board 103.
  • three sensor units 101 can be included in the camera system 100.
  • the camera system 100 can include six of the sensor units 101. Other number's of sensor units 101 are possible. For example, four or five sensor units 101 can be used.
  • the sensor units 101 can be spaced apart by a distance from 5 mm to 7 mm.
  • FIG. 4 is au optical representation of field of view.
  • two cameras were used. This is shown by the two vertical rectangles 109.
  • the spacing of sensor units was constrained by the second camera (i.e., the second vertical rectangle 109) because all parts of the field of view in FIG. 4 need to be imaged.
  • six sensors 106 can be used.
  • two camera systems with three sensors 106 each can be used, but other number's of sensors 106 are possible if there is no scanning gap.
  • the sensors 106 disclosed herein have an overlap (e.g., an image is detected by two or more sensors) to stitch the images together. This provides better light utilization because with six sensors 106 more than twice the amount of light is imaged in tile outer circle of FIG. 4 during the swath.
  • Tiles e.g., sensors
  • This layout can create challenging electromechanical requirements for tile (e.g., individual sensor) package design and tile carrier board (e.g., flex board).
  • tile carrier board e.g., flex board.
  • This design minimizes the light wasted (e.g., not collected) and has no gaps in the swathing of the TDI sensor. There may be overlap between tiles to put the image back together.
  • the modular implementation disclosed herein allows individual mechanical manipulation while using camera to get live images.
  • a sensor can be connected to a motherboard by a single connector in a flex circuit. This allows image collection dining tile alignment.
  • the embodiments disclosed herein can provide high reliability because multiple connectors are avoided when using a rigid-flex circuit board approach. Multiple connectors generally provide a point of failure.
  • Embodiments disclosed herein also provide a high-quality substrate in the form of the flex board 103 to sustain high-speed operation. Connector or signal breaks in the flex board 103 between the various rigid sections may be reduced or eliminated.
  • FIG. 5 is a block diagram of a system that iircludes the camera system of FIG. 1.
  • the system 200 includes optical based subsystem 201 ,
  • the optical based subsystem 201 is configured for generating optical based output for a specimen 202 by directing light to (or scanning light over) and detecting light from the specimen 202.
  • the specimen 202 includes a wafer.
  • the wafer may include any wafer known in the art.
  • the specimen includes a reticle.
  • the reticle may include any reticle known in the art.
  • optical based subsystem 201 includes an illumination subsystem configured to direct light to specimen 202.
  • the illumination subsystem includes at least one light source.
  • the illumination subsystem includes light source 203.
  • the illumination subsystem is configured to direct the light to the specimen 202 at one or more angles of incidence, which may include one or more oblique angles and/or one or more normal angles.
  • light from light source 203 is directed through optical element 204 and then lens 205 to specimen 202 at an oblique angle of incidence.
  • the oblique angle of incidence may include any suitable oblique angle of incidence, which may vary depending on, for instance, characteristics of the specimen 202.
  • the optical based subsystem 201 may be configur ed to direct the light to the specimen 202 at different angles of incidence at different times.
  • the optical based subsystem 201 may be configured to alter one or more char acteristics of one or more elements of the illumination subsystem such that the light can be directed to the specimen 202 at an angle of incidence that is different than that shown in FIG. 5.
  • the optical based subsystem 201 maybe configured to move light source 203, optical element 204, and lens 205 such that the light is directed to the specimen 202 at a different oblique angle of incidence or a normal (or near normal) angle of incidence.
  • the optical based subsystem 201 may be configur ed to direct light to the specimen 202 at more titan one angle of incidence at tire same time.
  • the illumina tion subsystem may include more than one illumination channel, one of the illumination channels may include light source 203, optical element 204, and lens 205 as shown in FIG. 5 and another of the illumination channels (not shown) may include similar elements, which may be configured differently or the same, or may include at least a light source and possibly one or more other components such, as those described further herein.
  • one or more characteristics e.g., wavelength, polarization, etc.
  • characteristics e.g., wavelength, polarization, etc.
  • the illumination subsystem may include only one light source
  • illumination subsystem e.g., light source 203 shown in FIG. 5
  • light from the light source may be separated into different optical paths (e.g., based on wavelength, polarization, etc.) by one or more optical elements (not shown) of the illumination subsystem.
  • Light in each of the different optical paths may then be directed to the specimen 202.
  • Multiple illumination channels may be configured to direct light to the specimen 202 at the same time or at different times (e.g., when different illumination channels are used to sequentially illuminate the specimen). In another instance, the same illumination channel may be configured to direct light to the specimen 202 with different characteristics at different times.
  • optical element 204 may be configured as a spectral filter and the properties of the spectral filter can be changed in a variety of different ways (e.g., by swapping out the spectral filter) such diat different wavelengths of light can be directed to the specimen 202 at different times.
  • the illumination subsystem may have any other suitable configuration known in the art for directing the light having different or the same characteristics to the specimen 202 at different or the same angles of incidence sequentially or simultaneously.
  • light source 203 may include a broadband plasma (BBP) source, and the system 200 is a BBP inspection tool. In this manner, the light generated by the light source
  • BBP broadband plasma
  • the light source 203 and directed to the specimen 202 may include broadband light.
  • the light source may include any other suitable light source such as a laser.
  • the laser may include any suitable laser known in the ait and may be configured to generate light at any suitable wavelength or wavelengths known in the art.
  • the laser may be configured to generate light that is monochromatic or nearly-monochromatic. In this manner, the laser may be a narrowband laser.
  • the light source 203 may also include a polychromatic light source that generates light at multiple discrete wavelengths or wavebands.
  • Light from optical element 204 may be focused onto specimen 202 by lens 205. Although lens 205 is shown in FIG.
  • lens 205 may include a number of refractive and/or reflective optical elements that in combination focus the light from the optical element to the specimen.
  • the illumination subsystem shown in FIG. 5 and described herein may include any other suitable optical elements (not shown). Examples of such optical elements include, but are not limited to, polarizing component(s), spectral filter(s), spatial filter(s), reflective optical etement(s), apodizer(.s), beam splitter(s) (such as beam splitter 213), aperture(s), and the like, which may include any such suitable optical elements known in the art.
  • the optical based subsystem 201 may be configured to alter oue or more of the elements of the illumination subsystem based on the type of illumination to be used for generating the optical based output.
  • the optical based subsystem 201 may also include a scanning subsystem configured to cause the light to be scanned over the specimen 202.
  • the optical based subsystem 201 may include stage 206 on which specimen 202 is disposed during optical based output generation.
  • the scanning subsystem may include any suitable mechanical and/or robotic assembly (that includes stage 206) that can be configured to move the specimen 202 such that the light can be scanned over the specimen 202.
  • the optical based subsystem 201 may be configured such that one or more optical elements of the optical based subsystem 201 perform some scanning of the light over the specimen 202.
  • the light may be scanned over the specimen 202 in any suitable fashion such as in a serpentine-like path or in a spiral path.
  • the optical based subsystem 201 further includes one or more detection channels. At least one of the one or more detection channels includes a detector configured to detect light from the specimen 202 due to illumination of the specimen 202 by the subsystem and to generate output responsive to the detected light.
  • the optical based subsystem 201 shown in FIG. 5 includes two detection channels, one formed by collector 207, element 208, and detector 209 and another formed by collector 210, element 211 , and camera system 100. As shown in FIG. 5, the two detection channels are configured to collect and detect light at different angles of collection. In some instances, both detection channels are configured to detect scattered light, and the detection channels are configured to detect tight that is scattered at different angles from the specimen 202. However, one or more of the detection channels may be configured to detect another type of light from the specimen 202 (e.g., reflected light).
  • both detection channels are shown positioned in the plane of the paper and the illumination subsystem is also shown positioned in the plane of the paper. Therefore, in this embodiment, both detection channels are positioned in (e.g., centered in) the plane of incidence. However, one or more of the detection channels may be positioned out of the plane of incidence.
  • the detection channel formed by collector 210, element 211, and camera system 100 may be configured to collect and detect light that is scattered out of the plane of incidence. Therefore, such a detection channel may be commonly referred to as a“side” channel, and such a side channel may be centered in a plane that is substantially perpendicular to the plane of incidence.
  • FIG. 5 shows an embodiment of the optical based subsystem 201 that includes two detection channels
  • the optical based subsystem 201 may include a different number of detection channels (e.g., only one detection channel or two or more detection channels).
  • the detection channel formed by collector 210, element 211, and camera system 100 may form one side channel as described above, and the optical based subsystem 201 may include an additional detection channel (not shown) formed as another side channel that is positioned on the opposite side of the plane of incidence.
  • the optical based subsystem 201 may include the detection channel that includes collector 207, element 208, and detector 209 and that is centered in the plane of incidence and configured to collect and detect light at scattering angle(s) that are at or close to normal to the specimen 202 surface.
  • This detection channel may therefore be commonly referred to as a“top” channel
  • tire optical based subsystem 201 may also include two or more side channels configured as described above.
  • the optical based subsystem 201 may include at least three channels (i.e., one top channel and two side channels), and each of the at least titree channels has its own collector, each of which is configured to collect light at different scattering angles than each of the other collectors.
  • each of the detection channels included in the optical based subsystem 201 may be configured to detect scattered light. Therefore, the optical based subsystem 201 shown in FIG. 5 may be configured for dark field (DF) output generation for specimens 202. However, the optical based subsystem 201 may also or alternatively include detection channels) that are configured for bright field (BF) output generation for specimens 202. In other words, the optical based subsystem 201 may include at least one detection channel that is configured to detect light specularly reflected from the specimen 202. Therefore, the optical based subsystems 201 described herein may be configured for only DF, only BF, or both DF and BF imaging. Although each of the collectors are shown in FIG. 5 as single refractive optical elements, it is to be understood that each of the collectors may include one or more refractive optical die(s) and/or one or more reflective optical element(s).
  • the one or more detection channels may include any suitable detectors known in the art.
  • the detectors may include photo-multiplier tubes (PMTs), charge coupled devices (CCDs), TDI cameras (such as those included in the camera system 100 of FIG. 1), and any other suitable detectors known in the ait.
  • the detectors may also include non-imaging detectors or imaging detectors. In this manner, if the detectors are non-imaging detectors, each of the detectors may be configured to detect certain characteristics of the scattered light such as intensity but may not be configured to detect such characteristics as a function of position within the imaging plane.
  • the output that is generated by each of the detectors included in each of the detection channels of the optical based subsystem may be signals or data, but not image signals or image data.
  • a processor such as processor 214 may be configured to generate images of the specimen 202 from the non-imaging output of the detectors.
  • the detectors may be configured as imaging detectors that are configured to generate imaging signals or image data. Therefore, the optical based subsystem may be configured to generate optical images or other optical based output described herein in a number of ways.
  • FIG. 5 is provided herein to generally illustrate a configuration of air optical based subsystem 201 that may be included in the system embodiments described herein or that may generate optical based output that is used by the system embodiments described herein.
  • the optical based subsystem 201 configuration described herein may be altered to optimize the performance of the optical based subsystem 201 as is normally per formed when designing a commercial output acquisition system.
  • the systems described herein may be
  • the methods described herein may be provided as optional functionality of the system (e.g., in addition to other functionality of the system).
  • the system described herein may be designed as a completely new system.
  • the processor 214 may be coupled to the components of the system 200 in any suitable manner (e.g., via one or more transmission media, which may include wired and/or wireless transmission media) such that the processor 214 can receive output.
  • the processor 214 may be configured to perform a number of functions using the output.
  • the system 200 can receive instructions or other infomiation from the processor 214.
  • the processor 214 and/or the electronic data storage unit 215 optionally may be in electronic communication with a wafer inspection tool, a wafer metrology tool, or a wafer review tool (not illustrated) to receive additional information or send instructions.
  • the processor 214 and/or the electronic data storage unit 215 can be in electronic communication with a scanning electron microscope (SENT).
  • SENT scanning electron microscope
  • the processor 214, other system(s), or other subsystem(s) described herein may be part of various systems, including a personal computer system, image computer, mainframe computer system, workstation, network appliance, internet appliance, or other device.
  • the subsystem(s) or system(s) may also include any suitable processor known in the ait, such as a parallel processor.
  • the subsystem(s) or system(s) may include a platform with highspeed processing and software, either as a standalone or a networked tool.
  • the processor 214 and electronic data storage unit 215 may be disposed in or otherwise part of the system 200 or another device.
  • the processor 214 and electronic data storage unit 215 may be part of a standalone control unit or in a centralized quality control unit. Multiple processors 214 or electronic data storage units 215 may be used.
  • the processor 214 may be implemented in practice by any combination of hardware, software, and firmware. Also, its functions as described herein may be performed by one unit, or divided up among differ ent components, each of which may be implemented in turn by any combination of hardware, software and firmware. Program code or instructions for the processor
  • the system 200 may include more than one processor 214, then the different subsystems may be coupled to each other such that images, data, information, instructions, etc. can be sent between the subsystems.
  • one subsystem may be coupled to additional subsystem(s) by any suitable transmission media, which may include any suitable wired and/or wireless transmission media known in the art.
  • Two or more of such subsystems may also be effectively coupled by a shared computer-readable storage medium (not shown).
  • the processor 214 may be configured to perform a number of functions using the output of the system 200 or other output. For instance, the processor 214 may be configured to send the output to au electronic data storage unit 215 or another storage medium. The processor 214 may be further configured as described herein.
  • the processor 214 may be configured according to any of the embodiments described herein.
  • the processor 214 also may be configured to perform other functions or additional steps using the output of the system 200 or using images or data from other sources.
  • Various steps, functions, and/or oper ations of system 200 arid the methods disclosed herein are carried out by one or more of the following: electronic circuits, logic gates, multiplexers, programmable logic devices, ASICs, analog or digital controls/switches, microcontrollers, or computing systems.
  • Progr am instructions implementing methods such as those described herein may be tr ansmitted over or stored on carrier medium.
  • the carrier medium may include a storage medium such as a read-only memory, a random access memory, a magnetic or optical disk, a non- volatile memory, a solid state memory, a magnetic tape, and the like.
  • a carrier medium may include a transmission medium such as a wire, cable, or wireless transmission link.
  • the processor 214 is in communication with the system 200.
  • the processor 214 can be configured to stitch images from the sensors in the camera system 100.
  • An additional embodiment relates to a non-transitory computer-readable medium storing progr am instructions executable on a controller for performing a computer-implemented method for determining a height of an illuminated region on a surface of a specimen 202, as disclosed herein.
  • electronic data storage unit 215 or other storage medium may contain non-transitory computer-readable medium that includes program instructions executable on the processor 214.
  • the computer-implemented method may include any step(s) of any method(s) described herein, including method 300.
  • Program instructions implementing methods such as those described herein may be stored on computer-readable medium, such as in the electronic data storage unit 215 or other storage medium.
  • the computer-readable medium may be a storage medium such as a magnetic or optical disk, a magnetic tape, or any other suitable non-transitory computer-readable medium known in the art.
  • the program instructions may be implemented in auy of various ways, including procedure-based techniques, component -based techniques, and/or object-oriented techniques, amoug others.
  • the program instructions may be implemented using ActiveX controls, C++ objects. JavaBeans, Microsoft Foundation Classes (MFC), Streaming SIMD Extension (SSE), or other tec!mologies or methodologies, as desired.
  • MFC Microsoft Foundation Classes
  • SSE Streaming SIMD Extension
  • FIG. 6 is a flowchart of an embodiment of a method 300.
  • a wafer is imaged with sensor units.
  • Each of the sensor units includes a folded flex board having laminations and defining an aperture; a sensor disposed in the folded flex board such that the sensor is positioned over the aper ture; a high-density digitizer disposed in the folded flex board; and a field
  • programmable gate array disposed in the folded flex board.
  • Each of the steps of the method may be perforated as described herein.
  • the methods also may include auy other step(s) that can be performed by the processor and/or computer subsystem(s) or system(s) described herein.
  • the steps can be performed by one or more computer systems, which may be configured according to any of the embodiments described herein.
  • the methods described above may be performed by any of the system embodiments described herein.

Abstract

Sensor units can be disposed in a support member. Each of the sensor units can include a folded flex board having a plurality of laminations and an aperture and a sensor disposed in the folded flex board such that the sensor is positioned over the aperture. The system can be used in broad band plasma inspection tools for semiconductor wafers.

Description

MULTI-SENSOR TILED CAMERA WITH FLEXIBLE ELECTRONICS FOR WAFER
INSPECTION
FELD OF THE DISCLOSURE
[0001] This disclosure relates to wafer inspection systems. BACKGROUND OF THE DISCLOSURE
[0002] Evolution of the semiconductor manufacturing industry is placing greater demands on yield management and, in particular, on metrology and inspection systems. Critical dimensions continue to shrink, yet the industry needs to decrease time for achieving high-yield, high-value production, Minimizing the total time from detecting a yield problem to fixing it determines the return-on-investment for a semiconductor manufacturer.
[0003] Fabricating semiconductor devices, such as logic and memory devices, typically includes processing a semiconductor wafer using a large number of fabrication processes to form various features and multiple levels of the semiconductor devices, For example, lithography is a semiconductor fabrication process that involves transferring a pattern from a reticle to a photoresist arranged on a semiconductor wafer, Additional examples of semiconductor fabrication processes include, but are not limited to, chemical-mechanical polishing (CMP), etch, deposition, and ion implantation. Multiple semiconductor devices may be fabricated in an arrangement on a single semiconductor wafer that are separated into individual semiconductor devices.
[0004] Inspection processes are used at various steps during semiconductor manufacturing to detect defects on wafers to promote higher yield in the manufacturing process and, thus, higher profits. Inspection has always been an important part of fabricating semiconductor devices such as integrated circuits (ICs). However, as the dimensions of semiconductor devices decrease, inspection becomes even more important to the successful manufacture of acceptable semiconductor devices because smaller defects can cause the devices to fail. For instance, as the dimensions of
semiconductor devices decrease, detection of defects of decreasing size has become necessary because even relatively small defects may cause unwanted aberrations in rive semiconductor devices. [0005] As design rules shrink, however, semiconductor manufacturing piocesses may be operating closer to the limitation on the performance capability of the processes. In addition, smaller defects can have an impact on the electrical parameters of the device as the design rules shrink, which drives more sensitive inspections. As design rules shrink, the population of potentially yield-relevant defects detected by inspection grows dramatically, and the population of nuisance defects detected by inspection also inn-eases dramatically. Therefore, ntore defects may be detected on tire wafers, and correcting the processes to eliminate all of the defects may be difficult and expensive. Determining which of the defects actually have an effect on the electrical parameters of the devices and tire yield may allow process control methods to be focused on those defects while largely ignoring others. Furthermore, at smaller design rules, process induced failures, in some cases, tend to be systematic. That is, process-iirduced failures tend to fail at predetermined design patterns often repeated many times within the design. Elimination of spatially-systematic, electrically-relevant defects cart have an impact on yield.
[0006] Each new generation of wafer inspection tools brings higher sensitivity and higher throughput. Increasing the throughput requires faster cameras, but also brighter light sources that can place more photons on a wafer in less time. Even if brighter light sources can be produced, such brighter light sources can have reliability or cost drawbacks. The cameras can help with this by increasing the number of time delay integr ation (TDI) stages (e.g.. pixels used to integr ate light for a. given image) in a sensor. When tire number of TDI stages is reached, multiple sensors can be tiled to better utilize the available field of view.
[0007] Previously, a single carrier electronics device to support multiple sensors was used.
The common carrier approach does not allow for flexible integration of each sensor. Furthermore, alignment is done without live feedback because the camera cannot operate dining alignment, This makes operation challenging. There also is a size limit as to how many sensors can be supported using a common earner.
[0008] A detached sensor solution providing pig-tails or flexible media also was previously used. This cabling solution allow independent sensor mechanical manipulation, but does not provide a high-speed solution that can support high throughput camera requirements (e.g., more than 3 million lines per second). The number of connectors that need to be used can result in a larger volume implementation and reliability problems with the connectors.
[0009] Therefore, an improved camera for inspection of semiconductor wafers is needed.
BRIEF SUMMARY OF THE DISCLOSURE [0010] A camera system is provided in a first embodiment. The camera system comprises a support member and a plurality of sensor units disposed in the support member. Each of the sensor units includes a folded flex board having a plurality of laminations, a sensor disposed in the folded flex board, a high-density digitizer disposed in the folded flex board, and a field progr ammable gate array disposed in the folded flex board. The folded flex board defines an aperture. The sensor is disposed in the folded flex board such that the sensor is positioned over the aperture.
[0011] The system can include, for example, three of the sensor units or six of the sensor units.
[0012] The sensor may be a time delay integration sensor.
[0013] The sensor can image 1536 pixels. [0014] Two of the sensor units can be spaced apart by a distance from 5 mm to 7 mm.
[0015] A land grid array for the sensor can have a pitch of 1 mm.
[0016] The camera system can further include a processor in electronic communication with the sensor units. The processor can be configured to stich images from the sensors.
[0017] A broadband plasma inspection tool can include the camera system of the first embodiment.
[0018] A method is provided in a second embodiment. The method comprises imaging a wafer with a plurality of sensor units disposed in a support member. Each of the sensor units includes a folded flex board having a plurality of laminations, a sensor disposed in the folded flex board, a high-density digitizer disposed in the folded flex board, and a field programmable gate array disposed in the folded flex board. The folded flex board defines an aperture. The sensor is disposed in the folded flex board such that the sensor is positioned over the aperture.
[0019] The method can further include stitching images from the sensors using a processor in electronic communication with the sensor units. [0020] For example, three or six of the sensor units can be disposed in the support member.
DESCRIPTION OF THE DRAWINGS
[0021] For a fuller understanding of the nature and objects of the disclosure, reference should be made to the following detailed description taken in conjunction with tire accompanying drawings, in which:
FIG. 1 illustrates a perspective view of an embodiment of a camera system in accordance with the present disclosure;
FIG. 2 illustrates an embodiment of a flex board in accordance with the present disclosure;
FIG. 3 illustrates an embodiment of the flex board of FIG. 2 when folded;
FIG. 4 is ait optical representation of field of view;
FIG. 5 is a block diagram of a system that includes the camera system of FIG. 1: and
FIG. 6 is a flowchart of an embodiment of a method in accordance with the present disclosure.
DETAILED DESCRIPTION OF THE DISCLOSURE
[0022] Although claimed subject matter will be described in terms of certain embodiments, other embodiments, including embodiments that do not provide all of the benefits and features set forth herein, are also within the scope of this disclosure. Various structural, logical, process step, and electronic changes may be made without departing front the scope of the disclosure.
Accordingly, the scope of the disclosure is defined only by reference to the appended claims.
[0023] Embodiments disclosed herein are dir ected to a camera architecture that supports multi-sensor integration in a single camera system with flexible electronics to allow optimal field of view utilization. This can maximize the light produced in tool. Many degrees of freedom are required for each sensor to support mechanical integration and individual alignment for optimal optical performance. Flexible electronics can provide mecharrical independence for the individual sensors in tiled cameras while providing high quality substrate for high-speed electronics, chips, aud interconnects. In the modular approach disclosed herein, every tile is considered a standalone camera. All electronic circuits required to operate each individual sensor are included in a tile block. The architecture decouples the opto-mechauical aspects of the design from the electronics required to operate the sensor. This allows full camera operation during individual alignment process.
[0024] FIG, 1 illustrates a perspective view of au embodiment of a camera system 100. FIG.
2 illustrates an embodiment of a ilex board 103. FIG. 3 illustrates an embodiment of the flex board 103 of FIG. 2 when folded. As seen in FIG. 1, the camera system 100 includes a plurality of sensor units 101 in a support member 102. Each sensor unit 101 includes a folded flex board 103 with laminations 104, such as that illustrated in FIG. 2. The laminations 104 may extend to or into rigid sections of the folded flex board 103. The laminations 104 may be internal layers of the flex board 103 that are laminated with rigid cores from the rigid sections of the ilex board 103. For example, the laminations 104 can fold along the dotted line in FIG. 2 or elsewhere on the laminations 104. The flex board 103 also defines au aperture 105.
[0025] As seen in FIG. 1, a sensor 106 is disposed in the folded flex board 103 such that the sensor 106 is positioned over the aperture 105. The sensor 106 can be held to the aperture 105 using a mechanical interposer between the sensor package and rigid sections of tire folded flex board 103 (e.g., land grid array (LGA) interfaces). In an example, the mechanical interposer can include approximately 2000 spring-based contacts to bridge across both LGA patterns. The assembly may be held in compression by screws from a front plate and a backing or pressure plate on an opposite side.
[0026] The sensor 106 can be a TDI sensor or another type of sensor. An LGA for the sensor 106 can have a pitch of 1 mm or other values.
[0027] In an instance, the sensor 106 can image 1536 pixels. The sensor 106 can be configured to image other ranges of pixels, and this is merely one example. [0028] The sensor units 101 each also can include a high-density digitizer 107 and a field programmable gate array 108 disposed in the folded flex board 103. In an instance, the high-density digitizer 107 is part of the sensor 106. Thus, all the electronics to operate the sensor 106 can be included in the folded flex board 103. All sensor control (e.g., timing, gate driving) and processing (e.g., image calibration) electronics may be contained in or around the aperture 105 so the individual sensor 106 operation is contained in a single sensor unit 101.
[0029] Rigid flex technology with a high layer count can be used for the folded flex board
103, which can enable the compact form. In an instance, the flex cores of the folded flex board 103 are made of PYRALUX AP from DuPont, which is a flexible circuit material that includes an all polyimide, double-sided, copper-clad laminate, and the rigid cores are standard 370HR board material. Other materials for the flex cores or rigid cores are possible. All cores can be laminated together and the parts that only contain flex cores may be flexible. In an instance, the flex cores are not glued together in what is commonly known as“open book” design to allow for a small bending radius and better flexibility. The number of layers, the number of vias, and the size of the vias can increase complexity of fabrication for the folded flex board 103.
[0030] As seen in FIG. 1, three sensor units 101 can be included in the camera system 100.
In another instance, the camera system 100 can include six of the sensor units 101. Other number's of sensor units 101 are possible. For example, four or five sensor units 101 can be used.
[0031] While illustrated as adjacent, the sensor units 101 can be spaced apart by a distance from 5 mm to 7 mm.
[0032] FIG. 4 is au optical representation of field of view. Previously, two cameras were used. This is shown by the two vertical rectangles 109. The spacing of sensor units was constrained by the second camera (i.e., the second vertical rectangle 109) because all parts of the field of view in FIG. 4 need to be imaged. Now, six sensors 106 can be used. For example, two camera systems with three sensors 106 each can be used, but other number's of sensors 106 are possible if there is no scanning gap. The sensors 106 disclosed herein have an overlap (e.g., an image is detected by two or more sensors) to stitch the images together. This provides better light utilization because with six sensors 106 more than twice the amount of light is imaged in tile outer circle of FIG. 4 during the swath. [0033] Tiles (e.g., sensors) can be laid out precisely to optimize image performance. This layout can create challenging electromechanical requirements for tile (e.g., individual sensor) package design and tile carrier board (e.g., flex board). This design minimizes the light wasted (e.g., not collected) and has no gaps in the swathing of the TDI sensor. There may be overlap between tiles to put the image back together.
[0034] There may only be approximately 6 mm of space between sensor packages. Smaller sensors, such as 1024 pixel sensors, may be used. For example, a 512 pixel sensor may be used and may allow more sensors to be included. These smaller sensors may be configured to have a package or routing space overhead to fit into camera system 100. [0035] A larger sensor may be challenging to implement because the larger the sensor then the larger the space between sensor packages. Sensor yield decreases as the sensor increases in size, which impacts wafer occupation on the wafer used to produce the sensors.
[0036] The modular implementation disclosed herein allows individual mechanical manipulation while using camera to get live images. A sensor can be connected to a motherboard by a single connector in a flex circuit. This allows image collection dining tile alignment. The embodiments disclosed herein can provide high reliability because multiple connectors are avoided when using a rigid-flex circuit board approach. Multiple connectors generally provide a point of failure.
[0037] Embodiments disclosed herein also provide a high-quality substrate in the form of the flex board 103 to sustain high-speed operation. Connector or signal breaks in the flex board 103 between the various rigid sections may be reduced or eliminated.
[0038] 10 Gbps links can operate in this substrate. The tile carrier board stack-up can combine flex cores with laser drilled micro-vias that allow high quality designs for 10 Gbps links. Micro-vias can enable 10G routing without via stubs. [0039] By eliminating connectors, the embodiments disclosed herein are compact. When a cable connector solution is used, there can be a problem reducing the number of pins (e.g., more than 2000 connections per sensor) without the connectors occupying extensive space. [0040] FIG. 5 is a block diagram of a system that iircludes the camera system of FIG. 1. The system 200 includes optical based subsystem 201 , In general, the optical based subsystem 201 is configured for generating optical based output for a specimen 202 by directing light to (or scanning light over) and detecting light from the specimen 202. hr one embodiment, the specimen 202 includes a wafer. The wafer may include any wafer known in the art. In another embodiment, the specimen includes a reticle. The reticle may include any reticle known in the art.
[0041] In the embodiment of the system 200 shown in FIG. 5, optical based subsystem 201 includes an illumination subsystem configured to direct light to specimen 202. The illumination subsystem includes at least one light source. For example, as shown in FIG. 5, the illumination subsystem includes light source 203. In one embodiment, the illumination subsystem is configured to direct the light to the specimen 202 at one or more angles of incidence, which may include one or more oblique angles and/or one or more normal angles. For example, as shown in FIG. 5, light from light source 203 is directed through optical element 204 and then lens 205 to specimen 202 at an oblique angle of incidence. The oblique angle of incidence may include any suitable oblique angle of incidence, which may vary depending on, for instance, characteristics of the specimen 202.
[0042] The optical based subsystem 201 may be configur ed to direct the light to the specimen 202 at different angles of incidence at different times. For example, the optical based subsystem 201 may be configured to alter one or more char acteristics of one or more elements of the illumination subsystem such that the light can be directed to the specimen 202 at an angle of incidence that is different than that shown in FIG. 5. In one such example, the optical based subsystem 201 maybe configured to move light source 203, optical element 204, and lens 205 such that the light is directed to the specimen 202 at a different oblique angle of incidence or a normal (or near normal) angle of incidence.
[0043] In some instances, the optical based subsystem 201 may be configur ed to direct light to the specimen 202 at more titan one angle of incidence at tire same time. For example, the illumina tion subsystem may include more than one illumination channel, one of the illumination channels may include light source 203, optical element 204, and lens 205 as shown in FIG. 5 and another of the illumination channels (not shown) may include similar elements, which may be configured differently or the same, or may include at least a light source and possibly one or more other components such, as those described further herein. If such light is directed to the specimen at the same time as the other light, one or more characteristics (e.g., wavelength, polarization, etc.) of the light directed to the specimen 202 at different angles of incidence may be different such that light resulting from illumination of the specimen 202 at the different angles of incidence can be discriminated from each other at the detector(s).
[0044] In another instance, the illumination subsystem may include only one light source
(e.g., light source 203 shown in FIG. 5) and light from the light source may be separated into different optical paths (e.g., based on wavelength, polarization, etc.) by one or more optical elements (not shown) of the illumination subsystem. Light in each of the different optical paths may then be directed to the specimen 202. Multiple illumination channels may be configured to direct light to the specimen 202 at the same time or at different times (e.g., when different illumination channels are used to sequentially illuminate the specimen). In another instance, the same illumination channel may be configured to direct light to the specimen 202 with different characteristics at different times. For example, in some instances, optical element 204 may be configured as a spectral filter and the properties of the spectral filter can be changed in a variety of different ways (e.g., by swapping out the spectral filter) such diat different wavelengths of light can be directed to the specimen 202 at different times. The illumination subsystem may have any other suitable configuration known in the art for directing the light having different or the same characteristics to the specimen 202 at different or the same angles of incidence sequentially or simultaneously. [0045] In one embodiment, light source 203 may include a broadband plasma (BBP) source, and the system 200 is a BBP inspection tool. In this manner, the light generated by the light source
203 and directed to the specimen 202 may include broadband light. However, the light source may include any other suitable light source such as a laser. The laser may include any suitable laser known in the ait and may be configured to generate light at any suitable wavelength or wavelengths known in the art. In addition, the laser may be configured to generate light that is monochromatic or nearly-monochromatic. In this manner, the laser may be a narrowband laser. The light source 203 may also include a polychromatic light source that generates light at multiple discrete wavelengths or wavebands. [0046] Light from optical element 204 may be focused onto specimen 202 by lens 205. Although lens 205 is shown in FIG. 5 as a single refractive optical element, it is to be understood that, in practice, lens 205 may include a number of refractive and/or reflective optical elements that in combination focus the light from the optical element to the specimen. The illumination subsystem shown in FIG. 5 and described herein may include any other suitable optical elements (not shown). Examples of such optical elements include, but are not limited to, polarizing component(s), spectral filter(s), spatial filter(s), reflective optical etement(s), apodizer(.s), beam splitter(s) (such as beam splitter 213), aperture(s), and the like, which may include any such suitable optical elements known in the art. In addition, the optical based subsystem 201 may be configured to alter oue or more of the elements of the illumination subsystem based on the type of illumination to be used for generating the optical based output.
[0047] The optical based subsystem 201 may also include a scanning subsystem configured to cause the light to be scanned over the specimen 202. For example, the optical based subsystem 201 may include stage 206 on which specimen 202 is disposed during optical based output generation. The scanning subsystem may include any suitable mechanical and/or robotic assembly ( that includes stage 206) that can be configured to move the specimen 202 such that the light can be scanned over the specimen 202. In addition, or alternatively, the optical based subsystem 201 may be configured such that one or more optical elements of the optical based subsystem 201 perform some scanning of the light over the specimen 202. The light may be scanned over the specimen 202 in any suitable fashion such as in a serpentine-like path or in a spiral path.
[0048] The optical based subsystem 201 further includes one or more detection channels. At least one of the one or more detection channels includes a detector configured to detect light from the specimen 202 due to illumination of the specimen 202 by the subsystem and to generate output responsive to the detected light. For example, the optical based subsystem 201 shown in FIG. 5 includes two detection channels, one formed by collector 207, element 208, and detector 209 and another formed by collector 210, element 211 , and camera system 100. As shown in FIG. 5, the two detection channels are configured to collect and detect light at different angles of collection. In some instances, both detection channels are configured to detect scattered light, and the detection channels are configured to detect tight that is scattered at different angles from the specimen 202. However, one or more of the detection channels may be configured to detect another type of light from the specimen 202 (e.g., reflected light).
[0049] As further shown in FIG. 5, both detection channels are shown positioned in the plane of the paper and the illumination subsystem is also shown positioned in the plane of the paper. Therefore, in this embodiment, both detection channels are positioned in (e.g., centered in) the plane of incidence. However, one or more of the detection channels may be positioned out of the plane of incidence. For example, the detection channel formed by collector 210, element 211, and camera system 100 may be configured to collect and detect light that is scattered out of the plane of incidence. Therefore, such a detection channel may be commonly referred to as a“side” channel, and such a side channel may be centered in a plane that is substantially perpendicular to the plane of incidence.
[0050] Although FIG. 5 shows an embodiment of the optical based subsystem 201 that includes two detection channels, the optical based subsystem 201 may include a different number of detection channels (e.g., only one detection channel or two or more detection channels). In one such instance, the detection channel formed by collector 210, element 211, and camera system 100 may form one side channel as described above, and the optical based subsystem 201 may include an additional detection channel (not shown) formed as another side channel that is positioned on the opposite side of the plane of incidence. Tliei efore, the optical based subsystem 201 may include the detection channel that includes collector 207, element 208, and detector 209 and that is centered in the plane of incidence and configured to collect and detect light at scattering angle(s) that are at or close to normal to the specimen 202 surface. This detection channel may therefore be commonly referred to as a“top” channel, and tire optical based subsystem 201 may also include two or more side channels configured as described above. As such, the optical based subsystem 201 may include at least three channels (i.e., one top channel and two side channels), and each of the at least titree channels has its own collector, each of which is configured to collect light at different scattering angles than each of the other collectors.
[0051] As described further above, each of the detection channels included in the optical based subsystem 201 may be configured to detect scattered light. Therefore, the optical based subsystem 201 shown in FIG. 5 may be configured for dark field (DF) output generation for specimens 202. However, the optical based subsystem 201 may also or alternatively include detection channels) that are configured for bright field (BF) output generation for specimens 202. In other words, the optical based subsystem 201 may include at least one detection channel that is configured to detect light specularly reflected from the specimen 202. Therefore, the optical based subsystems 201 described herein may be configured for only DF, only BF, or both DF and BF imaging. Although each of the collectors are shown in FIG. 5 as single refractive optical elements, it is to be understood that each of the collectors may include one or more refractive optical die(s) and/or one or more reflective optical element(s).
[0052] The one or more detection channels may include any suitable detectors known in the art. For example, the detectors may include photo-multiplier tubes (PMTs), charge coupled devices (CCDs), TDI cameras (such as those included in the camera system 100 of FIG. 1), and any other suitable detectors known in the ait. The detectors may also include non-imaging detectors or imaging detectors. In this manner, if the detectors are non-imaging detectors, each of the detectors may be configured to detect certain characteristics of the scattered light such as intensity but may not be configured to detect such characteristics as a function of position within the imaging plane.
As such, the output that is generated by each of the detectors included in each of the detection channels of the optical based subsystem may be signals or data, but not image signals or image data. In such instances, a processor such as processor 214 may be configured to generate images of the specimen 202 from the non-imaging output of the detectors. However, in other instances, the detectors may be configured as imaging detectors that are configured to generate imaging signals or image data. Therefore, the optical based subsystem may be configured to generate optical images or other optical based output described herein in a number of ways.
[0053] It. is noted that FIG. 5 is provided herein to generally illustrate a configuration of air optical based subsystem 201 that may be included in the system embodiments described herein or that may generate optical based output that is used by the system embodiments described herein. The optical based subsystem 201 configuration described herein may be altered to optimize the performance of the optical based subsystem 201 as is normally per formed when designing a commercial output acquisition system. In addition, the systems described herein may be
implemented using an existing system (e.g„ by adding functionality described herein to an existing system). For some such systems, the methods described herein may be provided as optional functionality of the system (e.g., in addition to other functionality of the system). Alternatively, the system described herein may be designed as a completely new system.
[0054] The processor 214 may be coupled to the components of the system 200 in any suitable manner (e.g., via one or more transmission media, which may include wired and/or wireless transmission media) such that the processor 214 can receive output. The processor 214 may be configured to perform a number of functions using the output. The system 200 can receive instructions or other infomiation from the processor 214. The processor 214 and/or the electronic data storage unit 215 optionally may be in electronic communication with a wafer inspection tool, a wafer metrology tool, or a wafer review tool (not illustrated) to receive additional information or send instructions. For example, the processor 214 and/or the electronic data storage unit 215 can be in electronic communication with a scanning electron microscope (SENT).
[0055] The processor 214, other system(s), or other subsystem(s) described herein may be part of various systems, including a personal computer system, image computer, mainframe computer system, workstation, network appliance, internet appliance, or other device. The subsystem(s) or system(s) may also include any suitable processor known in the ait, such as a parallel processor. In addition, the subsystem(s) or system(s) may include a platform with highspeed processing and software, either as a standalone or a networked tool.
[0056] The processor 214 and electronic data storage unit 215 may be disposed in or otherwise part of the system 200 or another device. In an example, the processor 214 and electronic data storage unit 215 may be part of a standalone control unit or in a centralized quality control unit. Multiple processors 214 or electronic data storage units 215 may be used.
[0057] The processor 214 may be implemented in practice by any combination of hardware, software, and firmware. Also, its functions as described herein may be performed by one unit, or divided up among differ ent components, each of which may be implemented in turn by any combination of hardware, software and firmware. Program code or instructions for the processor
214 to implement various methods and functions may be stored in readable storage media, such as a memory in the electronic data storage unit 215 or other memory. [0058] If the system 200 includes more than one processor 214, then the different subsystems may be coupled to each other such that images, data, information, instructions, etc. can be sent between the subsystems. For example, one subsystem may be coupled to additional subsystem(s) by any suitable transmission media, which may include any suitable wired and/or wireless transmission media known in the art. Two or more of such subsystems may also be effectively coupled by a shared computer-readable storage medium (not shown).
[0059] The processor 214 may be configured to perform a number of functions using the output of the system 200 or other output. For instance, the processor 214 may be configured to send the output to au electronic data storage unit 215 or another storage medium. The processor 214 may be further configured as described herein.
[0060] The processor 214 may be configured according to any of the embodiments described herein. The processor 214 also may be configured to perform other functions or additional steps using the output of the system 200 or using images or data from other sources.
[0061] Various steps, functions, and/or oper ations of system 200 arid the methods disclosed herein are carried out by one or more of the following: electronic circuits, logic gates, multiplexers, programmable logic devices, ASICs, analog or digital controls/switches, microcontrollers, or computing systems. Progr am instructions implementing methods such as those described herein may be tr ansmitted over or stored on carrier medium. The carrier medium may include a storage medium such as a read-only memory, a random access memory, a magnetic or optical disk, a non- volatile memory, a solid state memory, a magnetic tape, and the like. A carrier medium may include a transmission medium such as a wire, cable, or wireless transmission link. For instance, the various steps described throughout the present disclosure may be carried out by a single processor 214 or, alternatively, multiple processors 214. Moreover, different sub-systems of the system 200 may include one ot more computing or logic systems. Therefore, the above description should not be interpreted as a limitation on the present disclosure but merely an illustration.
[0062] In an instance, the processor 214 is in communication with the system 200. The processor 214 can be configured to stitch images from the sensors in the camera system 100. [0063] An additional embodiment relates to a non-transitory computer-readable medium storing progr am instructions executable on a controller for performing a computer-implemented method for determining a height of an illuminated region on a surface of a specimen 202, as disclosed herein. In particular, as shown in FIG. 5, electronic data storage unit 215 or other storage medium may contain non-transitory computer-readable medium that includes program instructions executable on the processor 214. The computer-implemented method may include any step(s) of any method(s) described herein, including method 300.
[0064] Program instructions implementing methods such as those described herein may be stored on computer-readable medium, such as in the electronic data storage unit 215 or other storage medium. The computer-readable medium may be a storage medium such as a magnetic or optical disk, a magnetic tape, or any other suitable non-transitory computer-readable medium known in the art.
[0065] The program instructions may be implemented in auy of various ways, including procedure-based techniques, component -based techniques, and/or object-oriented techniques, amoug others. For example, the program instructions may be implemented using ActiveX controls, C++ objects. JavaBeans, Microsoft Foundation Classes (MFC), Streaming SIMD Extension (SSE), or other tec!mologies or methodologies, as desired.
[0066] FIG. 6 is a flowchart of an embodiment of a method 300. At 301, a wafer is imaged with sensor units. Each of the sensor units includes a folded flex board having laminations and defining an aperture; a sensor disposed in the folded flex board such that the sensor is positioned over the aper ture; a high-density digitizer disposed in the folded flex board; and a field
programmable gate array disposed in the folded flex board. There may be, for example, three or six sensor lurits disposed in the support member .
[0067] At 302, images from tire sensors are stitched together using a processor in electronic communication with the sensor units.
[0068] Each of the steps of the method may be perforated as described herein. The methods also may include auy other step(s) that can be performed by the processor and/or computer subsystem(s) or system(s) described herein. The steps can be performed by one or more computer systems, which may be configured according to any of the embodiments described herein. In addition, the methods described above may be performed by any of the system embodiments described herein.
[0069] Although the present disclosure has been described with respect to one or more particular embodiments, it will be understood that other embodiments of the present disclosure may be made without departing from the scope of the present disclosure. Hence, the present disclosure is deemed limited only by the appended claims and the reasonable interpretation thereof.

Claims

What is claimed is:
1. A camera system comprising:
a support member; and
a plurality of sensor units disposed in the support member, wherein each of the sensor units includes:
a folded flex board having a plurality of laminations, wherein the folded flex board defines an aperture;
a sensor disposed in the folded flex board such that the sensor is positioned over the
aperture;
a high-density digitizer disposed in the folded flex board; and
a field programmable gate array disposed in the folded flex board.
2. The camera system of claim 1, wherein the system includes three of the sensor units.
3. The camera system of claim 1, wherein the system includes six of the sensor units.
4. The camera system of claim 1, wherein the sensor is a time delay integration sensor.
5. The camera system of claim 1, wherein the sensor images 1536 pixels.
6. The camera system of claim 1, wherein two of the sensor imits are spaced apart by a distance from 5 mm to 7 mm.
7. The camera system of claim 1, wherein a laud grid array for the sensor has a pitch of 1 mm,
8. The camera system of claim 1, further comprising a processor in electronic communication with the sensor units, wherein the processor is configured to stich images from the sensors.
9. A broadband plasma inspection tool that includes the camera system of claim 1.
10. A method comprising:
imaging a wafer with a plurality of sensor units disposed in a support member, wherein each of the sensor units includes: a folded flex board having a plurality of laminations, wherein the folded flex board defines an aperture;
a sensor disposed in the folded flex boar d such that the sensor is positioned over the
aperture;
a high-density digitizer disposed in the folded flex board; and
a field progr ammable gate array disposed in the folded flex board.
11. The method of claim 10, further comprising stitching images from the sensors using a processor in electronic communication with the sensor units.
12. The method of claim 10, wherein three of the sensor imits are disposed in the support member.
13. The method of claim 10, wherein six of the sensor units are disposed in the support member.
PCT/US2019/061579 2018-11-15 2019-11-15 Multi-sensor tiled camera with flexible electronics for wafer inspection WO2020102612A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201980073213.8A CN113039633B (en) 2018-11-15 2019-11-15 Multi-sensor tiled camera with flexible electronics for wafer inspection
KR1020217017725A KR102518210B1 (en) 2018-11-15 2019-11-15 Multi-Sensor Tiled Camera with Flexible Electronics for Wafer Inspection
IL282680A IL282680A (en) 2018-11-15 2021-04-27 Multi-sensor tiled camera with flexible electronics for wafer inspection

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201862767961P 2018-11-15 2018-11-15
US62/767,961 2018-11-15
US16/379,900 2019-04-10
US16/379,900 US10724964B1 (en) 2019-04-10 2019-04-10 Multi-sensor tiled camera with flexible electronics for wafer inspection

Publications (1)

Publication Number Publication Date
WO2020102612A1 true WO2020102612A1 (en) 2020-05-22

Family

ID=70730897

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2019/061579 WO2020102612A1 (en) 2018-11-15 2019-11-15 Multi-sensor tiled camera with flexible electronics for wafer inspection

Country Status (5)

Country Link
KR (1) KR102518210B1 (en)
CN (1) CN113039633B (en)
IL (1) IL282680A (en)
TW (1) TWI812813B (en)
WO (1) WO2020102612A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011080310A1 (en) * 2009-12-31 2011-07-07 Mapper Lithography Ip B.V. Integrated sensor system
US20130194445A1 (en) * 2012-02-01 2013-08-01 Kla-Tencor Corporation Integrated Multi-Channel Analog Front End And Digitizer For High Speed Imaging Applications
WO2014093341A1 (en) * 2012-12-10 2014-06-19 Kla-Tencor Corporation Method and apparatus for high speed acquisition of moving images using pulsed illumination
KR20150032373A (en) * 2013-09-16 2015-03-26 삼성전자주식회사 Stack type image sensor and fabrication method thereof
US20180059033A1 (en) * 2016-08-29 2018-03-01 Kla-Tencor Corporation Apparatus for High-Speed Imaging Sensor Data Transfer

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104835162A (en) * 2015-05-12 2015-08-12 李鹏飞 SoC_FPGA-based flexible intelligent machine vision detection system
CN204810392U (en) * 2015-06-17 2015-11-25 深圳市得意自动化科技有限公司 Industry camera

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011080310A1 (en) * 2009-12-31 2011-07-07 Mapper Lithography Ip B.V. Integrated sensor system
US20130194445A1 (en) * 2012-02-01 2013-08-01 Kla-Tencor Corporation Integrated Multi-Channel Analog Front End And Digitizer For High Speed Imaging Applications
WO2014093341A1 (en) * 2012-12-10 2014-06-19 Kla-Tencor Corporation Method and apparatus for high speed acquisition of moving images using pulsed illumination
KR20150032373A (en) * 2013-09-16 2015-03-26 삼성전자주식회사 Stack type image sensor and fabrication method thereof
US20180059033A1 (en) * 2016-08-29 2018-03-01 Kla-Tencor Corporation Apparatus for High-Speed Imaging Sensor Data Transfer

Also Published As

Publication number Publication date
TWI812813B (en) 2023-08-21
KR20210077780A (en) 2021-06-25
IL282680A (en) 2021-06-30
TW202034419A (en) 2020-09-16
CN113039633A (en) 2021-06-25
CN113039633B (en) 2022-08-16
KR102518210B1 (en) 2023-04-04

Similar Documents

Publication Publication Date Title
JP6580771B2 (en) System for examining samples in two separate channels simultaneously
JP6759053B2 (en) Polarized image acquisition device, pattern inspection device, polarized image acquisition method, and pattern inspection method
US11366069B2 (en) Simultaneous multi-directional laser wafer inspection
JP7094782B2 (en) Electron beam inspection device and electron beam inspection method
US20220214285A1 (en) Scanning scatterometry overlay measurement
US11783470B2 (en) Design-assisted inspection for DRAM and 3D NAND devices
JP6917208B2 (en) Polarized image acquisition device, pattern inspection device, polarized image acquisition method, and pattern inspection method
US10724964B1 (en) Multi-sensor tiled camera with flexible electronics for wafer inspection
KR102518210B1 (en) Multi-Sensor Tiled Camera with Flexible Electronics for Wafer Inspection
KR102580562B1 (en) Differential interference contrast scanning in imaging system design
KR20220070464A (en) Pattern-to-design alignment for one-dimensional eigenstructures
KR20230026359A (en) Design to Wafer Image Correlation by Combining Information from Multiple Acquisition Channels
JP6877239B2 (en) Pattern inspection device and pattern inspection method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19885973

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 20217017725

Country of ref document: KR

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 19885973

Country of ref document: EP

Kind code of ref document: A1