CN112505035A - Method for detecting gas-liquid interface inside nozzle and substrate processing apparatus - Google Patents

Method for detecting gas-liquid interface inside nozzle and substrate processing apparatus Download PDF

Info

Publication number
CN112505035A
CN112505035A CN202010645389.0A CN202010645389A CN112505035A CN 112505035 A CN112505035 A CN 112505035A CN 202010645389 A CN202010645389 A CN 202010645389A CN 112505035 A CN112505035 A CN 112505035A
Authority
CN
China
Prior art keywords
image
gas
edge
processing
liquid
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010645389.0A
Other languages
Chinese (zh)
Inventor
西本隆泰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Screen Holdings Co Ltd
Original Assignee
Screen Holdings Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Screen Holdings Co Ltd filed Critical Screen Holdings Co Ltd
Publication of CN112505035A publication Critical patent/CN112505035A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03FPHOTOMECHANICAL PRODUCTION OF TEXTURED OR PATTERNED SURFACES, e.g. FOR PRINTING, FOR PROCESSING OF SEMICONDUCTOR DEVICES; MATERIALS THEREFOR; ORIGINALS THEREFOR; APPARATUS SPECIALLY ADAPTED THEREFOR
    • G03F7/00Photomechanical, e.g. photolithographic, production of textured or patterned surfaces, e.g. printing surfaces; Materials therefor, e.g. comprising photoresists; Apparatus specially adapted therefor
    • G03F7/16Coating processes; Apparatus therefor
    • G03F7/162Coating on a rotating support, e.g. using a whirler or a spinner
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B05SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05BSPRAYING APPARATUS; ATOMISING APPARATUS; NOZZLES
    • B05B7/00Spraying apparatus for discharge of liquids or other fluent materials from two or more sources, e.g. of liquid and air, of powder and gas
    • B05B7/02Spray pistols; Apparatus for discharge
    • B05B7/04Spray pistols; Apparatus for discharge with arrangements for mixing liquids or other fluent materials before discharge
    • B05B7/0416Spray pistols; Apparatus for discharge with arrangements for mixing liquids or other fluent materials before discharge with arrangements for mixing one gas and one liquid
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/022Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by means of tv-camera scanning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/028Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring lateral position of a boundary of the object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01FMEASURING VOLUME, VOLUME FLOW, MASS FLOW OR LIQUID LEVEL; METERING BY VOLUME
    • G01F23/00Indicating or measuring liquid level or level of fluent solid material, e.g. indicating in terms of volume or indicating by means of an alarm
    • G01F23/22Indicating or measuring liquid level or level of fluent solid material, e.g. indicating in terms of volume or indicating by means of an alarm by measuring physical variables, other than linear dimensions, pressure or weight, dependent on the level to be measured, e.g. by difference of heat transfer of steam or water
    • G01F23/28Indicating or measuring liquid level or level of fluent solid material, e.g. indicating in terms of volume or indicating by means of an alarm by measuring physical variables, other than linear dimensions, pressure or weight, dependent on the level to be measured, e.g. by difference of heat transfer of steam or water by measuring the variations of parameters of electromagnetic or acoustic waves applied directly to the liquid or fluent solid material
    • G01F23/284Electromagnetic waves
    • G01F23/292Light, e.g. infrared or ultraviolet
    • G01F23/2921Light, e.g. infrared or ultraviolet for discrete levels
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L21/00Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
    • H01L21/02Manufacture or treatment of semiconductor devices or of parts thereof
    • H01L21/027Making masks on semiconductor bodies for further photolithographic processing not provided for in group H01L21/18 or H01L21/34
    • H01L21/0271Making masks on semiconductor bodies for further photolithographic processing not provided for in group H01L21/18 or H01L21/34 comprising organic layers
    • H01L21/0273Making masks on semiconductor bodies for further photolithographic processing not provided for in group H01L21/18 or H01L21/34 comprising organic layers characterised by the treatment of photoresist layers
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L21/00Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
    • H01L21/67Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere
    • H01L21/67005Apparatus not specifically provided for elsewhere
    • H01L21/67011Apparatus for manufacture or treatment
    • H01L21/6715Apparatus for applying a liquid, a resin, an ink or the like
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N2021/8405Application to two-phase or mixed materials, e.g. gas dissolved in liquids

Abstract

The invention provides a technology capable of detecting a gas-liquid interface between a processing liquid and a gas in an internal flow path of a nozzle with high precision. The present invention relates to a method for detecting a gas-liquid interface inside a nozzle and a substrate processing apparatus. The detection method of the present invention detects a gas-liquid interface between a processing liquid and a gas in an internal flow path of a nozzle, and includes an imaging step and a detection step. In the imaging step, an image is captured by imaging an imaging area including a transparent nozzle that ejects the treatment liquid from a front port of the internal flow path, that is, an ejection port, and an image is acquired. In the detection step, a gas-liquid interface in the internal flow path is detected based on a difference between 2 images in which the presence distribution states of the processing liquid and the gas in the internal flow path are different.

Description

Method for detecting gas-liquid interface inside nozzle and substrate processing apparatus
Technical Field
The present invention relates to a method for detecting a gas-liquid interface inside a nozzle and a substrate processing apparatus.
Background
Patent document 1 describes a substrate processing apparatus for coating a resist on a substrate. The substrate processing apparatus includes: a substrate holding portion for holding a substrate, a discharge nozzle for discharging a processing liquid (resist) to the substrate, and a camera for imaging the vicinity of the discharge nozzle as a resist detection region. In patent document 1, a resist detection area is photographed by a camera, and the resist ejection start time and the resist ejection end time are detected using the obtained image. Specifically, whether or not the discharge nozzle discharges the resist is determined based on the image acquired by the camera, and the resist discharge start time and the resist discharge end time are detected based on the determination result. In patent document 1, a resist ejection period is acquired based on a resist ejection start time and a resist ejection end time, and when the resist ejection period is out of a specific allowable range, it is determined that a resist ejection failure has occurred, and an operator of the substrate processing apparatus is warned. Therefore, the operator can prevent the substrate, which may have a defective process, from flowing out to the subsequent process.
[ background Art document ]
[ patent document ]
[ patent document 1] Japanese patent laid-open No. 2003-273003
Disclosure of Invention
[ problems to be solved by the invention ]
Patent document 1 discloses a method for determining whether or not a nozzle ejects a resist, but does not describe the state of a processing liquid flowing through an internal flow path of the nozzle.
For example, in normal process liquid discharge, the internal flow path of the nozzle is in a liquid-tight state filled with the process liquid. However, if bubbles are mixed in the internal flow path of the nozzle, the bubbles are discharged from the discharge port together with the processing liquid. At this time, the treatment liquid may be splashed from the discharge port of the nozzle.
When the nozzle stops discharging, the suck-back process may be performed, that is, the internal flow path of the nozzle becomes a negative pressure and a certain amount of the treatment liquid is sucked from the discharge port. If the position of the front end face after the suck-back process (suck-back position) is out of the specific reference range, there is a possibility that a discharge failure will occur when the processing liquid is discharged next time.
In addition, there is also a case where the internal flow path of the nozzle is mixed with air bubbles due to failure of the suck-back process. In this case, there is a possibility that ejection failure occurs when the treatment liquid is ejected next time.
Therefore, it is desirable to monitor the state of the internal flow path of the nozzle. More specifically, it is desirable to detect at least one of the suck-back position of the processing liquid and the bubble. In other words, it is desirable to detect the interface between the processing liquid and the gas in the internal flow path of the nozzle.
Therefore, an object of the present invention is to provide a technique capable of accurately detecting a gas-liquid interface between a processing liquid and a gas in an internal flow path of a nozzle.
[ means for solving problems ]
The 1 st aspect of the method for detecting a gas-liquid interface inside a nozzle is a method for detecting a gas-liquid interface between a processing liquid and a gas in an internal flow path of the nozzle, the method including: an imaging step of capturing an image of an imaging area including the transparent nozzle that ejects the treatment liquid from a front port, i.e., an ejection port, of the internal flow path by a camera to acquire an image; and a detection step of detecting the gas-liquid interface in the internal flow path based on a difference between 2 images having different distribution states of the processing liquid and the gas in the internal flow path.
The 2 nd aspect of the method for detecting a gas-liquid interface inside a nozzle is the method for detecting a gas-liquid interface inside a nozzle according to the 1 st aspect, wherein the detecting step includes: an edge extraction step of performing edge extraction processing on the 2 images acquired by the camera to acquire a1 st edge image and a2 nd edge image, respectively; an edge expansion step of performing edge expansion processing for expanding an edge of the 2 nd edge image to obtain an edge expanded image; and acquiring a difference image indicating a difference between the 1 st edge image and the edge expansion image, and detecting the gas-liquid interface based on an edge other than an edge from the edge expansion image in the difference image.
A 3 rd aspect of the method for detecting a gas-liquid interface inside a nozzle is the method for detecting a gas-liquid interface inside a nozzle according to the 2 nd aspect, wherein in the capturing step, the camera sequentially acquires the images during at least a part of a processing period from a state in which the processing liquid is stationary in the internal flow path of the nozzle until the processing liquid is completely discharged and the processing liquid is again stationary, and in the detecting step, the position of the gas-liquid interface is detected based on the 2 images captured during at least a part of the processing period.
A 4 th aspect of the method for detecting a gas-liquid interface inside a nozzle is the method for detecting a gas-liquid interface inside a nozzle according to the 3 rd aspect, wherein in the capturing step, the camera acquires a1 st image when the processing liquid is stationary in the internal flow path, acquires a2 nd image when the processing liquid flows toward the ejection port, and in the edge extracting step, the 1 st edge image is acquired based on the 1 st image and the 2 nd edge image is acquired based on the 2 nd image.
The 5 th aspect of the method for detecting a gas-liquid interface inside a nozzle is the method for detecting a gas-liquid interface inside a nozzle according to the 4 th aspect, and further includes: in the detecting step, it is determined that an abnormality has occurred when positions of the plurality of gas-liquid interfaces are detected based on the 1 st image and the 2 nd image.
A 6 th aspect of the method for detecting a gas-liquid interface inside a nozzle is the method for detecting a gas-liquid interface inside a nozzle according to any one of the 3 rd to 5 th aspects, wherein in the capturing step, the camera sequentially acquires the images during an ejection period in which the processing liquid is ejected from the ejection port of the nozzle, and in the edge extracting step, the 1 st edge image is acquired based on the current image acquired by the camera during the ejection period, and the 2 nd edge image is acquired based on the image acquired by the camera immediately before the current image during the ejection period.
The 7 th aspect of the method for detecting a gas-liquid interface inside a nozzle is the method for detecting a gas-liquid interface inside a nozzle according to the 6 th aspect, and further includes: in the detecting step, it is determined that an abnormality has occurred when the gas-liquid interface is detected based on the 2 images captured during the ejection period.
An aspect of a substrate processing apparatus includes: a substrate holding unit for holding a substrate; a transparent nozzle that ejects the processing liquid from a front port, i.e., an ejection port, of the internal flow path toward the substrate held by the substrate holding portion; a camera for capturing a capture area including the nozzle to acquire an image; and an image processing unit that detects a gas-liquid interface between the processing liquid and the gas in the internal flow path based on a difference between 2 images having different distribution states of the processing liquid and the gas in the internal flow path.
[ Effect of the invention ]
According to the method of detecting a gas-liquid interface inside a nozzle in the 1 st aspect and the substrate processing apparatus in the first aspect, the gas-liquid interface inside the nozzle is detected based on an image captured by a camera, and therefore the gas-liquid interface can be detected with high accuracy.
According to the 2 nd aspect of the method for detecting a gas-liquid interface inside a nozzle, edges other than the edge indicating the gas-liquid interface among the edges of the 1 st edge image are eliminated by the difference between the 1 st edge image and the edge expansion image. That is, only the edge indicating the gas-liquid interface in the 1 st edge image remains in the difference image. Conversely, even if the nozzle position shifts between the images, the edges other than the edge indicating the gas-liquid interface among the edges of the 1 st edge image can be eliminated in the differential image.
On the other hand, the edge of the edge-dilated image remains in the difference image, but the position of the gas-liquid interface is detected based on the edge other than the edge from the edge-dilated image.
As described above, the gas-liquid interface in the 1 st edge image can be detected by excluding the influence of the positional deviation of the nozzle between the images and the influence of the edge expansion image. This enables the gas-liquid interface to be detected with high accuracy.
According to the 3 rd aspect of the method for detecting a gas-liquid interface inside a nozzle, since 2 images acquired by a camera during a processing period are used, even if an aged element such as damage to the nozzle occurs, the influence of the aged element is eliminated in a difference image. This enables the gas-liquid interface to be detected with higher accuracy.
According to the 4 th aspect of the method for detecting a gas-liquid interface inside a nozzle, the position of the gas-liquid interface in the 1 st image is detected. The position of the leading end surface of the stationary processing liquid includes the suck-back position, and therefore, the suck-back position can be detected.
According to aspect 5 of the method for detecting a gas-liquid interface in a nozzle, it is possible to detect an abnormality in a flow path in the nozzle before ejection.
According to the 6 th aspect of the detection method of the gas-liquid interface inside the nozzle, the gas-liquid interface in the current image is detected by the difference amount processing of the edge expansion image based on the immediately preceding image and the 1 st edge image based on the current image. Further, when bubbles are generated in the internal flow path of the nozzle and move to the imaging area of the camera during the process of ejecting the processing liquid, the current image contains bubbles in the beginning and the immediately preceding image does not contain bubbles. According to the 6 th aspect, since the position of the gas-liquid interface based on the bubble is detected when the bubble is contained in the current image, the bubble can be detected quickly.
According to the 7 th aspect of the method for detecting a gas-liquid interface in a nozzle, it is possible to detect an abnormality in a flow path in the nozzle during ejection.
Drawings
FIG. 1 is a view schematically showing an example of the structure of a substrate processing apparatus.
Fig. 2 is a diagram schematically showing an example of the configuration of the processing unit.
Fig. 3 is a flowchart showing an example of the operation of the processing unit.
Fig. 4 is a diagram schematically showing an example of image data.
Fig. 5 is a flowchart showing an example of the monitoring process.
Fig. 6 is a diagram schematically showing an example of an edge image.
Fig. 7 is a diagram schematically showing an example of an edge expansion image.
Fig. 8 is a flowchart showing an example of the detection process of the gas-liquid interface.
Fig. 9 is a diagram for explaining the difference amount processing.
Fig. 10 is a diagram schematically showing another example of image data.
Fig. 11 is a diagram schematically showing another example of image data.
Fig. 12 is a diagram for explaining the difference amount processing.
Fig. 13 is a flowchart showing an example of the difference processing.
Fig. 14 is a flowchart showing an example of the abnormality determination processing.
Fig. 15 is a diagram schematically showing another example of image data.
Detailed Description
Hereinafter, embodiments will be described with reference to the accompanying drawings. The drawings are schematic drawings, and the configuration is omitted or simplified as appropriate for the sake of convenience of explanation. The relationship between the size and the position of the structure shown in the drawings is not necessarily described accurately, and may be appropriately changed.
In the following description, the same components are denoted by the same reference numerals, and the same names and functions are given to the same components. Therefore, detailed descriptions of the same components may be omitted to avoid redundancy.
< schematic constitution of substrate processing apparatus >
Fig. 1 is a plan view schematically showing an example of the structure of a substrate processing apparatus. The substrate processing apparatus of fig. 1 is an apparatus that forms a resist film or the like on a substrate (e.g., a semiconductor wafer) W and develops the exposed substrate W.
In the example of fig. 1, the substrate processing apparatus includes a transfer/transport unit 110, a processing unit 120, an interface unit 130, and a control unit 140. The control unit 140 controls various configurations of the substrate processing apparatus.
The control unit 140 is an electronic circuit, and may include a data processing device and a storage medium, for example. The data processing device may be an arithmetic processing device such as a CPU (Central processing Unit). The storage medium may also have a non-transitory storage medium (e.g., a ROM (Read Only Memory) or a hard disk) and a transitory storage medium (e.g., a RAM (Random Access Memory)). The non-transitory storage medium may store, for example, a program that defines processing to be executed by the control unit 140. By executing the program by the processing device, the control unit 140 can execute the processing specified by the program. Of course, part or all of the processing executed by the control unit 140 may be executed by hardware.
The transfer conveyor 110 is provided adjacent to the interface unit 130 on both sides of the processing unit 120. Further, an exposure machine EXP, which is an external device separate from the host device, is provided adjacent to the interface unit 130.
The transfer conveyor 110 includes a plurality of (4 in the figure) cassette mounting tables 111 and ID (Identification) transport mechanisms TID. The cassette mounting tables 111 are arranged in 1 row, and 1 cassette C is mounted on each cassette mounting table 111.
The ID conveyance mechanism TID is provided at the side of the cassette mounting table 111 so as to be horizontally movable along the arrangement direction of the cassettes C, and can be stopped at a position facing each cassette C. The ID transport mechanism TID includes a holding arm, and transfers the substrate W to and from each cassette C and each processing unit 120. The ID conveyance mechanism TID takes out the substrate W from the cassette C and conveys the substrate W to the processing unit 120, and stores the substrate W received from the processing unit 120 in the cassette C.
The processing unit 120 processes the substrate W. In the example of fig. 1, the processing portion 120 is a component (cell)121, 122. The unit 121 includes a main conveyance mechanism T1, and the unit 122 includes a main conveyance mechanism T2. A plurality of processing units are provided in the modules 121 and 122, respectively. In the example of fig. 1, only the modules 121 and 122 are shown, but a plurality of modules 121 and 122 may be provided in the vertical direction in the processing unit 120. That is, the same component as the component 121 may be laminated on the component 121, or the same component as the component 122 may be laminated on the component 122. In short, the processing unit 120 may have a multi-level structure. At element 121 (and elements of the upper level thereof), a resist film or the like is formed on substrate W, and at element 122 (and elements of the upper level thereof), substrate W is developed.
The units 121 and 122 are arranged in the horizontal direction and connected to each other, and constitute one substrate processing line connecting the transfer conveyor 110 and the interface unit 130. The same in each level. The substrate processing columns are disposed substantially in parallel in the vertical direction. In other words, the processing section 120 is constituted by a substrate processing column of a hierarchical structure.
The interface unit 130 is disposed between the processing unit 120 and the exposure machine EXP, and relays the substrate W therebetween.
Hereinafter, the components 121 and 122 will be described without description of the components of the upper stage for simplicity of description. The module 121 has a transfer space a1 formed therein for transferring the substrate W. The conveyance space a1 passes through the center of the block 121 and is formed in a belt shape parallel to the arrangement direction of the blocks 121 and 122. The processing units of the module 121 include a coating processing unit 123 for coating the substrate W with the processing liquid, and a heat treatment unit 124 for heat-treating the substrate W. The coating processing unit 123 is disposed on one side of the transfer space a1, and the heat treatment unit 124 is disposed on the other side.
The plurality of coating units 123 are arranged side by side so as to face the conveyance space a 1. In the present embodiment, the plurality of coating units 123 are also arranged in parallel in the vertical direction. For example, a total of 4 coating units 123 are arranged in 2 rows and 2 stages. The coating processing unit 123 includes an antireflection film coating processing unit that performs a process of forming an antireflection film on the substrate W, and a resist film coating processing unit that performs a process of forming a resist film on the substrate W. For example, the lower 2 coating units 123 form an antireflection film on the substrate W, and the upper 2 coating units 123 form a resist film on the substrate W.
The plurality of heat treatment units 124 are arranged side by side so as to face the conveyance space a 1. In the present embodiment, the plurality of heat treatment units 124 are also arranged in parallel in the vertical direction. For example, 3 heat treatment units 124 may be arranged in the lateral direction, and 5 heat treatment units 124 may be stacked in the vertical direction. The heat treatment units 124 each include a plate 125 on which the substrate W is placed. The heat treatment unit 124 includes a cooling unit that cools the substrate W, a heating and cooling unit that successively performs a heating treatment and a cooling treatment, and an adhesion treatment unit that performs a heat treatment in a vapor atmosphere of Hexamethylsilazane (HMDS) in order to improve the adhesion between the substrate W and the coating film. The heating and cooling unit includes 2 flat plates 125, and a local conveyance mechanism, not shown, for moving the substrate W between the 2 flat plates 125. A plurality of each heat treatment unit is arranged at an appropriate position.
A mounting portion PASS1 is provided at the boundary between the transfer conveyor 110 and the module 121, and a mounting portion PASS2 is provided at the boundary between the modules 121 and 122. The placing unit PASS1 relays the substrate W between the transfer conveyor 110 and the module 121, and the placing unit PASS2 relays the substrate W between the modules 121 and 122. The mounting portions PASS1 and PASS2 include a plurality of support pins for supporting the substrate W in a horizontal posture. The horizontal posture here means a posture in which the thickness direction of the substrate W is along the vertical direction. The mounting portion PASS1 can mount 2 substrates W, for example. The mounting portion PASS1 has a configuration of, for example, 2 stages, and 1 substrate W is mounted on each stage. One stage of the transfer conveyor 110 carries the substrate W to be transferred to the module 121, and the other stage carries the substrate W to be transferred from the module 121 to the transfer conveyor 110. The mounting portion PASS2 similarly has a 2-stage configuration.
A main conveyance mechanism T1 is provided substantially at the center of the conveyance space a 1. The main conveyance mechanism T1 transfers substrates W to and from the processing units of the module 121, the mounting portion PASS1, and the mounting portion PASS2, respectively. In the example of fig. 1, the main conveyance mechanism T1 includes 2 holding arms H1 and H2. Thus, the main conveyance mechanism T1 can take out a substrate W from a target portion (e.g., a processing unit of the module 121) using one of the holding arms H1 and hand over another substrate W to the target portion using the other holding arm H2.
The module 122 forms a transfer space a2 for transferring the substrate W. The conveyance space a2 is formed to be on an extension of the conveyance space a 1.
The processing units of the module 122 include a coating processing unit 127 that coats the substrate with the processing liquid, a heat treatment unit 126 that performs heat treatment on the substrate W, and an edge exposure unit (not shown) that exposes the peripheral edge portion of the substrate W. The coating processing unit 127 is disposed on one side of the transfer space a2, and the heat processing unit 126 and the edge exposure unit are disposed on the other side. Here, the coating unit 127 is preferably disposed on the same side as the coating unit 123. The heat treatment unit 126 and the edge exposure unit are preferably arranged in the same row as the heat treatment unit 124.
In the present embodiment, the plurality of coating units 127 are also arranged in parallel in the vertical direction. For example, a total of 6 coating units 127 are arranged in 3 rows and 2 stages. The coating processing unit 127 includes a developing processing unit that develops the substrate W, and a coating processing unit for a resist coating film that performs a process of forming a resist coating film on the substrate W. For example, the lower 3 coating process units 127 form a resist coating film on the substrate W, and the upper 3 coating process units 127 develop the substrate W.
The plurality of heat treatment units 126 are arranged in parallel in the lateral direction along the conveyance space a2 and are stacked in the vertical direction. The heat treatment unit 126 includes a heating unit that heats the substrate W, and a cooling unit that cools the substrate W.
The edge exposure unit is a separate body and is disposed at a specific position. The edge exposure unit includes a rotation holding portion (not shown) that rotatably holds the substrate W, and a light irradiation portion (not shown) that exposes the peripheral edge of the substrate W held by the rotation holding portion.
A mounting and buffering unit P-BF is provided at the boundary between the module 122 and the interface unit 130. The substrate W conveyed from the module 122 to the interface unit 130 is placed on the placing/buffering unit P-BF.
The main conveyance mechanism T2 is provided substantially at the center of the conveyance space a2 in a plan view. The main conveyance mechanism T2 is configured similarly to the main conveyance mechanism T1. The main conveyance mechanism T2 transfers the substrate W to and from the placement portion PASS2, the coating processing unit 127, the heat treatment unit 126, the edge exposure unit, and the placement and buffer portion P-BF, respectively.
The interface 130 includes a cleaning processing block 131 and a loading/unloading block 132. A loading unit PASS3 is provided at the boundary between the cleaning processing block 131 and the loading/unloading block 132. An example of the structure of the mounting portion PASS3 is the same as the mounting portions PASS1 and PASS 2. A mounting/cooling unit, not shown, is provided above or below the mounting portion PASS 3. The mounting and cooling unit cools the substrate W to a temperature suitable for exposure.
The carrying in/out block 132 is provided with an IF carrying mechanism TIF. The IF transport mechanism TIF transports the substrate W from the mounting and cooling unit to the loading portion LPa of the exposure machine EXP, and transports the substrate W from the unloading portion LPb of the exposure machine EXP to the mounting portion PASS 3.
The cleaning processing block 131 includes 2 cleaning processing units 133a and 133b and 2 conveying mechanisms T3a and T3 b. The 2 cleaning units 133a and 133b are arranged so as to sandwich one set of the conveyance mechanisms T3a and T3 b. The cleaning processing unit 133a cleans and dries the substrate W before exposure. The plurality of cleaning units 133a may be stacked in multiple stages. The transport mechanism T3a transports the substrate W from the mounting and buffer unit P-BF to the cleaning processing unit 133a, and transports the cleaned substrate W from the cleaning processing unit 133a to the mounting and cooling unit.
The cleaning processing unit 133b cleans and dries the exposed substrate W. The plurality of cleaning units 133b may be stacked in multiple stages. The transport mechanism T3b transports the substrate W from the mounting portion PASS3 to the cleaning and drying unit 133b, and transports the cleaned substrate W from the cleaning and drying unit 133b to the mounting and buffer portion P-BF.
In such a substrate processing system, a substrate W is processed as follows. That is, the substrate W taken out of the cassette C is cooled by the cooling unit of the assembly 121. The cooled substrate W is subjected to coating treatment by the coating treatment unit for the antireflection film of the module 121. Thereby, an antireflection film is formed on the surface of the substrate W. The substrate W on which the antireflection film is formed is heated by the heating and cooling unit and then cooled. The cooled substrate W is subjected to coating processing by a coating processing unit for a resist film. Thereby, a resist film is formed on the surface of the substrate W. The substrate W on which the resist film is formed is cooled after being heated again by the heating and cooling unit. The substrate W on which the resist film is formed is subjected to coating processing by a coating processing unit for a resist cover film of the module 122. Thereby, a resist coating film is formed on the surface of the substrate W. The substrate W on which the resist cover film is formed is heated and then cooled by the heating and cooling unit of the module 122.
The peripheral edge portion of the substrate W after cooling is exposed by the edge exposure unit of the module 122. The substrate W whose peripheral portion has been exposed is subjected to a cleaning and drying process in the cleaning unit 133 a. The cleaned substrate W is cooled by the mounting and cooling unit. The cooled substrate W is exposed by an external exposure machine EXP. The substrate W after exposure is subjected to the cleaning and drying process by the cleaning and drying process unit 133 b. The cleaned substrate W is subjected to a post-exposure bake process by the heating and cooling unit of the module 122. The baked substrate W is cooled by the cooling unit of the assembly 122. The cooled substrate W is subjected to a developing process by the developing process unit. The substrate W subjected to the development processing is heated by the heating and cooling unit and then cooled. The cooled substrate W is transported to the cassette C of the transfer conveyor 110. In the above manner, the substrate processing apparatus processes the substrate W.
< coating processing Unit >
Fig. 2 schematically shows an example of a partial configuration of the process unit 1 as an example of the coating process unit 123 or the coating process unit 127. As illustrated in fig. 2, the processing unit 1 includes a substrate holding portion 10, an ejection nozzle 12, a camera 35, an image processing portion 30, and a control portion 140.
The substrate holder 10 holds the substrate W in a substantially horizontal posture. The horizontal posture here means a state in which the thickness direction of the substrate W is along the vertical direction. The center lower surface of the substrate holder 10 is coupled to an electric motor, not shown, via a rotary shaft 11. The controller 140 rotates the electric motor to rotate the substrate holder 10 and the substrate W held by the substrate holder 10 in a horizontal plane.
The discharge nozzle 12 is a hollow cylindrical body and is connected to a processing liquid supply source 15 via a supply pipe 18. The discharge nozzle 12 is transparent, and its internal flow path 14 is visible. In other words, the discharge nozzle 12 has transparency to the extent that the internal flow path 14 thereof can be seen from the outside. A discharge port 13, which is a front end of the internal flow path 14, is formed on the lower surface of the discharge nozzle 12. The base end port of the internal flow passage 14 is connected to the supply pipe 18.
A supply valve 16 and a suck-back valve 17 are provided in the supply pipe 18. When the supply valve 16 is opened, the processing liquid from the processing liquid supply source 15 is supplied to the discharge nozzle 12 through the supply pipe 18. The discharge nozzle 12 supplies the supplied processing liquid onto the substrate W. The processing liquid includes, for example, a resist liquid, a coating liquid for film formation, a developing liquid, or a cleaning liquid (also referred to as a rinse liquid). The coating process is performed by opening the supply valve 16 and supplying the processing liquid from the discharge port 13 of the discharge nozzle 12 onto the substrate W while rotating the substrate holding portion 10 by the electric motor.
When the discharge of the treatment liquid is stopped, the suck-back valve 17 sucks the treatment liquid in the internal flow path 14 of the discharge nozzle 12 toward the opposite side of the discharge port 13, and separates the distal end surface of the treatment liquid from the discharge port 13. This can prevent the treatment liquid from dropping (so-called dripping) from the discharge port 13 of the discharge nozzle 12 by gravity.
The discharge nozzle 12 is provided so as to be movable between a processing position and a standby position by a nozzle moving mechanism, not shown. The processing position is a position at which the discharge nozzle 12 discharges the processing liquid onto the substrate W. For example, the processing position is a position facing vertically upward with respect to the center of the substrate W held by the substrate holder 10. The standby position is, for example, a position not opposed to the substrate W in the vertical direction. The nozzle moving mechanism has, for example, a ball screw structure.
The camera 35 is a so-called two-dimensional CCD camera including a CCD (charge coupled device). The camera 35 captures images at a specific time interval (frame rate), sequentially acquires image data, and transmits the image data to the image processing unit 30. The camera 35 is provided to face the main surface of the substrate W and the ejection port 13 at a specific position in the processing unit 1 via the support 37 so as to be able to image an imaging area including the ejection port 13 of the ejection nozzle 12.
In the example of fig. 2, illumination 36 is provided at the processing unit 1. The illumination 36 is a light source, for example, a light emitting diode, and is provided at a specific portion in the process unit 1, for example, near the camera 35, via the support portion 38 so as to irradiate the imaging area.
The control unit 140 controls various configurations of the processing unit 1. Specifically, the control unit 140 controls the supply valve 16, the suck-back valve 17, the electric motor, and the nozzle moving mechanism. The control unit 140 can also control the camera 35. For example, the control unit 140 outputs a shooting instruction to the camera 35, and the camera 35 performs shooting in accordance with the shooting instruction.
The control unit 140 is an electronic circuit, and may include a data processing device 141 and a storage medium 142, for example. The data processing device 141 may be an arithmetic processing device such as a cpu (central Processor unit). The storage medium 142 may also have a non-transitory storage medium (e.g., rom (read Only memory) or a hard disk) and a transitory storage medium (e.g., ram (random Access memory)). The non-transitory storage medium may store, for example, a program that defines processing executed by the control unit 140. By executing the program by the data processing device 141, the control unit 140 can execute the processing specified by the program. Of course, part or all of the processing executed by the control unit 140 may be executed by hardware.
The image processing unit 30 receives image data quantized by the camera 35 from the camera 35. The image processing unit 30 performs image processing on the image data. In the example of fig. 2, the image processing unit 30 includes a processor 31, a1 st processing memory 32, and a2 nd processing memory 33. The image data quantized in the camera 35 or the processor 31 is stored in the 1 st processing memory 32 and the 2 nd processing memory 33.
The processor 31 is an electronic circuit. The processor 31 performs image processing on the image data stored in the 1 st process memory 32 and the 2 nd process memory 33, and detects a gas-liquid interface between the processing liquid and the gas in the internal flow path 14 of the discharge nozzle 12. This will be described in detail below.
In the example of fig. 2, the control unit 140 and the image processing unit 30 are provided separately. However, the image processing function of the image processing section 30 may be incorporated in the control section 140. In this case, the control unit 140 functions as the image processing unit 30.
< actions of processing Unit >
Fig. 3 is a flowchart showing an example of the operation of the processing unit 1. First, the main transport mechanism T1 or the main transport mechanism T2 carries the substrate W into the processing unit 1, and the substrate holder 10 holds the carried-in substrate W (step S1).
Next, the processing unit 1 executes the coating process (step S2) and the monitoring process (step S3) in parallel. The coating treatment is a treatment of applying a treatment liquid to the upper surface of the substrate W. The monitoring process is a process of monitoring the state of the processing liquid in the internal flow path 14 of the discharge nozzle 12.
< coating treatment >
In the coating process, first, the discharge nozzle 12 is moved from the standby position to the processing position by a nozzle moving mechanism, not shown. Thereby, the discharge nozzle 12 enters the imaging area of the camera 35. Subsequently, the substrate holder 10 and the substrate W are rotated by an electric motor, not shown, and the supply valve 16 is opened. Thereby, the treatment liquid is discharged from the discharge nozzle 12 toward the surface of the rotating substrate W. The processing liquid discharged from the discharge nozzle 12 onto the surface of the substrate W spreads over the surface of the substrate W as the substrate W rotates, and splashes from the peripheral edge of the substrate W.
When a certain period of time has elapsed after the supply valve 16 is opened, for example, the supply valve 16 is closed. This terminates the discharge of the treatment liquid from the discharge nozzle 12. Then, the electric motor ends the rotation of the substrate W. Thereby, a specific film (e.g., resist) is formed on the upper surface of the substrate W. In the above manner, the processing unit 1 performs coating processing on the substrate W.
< monitor processing >
In the monitoring process, the camera 35 captures an image of the imaging region to sequentially acquire image data, and the image processing unit 30 monitors the state of the processing liquid in the internal flow path 14 of the discharge nozzle 12 based on the image data. More specifically, the processor 31 of the image processing unit 30 detects the gas-liquid interface between the processing liquid and the gas in the internal flow path 14 of the discharge nozzle 12 based on the image data. First, a method of detecting a gas-liquid interface will be described.
Fig. 4 is a diagram schematically showing an example of the image data IM1 acquired by the camera 35. FIG. 4 shows 3 image data IM1[ i-1 ] to IM1[ i +1] sequentially acquired by the camera 35. The image data IM1[ i-1 ] is the image data IM1 acquired at the earliest point in time, and the image data IM1[ i +1] is the image data IM1 acquired at the latest point in time. In the example of fig. 4, image data IM1 before and after the start of the flow of the treatment liquid is shown. Since the discharge nozzle 12 is transparent, the processing liquid in the internal flow path 14 is also shown in each image data IM 1.
In the example of fig. 4, the chronological element 19 is generated on the surface of the discharge nozzle 12. The aged element 19 is an element indicating aged change in the appearance of the discharge nozzle 12, and is, for example, a damage generated on the surface of the discharge nozzle 12, a stain attached to the surface, or the like. The years element 19 is commonly included in any one of the image data IM1[ i-1 ] to IM1[ i +1 ].
The image data IM1[ i-1 ] is image data IM1 acquired in the closed state in which feed valve 16 has been closed. In this closed state, the treatment liquid is stationary in the internal flow path 14 of the discharge nozzle 12, and the distal end surface of the treatment liquid is positioned above the discharge port 13. The position of the front end surface when the processing liquid is at rest is also referred to as a suck-back position. As long as the suck-back position is within an appropriate range, the processing liquid can be appropriately discharged from the discharge nozzle 12. The front end surface of the processing liquid is an example of a gas-liquid interface between the processing liquid and the gas.
The image data IM1[ i ] is also image data IM1 acquired in the closed state where the feed valve 16 has been closed. Since the position of the distal end surface does not substantially change when the processing liquid is at rest, the distribution of the presence of the processing liquid and the presence of the gas in the internal flow path 14 in the image data IM1[ i ] is the same as that in the image data IM1[ i-1 ]. Ideally, the image data IM1[ i-1 ], IM1[ i ] coincide with each other.
The image data IM1[ i +1] is the image data IM1 obtained immediately after the supply valve 16 is opened. When the supply valve 16 starts to open, the processing liquid starts to move toward the discharge port 13 in the internal flow path 14 of the discharge nozzle 12. In the image data IM1[ i +1], the treatment liquid is discharged from the discharge port 13 of the discharge nozzle 12. Thus, the internal flow path 14 of the discharge nozzle 12 is filled with the treatment liquid, and the distal end surface of the treatment liquid does not exist in the internal flow path 14. That is, the volume ratio occupied by the gas in the internal flow path 14 is zero. The distribution state of the image data IM1[ i +1] in the internal channel 14 is different from the distribution state of the image data IM1[ i-1 ] and IM1[ i ].
As described above, in the image data IM1[ i +1] illustrated in fig. 4, the treatment liquid is discharged from the discharge port 13 of the discharge nozzle 12. Accordingly, the presence distribution state of the image data IM1 acquired by the camera 35 is the same as the image data IM1[ i +1] as long as the treatment liquid can be properly discharged.
Next, differences between the image data IM1[ i ] and IM1[ i +1] having different distribution states are examined. The image data IM1[ i ] and the image data IM1[ i +1] are preferably different only in the region R1 described below and preferably coincide with each other in other regions. That is, the region R1 is a rectangular region between the front end face of the processing liquid in the image data IM1[ i ] and the lower end of the processing liquid in the image data IM1[ i +1] (the lower end of the image data IM1 in the drawing). In contrast to the gas occupying region R1 in the image data IM1[ i ], the treatment liquid occupying region R1 in the image data IM1[ i +1 ]. In the example of fig. 4, the region R1 is shown to be slightly smaller in order to avoid the drawing becoming complicated.
The image data representing this region R1 can be obtained by the difference between the image data IM1[ i ] and the image data IM1[ i +1 ]. It is desirable that the regions other than the region R1 coincide with each other in the image data IM1[ i-1 ], IM1[ i ]. Thus, it is preferable that the image data including only the region R1 be obtained by information other than the difference elimination region R1. The upper end of the region R1 represents the front end surface of the processing liquid in the image data IM1[ i ]. Since the information other than the region R1 is eliminated from the image data, the leading end surface of the processing liquid can be easily detected from the image data.
Therefore, the processor 31 of the image processing unit 30 detects the gas-liquid interface (for example, the front end surface of the processing liquid) in the internal flow path 14 based on the difference between the 2 pieces of image data IM1 having different distribution states. A specific example of the monitoring process will be described below.
Fig. 5 is a flowchart showing a specific example of the monitoring process (step S3 in fig. 3). This monitoring process is performed in parallel with the coating process (step S2 of fig. 3). First, the camera 35 captures a shooting area to acquire image data IM1[ n ], and transmits the image data IM1[ n ] to the image processing section 30 (step S31). The initial value of the value n is, for example, 0. As described below, since step S31 is repeatedly executed, the camera 35 sequentially performs shooting throughout the entire period (process period) in which the coating process is performed.
At the beginning of the processing period, the discharge nozzles 12 have not moved to the processing position, and therefore, the discharge nozzles 12 are not located in the imaging area, and the discharge nozzles 12 are not yet included in the image data IM1[ n ] acquired by the camera 35.
The processor 31 of the image processing section 30 confirms the position of the discharge nozzle 12 based on the image processing (shape recognition processing) on the image data IM1[ n ] (step S32). Specifically, the processor 31 determines whether the discharge nozzle 12 is located at the processing position. This determination is made, for example, by pattern matching. For example, a reference image obtained by capturing images of the discharge nozzles 12 located at the processing positions in advance is stored in a storage medium (for example, the 1 st processing memory 32). Then, the processor 31 specifies the position of the discharge nozzle 12 by matching the image data IM1[ n ] with the pattern of the reference image, and determines whether or not the discharge nozzle 12 is located at the processing position. If the discharge nozzle 12 is not located at the processing position, the processor 31 does not perform the processing of steps S32 to S35 described later, and determines whether or not to end the monitoring processing (step S36). Here, since the monitoring process has not been ended yet, step S31 is executed again after the value n is incremented.
When the ejection nozzles 12 are moved to the processing position, the ejection nozzles 12 are included in a specific region of the image data IM1[ n ]. The processor 31 determines that the discharge nozzle 12 is located at the processing position based on the image data IM1[ n ]. When it is determined that the discharge nozzle 12 is located at the processing position, the processor 31 detects the gas-liquid interface in the internal flow path 14 of the discharge nozzle 12 based on the difference between the 2 pieces of image data IM1 (steps S33 and S34).
Also, when the supply valve 16 is in the process of being closed, it is desirable that the image data IM1[ n ] sequentially acquired by the camera 35 be identical to each other (for example, refer to the image data IM1[ i-1 ], IM1[ i ] of fig. 4). Accordingly, it is desirable that all of the pixel values of the pixels of the image data obtained from the difference in the image data IM1 be zero, and the information on the front end surface of the processing liquid also disappear. Therefore, in the detection processing of the gas-liquid interface of the present embodiment (steps S33, S34), the position of the gas-liquid interface is not detected when supply valve 16 is in the process of being closed.
Here, for easy understanding, the processing will be described by taking as an example the image data IM1[ n ] acquired immediately after the supply valve 16 starts to open. That is, a case will be described where the image data IM1[ i +1] of fig. 4 is acquired as the image data IM1[ n ].
The processor 31 performs preprocessing on the image data IM1[ n ] (step S33). First, the processor 31 performs an edge extraction process such as Canny on the image data IM1[ n ], and acquires an edge image IM2[ n ]. Fig. 6 is a diagram schematically showing an example of the edge image IM2[ n ]. Here, the pixel value of the pixel indicating the edge is "1", and the pixel value of the pixel not indicating the edge is "0".
The processor 31 stores the edge image IM2[ n ] in the 1 st processing memory 32. Further, in the 2 nd processing memory 33, the edge image IM2[ n-1 ] acquired by the immediately preceding (i.e., previous) step S33 is stored. The edge image IM2[ n-1 ] is image data obtained by performing edge extraction processing on the immediately preceding image data IM1[ n-1 ]. The image data IM1[ n-1 ] is the image data IM1 that was last acquired when the feed valve 16 was in the process of being closed, and corresponds to the image data IM1[ i ] of fig. 4.
Next, the processor 31 performs an edge expansion process on the edge image IM2[ n ], and acquires an edge expansion image IM3[ n ]. The edge dilation processing is processing for enlarging the width of an edge in the edge image IM2[ n ], and for example, processing for enlarging an edge by a certain width in the vertical and horizontal directions around a pixel representing the edge. Fig. 7 is a diagram schematically showing an example of the edge expansion image IM3[ n ]. As can be understood from a comparison of FIGS. 6 and 7, the width of the edge in the edge expanded image IM3[ n ] is larger than the width of the edge in the edge image IM2[ n ]. Thus, hereinafter, the edge in the edge expansion image IM3[ n ] is also referred to as a wide edge. The processor 31 stores the edge expanded image IM3[ n ] in the 1 st processing memory 32.
Next, the processor 31 performs difference processing using the immediately preceding edge image IM2[ n-1 ] and the current edge expansion image IM3[ n ] to detect a gas-liquid interface (for example, the front end face of the processing liquid) (step S34). Fig. 8 is a flowchart showing an example of a specific operation of the gas-liquid interface detection process (step S34). The processor 31 performs difference processing using the edge image IM2[ n-1 ] and the edge inflation image IM3[ n ] to acquire a difference image IM4[ n ] (step S341). Fig. 9 is a diagram for explaining the difference processing. The difference amount processing is processing for subtracting pixels at the same position in two images from each other. Here, the processor 31 performs difference processing by subtracting the edge expansion image IM3[ n ] from the edge image IM2[ n-1 ].
As illustrated in fig. 9, the edge image IM2[ n-1 ] includes an edge E1 indicating the discharge nozzle 12, an edge E2 indicating the element 19 of elapsed time, and an edge E3 indicating the front end surface of the processing liquid. On the other hand, the edge expansion image IM3[ n ] includes a wide edge WE1 corresponding to the discharge nozzle 12, a wide edge WE2 corresponding to the perennial element 19, and a wide edge WE4 corresponding to the outline of the processing liquid discharged from the discharge port 13.
The edge E1 and the wide edge WE1 represent the same object (specifically, the ejection nozzle 12), and the edge E1 is thinner than the wide edge WE 1. Thus, even if the position of the discharge nozzle 12 is slightly shifted between the edge image IM2[ n-1 ] and the edge expansion image IM3[ n ], when the edge image IM2[ n-1 ] and the edge expansion image IM3[ n ] are superimposed, the edge E1 is covered with the wide edge WE 1. In other words, the pixel of the edge expansion image IM3[ n ] at the same position as each pixel representing the edge E1 becomes the pixel representing the wide edge WE 1.
Similarly, the edge E2 and the wide edge WE2 represent the same object (specifically, the chronological element 19), and the edge E2 is thinner than the wide edge WE 2. Thus, even if the position of the discharge nozzle 12 is slightly shifted, when the edge image IM2[ n-1 ] and the edge expansion image IM3[ n ] are superimposed, the edge E2 is covered with the wide edge WE 2.
Therefore, by the difference amount processing, the edges E1, E2 are eliminated in the difference amount image IM4[ n ]. That is, the pixel value of the pixel corresponding to the edges E1 and E2 in the difference image IM4[ n ] is "0" (-1).
On the other hand, regions (hereinafter referred to as edges SE1 and SE2) of the wide edges WE1 and WE2 that do not overlap with the edges E1 and E2 remain in the difference image IM4[ n ]. However, the difference processing causes the pixel values of the pixels at the edges SE1 and SE2 to be "-1" (-0-1). In the example of fig. 9, pixels having a pixel value of "-1" are indicated by hatching with oblique lines.
The edge corresponding to the wide edge WE4 in the edge inflation image IM3[ n ] does not exist in the edge image IM2[ n-1 ], and therefore, the wide edge WE4 remains in the difference image IM4[ n ]. However, the difference processing also causes the pixel value of each pixel of the wide edge WE4 to be "-1" (-0-1).
The wide edge corresponding to the edge E3 in the edge image IM2[ n-1 ] does not exist in the edge dilated image IM3[ n ], and therefore the edge E3 remains in the difference image IM4[ n ]. The pixel value of each pixel of the edge E3 of the difference image IM4[ n ] is still "1" (-1-0).
As described above, the difference image IM4[ n ] includes the edge E3 indicating the front end surface of the treatment liquid in the edge image IM2[ n-1 ], and the edges SE1, SE2, WE4 from the edge inflation image IM3[ n ]. In the difference image IM4[ n ], the pixel value of each pixel of the edges SE1, SE2, WE4 is "-1", the pixel value of each pixel of the edge E3 is "1", and the pixel values of the other pixels are "0".
The processor 31 detects the gas-liquid interface in the image data IM1[ n-1 ] based on the position of the edge E3 within the difference image IM4[ n ]. As an example of specific processing, the processor 31 cuts out the edges SE1, SE2, WE4 from the edge inflation image IM3[ n ] in the difference image IM4[ n ] (step S342). In summary, the processor 31 cuts out pixels whose pixel values are negative ("-1"). For example, the processor 31 replaces the pixel value of the pixel from "-1" to "0". Only the edge E3 remains in the post-ablation difference image IM4[ n ] (ablation image). The pixel value of each pixel of the edge E3 is "1". Therefore, the processor 31 finds the position of the gas-liquid interface based on the position of the pixel having the positive value ("1") pixel value (step S343).
The processor 31 may detect the position of the gas-liquid interface with reference to the position of the discharge port 13 of the discharge nozzle 12, for example. The position of the ejection port 13 may be set in advance and stored in a storage medium (for example, the 1 st process memory 32). Alternatively, the processor 31 may detect the position of the ejection port 13 based on the image data IM1[ n-1 ]. For example, the processor 31 may detect the position of the ejection port 13 by matching the image data IM1[ n-1 ] with the pattern of the reference image.
As described above, when the camera 35 acquires the image data IM1[ n ] immediately after the supply valve 16 starts to open, the processor 31 detects the position of the gas-liquid interface in the immediately preceding acquired image data IM1[ n-1 ] (steps S32 to S34). This immediately preceding image data IM1[ n-1 ] is the last acquired image data IM1 during the closing of the supply valve 16, the position of the gas-liquid interface of which indicates the suck-back position. That is, the position of the gas-liquid interface first detected in the present monitoring process (step S3) indicates the suck-back position.
Next, the processor 31 performs abnormality determination based on the detected gas-liquid interface (step S35). Specifically, the processor 31 specifies the position of the gas-liquid interface, which is first detected in the present monitoring process, as the suck-back position, and determines whether or not the suck-back position is within the reference range. The reference range is set in advance and stored in a storage medium (for example, the 1 st processing memory 32).
When it is determined that the suck-back position is out of the reference range, the processor 31 determines that an abnormality has occurred, and sends the determination result to the control unit 140. The control unit 140 causes the notification unit 40 to notify the abnormality, for example. The notification unit 40 is a device such as a display or a speaker. By this notification, the operator can know that an abnormality has occurred, and can take appropriate measures.
Next, the processor 31 determines whether or not the monitoring process should be ended (step S36). For example, when the coating process (step S2) ends, the processor 31 determines that the monitoring process should be ended, and ends the monitoring process. When it is determined that the monitoring process should not be ended, the processor 31 stores the edge image IM2[ n ] in the 2 nd process memory 33. Then, the shooting by the camera 35 in step S1 is performed again after the value n is incremented. The above operation is repeated until the coating process is completed.
As described above, the processor 31 detects the position of the gas-liquid interface in the internal flow path 14 of the discharge nozzle 12 based on the image data IM1 acquired by the camera 35, and therefore, can detect the position of the gas-liquid interface with high accuracy.
In the above example, the processor 31 detects the front end surface of the treatment liquid in the image data IM1[ n-1 ] using the image data IM1[ n-1 ] when the treatment liquid is stationary and the image data IM1[ n ] when the treatment liquid is flowing. Thereby, the suck-back position can be detected.
In the above example, the processor 31 detects the position of the gas-liquid interface based on the difference between the edge image IM2[ n-1 ] and the edge inflation image IM3[ n ]. Thus, even if the position of the discharge nozzles 12 is shifted between the image data IM1, the edges other than the edge E3 in the edge image IM2[ n-1 ] can be eliminated in the difference image IM4[ n ]. Further, the processor 31 can leave only the edge E3 in the difference image IM4[ n ] by cutting out the edges SE1, SE2, WE4 from the edge inflation image IM3[ n ]. The edge E3 is an edge indicating the front end surface (gas-liquid interface) of the processing liquid in the edge image IM2[ n-1 ]. Therefore, the processor 31 can detect the position of the gas-liquid interface with high accuracy by eliminating the influence of the positional deviation of the discharge nozzle 12 between the image data IM 1.
In addition, in the example, the processor 31 detects the position of the gas-liquid interface based on the difference amount of the 2 pieces of image data IM1 acquired by the camera 35 during the process (i.e., in the coating process). Accordingly, even if the discharge nozzle 12 generates the aged element 19 such as dirt or damage, the aged element 19 is commonly included in the 2 pieces of image data IM 1. Thus, the difference in the 2 pieces of image data IM1 can detect the position of the gas-liquid interface while excluding the influence of the aged component 19. Therefore, the position of the gas-liquid interface can be detected with higher accuracy. The treatment period here means a period from a state in which the treatment liquid is stationary in the internal flow path 14 of the discharge nozzle 12 to a state in which the treatment liquid is discharged and is again stationary.
< air bubble before spouting >
And, there are cases as follows: when the discharge of the treatment liquid is stopped, the suck-back treatment cannot be performed properly, and air bubbles are mixed in the internal flow path 14 of the discharge nozzle 12. Fig. 10 is a diagram schematically showing an example of the image data IM1 acquired by the camera 35. This image data IM1 is acquired in the closed state where the supply valve 16 is closed. In the example of fig. 10, the air bubbles B1 are mixed in the internal flow path 14 of the discharge nozzle 12. That is, the suck-back process is not performed properly in the previous coating process, and the air bubbles B1 are mixed in the internal flow path 14. The bubbles B1 are formed in the processing liquid, and in the example of fig. 10, separate the processing liquid into upper and lower sides. In this case, there are 3 gas-liquid interfaces in the internal flow path 14. That is, there are a gas-liquid interface representing the upper end face of the bubble B1, a gas-liquid interface representing the lower end face of the bubble B1, and a gas-liquid interface representing the front end face of the processing liquid.
In this case, the processor 31 detects the positions of the 3 gas-liquid interfaces by the monitoring process (step S3). When the positions of the plurality of gas-liquid interfaces are detected in this manner, it can be said that an abnormality occurs in the state of the processing liquid in the internal flow path 14 of the discharge nozzle 12. Therefore, in the abnormality determination (step S35), the processor 31 determines whether or not the positions of the plurality of gas-liquid interfaces are detected, and determines that an abnormality has occurred when the positions of the plurality of gas-liquid interfaces are detected. At this time, the processor 31 notifies the control unit 140 of the occurrence of the abnormality. The control unit 140 causes the notification unit 40 to notify the abnormality, for example, in response to the notification.
< detection of air bubbles in ejection >
In the above example, the bubble B1 before the ejection of the processing liquid is described. However, bubbles may be mixed into the supply pipe 18 during the process of discharging the processing liquid. The bubbles move together with the processing liquid in the supply pipe 18 and enter the internal flow path 14 of the discharge nozzle 12. The bubbles move toward the discharge port 13 in the internal flow path 14 of the discharge nozzle 12, and are discharged from the discharge port 13 to the outside. When the bubbles are released, the processing liquid may be splashed all around.
Fig. 11 is a diagram schematically showing an example of the image data IM1 acquired by the camera 35. Fig. 11 shows image data IM1 acquired by the camera 35 during a period (discharge period) in which the treatment liquid is discharged from the discharge port 13 of the discharge nozzle 12. In FIG. 11, 3 image data IM1[ j-1 ] to IM1[ j +1] sequentially acquired by the camera 35 are displayed in an array. The image data IM1[ j-1 ] is the image data IM1 acquired at the earliest point in time, and the image data IM1[ j +1] is the image data IM1 acquired at the latest point in time. In the example of fig. 11, image data IM1 during the process of discharging the treatment liquid is shown.
In the image data IM1[ j-1 ], the internal channel 14 of the discharge nozzle 12 is filled with the treatment liquid and does not yet include the air bubbles B2. However, the bubble B2 is generated on the upstream side of the internal flow path 14. The bubble B2 moves toward the ejection port 13 with the passage of time. In the image data IM1[ j ], the internal flow path 14 includes the bubble B2. The bubble B2 moves toward the discharge port 13 with the passage of time, and is discharged from the discharge port 13 to the outside. In the example of fig. 11, in the image data IM1[ j +1], the bubble B2 is released to the outside, and the internal flow path 14 is filled with the processing liquid again.
In the example of FIG. 11, the distribution state of the treatment liquid and the gas in the internal channel 14 of the image data IM1[ j ] is different from the distribution state of the image data IM1[ j-1 ], IM1[ j +1 ]. Further, the image data IM1[ j +1] desirably coincides with the image data IM1[ j-1 ].
According to the monitoring process (step S3), when the camera 35 acquires the image data IM1[ j ], the processor 31 detects the position of the gas-liquid interface in the image data IM1[ j-1 ] based on the immediately preceding acquired image data IM1[ j-1 ] and the current image data IM1[ j ] (steps S33, S34). However, since the image data IM1[ j-1 ] does not include the bubble B2, there is no gas-liquid interface. Thus, at this point in time, the position of the gas-liquid interface is not detected.
Then, when the camera 35 acquires the image data IM1[ j +1], the processor 31 similarly detects the position of the gas-liquid interface in the image data IM1[ j ] based on the image data IM1[ j ], IM1[ j +1] (steps S33, S34). Since the image data IM1[ j ] includes the bubble B2, the processor 31 detects the position of the outline surface of the bubble B2 (the position of the gas-liquid interface) in the image data IM1[ j ].
As described above, according to the monitoring process, the bubble B2 generated in the internal flow path 14 of the discharge nozzle 12 during the discharge of the processing liquid can be detected.
However, in the specific example, the processor 31 detects not the position of the gas-liquid interface in the current image data IM1 but the position of the gas-liquid interface in the immediately preceding image data IM 1. Thus, when the processor 31 acquires the image data IM1[ j ], although the image data IM1[ j ] includes the bubble B2, the image data IM1[ j-1 ] does not include the bubble B2, and therefore the position of the gas-liquid interface of the bubble B2 cannot be detected. When the processor 31 acquires the image data IM1[ j +1] as described above, the position of the gas-liquid interface of the bubble B2 is detected. Thus, the detection of the bubble B2 is delayed.
Therefore, the processor 31 detects the position of the gas-liquid interface in the current image data IM1[ n ] instead of the immediately preceding image data IM1[ n-1 ] after the treatment liquid is ejected. Fig. 12 is a diagram for explaining the difference amount processing. As shown in FIG. 12, the processor 31 performs difference processing using the edge expanded image IM3[ n-1 ] based on the immediately preceding image data IM1[ n-1 ], and the edge image IM2[ n ] based on the current image data IM1[ n ].
The edge image IM2[ n ] is an image obtained by performing edge extraction processing on the current image data IM1[ n ]. Here, the image data IM1[ n ] is the image data IM1[ j ]. In other words, fig. 12 shows the difference amount processing when the camera 35 acquires the image data IM1[ j ]. The edge image IM2[ n ] is stored in the 1 st processing memory 32, for example.
The edge expansion image IM3[ n-1 ] is an image obtained by sequentially performing edge extraction processing and edge expansion processing on the immediately preceding image data IM1[ n-1 ]. Here, the image data IM1[ n-1 ] is the image data IM1[ j-1 ]. The immediately preceding edge inflation image IM3[ n-1 ] is stored in the 2 nd processing memory 33 in the preceding step S33, for example.
In the example of fig. 12, the edge expansion image IM3[ n-1 ] includes a wide edge WE1 corresponding to the discharge nozzle 12, a wide edge WE2 corresponding to the perennial element 19, and a wide edge WE4 corresponding to the outline of the processing liquid discharged from the discharge port 13. The edge image IM2[ n ] includes an edge E1 indicating the discharge nozzle 12, an edge E2 indicating the chronological element 19, an edge E4 indicating the outline of the processing liquid discharged from the discharge port 13, and edges E5 indicating the upper end surface and the lower end surface of the bubble B2, respectively.
Edges E1, E2, and E4 in the edge image IM2[ n ] represent the same objects as wide edges WE1, WE2, and WE4 in the edge inflation image IM3[ n-1 ], respectively, and therefore are eliminated in the difference image IM4[ n ] by the difference processing. That is, the pixel values of the pixels of the edges E1, E2, and E4 other than the edge E5 in the edge image IM2[ n ] become "0" (1-1) in the difference image IM4[ n ].
On the other hand, regions of the wide edges WE1, WE2, WE4 that do not overlap with the edges E1, E2, E4 (hereinafter referred to as edges SE1, SE2, SE4, respectively) remain in the differential image IM4[ n ]. The pixel value of each pixel of the edges SE1, SE2, and SE4 is "1" (-1-0).
The wide edge corresponding to the edge E5 in the edge image IM2[ n ] does not exist in the edge dilated image IM3[ n-1 ], and therefore remains in the difference image IM4[ n ]. However, the pixel value of each pixel of the edge E5 is "-1" (-0-1).
As described above, the difference image IM4[ n ] includes the edge E5 indicating the upper end face and the lower end face of the bubble B2 in the edge image IM2[ n ], and the edges SE1, SE2, and SE4 from the edge inflation image IM3[ n-1 ]. In the difference image IM4[ n ], the pixel value of each pixel of the edges SE1, SE2, and SE4 is "1", the pixel value of each pixel of the edge E5 is "-1", and the pixel values of the other pixels are "0".
The processor 31 detects the gas-liquid interface of the treatment liquid in the image data IM1[ n ] based on the edge E5 in the difference image IM4[ n ]. Here, the position of the gas-liquid interface represents the profile of the bubble B2. As an example of a specific process, the processor 31 cuts out the edges SE1, SE2, and SE4 from the edge expansion image IM3[ n-1 ]. In summary, the processor 31 cuts out pixels having positive ("1") pixel values. For example, the processor 31 may replace the pixel value of the pixel from "1" to "0". Only the edge E5 remains in the post-ablation difference image IM4[ n ] (ablation image). Then, the processor 31 may perform absolute value processing on the pixel value of each pixel of the post-ablation difference image IM4[ n ]. In other words, the processor 31 replaces the pixel value of each pixel of the edge E5 from "-1" to "1". The processor 31 determines the position of the gas-liquid interface (the position of the profile of the bubble B2) based on the position of the pixel having the positive value ("1") pixel value.
As described above, the processor 31 detects the gas-liquid interface in the immediately preceding image data IM1[ n-1 ] to detect the suck-back position before the discharge of the treatment liquid, and detects the gas-liquid interface position in the current image data IM1[ n ] to detect the bubble B2 more quickly after the discharge of the treatment liquid.
Fig. 13 is a flowchart showing an example of the above processing. Fig. 13 corresponds to a specific example of step S34 in fig. 5. The processor 31 determines whether or not the processing liquid is currently discharged (step S340). For example, the control unit 140 may notify the processor 31 of information indicating that the supply valve 16 has been switched from the closed state to the open state. The processor 31 determines whether or not the processing liquid is currently discharged based on the information.
Alternatively, the processor 31 may determine whether or not to discharge the processing liquid based on the image data IM 1. For example, the processor 31 determines whether or not the treatment liquid is discharged by the image processing of the image data IM 1. For example, the processor 31 determines whether or not the treatment liquid is discharged based on the pixel values of the region (hereinafter referred to as a discharge region) of the image data IM1 located below the discharge port 13 of the discharge nozzle 12. Since the pixel value of the discharge area when the processing liquid is discharged is different from the pixel value of the discharge area when the processing liquid is not discharged, the processor 31 can determine whether or not the discharge is performed based on the pixel value of the discharge area. For example, when the processing liquid is ejected, there are cases where light reflected by the processing liquid from the illumination 36 is received by the camera 35. In this case, the luminance value of the discharge region during the discharge of the processing liquid is higher than the luminance value of the discharge region when the processing liquid is not discharged. In this case, for example, when the average of the luminance values of the ejection areas is higher than the reference luminance value, the processor 31 determines that the processing liquid has been ejected. The reference luminance value is set in advance and stored in a storage medium (for example, the 1 st processing memory 32).
If the processing liquid has not been discharged (NO in step S340), the processor 31 performs steps S341 to S343 described above. Thus, before the treatment liquid is discharged, the processor 31 performs difference processing using the edge image IM2[ n-1 ] based on the immediately preceding image data IM1[ n-1 ] and the edge expansion image IM3[ n ] based on the current image data IM1[ n ] (step S341). Thus, when the camera 35 acquires the image data IM1[ n ] immediately after the supply valve 16 starts to open, the processor 31 can detect the gas-liquid interface, that is, the suck-back position, in the immediately preceding image data IM1[ n-1 ].
In addition, when the processing liquid is discharged (YES in step S340), the processor 31 performs difference processing using the edge expansion image IM3[ n-1 ] and the edge image IM2[ n ] to acquire a difference image IM4[ n ] (step S344, see also FIG. 12). Next, the processor 31 cuts out the edge from the edge expansion image IM3[ n-1 ] in the difference image IM4[ n ] (step S345). Next, the processor 31 performs absolute value processing on each pixel of the clipped difference image IM4[ n ] (step S346). Thus, in the difference image IM4[ n ], the pixel value of each pixel of the edge E5 is replaced with "1" from "-1". The processor 31 detects the position of the gas-liquid interface based on the pixels having the positive value pixel value in the difference image IM4[ n ] (step S347).
< abnormal judgment >
Before the treatment liquid is discharged, since the front end surface of the treatment liquid is present in the internal flow path 14 of the discharge nozzle 12 (see fig. 4), if the bubble B1 is not generated, one gas-liquid interface is detected. On the other hand, if the bubble B1 (see fig. 10) is generated, a plurality of gas-liquid interfaces are detected. Thus, if a plurality of gas-liquid interfaces are detected before the treatment liquid is discharged, it can be said that an abnormality occurs. Therefore, the processor 31 determines that an abnormality has occurred when a plurality of gas-liquid interfaces are detected before the processing liquid is discharged.
In contrast, in the normal process of discharging the treatment liquid, the end surface of the treatment liquid does not exist in the internal flow path 14 of the discharge nozzle 12, and the internal flow path 14 is filled with the treatment liquid. Further, if the bubble B2 (refer to fig. 11) is generated, the gas-liquid interface is detected. Thus, if a gas-liquid interface is detected during the process of ejecting the processing liquid, it can be said that an abnormality has occurred. Therefore, the processor 31 determines that an abnormality has occurred when a plurality of gas-liquid interfaces are detected during the process of ejecting the processing liquid.
Fig. 14 is a flowchart showing an example of the above operation. Fig. 14 corresponds to a specific example of the abnormality determination in step S35 in fig. 5. However, in the example of fig. 14, the procedure for determining the suck-back position is not shown.
The processor 31 determines whether or not the treatment liquid is discharged from the discharge port 13 of the discharge nozzle 12 (step S350). That is, the processor 31 determines whether or not the process liquid ejection process is currently in progress.
When the processing liquid is not discharged (no in step S350), the processor 31 determines whether or not a plurality of gas-liquid interfaces are detected (step S351). When a plurality of gas-liquid interfaces are detected, the processor 31 determines that an abnormality has occurred, and notifies the control unit 140 of the occurrence of the abnormality. The control unit 140, for example, causes the notification unit 40 to notify the abnormality based on the notification (step S352). When the plurality of gas-liquid interfaces are not detected, the processor 31 does not perform step S352.
On the other hand, when the processing liquid is discharged (YES in step S350), the processor 31 determines whether or not the gas-liquid interface is detected (step S353). When the gas-liquid interface is detected, the processor 31 determines that an abnormality has occurred, and the control unit 140 notifies that the abnormality has occurred. The control unit 140, for example, causes the notification unit 40 to notify the abnormality based on the notification (step S354).
< stoppage of discharge of treatment liquid >
Next, an operation when the discharge of the treatment liquid is stopped will be described. Fig. 15 is a diagram schematically showing an example of the image data IM1 acquired by the camera 35. FIG. 15 shows 3 image data IM1[ k-1 ] to IM1[ k +1] sequentially acquired by the camera 35. The image data IM1[ k-1 ] is the image data IM1 acquired at the earliest point in time, and the image data IM1[ k +1] is the image data IM1 acquired at the latest point in time. In the example of fig. 15, image data IM1 before and after the stop of the discharge of the treatment liquid is shown.
The image data IM1[ k-1 ] is the image data IM1 acquired in a state where the supply valve 16 is open. In the image data IM1[ k-1 ], the treatment liquid is discharged from the discharge port 13 of the discharge nozzle 12, and the internal flow path 14 thereof is filled with the treatment liquid.
The image data IM1[ k ] is the image data IM1 obtained immediately after the supply valve 16 is closed and the suck-back valve 17 performs the suction operation. In the image data IM1[ k ], the processing liquid is not discharged from the discharge port 13 of the discharge nozzle 12, but exists only in the internal flow path 14 thereof. The distal end surface of the processing liquid is located above the discharge port 13 in the internal flow path 14. Thus, the image data IM1[ k-1 ] and the image data IM1[ k ] have different distribution states of the processing liquid and the gas in the internal flow path 14.
After the supply valve 16 is closed, the processing liquid is stationary in the internal flow path 14. Thus, the position of the distal end surface of the processing liquid does not change as long as the supply valve 16 is closed. Therefore, the image data IM1[ k +1] acquired next to the image data IM1[ k ] ideally coincides with the image data IM1[ k ].
According to the operation of fig. 13, after the treatment liquid is discharged, the processor 31 detects the gas-liquid interface in the current image data IM1[ n ] (steps S344 to S347). Thus, when the camera 35 acquires the image data IM1[ k ], the processor 31 performs the detection processing of the gas-liquid interface based on the difference between the immediately preceding image data IM1[ k-1 ] and the current image data IM1[ k ] (steps S344 to S347). Since the distribution states of the image data IM1[ k-1 ] and IM1[ k ] in the internal channel 14 are different from each other, the processor 31 can detect the position of the gas-liquid interface of the image data IM1[ k ].
In summary, the processor 31 acquires an edge image IM2[ k ] based on the image data IM1[ k ] when the treatment liquid is stationary, and acquires an edge expansion image IM3[ k-1 ] based on the image data IM1[ k-1 ] when the treatment liquid is flowing. Thus, the processor 31 can detect the position of the gas-liquid interface in the image data IM1[ k ] when the processing liquid is stationary by the difference processing.
Then, when the camera 35 acquires the image data IM1[ k +1], the processor 31 also performs the detection processing of the gas-liquid interface based on the difference between the immediately preceding image data IM1[ k ] and the current image data IM1[ k +1] (steps S344 to S347). However, the image data IM1[ k-1 ], IM1[ k ] have the same distribution state in the internal flow path 14, so the processor 31 does not detect the position of the gas-liquid interface of the image data IM1[ k +1 ]. The same applies hereinafter.
As described above, the position of the gas-liquid interface detected last in the monitoring process can be said to represent the suck-back position after the discharge of the processing liquid is stopped. Therefore, the processor 31 may also specify the last detected position of the gas-liquid interface as the suck-back position. The processor 31 may determine whether or not the suck-back position is within a reference range.
In addition, according to the abnormality determination in fig. 14, since the processing liquid is not discharged at the time point when the image data IM1[ k ] is acquired, the processing in steps S351 and S352 is performed. Accordingly, when the bubble B1 is mixed due to the fact that the suck-back process is not appropriately performed when the discharge of the processing liquid is stopped, the processor 31 detects a plurality of gas-liquid interfaces, and thus determines that an abnormality has occurred.
The embodiments have been described above, but the substrate processing apparatus can be variously modified from the above-described ones without departing from the gist thereof. In the present embodiment, the embodiments can be freely combined, arbitrary components of the embodiments can be changed, or arbitrary components can be omitted in the embodiments within the scope of the disclosure.
For example, the process unit 1 is not necessarily limited to the coating process unit 123 or the developing process unit DEV. The processing unit 1 may have a transparent discharge nozzle 12 for discharging the processing liquid to the substrate W.
In addition, in the example described, the processor 31 detects the gas-liquid interface based on the difference amount of the 2 pieces of image data IM1 acquired by the camera 35 during the processing (in the coating processing). However, the reference image captured by the camera 35 in advance may be stored in the 2 nd processing memory 33 in advance. As the reference image, for example, image data IM1 (for example, image data IM1[ i +1] of fig. 4) during normal ejection of the processing liquid can be used. The processor 31 may detect the gas-liquid interface in the image data IM1 based on the difference between the image data IM1 acquired during the processing period and the reference image.
In addition, in the example described, the camera 35 performs shooting during the entire period of the processing period. However, in the case where the period to be monitored is sufficient for only a part of the processing period, the camera 35 may perform imaging only for a part of the processing period.
[ description of symbols ]
10 substrate holding part
12 nozzle (jet nozzle)
30 image processing part
35 Camera

Claims (8)

1. A method for detecting a gas-liquid interface between a processing liquid and a gas in a flow path inside a nozzle, the method comprising:
an imaging step of capturing an image of an imaging area including the transparent nozzle by a camera, the nozzle ejecting the treatment liquid from a front port, that is, an ejection port, of the internal flow path; and
a detection step of detecting the gas-liquid interface in the internal flow path based on a difference between the 2 images in which the processing liquid and the gas in the internal flow path are present in different distribution states.
2. The method for detecting a gas-liquid interface inside a nozzle according to claim 1, wherein
The detection step includes:
an edge extraction step of performing edge extraction processing on the 2 images acquired by the camera to acquire a1 st edge image and a2 nd edge image, respectively;
an edge expansion step of performing edge expansion processing for expanding the edge of the 2 nd edge image to obtain an edge expanded image; and
acquiring a difference image representing a difference between the 1 st edge image and the edge expansion image, and detecting the gas-liquid interface based on edges other than an edge from the edge expansion image in the difference image.
3. The method for detecting a gas-liquid interface inside a nozzle according to claim 2, wherein
In the imaging step, the camera sequentially acquires the images during at least a part of a processing period from a state in which the processing liquid is stationary in the internal flow path of the nozzle to a state in which the processing liquid is discharged and the processing liquid is stationary again,
in the detecting step, a position of the gas-liquid interface is detected based on the 2 images captured during at least a part of the processing period.
4. The method for detecting a gas-liquid interface inside a nozzle according to claim 3, wherein
In the imaging step, the camera acquires a1 st image when the treatment liquid is stationary in the internal flow path, and acquires a2 nd image when the treatment liquid flows toward the ejection port,
in the edge extraction step, the 1 st edge image is acquired based on the 1 st image, and the 2 nd edge image is acquired based on the 2 nd image.
5. The method of detecting a gas-liquid interface inside a nozzle according to claim 4, further comprising:
in the detecting step, it is determined that an abnormality has occurred when positions of the plurality of gas-liquid interfaces are detected based on the 1 st image and the 2 nd image.
6. The method for detecting a gas-liquid interface inside a nozzle according to any one of claims 3 to 5, wherein
In the imaging step, the camera sequentially acquires the images while the treatment liquid is discharged from the discharge port of the nozzle,
in the edge extraction step, the 1 st edge image is acquired based on the current image acquired by the camera during the ejection period, and the 2 nd edge image is acquired based on the image acquired by the camera immediately before the current image during the ejection period.
7. The method of detecting a gas-liquid interface inside a nozzle according to claim 6, further comprising:
in the detecting step, it is determined that an abnormality has occurred when the gas-liquid interface is detected based on the 2 images captured during the ejection period.
8. A substrate processing apparatus includes:
a substrate holding unit for holding a substrate;
a transparent nozzle that ejects the processing liquid from a front port, i.e., an ejection port, of the internal flow path toward the substrate held by the substrate holding portion;
a camera for capturing a capture area including the nozzle to acquire an image; and
and an image processing unit that detects a gas-liquid interface between the processing liquid and the gas in the internal flow path based on a difference between 2 images having different distribution states of the processing liquid and the gas in the internal flow path.
CN202010645389.0A 2019-09-13 2020-07-07 Method for detecting gas-liquid interface inside nozzle and substrate processing apparatus Pending CN112505035A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-166877 2019-09-13
JP2019166877A JP7352419B2 (en) 2019-09-13 2019-09-13 Method for detecting gas-liquid interface inside a nozzle and substrate processing device

Publications (1)

Publication Number Publication Date
CN112505035A true CN112505035A (en) 2021-03-16

Family

ID=74863880

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010645389.0A Pending CN112505035A (en) 2019-09-13 2020-07-07 Method for detecting gas-liquid interface inside nozzle and substrate processing apparatus

Country Status (4)

Country Link
JP (1) JP7352419B2 (en)
KR (1) KR102513788B1 (en)
CN (1) CN112505035A (en)
TW (1) TWI737335B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2023140531A (en) * 2022-03-23 2023-10-05 株式会社Screenホールディングス Substrate processing apparatus

Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03278519A (en) * 1990-03-28 1991-12-10 Nec Corp Chemical detection mechanism of chemical applicator
JPH11290752A (en) * 1998-04-06 1999-10-26 Dainippon Screen Mfg Co Ltd Coater
JP2002133419A (en) * 2000-10-19 2002-05-10 Mitsubishi Heavy Ind Ltd Method and device for extracting object from image
JP2003273003A (en) * 2002-03-15 2003-09-26 Dainippon Screen Mfg Co Ltd Substrate-processing device
CN1524699A (en) * 2003-02-28 2004-09-01 ������������ʽ���� Image recognition method for nozzle bore and relevant method and apparatus
US20040180141A1 (en) * 2003-03-10 2004-09-16 Shinji Kobayashi Coating and processing apparatus and method
CN1618527A (en) * 2003-11-18 2005-05-25 大日本网目版制造株式会社 Base plate treater, slit jet nozzle and mechanism for determining liquid filling degree and gas mixing degree in filled body
US20060029388A1 (en) * 2004-08-05 2006-02-09 Tokyo Electron Limited Liquid processing apparatus processing a substrate surface with a processing liquid, liquid processing method, and liquid condition detection apparatus detecting fluctuation of the processing liquid
JP2006324677A (en) * 2006-05-29 2006-11-30 Tokyo Electron Ltd Automatic setting apparatus for liquid-treating apparatus
CN1926662A (en) * 2004-03-31 2007-03-07 东京毅力科创株式会社 Coater/developer and coating/developing method
US20080100809A1 (en) * 2006-10-25 2008-05-01 Tokyo Electron Limited Wet processing system, wet processing method and storage medium
JP2010175291A (en) * 2009-01-27 2010-08-12 Hitachi High-Technologies Corp Liquid level detection device and method
KR100989857B1 (en) * 2010-04-26 2010-10-29 (주)에스티글로벌 Method for monitoring liquid dispensing state and device thereof
JP2011242328A (en) * 2010-05-20 2011-12-01 Ricoh Co Ltd Bubble detection tool, bubble detector, and bubble detection method
CN102270560A (en) * 2010-05-21 2011-12-07 东京毅力科创株式会社 Wet-processing apparatus
KR101098454B1 (en) * 2010-10-08 2011-12-23 나노에프에이 주식회사 Method for detecting photoresist coating process using image processing and system for detecting photoresist coating process using therein
CN102540579A (en) * 2010-12-21 2012-07-04 塔工程有限公司 Coater
JP2013062330A (en) * 2011-09-13 2013-04-04 Tokyo Electron Ltd Liquid processing device, liquid processing method, and memory medium
CN106965554A (en) * 2012-01-02 2017-07-21 穆特拉茨国际有限公司 Pickup and printing
JP2019045451A (en) * 2017-09-07 2019-03-22 大日本印刷株式会社 Inspection apparatus, inspection method and program
CN109872955A (en) * 2017-12-04 2019-06-11 株式会社斯库林集团 Determination method and substrate board treatment
KR20190087298A (en) * 2018-01-15 2019-07-24 도쿄엘렉트론가부시키가이샤 Substrate processing apparatus, substrate processing method and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3278519B2 (en) 1993-12-29 2002-04-30 株式会社東芝 Information communication system
JP5127080B2 (en) * 2011-06-21 2013-01-23 東京エレクトロン株式会社 Liquid processing equipment

Patent Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03278519A (en) * 1990-03-28 1991-12-10 Nec Corp Chemical detection mechanism of chemical applicator
JPH11290752A (en) * 1998-04-06 1999-10-26 Dainippon Screen Mfg Co Ltd Coater
JP2002133419A (en) * 2000-10-19 2002-05-10 Mitsubishi Heavy Ind Ltd Method and device for extracting object from image
JP2003273003A (en) * 2002-03-15 2003-09-26 Dainippon Screen Mfg Co Ltd Substrate-processing device
CN1524699A (en) * 2003-02-28 2004-09-01 ������������ʽ���� Image recognition method for nozzle bore and relevant method and apparatus
US20040180141A1 (en) * 2003-03-10 2004-09-16 Shinji Kobayashi Coating and processing apparatus and method
CN1618527A (en) * 2003-11-18 2005-05-25 大日本网目版制造株式会社 Base plate treater, slit jet nozzle and mechanism for determining liquid filling degree and gas mixing degree in filled body
CN1926662A (en) * 2004-03-31 2007-03-07 东京毅力科创株式会社 Coater/developer and coating/developing method
US20070177869A1 (en) * 2004-03-31 2007-08-02 Tokyo Electon Limited Coater/developer and coating/developing method
US20060029388A1 (en) * 2004-08-05 2006-02-09 Tokyo Electron Limited Liquid processing apparatus processing a substrate surface with a processing liquid, liquid processing method, and liquid condition detection apparatus detecting fluctuation of the processing liquid
JP2006324677A (en) * 2006-05-29 2006-11-30 Tokyo Electron Ltd Automatic setting apparatus for liquid-treating apparatus
US20080100809A1 (en) * 2006-10-25 2008-05-01 Tokyo Electron Limited Wet processing system, wet processing method and storage medium
JP2010175291A (en) * 2009-01-27 2010-08-12 Hitachi High-Technologies Corp Liquid level detection device and method
KR100989857B1 (en) * 2010-04-26 2010-10-29 (주)에스티글로벌 Method for monitoring liquid dispensing state and device thereof
JP2011242328A (en) * 2010-05-20 2011-12-01 Ricoh Co Ltd Bubble detection tool, bubble detector, and bubble detection method
CN102270560A (en) * 2010-05-21 2011-12-07 东京毅力科创株式会社 Wet-processing apparatus
KR101098454B1 (en) * 2010-10-08 2011-12-23 나노에프에이 주식회사 Method for detecting photoresist coating process using image processing and system for detecting photoresist coating process using therein
CN102540579A (en) * 2010-12-21 2012-07-04 塔工程有限公司 Coater
TW201240737A (en) * 2010-12-21 2012-10-16 Top Eng Co Ltd Dispenser
JP2013062330A (en) * 2011-09-13 2013-04-04 Tokyo Electron Ltd Liquid processing device, liquid processing method, and memory medium
CN106965554A (en) * 2012-01-02 2017-07-21 穆特拉茨国际有限公司 Pickup and printing
JP2019045451A (en) * 2017-09-07 2019-03-22 大日本印刷株式会社 Inspection apparatus, inspection method and program
CN109872955A (en) * 2017-12-04 2019-06-11 株式会社斯库林集团 Determination method and substrate board treatment
KR20190087298A (en) * 2018-01-15 2019-07-24 도쿄엘렉트론가부시키가이샤 Substrate processing apparatus, substrate processing method and storage medium

Also Published As

Publication number Publication date
TWI737335B (en) 2021-08-21
JP7352419B2 (en) 2023-09-28
KR20210031812A (en) 2021-03-23
KR102513788B1 (en) 2023-03-24
JP2021043130A (en) 2021-03-18
TW202111768A (en) 2021-03-16

Similar Documents

Publication Publication Date Title
TWI425557B (en) Liquid treatment apparatus and liquid treatment method
JP7052573B2 (en) Coating film forming device and adjustment method of coating film forming device
KR101359751B1 (en) Decompression drying equipment
JP2008135679A (en) Liquid treatment apparatus and method, and storage medium
TWI828772B (en) Coating and developing apparatus, and coating and developing method
CN115668451A (en) Substrate processing method and substrate processing apparatus
CN111630637A (en) Substrate processing apparatus
JP2009181982A (en) Liquid treatment equipment
JP2023026469A (en) Substrate processing apparatus, adjustment method of parameter of application module, and storage medium
US8885140B2 (en) Substrate treatment apparatus, substrate treatment method and non-transitory storage medium
CN112505035A (en) Method for detecting gas-liquid interface inside nozzle and substrate processing apparatus
JP5127080B2 (en) Liquid processing equipment
JP2013004845A (en) Separation system, separation method, program and computer storage medium
JP7090005B2 (en) Board processing equipment and inspection method
JP2023133329A (en) Substrate processing apparatus, nozzle inspection method, and storage medium
JP7153521B2 (en) SUBSTRATE PROCESSING APPARATUS AND INSPECTION METHOD
JP6828329B2 (en) Substrate processing equipment, substrate processing method and storage medium
JP2004304168A (en) Method and apparatus for manufacturing component mounting board
JP5717614B2 (en) Peeling apparatus, peeling system, peeling method, program, and computer storage medium
KR20200141938A (en) Coating apparatus and coating method
JP7028285B2 (en) Board processing equipment, board processing method and storage medium
JP2008030043A (en) Method of removing coating film and apparatus therefor
TWI831656B (en) Substrate processing device and substrate processing method
JP7388896B2 (en) Substrate processing equipment
KR20110119956A (en) Substrate treatment method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination