WO2020039765A1 - Procédé de traitement de substrat, dispositif de traitement de substrat et système de traitement de substrat - Google Patents

Procédé de traitement de substrat, dispositif de traitement de substrat et système de traitement de substrat Download PDF

Info

Publication number
WO2020039765A1
WO2020039765A1 PCT/JP2019/026589 JP2019026589W WO2020039765A1 WO 2020039765 A1 WO2020039765 A1 WO 2020039765A1 JP 2019026589 W JP2019026589 W JP 2019026589W WO 2020039765 A1 WO2020039765 A1 WO 2020039765A1
Authority
WO
WIPO (PCT)
Prior art keywords
nozzle
substrate
processing liquid
timing
discharge
Prior art date
Application number
PCT/JP2019/026589
Other languages
English (en)
Japanese (ja)
Inventor
鮎美 樋口
英司 猶原
有史 沖田
翔太 岩畑
央章 角間
達哉 増井
Original Assignee
株式会社Screenホールディングス
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Screenホールディングス filed Critical 株式会社Screenホールディングス
Priority to KR1020217004619A priority Critical patent/KR102509854B1/ko
Priority to CN201980054879.9A priority patent/CN112640054B/zh
Publication of WO2020039765A1 publication Critical patent/WO2020039765A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L21/00Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
    • H01L21/02Manufacture or treatment of semiconductor devices or of parts thereof
    • H01L21/02041Cleaning
    • H01L21/02057Cleaning during device manufacture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B05SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05DPROCESSES FOR APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05D1/00Processes for applying liquids or other fluent materials
    • B05D1/36Successively applying liquids or other fluent materials, e.g. without intermediate treatment
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B05SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05DPROCESSES FOR APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05D1/00Processes for applying liquids or other fluent materials
    • B05D1/40Distributing applied liquids or other fluent materials by members moving relatively to surface
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B05SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05DPROCESSES FOR APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05D3/00Pretreatment of surfaces to which liquids or other fluent materials are to be applied; After-treatment of applied coatings, e.g. intermediate treating of an applied coating preparatory to subsequent applications of liquids or other fluent materials
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L21/00Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
    • H01L21/02Manufacture or treatment of semiconductor devices or of parts thereof
    • H01L21/027Making masks on semiconductor bodies for further photolithographic processing not provided for in group H01L21/18 or H01L21/34
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L21/00Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
    • H01L21/02Manufacture or treatment of semiconductor devices or of parts thereof
    • H01L21/04Manufacture or treatment of semiconductor devices or of parts thereof the devices having potential barriers, e.g. a PN junction, depletion layer or carrier concentration layer
    • H01L21/18Manufacture or treatment of semiconductor devices or of parts thereof the devices having potential barriers, e.g. a PN junction, depletion layer or carrier concentration layer the devices having semiconductor bodies comprising elements of Group IV of the Periodic Table or AIIIBV compounds with or without impurities, e.g. doping materials
    • H01L21/30Treatment of semiconductor bodies using processes or apparatus not provided for in groups H01L21/20 - H01L21/26
    • H01L21/302Treatment of semiconductor bodies using processes or apparatus not provided for in groups H01L21/20 - H01L21/26 to change their surface-physical characteristics or shape, e.g. etching, polishing, cutting
    • H01L21/304Mechanical treatment, e.g. grinding, polishing, cutting
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L21/00Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
    • H01L21/02Manufacture or treatment of semiconductor devices or of parts thereof
    • H01L21/04Manufacture or treatment of semiconductor devices or of parts thereof the devices having potential barriers, e.g. a PN junction, depletion layer or carrier concentration layer
    • H01L21/18Manufacture or treatment of semiconductor devices or of parts thereof the devices having potential barriers, e.g. a PN junction, depletion layer or carrier concentration layer the devices having semiconductor bodies comprising elements of Group IV of the Periodic Table or AIIIBV compounds with or without impurities, e.g. doping materials
    • H01L21/30Treatment of semiconductor bodies using processes or apparatus not provided for in groups H01L21/20 - H01L21/26
    • H01L21/302Treatment of semiconductor bodies using processes or apparatus not provided for in groups H01L21/20 - H01L21/26 to change their surface-physical characteristics or shape, e.g. etching, polishing, cutting
    • H01L21/306Chemical or electrical treatment, e.g. electrolytic etching
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L21/00Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
    • H01L21/67Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere
    • H01L21/67005Apparatus not specifically provided for elsewhere
    • H01L21/67011Apparatus for manufacture or treatment
    • H01L21/67017Apparatus for fluid treatment
    • H01L21/67028Apparatus for fluid treatment for cleaning followed by drying, rinsing, stripping, blasting or the like
    • H01L21/6704Apparatus for fluid treatment for cleaning followed by drying, rinsing, stripping, blasting or the like for wet cleaning or washing
    • H01L21/67051Apparatus for fluid treatment for cleaning followed by drying, rinsing, stripping, blasting or the like for wet cleaning or washing using mainly spraying means, e.g. nozzles
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L21/00Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
    • H01L21/67Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere
    • H01L21/67005Apparatus not specifically provided for elsewhere
    • H01L21/67011Apparatus for manufacture or treatment
    • H01L21/6715Apparatus for applying a liquid, a resin, an ink or the like
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L21/00Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
    • H01L21/67Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere
    • H01L21/67005Apparatus not specifically provided for elsewhere
    • H01L21/67242Apparatus for monitoring, sorting or marking
    • H01L21/67253Process monitoring, e.g. flow or thickness monitoring

Definitions

  • the present invention relates to a substrate processing method, a substrate processing apparatus, and a substrate processing system.
  • the substrate processing apparatus includes a substrate holding unit that holds the substrate in a horizontal posture, a rotation mechanism that rotates the substrate holding unit to rotate the substrate in a horizontal plane, a first discharge nozzle that discharges a processing liquid from above the substrate, and A second discharge nozzle.
  • the first discharge nozzle and the second discharge nozzle respectively discharge the processing liquid from near the center of the substrate in plan view.
  • the first processing liquid When the first processing liquid is discharged from the first discharge nozzle toward the substrate, the first processing liquid lands on the vicinity of the center of the substrate, spreads on the substrate under the centrifugal force accompanying the rotation of the substrate, and spreads on the substrate. Spattered from the periphery.
  • an SC1 liquid a mixed liquid of aqueous ammonia, hydrogen peroxide and water
  • an SC2 liquid a mixed liquid of hydrochloric acid, aqueous hydrogen peroxide and water
  • a DHF liquid dilute hydrofluoric acid
  • the treatment with the second treatment liquid is performed. That is, the nozzle that discharges the processing liquid is switched from the first discharge nozzle to the second discharge nozzle. Specifically, the discharge of the second processing liquid from the second discharge nozzle is started while the discharge of the first processing liquid from the first discharge nozzle is stopped.
  • the second processing liquid is deposited near the center of the substrate, spreads on the substrate under the centrifugal force accompanying the rotation of the substrate, and is scattered from the periphery of the substrate.
  • the second processing liquid for example, pure water can be adopted. Thereby, the first processing liquid can be washed away from the substrate.
  • Patent Documents 1 and 2 A technique has also been proposed for monitoring the discharge state of the processing liquid from the discharge nozzle with a camera (for example, Patent Documents 1 and 2).
  • a camera for example, Patent Documents 1 and 2.
  • an imaging area including the tip of a discharge nozzle is imaged by a camera, and it is determined whether or not the processing liquid is being discharged from the discharge nozzle based on an image captured by the camera.
  • the timing for stopping the discharge of the processing liquid from the first discharge nozzle and the timing for starting the discharge of the processing liquid from the second discharge nozzle Is important.
  • the discharge of the second discharge nozzle may be started before the discharge stop timing of the first discharge nozzle. For example, while the discharge of the first processing liquid is stopped, the discharge amount of the first processing liquid decreases with time and eventually becomes zero. By starting the discharge of the second processing liquid while the discharge of the first processing liquid is stopped, the processing liquid can be supplied to the substrate without interruption, and the possibility of drying the substrate can be reduced.
  • the discharge start timing of the second discharge nozzle is too early, the discharge of the second processing liquid is started in a state where the first processing liquid is still discharged with a sufficient discharge amount. In this case, the total amount of the processing liquid supplied to the substrate increases, and the processing liquid rebounds on the substrate (liquid splash). The generation of such liquid splash is not preferable.
  • the above timing difference may affect the quality of processing on the substrate.
  • an object of the present application is to provide a substrate processing method, a substrate processing apparatus, and a substrate processing system that can switch from the first nozzle to the second nozzle at a desired timing difference.
  • a first aspect of the substrate processing method is a substrate processing method, wherein the first step of holding a substrate and the start of imaging with a camera of an imaging region including a tip of a first nozzle and a tip of a second nozzle.
  • a second step of generating a captured image a third step of starting the discharge of the processing liquid from the first nozzle to the substrate, and stopping the discharge of the processing liquid from the first nozzle; Starting the discharge of the processing liquid from the second nozzle in the fourth step based on the image processing on the captured image, and starting the discharge of the processing liquid from the second nozzle in the fourth step.
  • a second aspect of the substrate processing method is the substrate processing method according to the first aspect, wherein the stop timing is adjusted without adjusting the start timing.
  • a third aspect of the substrate processing method is the substrate processing method according to the first or second aspect, wherein, in the fifth step, the first processing is performed from a tip of the first nozzle in each frame of the captured image.
  • the stop timing is specified based on a pixel value of a first ejection determination area extending in the ejection direction of the nozzle, and a second ejection determination area extending in the ejection direction of the second nozzle from the tip of the second nozzle in each frame.
  • the start timing is specified based on the pixel value of (i).
  • a fourth aspect of the substrate processing method is the substrate processing method according to the third aspect, wherein a frame in which the statistic of the pixel value of the first ejection determination area is larger than a threshold value and a frame next to the frame
  • the stop timing is specified based on a frame in which the statistic of the first ejection determination area is smaller than the threshold, and a frame in which the statistic of the pixel value of the second ejection determination area is smaller than the threshold.
  • the start timing is specified based on a frame next to the frame, the frame having a statistic of the first ejection determination area larger than the threshold.
  • a fifth aspect of the substrate processing method is the substrate processing method according to the fourth aspect, wherein in the sixth step, a graph showing a time change of the statistic for the first nozzle and the second nozzle is provided. Displayed on a user interface, and when an input for at least one of the start timing and the stop timing is input to the user interface, the target timing is adjusted according to the input.
  • a sixth aspect of the substrate processing method is the substrate processing method according to any one of the first to third aspects, wherein, in the fifth step, a machine-learned classifier is used to add the captured image to the captured image.
  • a machine-learned classifier is used to add the captured image to the captured image.
  • Each of the included frames is classified into discharge / stop of the processing liquid for each of the first nozzle and the second nozzle, and the timing difference is obtained based on the classification result.
  • a seventh aspect of the substrate processing method is the substrate processing method according to the sixth aspect, wherein a frame classified as ejection for the first nozzle and a frame next to the frame and the first nozzle
  • the stop timing is specified based on the frame classified as stopped, the frame classified as stopped for the second nozzle, and the frame next to the frame and classified as discharge for the second nozzle.
  • a substrate processing method for specifying the start timing based on the above.
  • An eighth aspect of the substrate processing method is the substrate processing method according to the sixth aspect, wherein the stop timing is after the start timing, and both the first nozzle and the second nozzle supply the processing liquid.
  • the timing difference is determined based on the number of frames classified as ejected and the time between the frames.
  • a ninth aspect of the substrate processing method is the substrate processing method according to any one of the first to eighth aspects, wherein in the sixth step, it is determined that the timing difference is outside a predetermined range. Then, the operator is notified of the fact.
  • a tenth aspect of the substrate processing method is the substrate processing method according to any one of the first to ninth aspects, wherein the stop timing is after the start timing and is based on image processing on the captured image. It is determined whether the processing liquid has splashed on the substrate or not. When it is determined that the liquid splash has occurred, the timing difference between the start timing and the stop timing is determined.
  • the method further includes a seventh step of adjusting at least one of the start timing and the stop timing so as to reduce the time.
  • An eleventh aspect of the substrate processing method is the substrate processing method according to the tenth aspect, wherein in the seventh step, each frame of the captured image is splashed using a machine-learned classifier. Classification with / without
  • a twelfth aspect of the substrate processing method is the substrate processing method according to the eleventh aspect, wherein, in the seventh step, in each frame of the captured image, a portion near the first nozzle and the second nozzle is provided. The liquid splash determination area is cut out, and the liquid splash determination area is classified into the presence / absence of the liquid splash using the classifier.
  • a thirteenth aspect of the substrate processing method is the substrate processing method according to any one of the sixth to eighth, eleventh, and twelfth aspects, wherein the type of the substrate, the type of the processing liquid, A position between a nozzle and the second nozzle, and one of a plurality of machine-learned classifiers corresponding to at least one of the flow rates of the processing liquid is selected, and based on the selected classifier. Then, each frame included in the captured image is classified.
  • a fourteenth aspect of the substrate processing method is the substrate processing method according to the thirteenth aspect, wherein the type of the substrate, the type of the processing liquid, the positions of the first nozzle and the second nozzle, and When at least one of the flow rates of the processing liquid is input to the input unit, one of the plurality of classifiers is selected according to the input to the input unit.
  • a first aspect of a substrate processing apparatus includes a substrate holding unit that holds a substrate, a first nozzle that discharges a processing liquid to the substrate, and a second nozzle that discharges a processing liquid to the substrate.
  • a processing liquid supply unit a camera that captures an image of an imaging region including a tip of the first nozzle and a tip of the second nozzle to generate a captured image, and a control unit, wherein the control unit includes the first After the discharge of the processing liquid from the nozzle to the substrate is started, the discharge of the processing liquid from the second nozzle to the substrate is started, and the discharge of the processing liquid from the first nozzle to the substrate is stopped.
  • the processing liquid supply unit is controlled as described above, and based on image processing on the captured image, the start timing of starting the discharge of the processing liquid from the second nozzle and the stop of the discharging of the processing liquid from the first nozzle are stopped. Tie with stop timing And determining at least one of the start timing and the stop timing such that the timing difference falls within the predetermined range when it is determined that the timing difference is outside the predetermined range. .
  • a second aspect of the substrate processing apparatus is the substrate processing apparatus according to the first aspect, wherein the control unit uses a machine-learned classifier to convert each frame included in the captured image into the first image.
  • the one nozzle and the second nozzle are classified into categories indicating the state of discharge / stop of the processing liquid, and the timing difference is obtained based on the classification result.
  • a third aspect of the substrate processing apparatus is the substrate processing apparatus according to the second aspect, wherein the control unit is configured to determine a type of the substrate, a type of the processing liquid, and a difference between the first nozzle and the second nozzle. Position, and one of a plurality of machine-learned classifiers corresponding to at least one of the flow rates of the processing liquid is selected, and based on the selected classifier, included in the captured image. Classify each frame.
  • a fourth aspect of the substrate processing apparatus is the substrate processing apparatus according to the third aspect, wherein the type of the substrate, the type of the processing liquid, the positions of the first nozzle and the second nozzle, and An input unit for inputting at least one of the flow rates of the processing liquid is provided, and the control unit selects one of the plurality of classifiers according to the input to the input unit.
  • An aspect of a substrate processing system includes a substrate processing apparatus and a server that communicates with the substrate processing apparatus, wherein the substrate processing apparatus discharges a processing liquid onto the substrate holding unit that holds a substrate and the substrate.
  • a control unit that controls the processing liquid supply unit to stop discharging the processing liquid to the substrate, wherein the substrate processing apparatus and the server are included in the captured image using a machine-learned classifier.
  • Each frame before The first nozzle and the second nozzle are classified into categories indicating the state of discharge / stop of the processing liquid, and based on the classification result, the start timing of starting the discharge of the processing liquid from the second nozzle;
  • the control unit obtains a timing difference from a stop timing for stopping the discharge of the processing liquid from one nozzle, and when the control unit determines that the timing difference is outside a predetermined range, the control unit determines that the timing difference falls within the predetermined range. At least one of the start timing and the stop timing is adjusted so that
  • the timing difference between the start timing and the stop timing is obtained based on the image processing on the captured image, so that the timing difference is obtained with high accuracy. Can be. Therefore, the timing difference can be adjusted within a predetermined range with high accuracy.
  • the timing difference can be set within a predetermined range without changing the length of the processing period during which the processing liquid is supplied from the first nozzle.
  • the processing can be performed as compared with the case where image processing is performed on the entire captured image. Can be light.
  • the start timing and the stop timing can be specified by simple processing.
  • the operator can visually recognize the time change of the statistic and adjust the timing difference based on the time change.
  • a timing difference can be obtained with high accuracy by machine learning.
  • the timing difference can be appropriately specified.
  • the timing difference can be appropriately specified.
  • the operator can recognize that the timing difference is out of the predetermined range.
  • the timing difference can be adjusted so that the generation of the liquid splash can be reduced.
  • the presence or absence of liquid splash can be determined with high accuracy.
  • the classification accuracy can be improved.
  • each frame can be classified with high accuracy.
  • the operator can input information to the input unit.
  • FIG. 2 is a diagram illustrating a schematic example of a configuration of a substrate processing apparatus.
  • FIG. 3 is a plan view illustrating a schematic example of a configuration of a processing unit.
  • FIG. 3 is a plan view illustrating a schematic example of a configuration of a processing unit.
  • 5 is a flowchart illustrating an example of an operation of a control unit.
  • FIG. 4 is a diagram schematically illustrating an example of a frame generated by a camera.
  • FIG. 4 is a diagram schematically illustrating an example of a frame generated by a camera.
  • FIG. 4 is a diagram schematically illustrating an example of a frame generated by a camera.
  • FIG. 4 is a diagram schematically illustrating an example of a frame generated by a camera.
  • FIG. 4 is a diagram schematically illustrating an example of a frame generated by a camera.
  • FIG. 5 is a graph schematically illustrating an example of a luminance distribution in an ejection determination area.
  • 5 is a graph schematically illustrating an example of a luminance distribution in an ejection determination area.
  • 6 is a graph schematically showing an example of a temporal change of a statistic of a pixel value in an ejection determination area.
  • 6 is a graph schematically showing an example of a temporal change of a statistic of a pixel value in an ejection determination area.
  • FIG. 3 is a functional block diagram schematically illustrating an example of an internal configuration of a control unit. It is a flowchart which shows a specific example of a monitoring process.
  • FIG. 3 is a functional block diagram schematically illustrating an example of an internal configuration of a control unit. It is a flowchart which shows a specific example of a monitoring process.
  • FIG. 3 is a functional block diagram schematically illustrating an example of an internal configuration of a control unit. It is a flowchart which shows
  • FIG. 4 is a diagram schematically illustrating an example of a plurality of frames of a captured image.
  • FIG. 3 is a functional block diagram schematically illustrating an example of an internal configuration of a control unit. It is a figure which shows an example of an input screen schematically. It is a functional block diagram showing roughly an example of a substrate processing device and a server.
  • FIG. 4 is a diagram schematically illustrating an example of a deep learning model.
  • FIG. 4 is a diagram schematically illustrating an example of a frame generated by a camera. It is a graph which shows roughly an example of a temporal change of the statistic of the pixel value in a liquid splash determination area. It is a flowchart which shows a specific example of a monitoring process.
  • FIG. 3 is a plan view illustrating a schematic example of a configuration of a processing unit.
  • FIG. 4 is a diagram schematically illustrating an example of a frame generated by a camera.
  • FIG. 1 is a diagram showing the overall configuration of the substrate processing apparatus 100.
  • the substrate processing apparatus 100 is an apparatus that performs processing on a substrate W by supplying a processing liquid to the substrate W.
  • the substrate W is, for example, a semiconductor substrate. This substrate W has a substantially disk shape.
  • This substrate processing apparatus 100 can sequentially supply at least two types of processing liquids to the substrate W.
  • the substrate processing apparatus 100 can perform a cleaning process by supplying a rinsing liquid to the substrate W after supplying a cleaning liquid to the substrate W.
  • the chemical solution is SC1 solution (mixed solution of ammonia water, hydrogen peroxide solution and water), SC2 solution (mixed solution of hydrochloric acid, hydrogen peroxide solution and water), DHF solution (dilute hydrofluoric acid)
  • the rinsing liquid for example, pure water is used.
  • the chemical liquid and the rinsing liquid are collectively referred to as “treatment liquid”. Note that not only the cleaning process but also a coating solution such as a photoresist solution for a film forming process, a chemical solution for removing an unnecessary film, a chemical solution for etching, and the like are included in the “processing solution”.
  • the substrate processing apparatus 100 includes an indexer 102, a plurality of processing units 1, and a main transfer robot 103.
  • the indexer 102 has a function of carrying an unprocessed substrate W received from outside the apparatus into the apparatus and a function of carrying out the processed substrate W after the cleaning processing to the outside of the apparatus.
  • the indexer 102 mounts a plurality of carriers and includes a transfer robot (both are not shown).
  • a FOUP front opening unified pod
  • SMIF Standard Mechanical Inter Face
  • OC open cassette
  • 12Twelve processing units 1 are arranged in the substrate processing apparatus 100.
  • the detailed arrangement configuration is such that four towers in which three processing units 1 are stacked are arranged so as to surround the main transfer robot 103.
  • four processing units 1 arranged so as to surround the main transfer robot 103 are stacked in three stages, and FIG. 1 shows one of them.
  • the number of processing units 1 mounted on the substrate processing apparatus 100 is not limited to 12, and may be, for example, eight or four.
  • the main transfer robot 103 is installed at the center of the four towers including the stacked processing units 1.
  • the main transport robot 103 loads the unprocessed substrate W received from the indexer 102 into each processing unit 1, unloads the processed substrate W from each processing unit 1, and delivers it to the indexer 102.
  • FIG. 2 is a plan view of the processing unit 1.
  • FIG. 3 is a longitudinal sectional view of the processing unit 1. 2 shows a state where the substrate W is not held by the substrate holding unit 20, and FIG. 3 shows a state where the substrate W is held by the substrate holding unit 20.
  • the processing unit 1 includes a substrate holding unit 20 that holds a substrate W in a horizontal posture (a posture in which a normal line of the substrate W is along the vertical direction) as main elements in the chamber 10, and a substrate held by the substrate holding unit 20.
  • Three processing liquid supply units 30, 60, 65 for supplying a processing liquid to the upper surface of W, a processing cup 40 surrounding the periphery of the substrate holding unit 20, and a camera 70 for imaging the space above the substrate holding unit 20. Is provided. Further, around the processing cup 40 in the chamber 10, there is provided a partition plate 15 for vertically partitioning the inner space of the chamber 10.
  • the chamber 10 includes a side wall 11 extending in the vertical direction, a ceiling wall 12 closing the upper side of a space surrounded by the side wall 11, and a floor wall 13 closing the lower side.
  • the space surrounded by the side wall 11, the ceiling wall 12, and the floor wall 13 is a processing space for the substrate W.
  • a part of the side wall 11 of the chamber 10 is provided with a loading / unloading port for the main transfer robot 103 to load and unload the substrate W with respect to the chamber 10, and a shutter for opening and closing the loading / unloading port. Omitted).
  • the fan filter unit 14 includes a fan and a filter (for example, a HEPA filter) for taking in air in the clean room and sending it out into the chamber 10, and forms a downflow of clean air in a processing space in the chamber 10.
  • a punching plate having a large number of blowout holes may be provided directly below the ceiling wall 12.
  • the substrate holding unit 20 is, for example, a spin chuck.
  • the substrate holding section 20 includes a disk-shaped spin base 21 fixed in a horizontal posture to an upper end of a rotating shaft 24 extending in the vertical direction.
  • a spin motor 22 for rotating a rotating shaft 24 is provided below the spin base 21.
  • the spin motor 22 rotates the spin base 21 via a rotation shaft 24 in a horizontal plane.
  • a cylindrical cover member 23 is provided so as to surround the spin motor 22 and the rotation shaft 24.
  • the outer diameter of the disc-shaped spin base 21 is slightly larger than the diameter of the circular substrate W held by the substrate holder 20. Therefore, the spin base 21 has a holding surface 21a facing the entire lower surface of the substrate W to be held.
  • a plurality (four in the present embodiment) of chuck pins 26 are provided upright on the periphery of the holding surface 21a of the spin base 21.
  • the plurality of chuck pins 26 are equally spaced along the circumference corresponding to the outer circumferential circle of the circular substrate W (in the case of four chuck pins 26 as in the present embodiment, at 90 ° intervals). ) Is located.
  • the plurality of chuck pins 26 are driven in conjunction with each other by a link mechanism (not shown) accommodated in the spin base 21.
  • the substrate holding unit 20 holds the substrate W by bringing each of the plurality of chuck pins 26 into contact with the outer peripheral end of the substrate W, thereby positioning the substrate W above the spin base 21 and close to the holding surface 21a. (See FIG. 3), and the gripping can be released by separating each of the plurality of chuck pins 26 from the outer peripheral end of the substrate W.
  • the spin motor 22 rotates the rotation shaft 24 in a state where the substrate holding unit 20 holds the substrate W by being gripped by the plurality of chuck pins 26, thereby rotating the rotation axis CX along the vertical direction passing through the center of the substrate W.
  • the substrate W can be rotated.
  • the processing liquid supply unit 30 is configured by attaching the discharge nozzle 31 to the tip of the nozzle arm 32 (see FIG. 2).
  • the base end of the nozzle arm 32 is fixedly connected to a nozzle base 33.
  • the nozzle base 33 is rotatable around a vertical axis by a motor (not shown). The rotation of the nozzle base 33 causes the discharge nozzle 31 to move between the processing position above the substrate holding unit 20 and the standby position outside the processing cup 40, as indicated by an arrow AR34 in FIG. It moves in an arc along the horizontal direction.
  • the processing liquid supply unit 30 is configured to supply a plurality of types of processing liquids. Specifically, the processing liquid supply unit 30 has a plurality of discharge nozzles 31. 2 and 3, two ejection nozzles 31a and 31b are shown as the ejection nozzles 31.
  • the discharge nozzles 31a and 31b are fixed to a nozzle base 33 via a nozzle arm 32. Therefore, the discharge nozzles 31a and 31b move in synchronization with each other.
  • the discharge nozzles 31a and 31b are provided so as to be adjacent in a horizontal plane.
  • the discharge nozzle 31a is connected to a processing liquid supply source 37a via a pipe 34a
  • the discharge nozzle 31b is connected to a processing liquid supply source 37b via a pipe 34b.
  • On-off valves 35a and 35b are provided in the middle of the pipes 34a and 34b, respectively.
  • the on-off valve 35a is opened, the processing liquid Lq1 from the processing liquid supply source 37a flows inside the pipe 34a and is discharged from the discharge nozzle 31a.
  • the on-off valve 35b is opened, the processing liquid from the processing liquid supply source 37b is opened. Lq2 flows inside the pipe 34b and is discharged from the discharge nozzle 31b.
  • SC1 liquid is discharged from the discharge nozzle 31a, and pure water is discharged from the discharge nozzle 31b, for example.
  • Suckback valves 36a, 36b may be provided in the middle of the pipes 34a, 34b, respectively.
  • the suck back valve 36a draws in the processing liquid Lq1 from the tip of the discharge nozzle 31a by sucking the processing liquid Lq1 in the pipe 34a when the discharge of the processing liquid Lq1 is stopped. This makes it difficult for the processing liquid Lq1 to drop from the tip of the discharge nozzle 31a as a relatively large lump (droplet) when the discharge is stopped.
  • the processing unit 1 of this embodiment is provided with two processing liquid supply units 60 and 65 in addition to the processing liquid supply unit 30 described above.
  • the processing liquid supply units 60 and 65 of the present embodiment have the same configuration as the processing liquid supply unit 30 described above. That is, the processing liquid supply unit 60 is configured by attaching a discharge nozzle 61 to the tip of a nozzle arm 62, and the discharge nozzle 61 is moved by an arrow AR 64 by a nozzle base 63 connected to the base end of the nozzle arm 62. As shown in the figure, the wafer moves in an arc between a processing position above the substrate holding unit 20 and a standby position outside the processing cup 40.
  • the processing liquid supply unit 65 is configured by attaching a discharge nozzle 66 to the tip of a nozzle arm 67, and the discharge nozzle 66 is moved by an arrow AR 69 by a nozzle base 68 connected to the base end of the nozzle arm 67. As shown by, it moves in an arc between a processing position above the substrate holding unit 20 and a standby position outside the processing cup 40.
  • the processing liquid supply units 60 and 65 may also be configured to supply a plurality of types of processing liquid, or may be configured to supply a single processing liquid.
  • the processing liquid supply units 60 and 65 discharge the processing liquid onto the upper surface of the substrate W held by the substrate holding unit 20 with the respective discharge nozzles 61 and 66 positioned at the processing positions. At least one of the processing liquid supply units 60 and 65 mixes a cleaning liquid such as pure water and a pressurized gas to generate droplets, and ejects a mixed fluid of the droplets and the gas to the substrate W. It may be a two-fluid nozzle. Further, the number of processing liquid supply units provided in the processing unit 1 is not limited to three, but may be one or more. However, in the present embodiment, it is assumed that two processing liquids are sequentially switched and discharged, and therefore, two or more discharge nozzles are provided as a whole.
  • Each of the discharge nozzles of the processing liquid supply units 60 and 65 is also connected to a processing liquid supply source via a pipe, similarly to the processing liquid supply unit 30, and an opening / closing valve is provided in the middle of the pipe. May be provided.
  • processing using the processing liquid supply unit 30 will be described as a representative.
  • the processing cup 40 is provided so as to surround the substrate holding unit 20.
  • the processing cup 40 includes an inner cup 41, a middle cup 42, and an outer cup 43.
  • the inner cup 41, the middle cup 42, and the outer cup 43 are provided to be able to move up and down.
  • the processing liquid scattered from the peripheral edge of the substrate W falls on the inner peripheral surface of the inner cup 41.
  • the dropped processing liquid is appropriately collected by a first collecting mechanism (not shown).
  • the inner cup 41 is lowered and the middle cup 42 and the outer cup 43 are raised, the processing liquid scattered from the periphery of the substrate W falls on the inner peripheral surface of the middle cup 42.
  • the dropped processing liquid is appropriately collected by a second collecting mechanism (not shown).
  • the processing liquid scattered from the peripheral edge of the substrate W falls on the inner peripheral surface of the outer cup 43.
  • the dropped processing liquid is appropriately collected by a third collecting mechanism (not shown). According to this, different processing liquids can be appropriately collected.
  • the partition plate 15 is provided so as to vertically partition the inner space of the chamber 10 around the processing cup 40.
  • the partition plate 15 may be a single plate member surrounding the processing cup 40, or may be a combination of a plurality of plate members. Further, the partition plate 15 may be formed with a through hole or a notch penetrating in the thickness direction.
  • the nozzle bases 33, 63, 68 of the processing liquid supply units 30, 60, 65 may be formed.
  • a through hole (not shown) for passing a support shaft for supporting is formed.
  • the outer peripheral end of the partition plate 15 is connected to the side wall 11 of the chamber 10.
  • the edge of the partition plate 15 surrounding the processing cup 40 is formed in a circular shape having a diameter larger than the outer diameter of the outer cup 43. Therefore, the partition plate 15 does not prevent the outer cup 43 from moving up and down.
  • An exhaust duct 18 is provided in a part of the side wall 11 of the chamber 10 and near the floor wall 13.
  • the exhaust duct 18 is connected to an exhaust mechanism (not shown).
  • the air passing between the processing cup 40 and the partition plate 15 is discharged from the exhaust duct 18 to the outside of the apparatus.
  • the camera 70 is installed in the chamber 10 above the partition plate 15.
  • the camera 70 includes, for example, an image sensor (for example, a CCD (Charge Coupled Device)) and an optical system such as an electronic shutter and a lens.
  • the discharge nozzle 31 of the processing liquid supply unit 30 has a processing position (solid line position in FIG. 3) above the substrate W held by the substrate holding unit 20 and a standby position outside the processing cup 40 by the nozzle base 33 ( (Dotted line position in FIG. 3).
  • the processing position is a position where the cleaning liquid is discharged from the processing liquid supply unit 30 onto the upper surface of the substrate W held by the substrate holding unit 20 to perform the cleaning processing.
  • the standby position is a position where the discharge of the processing liquid is stopped and the processing liquid supply unit 30 stands by when the processing liquid supply unit 30 does not perform the cleaning process.
  • a standby pod that accommodates the discharge nozzle 31 of the processing liquid supply unit 30 may be provided at the standby position.
  • the camera 70 is installed so that its imaging area includes at least the tip of the discharge nozzle 31 at the processing position. More specifically, the camera 70 is installed such that the tip of the ejection nozzle 31 and the processing liquid ejected from the tip are included in the imaging region.
  • the camera 70 is installed at a position where the ejection nozzle 31 at the processing position is imaged from the upper front. Therefore, the camera 70 can capture an image of the imaging region including the tip of the discharge nozzle 31 at the processing position.
  • the camera 70 can also capture an image of an imaging area including the tips of the discharge nozzles 61 and 66 of the processing liquid supply units 60 and 65 at the processing position.
  • the discharge nozzles 31 and 66 of the processing liquid supply units 30 and 65 move in the horizontal direction within the imaging field of view of the camera 70.
  • the discharge nozzle 61 of the processing liquid supply unit 60 moves in the depth direction within the imaging field of view of the camera 70.
  • a camera dedicated to the processing liquid supply unit 60 may be provided separately from the camera 70.
  • an illumination unit 71 is provided in the chamber 10 and above the partition plate 15. Usually, since the interior of the chamber 10 is a dark room, when the camera 70 performs imaging, the illumination unit 71 irradiates the discharge nozzles 31, 61, 66 of the processing liquid supply units 30, 60, 65 near the processing position with light. The captured image generated by the camera 70 is output to the control unit 9.
  • the control unit 9 controls various components of the substrate processing apparatus 100 to perform processing on the substrate W. Further, the control section 9 performs image processing on the captured image generated by the camera 70. The control unit 9 obtains a timing difference between the start timing and the stop timing of the discharge of the processing liquid from each discharge nozzle by this image processing. This image processing will be described later in detail.
  • the configuration of the control unit 9 as hardware is the same as that of a general computer. That is, the control unit 9 stores a CPU for performing various arithmetic processes, a ROM that is a read-only memory that stores a basic program, a RAM that is a readable and writable memory that stores various information, and control software and data. And a magnetic disk.
  • the CPU of the control unit 9 executes a predetermined processing program, each operation mechanism of the substrate processing apparatus 100 is controlled by the control unit 9 and the processing in the substrate processing apparatus 100 proceeds.
  • the CPU of the control unit 9 executes a predetermined processing program to perform image processing.
  • Some or all of the functions of the control unit 9 may be realized by dedicated hardware.
  • the user interface 90 includes a display and an input unit.
  • the display is, for example, a liquid crystal display or an organic EL (Electro Luminescence) display.
  • the input unit is, for example, a touch panel, a mouse, or a keyboard.
  • This user interface 90 is connected to the control unit 9.
  • the display displays a display image based on a display signal from the control unit 9.
  • This display image includes, for example, a captured image from the camera 70.
  • the input unit outputs input information input by the user to the control unit 9.
  • the control unit 9 can control various components according to the input information.
  • FIG. 4 is a flowchart illustrating an example of the operation of the control unit 9.
  • the processing using the processing liquid supply unit 30 will be described as an example.
  • step S1 the substrate W is transferred onto the substrate holding unit 20 by the main transfer robot 103.
  • the substrate holding unit 20 holds the transferred substrate W.
  • step S2 the control unit 9 rotates the nozzle base 33 to move the discharge nozzles 31a and 31b to the processing position.
  • the tip of the ejection nozzle 31a and the tip of the ejection nozzle 31b are included in the imaging region of the camera 70.
  • step S3 the control unit 9 controls the camera 70 to start imaging.
  • the camera 70 can more reliably image the tip of the discharge nozzle 31a and the tip of the discharge nozzle 31b.
  • the camera 70 images the imaging region at a predetermined frame rate (for example, 60 frames / second), and sequentially outputs each frame of the generated captured image to the control unit 9.
  • the imaging by the camera 70 may be started with the start of the movement of the ejection nozzles 31a and 31b in step S2.
  • FIG. 5 is a diagram schematically illustrating an example of a frame IM1 of a captured image generated by the camera 70.
  • the tip of the discharge nozzle 31a and the tip of the discharge nozzle 31b are shown, and a part of the substrate W is also shown.
  • the processing liquid Lq1 is not yet discharged from the discharge nozzle 31a, and similarly, the processing liquid Lq2 is not yet discharged from the discharge nozzle 31b.
  • step S4 the control unit 9 starts discharging from the discharge nozzle 31a. Specifically, the control unit 9 outputs an open signal to the on-off valve 35a.
  • the on-off valve 35a performs an opening operation based on the opening signal to open the pipe 34a.
  • the processing liquid from the processing liquid supply source 37a is discharged from the discharge nozzle 31a and supplied to the upper surface of the substrate W. Note that there is a delay from when the open signal is output to when the processing liquid Lq1 is actually discharged. This delay time depends on various factors such as the moving speed of the valve element due to the opening operation of the on-off valve 35a, the length of the pipe 34a and the pressure loss.
  • step S4 the control unit 9 rotates the spin motor 22 to rotate the substrate W.
  • FIG. 6 is a diagram schematically illustrating an example of a frame IM2 of a captured image generated by the camera 70.
  • the processing liquid Lq1 is discharged from the discharge nozzle 31a, and the processing liquid Lq2 is not discharged from the discharge nozzle 31b.
  • the processing liquid Lq1 discharged from the discharge nozzle 31a is a so-called continuous flow, and has a liquid column shape extending in the vertical direction in a region from the tip of the discharge nozzle 31a to the upper surface of the substrate W.
  • the processing liquid Lq1 lands on substantially the center of the substrate W, and spreads on the upper surface of the substrate W under the centrifugal force accompanying the rotation of the substrate W. Then, it scatters from the periphery of the substrate W. Thereby, the processing liquid Lq1 acts on the entire upper surface of the substrate W, and the processing based on the processing liquid Lq1 is performed.
  • the control unit 9 switches the nozzle for discharging the processing liquid from the discharge nozzle 31a to the discharge nozzle 31b in step S5. That is, the control unit 9 stops discharging the processing liquid Lq1 from the discharge nozzle 31a and starts discharging the processing liquid Lq2 from the discharge nozzle 31b. That is, the control unit 9 transmits a close signal to the on-off valve 35a and transmits an open signal to the on-off valve 35b.
  • the control unit 9 outputs an open signal to the on-off valve 35b, and the elapsed time from step S4 reaches the second reference time. Output a closing signal to the on-off valve 35a.
  • the second reference time can be set longer than the first reference time.
  • the on-off valve 35b performs an opening operation based on the opening signal to open the pipe 34b. Accordingly, the processing liquid Lq2 from the processing liquid supply source 37b is discharged from the discharge nozzle 31b, and lands on the upper surface of the substrate W. Note that there is a delay from when the open signal is output to when the processing liquid Lq2 is actually discharged. This delay time depends on various factors such as the moving speed of the valve element due to the opening operation of the on-off valve 35b, the length of the pipe 34b and the pressure loss.
  • the on-off valve 35a performs a closing operation based on the closing signal to close the pipe 34a.
  • the control unit 9 transmits a suction signal to the suck back valve 36a.
  • the suck-back valve 36a performs a suction operation based on the suction signal to suck the processing liquid in the pipe 34a.
  • the closing operation of the on-off valve 35a and the suction operation of the suck-back valve 36a are executed in parallel with each other. Thereby, the processing liquid Lq1 on the tip side of the discharge nozzle 31a is pulled back, and the discharge of the processing liquid Lq1 is appropriately stopped.
  • FIGS. 7 and 8 schematically show an example of a frame of a captured image generated by the camera 70.
  • FIG. Frames IM3 and IM4 respectively illustrated in FIGS. 7 and 8 indicate frames at the time of switching from the discharge nozzle 31a to the discharge nozzle 31b.
  • the frame IM3 is a frame during which the on-off valves 35a and 35b are performing the closing operation and the opening operation, respectively. Therefore, in the frame IM3, the processing liquids Lq1 and Lq2 are respectively discharged from both the discharge nozzle 31a and the discharge nozzle 31b. However, the width of the processing liquid Lq1 discharged from the discharge nozzle 31a is smaller than that of the frame IM2.
  • the frame IM4 is a frame in a state where the on-off valve 35a is closed and the on-off valve 35b is open. Therefore, in the frame IM4, the processing liquid Lq1 is not discharged from the discharge nozzle 31a, and the processing liquid Lq2 is discharged from the discharge nozzle 31b.
  • the processing liquid Lq2 discharged from the discharge nozzle 31b is also a continuous flow, and has a liquid column shape extending in the vertical direction in a region from the tip of the discharge nozzle 31b to the upper surface of the substrate W.
  • the processing liquid Lq2 lands on substantially the center of the substrate W, and spreads on the upper surface of the substrate W under the centrifugal force accompanying the rotation of the substrate W. Then, it scatters from the periphery of the substrate W. Thereby, the processing liquid Lq2 acts on the entire upper surface of the substrate W, and the processing based on the processing liquid Lq2 is performed.
  • step S6 the control unit 9 stops discharging the processing liquid Lq2 from the discharge nozzle 31b.
  • the control unit 9 transmits a close signal to the on-off valve 35b when the elapsed time from the time when the open signal is output to the on-off valve 35b reaches the third reference time.
  • the on-off valve 35b performs a closing operation based on the closing signal to close the pipe 34b.
  • the suction signal is transmitted to the suck back valve 36b.
  • the suck-back valve 36b performs a suction operation in parallel with the closing operation of the on-off valve 35a, and sucks the processing liquid Lq2 in the pipe 34b.
  • the discharge of the processing liquid Lq2 from the discharge nozzle 31b is appropriately stopped. Also at this time, a delay time occurs from the output of the signal to the end of the actual discharge of the processing liquid Lq2.
  • the controller 9 may stop the rotation of the spin motor 22 after step S6 to stop the rotation of the substrate W.
  • the control unit 9 increases the rotation speed of the spin motor 22 so that the processing liquid Lq2 on the substrate W is scattered from the periphery of the substrate W by the rotational force to dry the substrate W. The rotation may be stopped.
  • step S7 the control unit 9 ends the image capturing by the camera 70.
  • step S8 the control unit 9 controls the nozzle base 33 to move the discharge nozzles 31a and 31b to the standby position.
  • the control unit 9 performs a monitoring process in step S10 in parallel with steps S4 to S6 in order to monitor the discharge / stop timing of the processing liquid.
  • This monitoring process determines whether or not the timing difference between the start timing tb at which the discharge nozzle 31b starts discharging the processing liquid Lq2 from the discharge nozzle 31b and the stop timing ta at which the discharge nozzle 31a stops discharging the processing liquid Lq1 is appropriate. This is a monitoring process.
  • FIG. 9 is a flowchart showing an example of a specific operation of the monitoring process.
  • the control unit 9 performs image processing on a captured image generated by the camera 70, and specifies a start timing tb of the discharge nozzle 31b and a stop timing ta of the discharge nozzle 31a (Step S11).
  • the discharge determination region Rb1 is a region of each frame of the captured image that extends from the tip of the discharge nozzle 31b in the discharge direction of the processing liquid Lq2 (see also FIGS. 5 to 8).
  • the processing liquid Lq2 extends vertically downward, the ejection determination region Rb1 has a long shape (for example, a rectangular shape) extending in the vertical direction of the captured image.
  • the width of the discharge determination region Rb1 in the horizontal direction is set wider than the width of the processing liquid Lq2, and the length of the discharge determination region Rb1 in the vertical direction is long enough that the discharge determination region Rb1 does not include the liquid landing position of the processing liquid Lq2. Is set to
  • FIGS. 10 and 11 are diagrams schematically illustrating an example of a luminance distribution in the horizontal direction in the ejection determination region Rb1.
  • FIG. 10 illustrates a luminance distribution when the processing liquid Lq2 is not discharged
  • FIG. 11 illustrates a luminance distribution when the processing liquid Lq2 is discharged.
  • the liquid column portion of the processing liquid Lq2 appears in the discharge determination region Rb1.
  • the illumination light enters from the same direction as the imaging direction of the camera 70, the surface of the liquid column by the processing liquid Lq2 appears to shine brightly. Therefore, as illustrated in FIG. 11, the brightness corresponding to the liquid column portion is higher than the surroundings.
  • the luminance distribution has an upwardly convex shape in the liquid column portion. That is, the luminance distribution has a characteristic caused by the shape of the liquid column of the processing liquid Lq2.
  • the liquid column shape of the processing liquid Lq2 is not shown in the discharge determination region Rb1. Therefore, as illustrated in FIG. 10, the luminance distribution naturally does not have a characteristic due to the liquid column shape of the processing liquid Lq2. Although the luminance varies depending on irregular reflection by a pattern on the upper surface of the substrate W or reflection of components inside the chamber 10, it has a relatively uniform distribution.
  • the camera 70 may be a camera that generates a grayscale captured image, or may be a camera that generates a color captured image. In the former case, it can be said that the pixel value of the captured image indicates a luminance value.
  • a camera that generates a grayscale captured image will be described as an example.
  • a luminance value may be calculated from a pixel value and the luminance value may be used.
  • the control unit 9 determines whether or not the processing liquid Lq2 is being discharged from the discharge nozzle 31b based on the pixel value in the discharge determination region Rb1. Specifically, the control unit 9 calculates a statistic A2 of pixel values in the ejection determination region Rb1.
  • the statistic A2 is a value that reflects the state of discharge of the processing liquid Lq2 from the discharge nozzle 31b, and is, for example, a sum (integral value) of pixel values in the discharge determination region Rb1. This is because the sum of the pixel values when the processing liquid Lq2 is discharged is larger than the sum of the pixel values when the processing liquid Lq2 is not discharged.
  • the variance of the pixel values may be adopted instead of the sum of the pixel values. This is because, as shown in FIGS. 10 and 11, the luminance distribution when the processing liquid Lq2 is discharged varies as compared with the luminance distribution when the processing liquid Lq2 is not discharged.
  • the variance for example, a standard deviation can be adopted.
  • the variance for all pixel values in the ejection determination region Rb1 can be adopted.
  • the processing liquid Lq2 since the processing liquid Lq2 has a liquid column shape along the vertical direction, the fluctuation of the luminance distribution in the vertical direction in the ejection determination region Rb1 is small. Therefore, pixels arranged in a line in the horizontal direction may be cut out, and the variance of the plurality of pixel values may be adopted. Alternatively, pixel values arranged in a line in the vertical direction may be integrated for each column to calculate an integrated pixel value, and the obtained variance of the integrated pixel value for each column may be employed.
  • a threshold th1 for the statistic A2 is set.
  • the statistic A2 is equal to or greater than the threshold th1, it can be determined that the processing liquid Lq2 is being discharged from the discharge nozzle 31b, and the statistic A2 is greater than the threshold th1.
  • This threshold th1 can be set in advance by experiments, simulations, or the like.
  • FIG. 12 is a graph showing an example of a temporal change of the statistic A2 in steps S4 to S6 of FIG. 12, the horizontal axis indicates the frame number of the captured image generated by the camera 70, and the vertical axis indicates the statistic. Since the frame number increases with the passage of time, it can be said that the horizontal axis indicates time.
  • the statistic A2 for the ejection determination region Rb1 is indicated by a broken line
  • the statistic A1 for the ejection determination region Ra1 described later is indicated by a solid line.
  • the statistic A2 is initially smaller than the threshold th1. This is because the discharge nozzle 31b does not discharge the processing liquid Lq2 at the beginning of the process (see step S4 in FIG. 4).
  • the statistic A2 increases and exceeds the threshold th1. That is, the statistic A2 transitions from a state smaller than the threshold th1 to a state larger than the threshold th1.
  • the timing at which the statistic A2 exceeds the threshold th1 corresponds to the start timing tb. Therefore, the start timing tb can be specified based on the change in the statistic A2. The details will be described below.
  • the control unit 9 calculates the statistic A2 for each frame, and determines whether or not the statistic A2 is larger than the threshold th1 for each frame. Then, the control unit 9 stores the determination result in the storage medium. The control unit 9 determines that the statistic A2 has exceeded the threshold th1 when the statistic A2 is smaller than the threshold th1 in the previous frame and the statistic A2 is larger than the threshold th1 in the current frame.
  • the control unit 9 specifies the start timing tb based on the previous frame and the current frame. That is, the control unit 9 determines the start timing based on the previous frame in which the statistic A2 is smaller than the threshold th1 and the current frame which is the next frame after the frame and in which the statistic A2 is larger than the threshold th1.
  • Specify tb For example, the control unit 9 may specify the generation timing of the previous frame as the start timing tb, or may specify the generation timing of the current frame as the start timing tb, or may generate the generation timing of the previous and current frames.
  • the average of the timings may be specified as the start timing tb.
  • the ejection determination area Ra1 is an area in each frame of the captured image that extends from the tip of the ejection nozzle 31a in the ejection direction of the processing liquid Lq1 (see also FIGS. 5 to 8).
  • the processing liquid Lq1 extends vertically downward, the ejection determination region Ra1 has a long shape (for example, a rectangular shape) extending in the vertical direction of the captured image.
  • the width of the discharge determination region Ra1 in the horizontal direction is set to be wider than the width of the processing liquid Lq1, and the length of the discharge determination region Rb1 in the vertical direction is such that the discharge determination region Rb1 does not include the liquid landing position of the processing liquid Lq1. Is set to
  • the luminance distribution in the discharge determination area Ra1 differs depending on whether or not the processing liquid Lq1 is discharged, as in the discharge determination area Rb1. Therefore, the control unit 9 determines the presence / absence of the discharge of the processing liquid Lq1 based on the pixel value of the discharge determination area Ra1, as in the determination of the presence / absence of the discharge of the processing liquid Lq2. More specifically, the control unit 9 calculates a statistic A1 of the pixel value in the ejection determination area Ra1.
  • the statistic A1 is the same as the statistic A2, and is a value reflecting the discharge state of the processing liquid Lq1 from the discharge nozzle 31a, and is, for example, a sum or a variance of pixel values in the discharge determination region Ra1.
  • a threshold value is set for the statistic A1 used for these determinations.
  • the threshold th1 for the statistic A2 is adopted as the threshold for the statistic A1.
  • a value different from the threshold th1 may be adopted as the threshold for the statistic A1.
  • the statistic A1 is initially larger than the threshold th1. This is because the processing liquid Lq1 is discharged from the beginning of the processing (see step S4 in FIG. 4). Then, at the time of switching the ejection nozzle in step S5, the statistic A1 decreases and falls below the threshold th1. The timing at which the statistic A1 falls below the threshold th1 corresponds to the stop timing ta. Therefore, the stop timing ta can be specified based on the change in the statistic A1. The details will be described below.
  • the control unit 9 calculates the statistic A1 for each frame, and determines whether or not the statistic A1 is greater than the threshold th1 for each frame. Then, the control unit 9 stores the determination result in the storage medium. When the statistic A1 is larger than the threshold th1 in the previous frame and the statistic A1 is smaller than the threshold th1 in the current frame, the control unit 9 determines that the statistic A1 is lower than the threshold th1.
  • the control unit 9 specifies the stop timing ta based on the previous frame and the current frame. That is, the control unit 9 determines the stop timing based on the previous frame in which the statistic A1 is larger than the threshold th1 and the current frame which is the next frame after the frame and in which the statistic A2 is smaller than the threshold th1.
  • ta is specified.
  • the control unit 9 may specify the previous frame generation timing as the stop timing ta, or may specify the current frame generation timing as the stop timing ta, or generate the previous and current frames. The average of the timings may be specified as the stop timing ta.
  • step S12 the control unit 9 calculates a timing difference between the start timing tb and the stop timing ta. Specifically, the control unit 9 calculates a timing difference by subtracting the start timing tb from the stop timing ta.
  • step S13 the control unit 9 determines whether the timing difference is outside a predetermined range.
  • the predetermined range may be set in advance, for example, and stored in a storage medium.
  • the lower limit and the upper limit of the predetermined range have, for example, positive values.
  • the control unit 9 executes an error notification process in step S14.
  • the control unit 9 causes the display of the user interface 90 to display an error.
  • a sound output unit such as a buzzer or a speaker
  • the control unit 9 may output an error to the sound output unit.
  • the display and the sound output unit of the user interface 90 are examples of a notification unit.
  • the control unit 9 causes this notifying unit to notify an error. Such notification allows the operator to recognize that the timing difference is outside the predetermined range.
  • step S15 the control unit 9 adjusts at least one of the start timing tb and the stop timing ta so that the timing difference falls within a predetermined range.
  • the control unit 9 updates, for example, the start timing tb of the discharge nozzle 31b to a later timing in order to reduce the timing difference to fall within the predetermined range.
  • the control unit 9 updates the second reference time defining the timing for opening the on-off valve 35b to a longer value so that the timing difference falls within a predetermined range, and updates the second reference time.
  • the two reference times are stored in the storage medium. According to this, the timing of outputting the open signal to the on-off valve 35b is delayed at the time of the switching of step S5 after the next time, so the start timing tb is delayed. Therefore, the timing difference in the next and subsequent steps S5 (step S5 for the next substrate) can be reduced to be within a predetermined range.
  • control unit 9 may update the stop timing ta of the discharge nozzle 31a to an earlier timing.
  • the control unit 9 updates the first reference time that defines the timing for closing the on-off valve 35a to a shorter value so that the timing difference falls within a predetermined range, and updates the first reference time.
  • One reference time is stored in the storage medium. According to this, the timing of outputting the close signal to the on-off valve 35a is advanced at the time of the switching of step S5 after the next time, so the stop timing ta is advanced. Therefore, the timing difference in the next and subsequent steps S5 can be reduced to be within a predetermined range.
  • both the start timing tb and the stop timing ta may be adjusted to reduce the timing difference.
  • the timing difference When the timing difference is larger than the upper limit of the predetermined range, the overlap period in which both the processing liquids Lq1 and Lq2 are discharged is long. That is, the total amount of the processing liquid discharged to the substrate W temporarily increases. Thereby, for example, the processing liquid Lq2 collides with the processing liquid Lq1 on the substrate W, and a liquid splash in which a part of the processing liquid jumps on the substrate W may occur.
  • the timing difference in the next and subsequent steps S5 when the timing difference is larger than the upper limit of the predetermined range, the timing difference in the next and subsequent steps S5 can be within the predetermined range. Therefore, by adopting a value that does not cause liquid splashing as the upper limit of the predetermined range, generation of liquid splashing can be substantially avoided.
  • the control unit 9 updates, for example, the start timing tb of the ejection nozzle 31b to an earlier timing so as to increase the timing difference to fall within the predetermined range. .
  • the control unit 9 updates the second reference time defining the timing for opening the on-off valve 35b to a shorter value so that the timing difference falls within a predetermined range, and updates the second reference time after the update.
  • the second reference time is stored in a storage medium.
  • control unit 9 may update the stop timing ta of the discharge nozzle 31a to a later timing.
  • control unit 9 updates the first reference time that defines the timing for closing the on-off valve 35a to a longer value so that the timing difference falls within a predetermined range, and updates the first reference time after the update.
  • One reference time is stored in the storage medium. Also in this case, the timing difference in the next and subsequent steps S5 can be increased to be within a predetermined range.
  • the timing difference may be increased by adjusting both the start timing tb and the stop timing ta.
  • the overlap period in which both of the processing liquids Lq1 and Lq2 are discharged becomes shorter. That is, the discharge of the processing liquid Lq2 is started with the discharge of the processing liquid Lq1 substantially stopped. Since the substrate W is rotating, the processing liquid Lq1 supplied to the upper surface moves to the peripheral edge of the substrate W under the centrifugal force. Therefore, if the discharge amount of the processing liquid Lq1 decreases in a state where the processing liquid Lq2 is not discharged, the processing liquid Lq1 on the upper surface of the substrate W decreases, and the substrate W may be partially dried (liquid withering).
  • the upper surface of the substrate W can be partially dried in the vicinity of the liquid landing position of the processing liquid Lq1. Such drying is not preferred because it may cause defects (for example, adhesion of particles) on the upper surface of the substrate W.
  • the timing difference in the next and subsequent steps S5 can be kept within the predetermined range. Therefore, by setting the lower limit of the predetermined range to a value that does not cause partial drying of the substrate W, the occurrence of partial drying of the substrate W can be substantially avoided.
  • the timing difference when the timing difference is out of the predetermined range, the timing difference is adjusted to be within the predetermined range. Therefore, it is possible to substantially avoid the occurrence of a problem (liquid splashing and liquid withering) due to the timing difference being outside the predetermined range.
  • the start timing tb of the discharge nozzle 31b and the stop timing ta of the discharge nozzle 31a are specified by image processing on the image captured by the camera 70. That is, since the start timing tb and the stop timing ta can be specified based on the actual discharge states of the processing liquids Lq1 and Lq2, the specification accuracy can be improved. Since the timing difference is calculated based on the timing specified with high accuracy, the calculation accuracy of the timing difference is also high. Therefore, the timing difference can be kept within a predetermined range with higher accuracy.
  • the processing can be made lighter than when image processing is performed on the entire captured image.
  • the upper limit of the predetermined range is set to a value that does not cause liquid splashing
  • the lower limit is set to a value that does not cause liquid withering.
  • the upper limit value may be set to a smaller value or the lower limit value may be set to a larger value due to other factors or the like.
  • the timing difference may affect the quality of the processing result on the substrate W.
  • the upper limit value and the lower limit value of the predetermined range of the timing difference may be set by an experiment, a simulation, or the like so that the processing result is good.
  • the delay time in the discharge control may be different for each processing unit 1 due to manufacturing variations of the plurality of processing units 1 and the like.
  • the control unit 9 performs the above-described operation for each processing unit 1, the timing difference can be adjusted within a predetermined range for each processing unit 1.
  • the recipe information of the substrate W has been changed for each processing unit 1 in order to adjust the start timing tb and the stop timing ta for each processing unit 1.
  • management of recipe information has been complicated.
  • processing can be performed with an optimal timing difference for each processing unit 1, and stable processing performance can be realized.
  • FIG. 13 is a graph showing another example of the time change of the statistic.
  • the start timing tb of the discharge nozzle 31b appears after the stop timing ta of the discharge nozzle 31a, and the timing difference is relatively large. Also in such a case, since the processing liquid Lq1 on the upper surface of the substrate W decreases before the processing liquid Lq2 is discharged, the substrate W may be partially dried.
  • the timing difference obtained by subtracting the start timing tb from the stop timing ta in step S12 has a negative value. Therefore, this timing difference is smaller than the lower limit of the predetermined range, and in step S13, the control unit 9 determines that the timing difference is outside the predetermined range. Therefore, in step S15, the control unit 9 adjusts at least one of the start timing tb and the stop timing ta so that the timing difference falls within a predetermined range. Therefore, the timing difference can be set within a predetermined range in the next and subsequent steps S5.
  • the above description can be understood as an operation at the time of processing the substrate W, or as an operation at the time of initial setting performed when the substrate processing apparatus 100 is installed. That is, at the time of initial setting, if a temporary timing is set and the processing is actually performed on the substrate W, an appropriate timing is set when the temporary timing is inappropriate.
  • control unit 9 has determined whether the timing difference is out of the predetermined range. However, the operator may make the determination. Hereinafter, a specific description will be given.
  • the control unit 9 causes the display of the user interface 90 to display a graph showing the time change of the statistics A1 and A2. It is desirable that the control unit 9 also displays the threshold value th1 on the graph. Specifically, the control unit 9 displays the graph shown in FIG. 12 or 13 on the display. Thereby, the operator can visually recognize the time change of the statistics A1 and A2, and can determine the magnitude of the timing difference.
  • the input unit of the user interface 90 receives an input for adjusting an input for adjusting the start timing tb and the stop timing ta. For example, when determining that the timing difference is larger than the upper limit of the predetermined range, the operator performs at least one of an input for delaying the start timing tb and an input for advancing the stop timing ta to the input unit. .
  • the input unit outputs the input information to the control unit 9.
  • the control unit 9 adjusts at least one of the start timing tb and the stop timing ta according to the input information. The same applies when the operator determines that the timing difference is smaller than the lower limit of the predetermined range.
  • the operator can visually recognize the time change of the statistics A1 and A2, and adjust the timing difference based on the time change.
  • the control unit 9 obtains the timing difference between the start timing tb and the stop timing ta based on the statistics of the pixel values, but is not limited thereto.
  • the control unit 9 may determine the timing difference between the start timing tb and the stop timing ta by machine learning in the monitoring process.
  • FIG. 14 is a diagram schematically showing an example of the internal configuration of the control unit 9.
  • the control unit 9 includes a classifier 91 and a machine learning unit 92. Each frame of the captured image from the camera 70 is sequentially input to the classifier 91.
  • the classifier 91 classifies the input frames into the following four categories C1 to C4 relating to the discharge / stop of the discharge nozzles 31a and 31b. Categories can also be called classes.
  • the four categories C1 to C4 are categories indicating the ejection states shown in FIGS. 5 to 8, respectively. More specifically, category C1 is a category indicating a state where both of the discharge nozzles 31a and 31b are not discharging the processing liquid (FIG. 5), and category C2 is a state where only the discharge nozzle 31a discharges the processing liquid.
  • category C3 is a category indicating a state where both of the discharge nozzles 31a and 31b are discharging the processing liquid (FIG. 7), and the category C4 is a category indicating a state where both of the discharge nozzles 31a and 31b are discharging the processing liquid. Only the category indicating the state where the processing liquid is being discharged (FIG. 8).
  • This classifier 91 is generated by the machine learning unit 92 using a plurality of teacher data. That is, it can be said that the classifier 91 is a machine-learned classifier.
  • the machine learning unit 92 uses, for example, a neighborhood method, a support vector machine, a random forest, or a neural network (including deep learning) as a machine learning algorithm.
  • the teacher data includes learning data and a label indicating which category the learning data is classified into.
  • the learning data is a frame of a captured image captured by the camera 70, and is generated in advance.
  • a correct category is assigned to each learning data as a label. This assignment can be performed by an operator's operation on the user interface 90, for example.
  • the machine learning unit 92 generates a classifier 91 by performing machine learning based on the teacher data.
  • the classifier 91 includes a feature vector extraction unit 911, a determination unit 912, and a storage medium in which a determination database 913 is stored.
  • Each frame of the captured image from the camera 70 is sequentially input to the feature vector extraction unit 911.
  • the feature vector extraction unit 911 extracts a feature vector of a frame according to a predetermined algorithm. This feature vector is a vector that easily indicates a feature amount according to the ejection state of the ejection nozzles 31a and 31b. A known algorithm can be adopted as the algorithm.
  • the feature vector extraction unit 911 outputs the feature vector to the determination unit 912.
  • the determination database 913 stores a plurality of feature vectors (hereinafter, referred to as reference vectors) generated from the plurality of teacher data by the machine learning unit 92, and the reference vectors are classified into each of the categories C1 to C4. I have. Specifically, the machine learning unit 92 generates a plurality of reference vectors by applying the same algorithm as the feature vector extraction unit 911 to a plurality of teacher data. Then, the machine learning unit 92 assigns a label (correct category) of teacher data to the reference vector.
  • reference vectors a plurality of feature vectors
  • the determination unit 912 classifies the frame based on the feature vector input from the feature vector extraction unit 911 and a plurality of reference vectors stored in the determination database 913. For example, the determination unit 912 may specify a reference vector whose feature vector is closest, and may classify the frame into the category of the specified reference vector (nearest neighbor method). Accordingly, the determination unit 912 can classify the frame input to the classifier 91 (the feature vector extraction unit 911) into one of the categories C1 to C4.
  • the control unit 9 classifies each frame by the classifier 91, and obtains a timing difference between the start timing tb of the discharge nozzle 31b and the stop timing ta of the discharge nozzle 31a based on the classification result.
  • FIG. 15 is a flowchart illustrating an example of the operation of the control unit 9 in the monitoring process.
  • the control unit 9 specifies the start timing tb and the stop timing ta by machine learning as image processing.
  • FIG. 16 is a diagram schematically illustrating an example of a plurality of frames F of a captured image.
  • the frames F are arranged side by side in chronological order.
  • the reason for changing the sign of the frame is to facilitate the explanation, and the frame F is of the same type as the frames IM1 to IM4 described above.
  • step S4 when the processing liquid Lq1 is discharged from the discharge nozzle 31a (step S4), the subsequent frames F [k + 1] to F [m] are classified into the category C2 by the classifier 91 as illustrated in FIG.
  • Step S5 the nozzle that discharges the processing liquid is switched from the discharge nozzle 31a to the discharge nozzle 31b (Step S5). That is, the processing liquid Lq2 is discharged from the discharge nozzle 31b. Therefore, as illustrated in FIG. 16, the subsequent frames F [m + 1] to F [n] are classified by the classifier 91 into the category C3. Next, the discharge of the processing liquid Lq1 from the discharge nozzle 31a is stopped. Therefore, as illustrated in FIG. 16, the frames after the subsequent frame F [n + 1] and before the discharge of the processing liquid Lq2 from the discharge nozzle 31b are completed are classified into the category C4 by the classifier 91. You.
  • the control unit 9 specifies the start timing tb and the stop timing ta based on the classification result of each frame, as described in detail below.
  • the control unit 9 determines the m-th frame F [m] classified as stopped (category C2) for the ejection nozzle 31b and the (m + 1) -th frame following the frame F [m],
  • the start timing tb of the ejection nozzle 31b is specified based on the frame F [m + 1] classified as ejection (category C3) for 31b.
  • the control unit 9 may specify the generation timing of the frame F [m] as the start timing tb, may specify the generation timing of the frame F [m + 1] as the start timing tb, or may specify the generation timing of the frame F [m]. ], F [m + 1] may be specified as the start timing tb.
  • the control unit 9 determines the nth frame F [n] classified as ejection (category C3) for the ejection nozzle 31a and the (n + 1) th frame following the frame F [n], and
  • the stop timing ta of the discharge nozzle 31a is specified based on the frame F [n + 1] classified as stop (category C4) for the nozzle 31a.
  • the control unit 9 may specify the generation timing of the frame F [n] as the stop timing ta, may specify the generation timing of the frame F [n + 1] as the stop timing ta, or may specify the frame F [n]. ], F [n + 1] may be specified as the stop timing ta.
  • the ejection state of the ejection nozzles 31a and 31b can be determined (classified) by machine learning. Therefore, each frame can be classified with high accuracy. As a result, the timing difference between the stop timing ta and the start timing tb can be determined with high accuracy.
  • Steps S41 to S44 after specifying the stop timing ta and the start timing tb are the same as steps S12 to S15 in the first embodiment, respectively.
  • the control unit 9 may calculate the timing difference based on the number of frames classified into the category C3. For example, the control unit 9 may calculate the timing difference by multiplying the time between frames by the number of frames.
  • the control unit 9 may cut out images indicating the ejection determination regions Ra1 and Rb1 in the frame F and input the images to the classifier 91.
  • images indicating the ejection determination regions Ra1 and Rb1 are adopted as the learning data input to the machine learning unit 92.
  • the classifier 91 can perform the classification by removing the influence of the area having low relation with the ejection state, the classification accuracy can be improved. As a result, the accuracy of specifying the start timing tb and the stop timing ta can be improved. Further, the use of the ejection determination regions Ra1 and Rb1 can reduce the processing compared to the case where the processing by the classifier 91 is performed on the entire frame of the captured image.
  • the classifier 91 may classify the discharge state for each image in the discharge determination areas Ra1 and Rb2. That is, the classifier 91 may classify the image into the following two categories Ca1 and Ca2 based on the image of the ejection determination area Ra1. That is, the category Ca1 indicates a state where the discharge nozzle 31a is not discharging the processing liquid Lq1, and the category Ca2 indicates a state where the discharge nozzle 31a is discharging the processing liquid Lq1. Similarly, the classifier 91 may classify the image into the following two categories Cb1 and Cb2 based on the image of the ejection determination region Rb2. That is, category Cb1 indicates a state where the discharge nozzle 31b is not discharging the processing liquid Lq2, and category Cb2 indicates a state where the discharge nozzle 31b is discharging the processing liquid Lq2.
  • the control unit 9 determines the start timing tb of the discharge nozzle 31b based on the frame including the image classified into the category Cb1 and the frame next to the image and including the image classified into the category Cb2. Can be identified. The same applies to the stop timing ta of the discharge nozzle 31a.
  • a pixel value group of pixels arranged in a horizontal line in each of the ejection determination regions Ra1 and Rb1 may be adopted. This is because, as shown in FIGS. 10 and 11, the pixel value group of the pixels arranged in the horizontal direction changes depending on whether or not the processing liquid is discharged.
  • an integrated value group including, for each column, an integrated value which is a sum of pixel values arranged in a line in the vertical direction may be employed.
  • FIG. 17 is a functional block diagram schematically illustrating an example of the internal configuration of the control unit 9.
  • the control unit 9 is the same as FIG. 14 except that the control unit 9 includes a plurality of classifiers 91.
  • the plurality of classifiers 91 are also generated by the machine learning unit 92.
  • the machine learning unit 92 generates a plurality of classifiers 91 for each type of the processing liquids Lq1 and Lq2.
  • three classifiers 91A to 91C are shown as the plurality of classifiers 91.
  • the classifiers 91A to 91C are classifiers generated using different teacher data.
  • each frame of a captured image when processing using a certain type of processing liquid Lq1 or Lq2 is employed as learning data.
  • the machine learning unit 92 generates a classifier 91A based on the teacher data including the learning data.
  • a classifier 91A for a set of types of the processing liquids Lq1 and Lq2 (hereinafter, a first set) can be generated.
  • the machine learning unit 92 generates a classifier 91B for the second set based on the teacher data using the second set of processing liquids Lq1 and Lq2 different from the first set, and
  • the classifier 91C for the third set is generated based on the teacher data using the third set of the processing liquids Lq1 and Lq2, which are different from each other.
  • the feature vector extraction unit 911 and the determination unit 912 are provided in common in the classifiers 91A to 91C, and the classifiers 91A to 91C are distinguished by the determination database 913.
  • the machine learning unit 92 generates the classifiers 91A to 91C by generating the first to third determination databases 913A to 913C, respectively.
  • the determination database 913A a reference vector when the first set of treatment liquids Lq1 and Lq2 is used is recorded together with the correct category. The same applies to the judgment databases 913B and 913C.
  • FIG. 18 is a diagram illustrating an example of the input screen 90a displayed on the display of the user interface 90.
  • a table 901 relating to the processing liquids Lq1 and Lq2 is displayed on the input screen 90a.
  • the type, flow rate, rotation speed of the substrate W and processing time are shown.
  • Various kinds of information in the table 901 can be changed by input to an input unit by an operator. For example, by clicking (or touching) a corresponding part in the table 901, information on the corresponding part can be input. For example, the click displays a plurality of pieces of information in a corresponding portion in a pull-down format, and the operator can input information by selecting one of them.
  • a soft key 902 for selecting the type of the substrate W is also displayed.
  • the operator can select the soft key 902 by clicking or touching the soft key 902.
  • the type of the substrate W include a silicon (Si) substrate and a silicon carbide (SiC) substrate.
  • the determination database can be selected depending on whether or not a film is formed on the upper surface of the substrate W, or the type of film formed on the upper surface of the substrate W (for example, SiO 2 , SiN, TiN, or the like). Good.
  • the input screen 90a illustrated in FIG. 18 shows an area 903 where a captured image generated by the camera 70 is displayed. Thereby, the worker can visually recognize the captured image of the camera 70.
  • the user interface 90 outputs the input information (the types of the processing liquids Lq1 and Lq2, etc.) input by the operator to the determination unit 912.
  • the feature vector extracting unit 911 extracts the feature vector of the input frame and outputs the feature vector to the determining unit 912.
  • the determination unit 912 selects the determination database 913 according to the types of the processing liquids Lq1 and Lq2 to be processed.
  • the determination unit 912 classifies the frame using the input feature vector and the selected reference vector of the determination database 913.
  • the classification accuracy of the frames can be improved.
  • the accuracy of calculating the timing difference can be improved.
  • the classifier 91 is generated for each type of the processing liquids Lq1 and Lq2.
  • the classifier 91 may be generated for each type of the substrate W or each flow rate of the processing liquids Lq1 and Lq2.
  • the position of the ejection nozzle in the image captured by the camera 70 may be different.
  • the positional relationship between the discharge nozzles 31 of the processing liquid supply unit 30 and the positional relationship between the discharge nozzles 66 of the processing liquid supply unit 65 may be different.
  • the classifier 91 may be generated for each position of the discharge nozzle. These may be combined.
  • the classifier 91 according to the type of the set of the processing liquids Lq1 and Lq2 and the type of the substrate W may be generated. For example, when there are N types of sets of the processing liquids Lq1 and Lq2 and there are M types of substrates W, (N ⁇ M) classifiers 91 corresponding to the combinations may be generated. .
  • the user interface 90 receives input of information (at least one of the types of the processing liquids Lq1 and Lq2, the type of the substrate W, the flow rates of the processing liquids Lq1 and Lq2, and the positions of the discharge nozzles) necessary for selecting the classifier 91. What is necessary is just to accept and output the input information to the control part 9.
  • the information required for selecting the classifier 91 does not always need to be input by the operator.
  • substrate information on the substrate W may be transmitted from the upstream side of the substrate processing apparatus 100 to the control unit 9, and the substrate information may include such information.
  • the control unit 9 may select the classifier 91 based on the information in the board information.
  • control unit 9 provided in the substrate processing apparatus 100 generates the classifier 91 by machine learning, and classifies the frame by the classifier 91.
  • the control unit 9 may be provided in the server.
  • FIG. 19 is a functional block diagram schematically showing an example of an electrical configuration of the substrate processing system.
  • the substrate processing system includes a substrate processing apparatus 100 and a server 200.
  • the control unit 9 of the substrate processing apparatus 100 communicates with the server 200 via the communication unit 93.
  • the communication unit 93 is a communication interface, and can communicate with the server 200 by wire or wirelessly.
  • the server 200 includes a machine learning unit 210 and a storage medium in which the determination database 220 is stored.
  • the machine learning unit 210 has the same function as the machine learning unit 92, and can generate the determination database 220 based on the teacher data.
  • the judgment database 220 is similar to the judgment database 913.
  • the determination unit 912 transmits a request signal requesting the determination database 220 to the server 200 via the communication unit 93.
  • the server 200 transmits the determination database 220 to the communication unit 93 according to the request signal.
  • the determination unit 912 can use the determination database 220 stored in the server 200. In this case, the machine learning unit 92 and the determination database 913 are not always necessary.
  • the determination unit 912 is provided in the control unit 9 of the substrate processing apparatus 100, but the determination unit 912 may be provided in the server 200.
  • the feature vector extraction unit 911 transmits the feature vector to the server 200 via the communication unit 93.
  • the server 200 classifies the frame based on the received feature vector and the judgment database 220, and transmits the classification result to the communication unit 93.
  • the communication unit 93 outputs the classification result to the control unit 9.
  • the feature vector extraction unit 911 is provided in the control unit 9 of the substrate processing apparatus 100, but the feature vector extraction unit 911 may be provided in the server 200. That is, the classifier 91 itself may be provided in the server 200.
  • the control unit 9 transmits the frame of the captured image generated by the camera 70 to the server 200 via the communication unit 93.
  • the server 200 extracts a feature vector from the frame, classifies the frame using the extracted feature vector and the determination database 220, and transmits the classification result to the communication unit 93.
  • the communication unit 93 outputs the classification result to the control unit 9.
  • the server 200 is provided with the determination processing function, so that a common determination can be made for a plurality of substrate processing units.
  • the machine learning unit 210 of the server 200 generates the determination database 220 for at least one of the types of the processing liquids Lq1 and Lq2, the type of the substrate W, the flow rates of the processing liquids Lq1 and Lq2, and the positions of the discharge nozzles 31a and 31b. May be.
  • the control unit 9 may transmit information specifying the determination database 220 to be used to the server 200 via the communication unit 93.
  • FIG. 20 shows a model of the neural network (including deep learning) NN1.
  • This model includes an input layer, an intermediate layer (hidden layer), and an output layer.
  • Each layer has a plurality of nodes (artificial neurons), and each node receives weighted output data of a node in the preceding layer, and inputs the data. That is, to each node, a multiplication result obtained by multiplying the output of the preceding node by each weighting coefficient is input.
  • Each node outputs, for example, the result of a known function.
  • the number of intermediate layers is not limited to one and can be set arbitrarily.
  • the machine learning unit 92 determines weighting coefficients used for weighting between nodes by performing learning based on teacher data. This weighting coefficient is stored as a determination database. Thereby, the machine learning unit 92 can substantially generate the classifier 91.
  • the frame F generated by the camera 70 is input to the input layer.
  • the classifier 91 uses the respective weighting factors stored in the determination database to perform an arithmetic operation on the output layer through the intermediate layer based on the frame F based on the frame F, so that the frame F corresponds to each of the categories C1 to C4. Calculate the probability. Then, the classifier 91 classifies the frame into the category having the highest probability.
  • frames can be classified by a neural network.
  • the classifier 91 automatically generates a feature value, there is no need for a designer to determine a feature vector.
  • a plurality of classifiers 91 may be generated in the neural network.
  • the machine learning unit 92 may perform machine learning using the teacher data for each type of the processing liquids Lq1 and Lq2, and generate a determination database (weighting coefficient) for each type.
  • the classifier 91 specifies the types of the processing liquids Lq1 and Lq2 according to the input information from the user interface 90, and classifies the frame F using a determination database according to the types. Thereby, the classification accuracy can be improved.
  • the classifier 91 may be generated not only for the types of the processing liquids Lq1 and Lq2 but also for at least one of the type of the substrate W, the flow rates of the processing liquids Lq1 and Lq2, and the positions of the discharge nozzles 31a and 31b.
  • Second embodiment An example of the configuration of the substrate processing apparatus 100 according to the second embodiment is similar to that of the first embodiment, and an example of the operation is as shown in FIG. However, a specific example of the monitoring process is different from the first embodiment.
  • the control unit 9 performs image processing on a captured image captured by the camera 70 to detect liquid splash generated when switching from the discharge nozzle 31a to the discharge nozzle 31b. I do. That is, in the monitoring process, it is monitored whether or not liquid splash has occurred.
  • FIG. 21 is a diagram illustrating an example of a frame of a captured image generated by the camera 70.
  • the processing liquids Lq1 and Lq2 are discharged from the discharge nozzles 31a and 31b, respectively, and the liquid is splashed. The liquid splash is generated near the liquid landing position of the discharge nozzles 31a and 31b.
  • the liquid splash determination area R2 can be set in the captured image to detect the liquid splash. Specifically, the liquid splash determination region R2 is set near the ejection nozzles 31a and 31b in each frame of the captured image.
  • the liquid splash determination region R2 is a region within the region where the substrate W is captured, and is set to a region that does not overlap with the discharge determination regions Ra1 and Rb1.
  • the liquid splash determination area R2 has determination areas R21 and R22, which are set near the ejection nozzles 31a and 31b. More specifically, the determination regions R21 and R22 are located on opposite sides of a pair of the ejection nozzles 31a and 31b in the horizontal direction of the captured image. In the example of FIG. 21, the determination regions R21 and R22 have a rectangular shape.
  • the control unit 9 calculates a statistic B1 that is the sum or variance (for example, standard deviation) of the pixel values of the determination region R21 and a statistic B2 that is the sum or variance (for example, standard deviation) of the pixel values of the determination region R22. calculate.
  • FIG. 22 is a graph showing an example of a temporal change of the statistics B1 and B2.
  • the example of FIG. 22 shows the statistics B1 and B2 when the liquid splash occurs in step S5.
  • the statistics B1 and B2 increase and exceed the threshold th2.
  • the threshold value th2 can be set in advance by, for example, an experiment or a simulation.
  • control unit 9 An example of the operation of the control unit 9 is the same as the flowchart of FIG. However, the specific contents of the monitoring process in step S10 are different. In the second embodiment, this monitoring process is a process of monitoring whether or not liquid splash has occurred based on a captured image generated by the camera 70.
  • FIG. 23 is a flowchart illustrating an example of the operation of the monitoring process according to the second embodiment.
  • the control unit 9 initializes the value nn to 1.
  • the control unit 9 determines whether or not liquid splash has occurred based on the nn-th frame F [nn]. Specifically, the control unit 9 calculates the statistics B1 and B2 of the pixel values in the determination regions R21 and R22 of the frame F [nn].
  • the control unit 9 determines whether at least one of the statistics B1 and B2 is equal to or larger than the threshold th2. When both the statistics B1 and B2 are less than the threshold value th2, it is determined that no liquid splash has occurred, and in step S32, the control unit 9 updates the value nn by adding 1 to the value nn. Step S31 is executed for the value nn. That is, when both the statistics B1 and B2 are less than the threshold th2, it is determined that no liquid splash has occurred, and the determination in step S31 for the next frame is performed.
  • the control unit 9 performs error notification processing in step S33.
  • the control unit 9 causes the display of the user interface 90 to display an error.
  • a sound output unit such as a buzzer or a speaker
  • the control unit 9 may output an error to the sound output unit.
  • step S34 the control unit 9 adjusts one of the start timing tb and the stop timing ta. That is, when at least one of the statistics B1 and B2 is equal to or greater than the threshold th2, it is determined that the liquid splash has been detected, and the control unit 9 adjusts the timing difference. Specifically, at least one of the start timing tb and the stop timing ta is adjusted so that the timing difference becomes small.
  • the reduction amount of the timing difference may be determined in advance. That is, in step S34, the control unit 9 may reduce the timing difference by a predetermined reduction amount. Then, the control unit 9 executes the operation of FIG. 4 again. At this time, when the liquid splash is detected again in step S31, the timing difference is reduced again by the reduced amount in step S34. By repeating this series of operations, the timing difference is adjusted to a value that does not cause liquid splash.
  • the timing difference is adjusted so that the liquid splash does not occur. This can avoid or suppress the generation of liquid splash in the subsequent processing.
  • control unit 9 detects the liquid splash based on the statistics B1 and B2 of the pixel values, but is not limited thereto.
  • the control unit 9 may detect the splash by machine learning.
  • the classifier 91 classifies each frame input from the camera 70 into the following two categories C11 and C12. That is, category C11 is a category indicating a state in which liquid splash has not occurred, and category C12 is a category indicating a state in which liquid splash has occurred.
  • Teacher data is generated by adopting a frame of an image captured by the camera 70 as learning data and assigning a correct category as a label to the learning data.
  • the machine learning unit 92 generates a judgment database 913 based on the teacher data.
  • step S31 the control unit 9 determines whether or not liquid splash is detected based on the classification result of the classifier 91 for the frame F [nn]. Specifically, when the frame F [nn] is classified into the category C11, the control unit 9 determines that the liquid splash is not detected, and when the frame F [nn] is classified into the category C12. Then, it is determined that the liquid splash has been detected.
  • the frame F [nn] is classified into the category C11 or the category C12 by the classifier 91 generated by the machine learning. Therefore, frames can be classified with high classification accuracy. As a result, liquid splash can be detected with high detection accuracy.
  • a plurality of classifiers 91 may be generated as in the first embodiment, and some or all of the classifier 91 and the machine learning unit 92 are provided in the server 200. You may.
  • FIG. 24 is a diagram illustrating an example of the configuration of the processing unit 1A.
  • the processing unit 1A is the same as the processing unit 1 except for the presence or absence of the processing liquid supply unit 80.
  • the processing liquid supply unit 80 has a discharge nozzle 81, and the discharge nozzle 81 is provided on a side of the substrate W and at a position higher than the upper surface of the substrate W.
  • the discharge nozzle 81 discharges the processing liquid Lq3 along a substantially horizontal direction so that the processing liquid lands on the upper surface of the substrate W.
  • the processing liquid Lq3 is discharged in an arc shape from the tip of the discharge nozzle 81 and reaches near the center of the upper surface of the substrate W.
  • the discharge nozzle 81 is connected to a processing liquid supply source 84 via a pipe 82.
  • An on-off valve 83 is provided in the middle of the pipe 82. When the on-off valve 83 is opened, the processing liquid Lq 3 from the processing liquid supply source 84 flows through the inside of the pipe 82 and is discharged from the discharge nozzle 81.
  • a processing liquid is sequentially supplied to the substrate W using a discharge nozzle (for example, the discharge nozzle 31a) positioned above the substrate W and a discharge nozzle 81 positioned on the side of the substrate W.
  • a discharge nozzle for example, the discharge nozzle 31a
  • the processing on the substrate W is performed with the processing liquid Lq1 from the discharge nozzle 31a, and thereafter, the nozzle for discharging the processing liquid is switched from the discharge nozzle 31a to the discharge nozzle 81, and the processing liquid Lq3 from the discharge nozzle 81 is applied to the substrate W.
  • the processing liquid Lq3 from the discharge nozzle 81 is applied to the substrate W.
  • the camera 70 is provided at a position where the tip of the discharge nozzles 31a, 31b, 81 is included in the imaging area. Each frame of the captured image generated by the camera 70 is sequentially output to the control unit 9.
  • FIG. 25 is a diagram schematically illustrating an example of a frame IM6 of a captured image generated by the camera 70.
  • the discharge nozzles 31a and 81 discharge the processing liquids Lq1 and Lq3, respectively. That is, the frame IM6 is a frame at the timing when both of the nozzles 31a and 81 are switched to discharge the processing liquid.
  • An ejection determination region Rc1 is set in the frame IM6.
  • the discharge determination region Rc1 is provided on the discharge path of the processing liquid Lq3 from the tip of the discharge nozzle 81 to the liquid landing position on the substrate W, and extends, for example, in the direction in which the processing liquid Lq3 extends from the tip of the discharge nozzle 81. ing.
  • the ejection determination region Rc1 has, for example, a rectangular shape.
  • the control unit 9 determines whether or not the processing liquid Lq3 is discharged from the discharge nozzle 81 based on the magnitude of the statistic of the pixel value in the discharge determination region Rc1, similarly to the discharge nozzles 31a and 31b, and starts the discharge nozzle 81. Identify timing.
  • a sum or a variance of pixel values in the ejection determination region Rc1 can be adopted.
  • the discharge nozzle 81 discharges the processing liquid Lq3 in the horizontal direction, the variation in the luminance distribution in the horizontal direction is small, and the characteristics of the processing liquid Lq3 appear in the luminance distribution in the vertical direction. Therefore, the variance of the pixels arranged in a line in the vertical direction may be adopted as the statistic.
  • the pixel values of the pixels arranged in the horizontal direction may be integrated for each row, and the variance of the integrated value for each row may be adopted.
  • control unit 9 also specifies the stop timing of the discharge nozzle 31a.
  • the control unit 9 calculates a timing difference between the start timing of the discharge nozzle 81 and the stop timing of the discharge nozzle 31a. For example, the timing difference is calculated by subtracting the start timing from the stop timing. Then, the control unit 9 determines whether or not the timing difference is out of the predetermined range. When the timing difference is outside the predetermined range, the control unit 9 adjusts one of the stop timing of the discharge nozzle 31a and the stop timing of the discharge nozzle 81 so that the timing difference falls within the predetermined range.
  • the timing difference between the stop timing of the discharge nozzle 31a and the start timing of the discharge nozzle 81 can be adjusted.
  • Whether or not the processing liquid Lq3 is discharged from the discharge nozzle 81 may be determined by a machine-learned classifier as in the second embodiment.
  • a liquid splash determination region R3 is provided.
  • the liquid splash determination region R3 is provided within a region where the substrate W is captured, and is located in a region opposite to the ejection nozzle 81 with respect to the center of the substrate W.
  • the liquid splash determination region R3 has, for example, a rectangular shape.
  • the control unit 9 determines the presence or absence of the liquid splash based on the magnitude of the statistic of the pixel value in the liquid splash determination region R3, similarly to the liquid splash determination region R2. As the statistic here, the sum or variance of the pixel values in the liquid splash determination region R3 can be adopted. Then, when the liquid splash occurs, the control unit 9 adjusts the timing difference between the start timing of the discharge nozzle 81 and the stop timing of the discharge nozzle 31a so that the liquid splash does not occur.
  • the determination of the presence or absence of the liquid splash may be performed by a machine-learned classifier as in the fourth embodiment.
  • both the first embodiment and the second embodiment may be performed to monitor both the timing difference and the liquid splash.
  • a semiconductor substrate has been described as the substrate W, the present invention is not limited to this.
  • a substrate such as a glass substrate for a photomask, a glass substrate for a liquid crystal display, a glass substrate for a plasma display, a substrate for an FED (Field Emission Display), a substrate for an optical disk, a substrate for a magnetic disk, or a substrate for a magneto-optical disk may be used. Good.
  • the present embodiment can be applied to any apparatus that performs a predetermined process by discharging a processing liquid from a movable nozzle to a substrate.
  • a rotary coating apparatus spin coater
  • spin coater that discharges a photoresist liquid from a nozzle onto a rotating substrate to apply a resist, and a substrate on which a film is formed on the surface.
  • the technology according to the present embodiment may be applied to a device that discharges a film removing liquid from a nozzle to an edge of the substrate or a device that discharges an etching liquid from a nozzle to the surface of a substrate.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • General Physics & Mathematics (AREA)
  • Manufacturing & Machinery (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Power Engineering (AREA)
  • Cleaning Or Drying Semiconductors (AREA)
  • Application Of Or Painting With Fluid Materials (AREA)
  • Weting (AREA)
  • Exposure Of Semiconductors, Excluding Electron Or Ion Beam Exposure (AREA)

Abstract

L'invention concerne un procédé de traitement de substrat comprenant des première à sixième étapes. Un substrat est maintenu (première étape). Une capture d'image à l'aide d'une caméra est démarrée pour générer une image capturée (seconde étape). L'évacuation d'un liquide de traitement depuis une première buse vers le substrat est démarrée (troisième étape). L'évacuation du liquide de traitement à partir de la première buse est arrêtée, et l'évacuation d'un liquide de traitement à partir d'une seconde buse est démarrée (quatrième étape). La différence de synchronisation entre la synchronisation de démarrage de l'évacuation du liquide de traitement à partir de la seconde buse et la synchronisation d'arrêt de l'évacuation du liquide de traitement à partir de la première buse est obtenue sur la base d'un traitement d'image effectué sur l'image capturée (cinquième étape). Lorsque la différence de synchronisation est déterminée comme étant en dehors d'une plage prescrite, la synchronisation de début et/ou la synchronisation d'arrêt sont ajustées de telle sorte que la différence de synchronisation tombe dans la plage prescrite (sixième étape).
PCT/JP2019/026589 2018-08-20 2019-07-04 Procédé de traitement de substrat, dispositif de traitement de substrat et système de traitement de substrat WO2020039765A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020217004619A KR102509854B1 (ko) 2018-08-20 2019-07-04 기판 처리 방법, 기판 처리 장치 및 기판 처리 시스템
CN201980054879.9A CN112640054B (zh) 2018-08-20 2019-07-04 基板处理方法以及基板处理装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-154079 2018-08-20
JP2018154079A JP7177628B2 (ja) 2018-08-20 2018-08-20 基板処理方法、基板処理装置および基板処理システム

Publications (1)

Publication Number Publication Date
WO2020039765A1 true WO2020039765A1 (fr) 2020-02-27

Family

ID=69593077

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/026589 WO2020039765A1 (fr) 2018-08-20 2019-07-04 Procédé de traitement de substrat, dispositif de traitement de substrat et système de traitement de substrat

Country Status (5)

Country Link
JP (1) JP7177628B2 (fr)
KR (1) KR102509854B1 (fr)
CN (1) CN112640054B (fr)
TW (1) TWI702649B (fr)
WO (1) WO2020039765A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200083065A1 (en) * 2018-09-11 2020-03-12 Soitec Process for treating an soi substrate in a single wafer cleaner
WO2024210123A1 (fr) * 2023-04-07 2024-10-10 株式会社Screenホールディングス Dispositif de traitement de substrat et procédé de traitement de substrat

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021152762A (ja) * 2020-03-24 2021-09-30 株式会社Screenホールディングス 学習済みモデル生成方法、学習済みモデル、異常要因推定装置、基板処理装置、異常要因推定方法、学習方法、学習装置、及び、学習データ作成方法
KR102368201B1 (ko) * 2020-04-08 2022-03-02 이지스로직 주식회사 스핀 코터의 포토레지스트 코팅 품질 검사 시스템
KR102327761B1 (ko) * 2020-04-08 2021-11-19 주식회사 이지스로직 Dvs와 딥러닝을 이용한 스핀 코터의 포토레지스트 코팅 품질 검사 시스템
KR102324162B1 (ko) * 2020-04-08 2021-11-10 이지스로직 주식회사 포토레지스트 코팅 품질 검사가 가능한 스핀 코터
US11699595B2 (en) * 2021-02-25 2023-07-11 Applied Materials, Inc. Imaging for monitoring thickness in a substrate cleaning system
KR102585478B1 (ko) * 2021-10-14 2023-10-10 주식회사 램스 딥러닝을 이용한 스핀 코터의 포토레지스트 도포 상태 검사 시스템
JP2023127856A (ja) * 2022-03-02 2023-09-14 株式会社Screenホールディングス 基板処理方法、及び基板処理装置
JP2024047494A (ja) * 2022-09-26 2024-04-05 株式会社Screenホールディングス 学習装置、情報処理装置、基板処理装置、基板処理システム、学習方法および処理条件決定方法

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11330041A (ja) * 1998-05-07 1999-11-30 Dainippon Screen Mfg Co Ltd エッチング液による基板処理装置
JP2003273003A (ja) * 2002-03-15 2003-09-26 Dainippon Screen Mfg Co Ltd 基板処理装置
JP2016072344A (ja) * 2014-09-29 2016-05-09 株式会社Screenホールディングス 基板処理装置および基板処理方法

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3481416B2 (ja) * 1997-04-07 2003-12-22 大日本スクリーン製造株式会社 基板処理装置及び方法
JP4601452B2 (ja) * 2005-02-22 2010-12-22 大日本スクリーン製造株式会社 基板処理装置
JP2010151925A (ja) * 2008-12-24 2010-07-08 Hitachi High-Technologies Corp 基板処理装置、フラットパネルディスプレイの製造装置およびフラットパネルディスプレイ
JP2009218622A (ja) * 2009-06-29 2009-09-24 Canon Anelva Corp 基板処理装置及び基板処理装置における基板位置ずれ補正方法
JP6251086B2 (ja) * 2014-03-12 2017-12-20 株式会社Screenホールディングス 基板処理装置および基板処理方法
JP6278759B2 (ja) 2014-03-11 2018-02-14 株式会社Screenホールディングス 基板処理装置および基板処理方法
JP2016122681A (ja) * 2014-12-24 2016-07-07 株式会社Screenホールディングス 基板処理装置および基板処理方法
JP6635869B2 (ja) * 2015-06-16 2020-01-29 東京エレクトロン株式会社 処理装置、処理方法および記憶媒体
JP6541491B2 (ja) 2015-07-29 2019-07-10 株式会社Screenホールディングス 流下判定方法、流下判定装置および吐出装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11330041A (ja) * 1998-05-07 1999-11-30 Dainippon Screen Mfg Co Ltd エッチング液による基板処理装置
JP2003273003A (ja) * 2002-03-15 2003-09-26 Dainippon Screen Mfg Co Ltd 基板処理装置
JP2016072344A (ja) * 2014-09-29 2016-05-09 株式会社Screenホールディングス 基板処理装置および基板処理方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200083065A1 (en) * 2018-09-11 2020-03-12 Soitec Process for treating an soi substrate in a single wafer cleaner
WO2024210123A1 (fr) * 2023-04-07 2024-10-10 株式会社Screenホールディングス Dispositif de traitement de substrat et procédé de traitement de substrat

Also Published As

Publication number Publication date
CN112640054B (zh) 2024-10-18
KR102509854B1 (ko) 2023-03-14
KR20210031952A (ko) 2021-03-23
JP2020031083A (ja) 2020-02-27
JP7177628B2 (ja) 2022-11-24
CN112640054A (zh) 2021-04-09
TWI702649B (zh) 2020-08-21
TW202010003A (zh) 2020-03-01

Similar Documents

Publication Publication Date Title
WO2020039765A1 (fr) Procédé de traitement de substrat, dispositif de traitement de substrat et système de traitement de substrat
JP7179568B2 (ja) 基板処理方法および基板処理装置
TWI743522B (zh) 基板處理方法、基板處理裝置以及基板處理系統
WO2021241228A1 (fr) Procédé de traitement de substrat et dispositif de traitement de substrat
CN109560012B (zh) 衬底处理装置及衬底处理方法
US10985038B2 (en) Determination method and substrate processing equipment
WO2020071206A1 (fr) Dispositif de traitement de substrat et procédé de traitement de substrat
JP7219190B2 (ja) 教師データ生成方法および吐出状態の判定方法
TWI746907B (zh) 煙霧判定方法、基板處理方法及基板處理裝置
US20220238346A1 (en) Substrate processing apparatus, substrate processing method, and non-transitory computer-readable storage medium
JP6489524B2 (ja) 基板処理装置
JP7071209B2 (ja) 処理液吐出装置、処理液吐出方法、および基板処理装置
CN112509940B (zh) 基板处理装置以及基板处理方法
JP2019102784A (ja) ヒューム判定方法、基板処理方法、および基板処理装置
WO2020045176A1 (fr) Procédé de génération de données d'apprentissage et procédé de détermination d'état d'évacuation
JP2021044467A (ja) 検知装置、および、検知方法
TW202430868A (zh) 動作監視方法及製造裝置
TW202326121A (zh) 動作監視方法及製造裝置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19852346

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20217004619

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19852346

Country of ref document: EP

Kind code of ref document: A1