WO2020045176A1 - Procédé de génération de données d'apprentissage et procédé de détermination d'état d'évacuation - Google Patents

Procédé de génération de données d'apprentissage et procédé de détermination d'état d'évacuation Download PDF

Info

Publication number
WO2020045176A1
WO2020045176A1 PCT/JP2019/032550 JP2019032550W WO2020045176A1 WO 2020045176 A1 WO2020045176 A1 WO 2020045176A1 JP 2019032550 W JP2019032550 W JP 2019032550W WO 2020045176 A1 WO2020045176 A1 WO 2020045176A1
Authority
WO
WIPO (PCT)
Prior art keywords
processing liquid
substrate
image data
teacher data
discharge
Prior art date
Application number
PCT/JP2019/032550
Other languages
English (en)
Japanese (ja)
Inventor
有史 沖田
英司 猶原
達哉 増井
央章 角間
Original Assignee
株式会社Screenホールディングス
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2019149428A external-priority patent/JP7219190B2/ja
Application filed by 株式会社Screenホールディングス filed Critical 株式会社Screenホールディングス
Priority to KR1020217004602A priority Critical patent/KR102524704B1/ko
Priority to CN201980055179.1A priority patent/CN112601617A/zh
Publication of WO2020045176A1 publication Critical patent/WO2020045176A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B05SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05DPROCESSES FOR APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05D1/00Processes for applying liquids or other fluent materials
    • B05D1/40Distributing applied liquids or other fluent materials by members moving relatively to surface
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B05SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05DPROCESSES FOR APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05D3/00Pretreatment of surfaces to which liquids or other fluent materials are to be applied; After-treatment of applied coatings, e.g. intermediate treating of an applied coating preparatory to subsequent applications of liquids or other fluent materials
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L21/00Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
    • H01L21/02Manufacture or treatment of semiconductor devices or of parts thereof
    • H01L21/027Making masks on semiconductor bodies for further photolithographic processing not provided for in group H01L21/18 or H01L21/34
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L21/00Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
    • H01L21/02Manufacture or treatment of semiconductor devices or of parts thereof
    • H01L21/04Manufacture or treatment of semiconductor devices or of parts thereof the devices having potential barriers, e.g. a PN junction, depletion layer or carrier concentration layer
    • H01L21/18Manufacture or treatment of semiconductor devices or of parts thereof the devices having potential barriers, e.g. a PN junction, depletion layer or carrier concentration layer the devices having semiconductor bodies comprising elements of Group IV of the Periodic Table or AIIIBV compounds with or without impurities, e.g. doping materials
    • H01L21/30Treatment of semiconductor bodies using processes or apparatus not provided for in groups H01L21/20 - H01L21/26
    • H01L21/302Treatment of semiconductor bodies using processes or apparatus not provided for in groups H01L21/20 - H01L21/26 to change their surface-physical characteristics or shape, e.g. etching, polishing, cutting
    • H01L21/304Mechanical treatment, e.g. grinding, polishing, cutting

Definitions

  • the present invention relates to a teacher data generation method and a discharge state determination method.
  • the substrate processing apparatus includes a substrate holding unit that holds the substrate in a horizontal position, a rotation mechanism that rotates the substrate holding unit to rotate the substrate in a horizontal plane, and a discharge nozzle that discharges a processing liquid from above the substrate. ing.
  • the processing liquid When the processing liquid is discharged from the discharge nozzle toward the substrate, the processing liquid lands near the center of the substrate, spreads on the substrate under the centrifugal force accompanying the rotation of the substrate, and is scattered from the periphery of the substrate. As a result, the processing liquid acts on the entire surface of the substrate, and processing according to the processing liquid can be performed on the substrate.
  • the treatment liquid include chemical liquids such as SC1 liquid (a mixed liquid of aqueous ammonia, hydrogen peroxide and water), SC2 liquid (a mixed liquid of hydrochloric acid, aqueous hydrogen peroxide and water) and DHF liquid (dilute hydrofluoric acid), or And a rinsing liquid such as pure water.
  • Patent Document 1 A technique has also been proposed for monitoring the discharge state of the processing liquid from the discharge nozzle with a camera (for example, Patent Document 1).
  • Patent Literature 1 an imaging area including the tip of a discharge nozzle is imaged by a camera, and it is determined whether or not the processing liquid is being discharged from the discharge nozzle based on image data obtained by the camera.
  • the category of the classification a category according to the discharge state of the processing liquid from the discharge nozzle can be adopted. Specifically, the category includes a category indicating that the processing liquid ejection state is normal, a category indicating that the processing liquid has not been ejected yet, and a category indicating that the processing liquid ejection state is defective. The categories shown can be adopted.
  • the classifier classifies the image data acquired by the camera into a corresponding category.
  • an object of the present application is to provide a teacher data generation method capable of generating teacher data according to the discharge state of a discharge nozzle.
  • a first aspect of a method for generating teacher data is a method for generating teacher data on a discharge state of a discharge nozzle in a substrate processing apparatus, comprising: a holding step of holding a substrate substantially horizontally on a substrate holding mechanism; A processing liquid discharging step of discharging a processing liquid from the substrate toward the main surface of the substrate, the discharge nozzle, and imaging adjusted to include at least a part of the processing liquid discharged in the processing liquid discharging step.
  • An image of an area is captured by a camera over a period including at least a part of the processing liquid discharging step, and an imaging step of acquiring a plurality of image data, and at least one of the plurality of image data is captured in the image data.
  • a second aspect of the teacher data generation method is the teacher data generation method according to the first aspect, wherein the teacher data generation step includes a timing at which a discharge state of the processing liquid changes, and A storage step of storing image data obtained by the camera in a storage medium as a teacher data candidate in a period shorter than a period of time, a selecting step of selecting image data to be adopted as teacher data from the candidates, And a label applying step of applying a label according to the discharge state of the processing liquid reflected in the teacher data.
  • a third aspect of the teacher data generation method is the teacher data generation method according to the first or second aspect, wherein image data is erroneously classified into a certain category by a classifier generated based on the teacher data.
  • the image data is used as teacher data, and a second label different from the first label corresponding to the category is provided to the teacher data.
  • a fourth aspect of the method for generating teacher data is the method for generating teacher data according to the third aspect, wherein the second label is a label indicating that the classification is incorrect.
  • a fifth aspect of the method for generating teacher data is the method for generating teacher data according to the third aspect, wherein the category is a dropping state in which the processing liquid drops as droplets when the discharge of the processing liquid is stopped. And the second label is a label indicating that the pattern formed on the surface of the substrate immediately below the discharge nozzle in the image data is similar to the pattern of the dropout.
  • An aspect of the method of determining the ejection state includes a step of performing machine learning using the teacher data generated by the teacher data generation method according to any one of the first to fifth aspects to generate a classifier, A step of holding the substrate substantially horizontally on a substrate holding mechanism, a step of discharging a processing liquid from the discharge nozzle toward a main surface of the substrate, the discharge nozzle, and the processing liquid discharged in the processing liquid discharging step.
  • the imaging area updated to include at least a part of the processing liquid, a camera image over a period including at least a part of the processing liquid discharge step, a step of obtaining a plurality of image data, from the discharge nozzle
  • the provisional determination of the quality of the discharge state of the processing liquid is performed based on the statistic of pixel values in a region of the image data extending in the processing liquid discharge direction from the tip of the discharge nozzle in the image data, When the output state is tentatively judged that defective, by the classifier, classifies into categories corresponding to the image data in the discharge state.
  • the teacher data generation method it is possible to generate teacher data to which a label according to the discharge state of the processing liquid is added.
  • the teacher data generation method since the image data acquired in the shorter candidate period is stored as the teacher data candidate, for example, all the teachers of the image data generated in the entire processing period are processed.
  • the number of teacher data candidates can be reduced as compared with the case where data is stored as data candidates. Therefore, selection of teacher data becomes easy.
  • the teacher data generation method it is possible to generate teacher data for re-learning.
  • the classification accuracy can be improved.
  • the teacher data generation method it is possible to generate teacher data that contributes to suppressing erroneous determination of a drop.
  • the image data is classified into the category by the classifier.
  • the processing can be simplified.
  • FIG. 2 is a diagram illustrating a schematic example of a configuration of a substrate processing apparatus.
  • FIG. 3 is a plan view illustrating a schematic example of a configuration of a processing unit.
  • FIG. 4 is a side view illustrating a schematic example of a configuration of a processing unit.
  • FIG. 3 is a functional block diagram schematically illustrating an example of a configuration of a control unit. It is a figure which shows an example of a discharge state schematically in a table form. It is a flow chart which shows an example of generation processing of a classifier. It is a flow chart which shows an example of generation processing of teacher data. It is a figure which shows an example of an input screen schematically.
  • FIG. 4 is a diagram schematically illustrating an example of a part of image data.
  • FIG. 11 is a functional block diagram schematically illustrating another example of the configuration of the control unit. It is a flowchart which shows an example of an update process of a classifier. It is a figure which shows an example of a browsing screen schematically. It is a figure which shows another example of a browsing screen schematically.
  • FIG. 11 is a functional block diagram schematically illustrating another example of the configuration of the control unit.
  • FIG. 3 is a diagram schematically illustrating an example of image data.
  • FIG. 1 is a diagram showing the overall configuration of the substrate processing apparatus 100.
  • the substrate processing apparatus 100 is an apparatus that performs processing on a substrate W by supplying a processing liquid to the substrate W.
  • the substrate W is, for example, a semiconductor substrate. This substrate W has a substantially disk shape.
  • This substrate processing apparatus 100 can supply the processing liquid to the main surface of the substrate W.
  • the substrate processing apparatus 100 can perform a cleaning process by supplying a rinsing liquid to the substrate W after supplying a cleaning liquid to the substrate W.
  • the chemical liquid is SC1 liquid (a mixture of aqueous ammonia, hydrogen peroxide and water), SC2 liquid (a mixed liquid of hydrochloric acid, hydrogen peroxide and water), or DHF liquid (dilute hydrofluoric acid). Acid) and the like.
  • the rinsing liquid for example, pure water is used.
  • the chemical liquid and the rinsing liquid are collectively referred to as “treatment liquid”. Note that not only the cleaning process but also a coating solution such as a photoresist solution for a film forming process, a chemical solution for removing an unnecessary film, a chemical solution for etching, and the like are included in the “processing solution”.
  • the substrate processing apparatus 100 includes an indexer 102, a plurality of processing units 1, and a main transfer robot 103.
  • the indexer 102 has a function of carrying an unprocessed substrate W received from outside the apparatus into the apparatus and a function of carrying out the processed substrate W after the cleaning processing to the outside of the apparatus.
  • the indexer 102 mounts a plurality of carriers and includes a transfer robot (both are not shown).
  • a FOUP front opening unified pod
  • SMIF Standard Mechanical Inter Face
  • OC open cassette
  • 12Twelve processing units 1 are arranged in the substrate processing apparatus 100.
  • the detailed arrangement configuration is such that four towers in which three processing units 1 are stacked are arranged so as to surround the main transfer robot 103.
  • four processing units 1 arranged so as to surround the main transfer robot 103 are stacked in three stages, and FIG. 1 shows one of them.
  • the number of processing units 1 mounted on the substrate processing apparatus 100 is not limited to 12, and may be, for example, eight or four.
  • the main transfer robot 103 is installed at the center of the four towers including the stacked processing units 1.
  • the main transport robot 103 loads the unprocessed substrate W received from the indexer 102 into each processing unit 1, unloads the processed substrate W from each processing unit 1, and delivers it to the indexer 102.
  • FIG. 2 is a plan view of the processing unit 1.
  • FIG. 3 is a longitudinal sectional view of the processing unit 1. 2 shows a state where the substrate W is not held by the substrate holding unit 20, and FIG. 3 shows a state where the substrate W is held by the substrate holding unit 20.
  • the processing unit 1 includes a substrate holding unit 20 that holds a substrate W in a horizontal posture (a posture in which a normal line of the substrate W is along the vertical direction) as main elements in the chamber 10, and a substrate held by the substrate holding unit 20.
  • Three processing liquid supply units 30, 60, 65 for supplying a processing liquid to the upper surface of W, a processing cup 40 surrounding the periphery of the substrate holding unit 20, and a camera 70 for imaging the space above the substrate holding unit 20. Is provided. Further, around the processing cup 40 in the chamber 10, there is provided a partition plate 15 for vertically partitioning the inner space of the chamber 10.
  • the chamber 10 includes a side wall 11 extending in the vertical direction, a ceiling wall 12 closing the upper side of a space surrounded by the side wall 11, and a floor wall 13 closing the lower side.
  • the space surrounded by the side wall 11, the ceiling wall 12, and the floor wall 13 is a processing space for the substrate W.
  • a part of the side wall 11 of the chamber 10 is provided with a loading / unloading port for the main transfer robot 103 to load and unload the substrate W with respect to the chamber 10, and a shutter for opening and closing the loading / unloading port. Omitted).
  • the fan filter unit 14 includes a fan and a filter (for example, a HEPA filter) for taking in air in the clean room and sending it out into the chamber 10, and forms a downflow of clean air in a processing space in the chamber 10.
  • a punching plate having a large number of blowout holes may be provided directly below the ceiling wall 12.
  • the substrate holding unit 20 is, for example, a spin chuck.
  • the substrate holding section 20 includes a disk-shaped spin base 21 fixed in a horizontal posture to an upper end of a rotating shaft 24 extending in the vertical direction.
  • a spin motor 22 for rotating a rotating shaft 24 is provided below the spin base 21.
  • the spin motor 22 rotates the spin base 21 via a rotation shaft 24 in a horizontal plane.
  • a cylindrical cover member 23 is provided so as to surround the spin motor 22 and the rotation shaft 24.
  • the outer diameter of the disc-shaped spin base 21 is slightly larger than the diameter of the circular substrate W held by the substrate holder 20. Therefore, the spin base 21 has a holding surface 21a facing the entire lower surface of the substrate W to be held.
  • a plurality (four in the present embodiment) of chuck pins 26 are provided upright on the periphery of the holding surface 21a of the spin base 21.
  • the plurality of chuck pins 26 are equally spaced along the circumference corresponding to the outer circumference of the circular substrate W (in the case of four chuck pins 26 as in the present embodiment, at 90 ° intervals). ) Is located.
  • the plurality of chuck pins 26 are driven in conjunction with each other by a link mechanism (not shown) accommodated in the spin base 21.
  • the substrate holding unit 20 holds the substrate W by bringing each of the plurality of chuck pins 26 into contact with the outer peripheral end of the substrate W, thereby positioning the substrate W above the spin base 21 and close to the holding surface 21a. (See FIG. 3), and the gripping can be released by separating each of the plurality of chuck pins 26 from the outer peripheral end of the substrate W.
  • the spin motor 22 rotates the rotation shaft 24 in a state where the substrate holding unit 20 holds the substrate W by being gripped by the plurality of chuck pins 26, thereby rotating the rotation axis CX along the vertical direction passing through the center of the substrate W.
  • the substrate W can be rotated.
  • the processing liquid supply unit 30 is configured by attaching the discharge nozzle 31 to the tip of the nozzle arm 32 (see FIG. 2).
  • the base end of the nozzle arm 32 is fixedly connected to a nozzle base 33.
  • the nozzle base 33 is rotatable around a vertical axis by a motor (not shown). The rotation of the nozzle base 33 causes the discharge nozzle 31 to move between the processing position above the substrate holding unit 20 and the standby position outside the processing cup 40, as indicated by an arrow AR34 in FIG. It moves in an arc along the horizontal direction.
  • the processing liquid supply unit 30 is configured to supply a plurality of types of processing liquids.
  • the processing liquid supply unit 30 has a plurality of discharge nozzles 31. 2 and 3, two ejection nozzles 31a and 31b are shown as the ejection nozzles 31.
  • the discharge nozzles 31a and 31b are fixed to a nozzle base 33 via a nozzle arm 32. Therefore, the discharge nozzles 31a and 31b move in synchronization with each other.
  • the discharge nozzles 31a and 31b are provided so as to be adjacent in a horizontal plane.
  • the discharge nozzle 31a is connected to a processing liquid supply source 37a via a pipe 34a
  • the discharge nozzle 31b is connected to a processing liquid supply source 37b via a pipe 34b.
  • On-off valves 35a and 35b are provided in the middle of the pipes 34a and 34b, respectively.
  • the on-off valve 35a is opened, the processing liquid from the processing liquid supply source 37a flows through the inside of the pipe 34a and is discharged from the discharge nozzle 31a.
  • the on-off valve 35b is opened, the processing liquid from the processing liquid supply source 37b is discharged.
  • the gas flows through the inside of the pipe 34b and is discharged from the discharge nozzle 31b.
  • SC1 liquid is discharged from the discharge nozzle 31a, and pure water is discharged from the discharge nozzle 31b, for example.
  • Suckback valves 36a, 36b may be provided in the middle of the pipes 34a, 34b, respectively.
  • the suck back valve 36a draws in the processing liquid from the tip of the discharge nozzle 31a by sucking the processing liquid in the pipe 34a when the discharge of the processing liquid is stopped. This makes it difficult for the processing liquid to drop from the tip of the discharge nozzle 31a as a relatively large lump (droplet) when the discharge is stopped.
  • the processing unit 1 of this embodiment is provided with two processing liquid supply units 60 and 65 in addition to the processing liquid supply unit 30 described above.
  • the processing liquid supply units 60 and 65 of the present embodiment have the same configuration as the processing liquid supply unit 30 described above. That is, the processing liquid supply unit 60 is configured by attaching a discharge nozzle 61 to the tip of a nozzle arm 62, and the discharge nozzle 61 is moved by an arrow AR 64 by a nozzle base 63 connected to the base end of the nozzle arm 62. As shown in the figure, the wafer moves in an arc between a processing position above the substrate holding unit 20 and a standby position outside the processing cup 40.
  • the processing liquid supply unit 65 is configured by attaching a discharge nozzle 66 to the tip of a nozzle arm 67, and the discharge nozzle 66 is moved by an arrow AR 69 by a nozzle base 68 connected to the base end of the nozzle arm 67. As shown by, it moves in an arc between a processing position above the substrate holding unit 20 and a standby position outside the processing cup 40.
  • the processing liquid supply units 60 and 65 may also be configured to supply a plurality of types of processing liquid, or may be configured to supply a single processing liquid.
  • the processing liquid supply units 60 and 65 discharge the processing liquid onto the upper surface of the substrate W held by the substrate holding unit 20 with the respective discharge nozzles 61 and 66 positioned at the processing positions. At least one of the processing liquid supply units 60 and 65 mixes a cleaning liquid such as pure water and a pressurized gas to generate droplets, and ejects a mixed fluid of the droplets and the gas to the substrate W. It may be a two-fluid nozzle. Further, the number of processing liquid supply units provided in the processing unit 1 is not limited to three, but may be one or more. However, in the present embodiment, it is assumed that two processing liquids are sequentially switched and discharged, and therefore, two or more discharge nozzles are provided as a whole.
  • Each of the discharge nozzles of the processing liquid supply units 60 and 65 is also connected to a processing liquid supply source via a pipe, similarly to the processing liquid supply unit 30, and an opening / closing valve is provided in the middle of the pipe. May be provided.
  • processing using the processing liquid supply unit 30 will be described as a representative.
  • the processing cup 40 is provided so as to surround the substrate holding unit 20.
  • the processing cup 40 includes an inner cup 41, a middle cup 42, and an outer cup 43.
  • the inner cup 41, the middle cup 42, and the outer cup 43 are provided to be able to move up and down.
  • the processing liquid scattered from the peripheral edge of the substrate W falls on the inner peripheral surface of the inner cup 41.
  • the dropped processing liquid is appropriately collected by the first collection mechanism.
  • the inner cup 41 is lowered and the middle cup 42 and the outer cup 43 are raised, the processing liquid scattered from the periphery of the substrate W falls on the inner peripheral surface of the middle cup 42.
  • the dropped processing liquid is appropriately collected by the second collection mechanism.
  • the processing liquid scattered from the peripheral edge of the substrate W falls on the inner peripheral surface of the outer cup 43.
  • the dropped processing liquid is appropriately collected by the third collection mechanism. According to this, different processing liquids can be appropriately collected.
  • the partition plate 15 is provided so as to vertically partition the inner space of the chamber 10 around the processing cup 40.
  • the partition plate 15 may be a single plate member surrounding the processing cup 40, or may be a combination of a plurality of plate members. Further, the partition plate 15 may be formed with a through hole or notch penetrating in the thickness direction.
  • the nozzle bases 33, 63, 68 of the processing liquid supply units 30, 60, 65 are formed.
  • a through hole (not shown) for passing a support shaft for supporting is formed.
  • the outer peripheral end of the partition plate 15 is connected to the side wall 11 of the chamber 10.
  • the edge of the partition plate 15 surrounding the processing cup 40 is formed in a circular shape having a diameter larger than the outer diameter of the outer cup 43. Therefore, the partition plate 15 does not prevent the outer cup 43 from moving up and down.
  • an exhaust duct 18 is provided near the floor wall 13 in the side wall 11 of the chamber 10.
  • the exhaust duct 18 is connected to an exhaust mechanism (not shown).
  • the air passing between the processing cup 40 and the partition plate 15 is discharged from the exhaust duct 18 to the outside of the apparatus.
  • the camera 70 is installed in the chamber 10 above the partition plate 15.
  • the camera 70 includes, for example, an imaging element (for example, a CCD (Charge Coupled Device)) and an optical system such as an electronic shutter and a lens.
  • the discharge nozzle 31 of the processing liquid supply unit 30 has a processing position (solid line position in FIG. 3) above the substrate W held by the substrate holding unit 20 and a standby position outside the processing cup 40 by the nozzle base 33 ( (Dotted line position in FIG. 3).
  • the processing position is a position where the cleaning liquid is discharged from the processing liquid supply unit 30 onto the upper surface of the substrate W held by the substrate holding unit 20 to perform the cleaning processing.
  • the standby position is a position where the discharge of the processing liquid is stopped and the processing liquid supply unit 30 stands by when the processing liquid supply unit 30 does not perform the cleaning process.
  • a standby pod that accommodates the discharge nozzle 31 of the processing liquid supply unit 30 may be provided at the standby position.
  • the camera 70 is installed so that its imaging area includes at least the tip of the discharge nozzle 31 at the processing position. More specifically, the camera 70 is installed such that the tip of the ejection nozzle 31 and the processing liquid ejected from the tip are included in the imaging region.
  • the camera 70 is installed at a position where the ejection nozzle 31 at the processing position is imaged from the upper front. Therefore, the camera 70 can capture an image of the imaging region including the tip of the discharge nozzle 31 at the processing position.
  • the camera 70 can also image the distal end of the discharge nozzle 61, 66 of the processing liquid supply unit 60, 65 at the processing position, and the imaging region including the processing liquid discharged from the distal end. In other words, the position of the camera 70 is adjusted so that the tips of these discharge nozzles and the discharged processing liquid are included in the imaging region.
  • the discharge nozzles 31 and 66 of the processing liquid supply units 30 and 65 move in the horizontal direction within the imaging field of view of the camera 70.
  • the discharge nozzle 61 of the processing liquid supply unit 60 moves in the depth direction within the imaging field of view of the camera 70.
  • a camera dedicated to the processing liquid supply unit 60 may be provided separately from the camera 70.
  • This camera 70 acquires a plurality of image data by imaging the imaging region over a period including at least a part of a period during which the processing liquid is discharged onto the substrate W.
  • the camera 70 outputs the acquired image data to the control unit 9.
  • an illumination unit 71 is provided in the chamber 10 and above the partition plate 15. Usually, since the interior of the chamber 10 is a dark room, when the camera 70 performs imaging, the illumination unit 71 irradiates the discharge nozzles 31, 61, 66 of the processing liquid supply units 30, 60, 65 near the processing position with light. The image data obtained by the camera 70 is output to the control unit 9.
  • the user interface 80 includes a display unit 81 and an input unit 82.
  • the display unit 81 is, for example, a liquid crystal display or an organic EL (Electro Luminescence) display.
  • the input unit 82 is, for example, a touch panel, a mouse, or a keyboard.
  • This user interface 80 is connected to the control unit 9.
  • the display unit 81 displays a display image based on a display signal from the control unit 9. This display image includes, for example, image data from the camera 70.
  • the input unit 82 outputs input information input by the worker to the control unit 9.
  • the control unit 9 can control various components according to the input information.
  • the control unit 9 controls various components of the substrate processing apparatus 100 to perform processing on the substrate W.
  • the control unit 9 performs image processing on the image data acquired by the camera 70.
  • the control unit 9 determines the discharge state of the processing liquid from each discharge nozzle by this image processing. This image processing will be described later in detail.
  • the configuration of the control unit 9 as hardware is the same as that of a general computer. That is, the control unit 9 stores a CPU for performing various arithmetic processes, a ROM that is a read-only memory that stores a basic program, a RAM that is a readable and writable memory that stores various information, and control software and data. And a magnetic disk.
  • the CPU of the control unit 9 executes a predetermined processing program, each operation mechanism of the substrate processing apparatus 100 is controlled by the control unit 9 and the processing in the substrate processing apparatus 100 proceeds.
  • the CPU of the control unit 9 executes a predetermined processing program to perform image processing.
  • Some or all of the functions of the control unit 9 may be realized by dedicated hardware.
  • FIG. 4 is a functional block diagram schematically showing an example of the internal configuration of the control unit 9.
  • the control unit 9 includes a classifier 91, a machine learning unit 92, a candidate data extraction unit 93, a display control unit 94, and a teacher data generation unit 95.
  • Image data from the camera 70 is sequentially input to the classifier 91.
  • the classifier 91 classifies the input image data into one of a plurality of categories. Categories can also be called classes.
  • the category is a category according to the discharge state of the processing liquid from the discharge nozzle 61.
  • FIG. 5 is a diagram schematically illustrating an example of a discharge state.
  • the discharge state includes a state relating to the flow-down shape of the processing liquid discharged from the tip of the discharge nozzle 61. Specifically, as illustrated in FIG.
  • the discharge state includes a normal discharge state in which the processing liquid flows down from the discharge nozzle 61 as a continuous flow, and the processing liquid drops as droplets when the discharge of the processing liquid is stopped.
  • a dropping state There are a dropping state, a discharge stop state in which the processing liquid is not discharged, and the like. It is not desirable that drips occur, and the drips state is a kind of defective discharge state.
  • the defective discharge state for example, a liquid splash state in which the processing liquid jumps at a liquid landing point on the substrate W, and the like can be cited.
  • a description will be given by employing a drop-off state as a defective discharge state.
  • Each category is set according to each ejection state.
  • the category C1 is a category indicating a normal ejection state
  • the category C2 is a category indicating a certain defective ejection state (here, a dropping state)
  • the category C3 is a category indicating a discharge stop state.
  • the classifier 91 classifies the image data into categories, and the control unit 9 substantially determines the ejection state. For example, when the classifier 91 classifies the image data into the category C2, it can be determined that a drop has occurred. Therefore, the control unit 9 displays on the display unit 81 that the drop has occurred, so that the operator can see the drop. You may be notified.
  • This classifier 91 is generated by the machine learning unit 92 using a plurality of teacher data. That is, it can be said that the classifier 91 is a machine-learned classifier.
  • the machine learning unit 92 uses, for example, a neighborhood method, a support vector machine, a random forest, or a neural network (including deep learning) as a machine learning algorithm.
  • the teacher data includes image data and a label indicating which category the image data should be classified into.
  • the image data may be acquired by the camera 70.
  • the application of the label can be performed by, for example, an operation performed on the user interface 80 (input unit 82) by the operator. This operation will be described later in detail.
  • the machine learning unit 92 generates a classifier 91 by performing machine learning based on the teacher data.
  • the classifier 91 includes a feature vector extraction unit 911, a determination unit 912, and a storage medium in which a determination database 913 is stored.
  • Image data from the camera 70 is sequentially input to the feature vector extraction unit 911.
  • the feature vector extraction unit 911 extracts a feature vector of image data according to a predetermined algorithm.
  • This feature vector is a vector that can indicate a feature according to the ejection state of the ejection nozzle.
  • a known algorithm can be adopted as the algorithm.
  • the feature vector extraction unit 911 outputs the feature vector to the determination unit 912.
  • the judgment database 913 stores a plurality of feature vectors (hereinafter, referred to as reference vectors) generated from the plurality of teacher data by the machine learning unit 92, and the reference vectors are classified into each of the categories C1 to C3. I have. Specifically, the machine learning unit 92 generates a plurality of reference vectors by applying the same algorithm as the feature vector extraction unit 911 to a plurality of teacher data. Then, the machine learning unit 92 assigns a label (correct category) of teacher data to the reference vector.
  • reference vectors a plurality of feature vectors
  • the determination unit 912 classifies image data (frames) based on the feature vector input from the feature vector extraction unit 911 and a plurality of reference vectors stored in the determination database 913.
  • the determination unit 912 may specify a reference vector whose feature vector is closest, and may classify the frame into the category of the specified reference vector (nearest neighbor method). Accordingly, the determination unit 912 can classify the frame input to the classifier 91 (the feature vector extraction unit 911) into one of the categories C1 to C3.
  • the candidate data extraction unit 93, the display control unit 94, and the teacher data generation unit 95 are functional units related to generation of teacher data.
  • Image data obtained by the camera 70 is input to the candidate data extraction unit 93.
  • the candidate data extracting unit 93 stores, in the storage medium, image data acquired during a candidate period described below, among the image data, as teacher data candidates (hereinafter, referred to as candidate data).
  • the candidate period is a period including at least a part of the period during which the processing liquid is discharged from the discharge nozzle 61.
  • the candidate period includes a timing at which the discharge state of the processing liquid is changed. More specifically, the period includes a discharge stop timing for stopping the discharge of the processing liquid from the discharge nozzle 61, for example.
  • This candidate period can be set shorter than the discharge period in which the processing liquid is discharged from the discharge nozzle 61 to the substrate W. In this candidate period, the processing liquid is initially discharged from the discharge nozzle 61, and the discharge of the processing liquid is stopped in the middle of the candidate period.
  • image data showing a state in which the processing liquid flows down as a continuous flow includes image data showing a state in which the discharge of the processing liquid from the discharge nozzle 61 is stopped. Further, when a drop occurs when the ejection is stopped, the image data includes a state in which the drop has occurred. That is, by storing a plurality of data within the candidate period as candidate data, image data corresponding to each of the categories C1 to C3 can be stored as candidate data.
  • the storage medium on which the candidate data is stored may be the same as the storage medium on which the determination database 913 is stored, or may be another storage medium.
  • the plurality of candidate data stored in the storage medium is also referred to as a candidate database 931.
  • the display control unit 94 reads candidate data from the storage medium in response to the input of the input unit 82 by the operator, and causes the display unit 81 to display the candidate data. Thereby, the worker can visually recognize the candidate data on the display unit 81.
  • the worker visually recognizes the candidate data displayed on the display unit 81 and selects candidate data to be adopted as teacher data.
  • the operator inputs to the input unit 82 designating candidate data to be adopted as teacher data, and inputs a label to be assigned to the candidate data to the input unit 82. For example, a label “normal ejection state” is given to image data showing a normal ejection state.
  • the teacher data generation unit 95 receives the candidate data being displayed from the display control unit 94 in response to the input related to the label to the input unit 82, and transmits the information of the label input to the input unit 82 by the operator. Receive from 82.
  • the teacher data generation unit 95 stores the candidate data and the label in the storage medium as teacher data in association with each other.
  • the worker visually recognizes other candidate data in order, and determines whether or not to adopt the data as teacher data. Then, the worker inputs a label for candidate data to be adopted as teacher data. Thereby, a plurality of teacher data is stored in the storage medium.
  • the storage medium in which the teacher data is stored may be the same as the storage medium in which the determination database 913 or the candidate database 931 is stored, or may be another storage medium.
  • the plurality of teacher data stored in the storage medium is also referred to as a teacher database 921.
  • FIG. 6 is a flowchart showing an example of the generation processing of the classifier 91.
  • the classifier 91 is generated before the installation of the substrate processing apparatus 100 (that is, before shipment) will be described.
  • step S1 teacher data is generated.
  • FIG. 7 is a flowchart showing an example of the teacher data generation process in step S1 of FIG.
  • the substrate W is held by the substrate holding unit 20. Specifically, the substrate W is transferred onto the substrate holding unit 20 by the main transfer robot 103. The substrate holding unit 20 holds the transported substrate W substantially horizontally.
  • step S12 the control unit 9 controls, for example, the rotating mechanism of the processing liquid supply unit 60 to move the discharge nozzle 61 to the processing position.
  • the processing liquid supply unit 60 will be described, but the processing liquid supply units 30 and 65 may be used.
  • step S13 the control unit 9 causes the camera 70 to take an image, for example, triggered by the movement of the discharge nozzle 61.
  • the camera 70 captures an image of the imaging region at a predetermined frame rate (for example, 60 frames / second), and sequentially outputs the obtained image data to the control unit 9.
  • step S14 the control unit 9 controls the spin motor 22 to rotate the substrate W in a horizontal plane, and in step S15, performs a processing liquid discharge process. Specifically, the control unit 9 outputs an open signal to the on-off valve connected to the discharge nozzle 61 to discharge the processing liquid from the discharge nozzle 61 onto the upper surface of the substrate W. Then, for example, when a predetermined processing period has elapsed, the control unit 9 outputs a close signal to the on-off valve to stop the discharge of the processing liquid from the discharge nozzle 61. If a suck-back valve is provided, a suction signal is output to the suck-back valve.
  • step S16 the control unit 9 causes the camera 70 to end the imaging, and in step S17, the control unit 9 controls the spin motor 22 to stop the rotation of the substrate W.
  • the candidate data extraction unit 93 stores the image data acquired by the camera 70 during the candidate period in the storage medium as candidate data.
  • the candidate period is, for example, about several seconds.
  • the candidate data extracting unit 93 can determine the candidate period based on the timing at which the discharge nozzle 61 stops discharging. Specifically, the candidate data extracting unit 93 may determine a period from a timing earlier than the ejection stop timing by a first predetermined amount to a timing later than the ejection stop timing by a second predetermined amount as the candidate period. .
  • the opening / closing valve for controlling the discharge / stop of the processing liquid from the discharge nozzle 61 is controlled by the controller 9 as described above. Then, after the control unit 9 outputs the close signal to the on-off valve, the discharge of the processing liquid from the discharge nozzle 61 is stopped after a delay time corresponding to the length of the pipe or the like. Since this delay time is usually not so long, the candidate data extraction unit 93 may determine the candidate period based on the output timing of outputting a close signal to the on-off valve. That is, this output timing may be regarded as the discharge stop timing. Alternatively, when a suck back valve is provided, the candidate period may be determined based on the output timing of the suction signal to the suck back valve. Alternatively, when another ejection stop signal is used, that ejection stop signal may be employed.
  • the candidate data extracting unit 93 may specify the ejection stop timing based on the image data from the camera 70.
  • the processing liquid from the discharge nozzle 61 is captured in a discharge region R ⁇ b> 1 (see also FIG. 15) extending from the tip of the discharge nozzle 61 in the processing liquid discharge direction.
  • the processing liquid when the processing liquid is being discharged, the liquid column-shaped processing liquid appears in the discharge region R1, and when the processing liquid is not discharged, the processing liquid does not appear in the discharge region R1.
  • the processing liquid is imaged in the ejection region R1, the brightness value (or gray scale pixel value) of the pixels in the ejection region R1 becomes relatively high, and the distribution varies.
  • the processing liquid is not imaged in the ejection region R1
  • the upper surface of the substrate W is imaged, so that the luminance values of the pixels in the ejection region R1 are relatively low and the distribution is more uniform.
  • the candidate data extraction unit 93 determines that the processing liquid is discharged when the statistic is higher than the reference value, and stops the processing liquid discharge when the statistic is smaller than the reference value.
  • the reference value can be set, for example, by simulation or experiment.
  • the candidate data extraction unit 93 may specify the ejection stop timing based on the result of this determination. According to this, the discharge stop timing can be specified with high accuracy. Therefore, even if the candidate period is set short, the discharge stop timing can be included in the candidate period. As a result, in the candidate period, image data corresponding to the categories C1 to C3 can be obtained.
  • the candidate data extracting unit 93 stores a plurality of candidate data acquired in the candidate period as a group of candidate data in a storage medium. Further, the candidate data extracting unit 93 may store information accompanying the candidate data (hereinafter, also referred to as additional information) in a storage medium in association with the candidate data group. Examples of the accompanying information include, for example, information for identifying the processing unit 1 to which the camera 70 that has obtained the candidate data belongs, date and time information when the candidate data has been obtained, and information for identifying the processing liquid supply unit shown in the candidate data. Can be.
  • a candidate data group can be stored in the storage medium for each processing unit 1. Further, by performing the processing while changing the processing liquid supply unit in each processing unit 1, a candidate data group can be stored in the storage medium for each processing liquid supply unit.
  • FIG. 8 is a diagram schematically showing an example of an input screen IS1 for selecting candidate data.
  • the input screen IS1 is provided with an image display area IR1, a list display area TR1, and a button area BR1.
  • the image display area IR1 is a rectangular area and is an area where candidate data is displayed. In the example of FIG. 8, the image display area IR1 is located at the upper left of the input screen IS1.
  • the button area BR1 includes a plurality of buttons for designating candidate data to be displayed in the image display area IR1.
  • the button area BR1 is located to the right of the image display area IR1.
  • An operation on a button in the button area BR1 is realized by an input to the input unit 82.
  • the button area BR1 includes a unit selection area U1, a date selection area D1, an image selection area Ca1, and a label selection area L1.
  • the unit selection area U1 includes a button for selecting one of the twelve processing units 1.
  • the unit selection area U1 includes buttons “1” to “12”, and these numbers indicate the identification numbers of the processing units 1.
  • the date selection area D1 includes a button for designating the date on which the candidate data was obtained.
  • the date selection area D1 includes a "previous day” button and a "next day” button. By operating these buttons, a date can be selected.
  • the date selection area D1 may include a frame in which a date is displayed, such as “2016/3/25”. By inputting to the input unit 82, the date may be directly input by operating the frame.
  • the image selection area Ca1 includes a button for designating candidate data from the candidate data group.
  • the image selection area Ca1 includes a “previous image” button and a “next image” button, and by operating these, candidate data can be selected.
  • the image selection area Ca1 may include a frame for displaying the number of candidate data with respect to the total number of candidate data of the candidate data group, such as “1/64”. By inputting to the input unit 82, an operation may be performed on this frame to directly specify candidate data.
  • the label selection area L1 includes a button for designating a label.
  • the label selection area L1 includes buttons for a “normal ejection state”, a “dropping state”, and a “discharge stop state”.
  • the operator operates the buttons in the unit selection area U1 to select one of the processing units 1 and operates the buttons and the like in the date selection area D1 to specify the date by inputting to the input unit 82.
  • the display control unit 94 reads a candidate data group that satisfies the conditions (the processing unit 1 and the date) (for example, reads from the candidate database 931 or reads the candidate data group).
  • the history database 961 (see FIG. 10) is read), and the list is displayed in the list display area TR1 in a table format. In the example of FIG. 8, in the list display area TR1, the acquisition time of the candidate data group and the processing liquid supply unit appearing in the candidate data group are displayed.
  • the operator can select one from the list in this table. This selection is also realized by input to the input unit 82.
  • the display control unit 94 converts one candidate data included in the selected candidate data group into an image. It is displayed in the display area IR1. For example, the first candidate data included in the candidate data group is displayed in the image display area IR1.
  • the operator operates a button or the like in the image selection area Ca1 by inputting to the input unit 82 to sequentially display the candidate data included in the candidate data group in the image display area IR1. For example, when the operator performs an operation on a button indicating “next image”, the display control unit 94 terminates the display of the candidate data displayed in the image display area IR1, and sequentially displays the candidate data in time series. The located candidate data is displayed in the image display area IR1.
  • the worker visually recognizes the candidate data displayed in the image display area IR1, and determines whether or not to adopt the candidate data as teacher data. That is, the worker selects image data to be adopted as teacher data from the candidate data. If the worker decides to adopt the candidate data as teacher data, a correct category is given as a label to the image data. Specifically, the operator operates a button in the label selection area L1 by inputting to the input unit 82 to input a correct label for the candidate data. In response to the input to the input unit 82, the teacher data generation unit 95 receives the candidate data currently displayed in the image display area IR1 from the display control unit 94, and responds to the candidate data in response to the input. The candidate data and the label associated with each other are stored in the storage medium as teacher data.
  • the worker can generate a plurality of teacher data (teacher database 921) by repeatedly performing the above operation.
  • the plurality of teacher data is stored in a storage medium.
  • the teacher data generation unit 95 assigns a label corresponding to the discharge state of the processing liquid in at least one of the image data to the image data based on the input to the input unit 82 by the worker, Generate teacher data.
  • the machine learning unit 92 reads the teacher data from the teacher database 921, and generates a classifier 91 by a learning process based on the teacher data.
  • the classifier 91 can also be called a trained model.
  • the machine learning unit 92 may generate the classifier 91 for each processing unit 1 or for each processing liquid supply unit. Alternatively, the machine learning unit 92 may generate the classifier 91 according to the combination of the processing unit 1 and the processing liquid supply unit.
  • step S3 the accuracy of the classifier 91 is evaluated.
  • the image data is input to the classifier 91, and the classifier 91 classifies the image data.
  • the classifier 91 calculates the degree to which the input image data falls into each category (the degree of matching), and classifies the image data into the category having the highest degree of matching.
  • the “normal discharge state” is classified into a category C1
  • the “blind state” is classified into a category C2
  • the “discharge stopped state” is classified into a category C3.
  • the relevance to the correct category is sufficiently higher than the other relevance, it can be said that the classifier 91 has correctly classified. Conversely, if the relevance is low, it can be said that the classifier 91 still has room for improvement.
  • control unit 9 determines whether the relevance to the correct category is higher or lower than the reference value. When the relevance is lower than the reference value, in step S2, the machine learning unit 92 performs the learning process again. Update the classifier 91. Note that the machine learning unit 92 may perform a learning process to update the classifier 91 after newly adding teacher data.
  • the classifier 91 can be determined to be appropriate. However, if only the accuracy evaluation is performed on the classification result of one image data, the evaluation is insufficient. Therefore, it is desirable to perform the accuracy evaluation on the classification result of a plurality of image data. More specifically, when the relevance to a correct category is higher than a reference value in a predetermined number of classification results, the classifier 91 may be determined to be appropriate. If the classifier 91 determines that it is appropriate, the process ends.
  • image data obtained by the camera 70 and showing various ejection states is adopted as teacher data. Therefore, a classifier 91 that can classify the ejection state can be generated.
  • the control unit 9 stores the image data in the short candidate period including the timing at which the discharge state of the processing liquid changes (for example, the discharge stop timing) as a teacher data candidate (candidate data) in the storage medium. I have. Therefore, the number of candidate data can be reduced as compared with the case where all the image data in the processing period are stored as candidate data. Therefore, the operator can reduce the labor required for checking the candidate data, and the image data can be easily selected. Also, the required capacity of the storage medium can be reduced.
  • FIG. 9 is a diagram schematically illustrating an example of a part of image data that has been misclassified.
  • FIG. 9 shows a part of the image data in an enlarged manner.
  • the processing liquid is not discharged from the discharge nozzle 61.
  • a substantially circular pattern P1 is discretely included on the upper surface of the substrate W of the image data.
  • the pattern P1 is, for example, a pattern formed on the upper surface of the substrate W.
  • the patterns P ⁇ b> 1 are arranged at intervals in a substantially vertical direction immediately below the tip of the discharge nozzle 61.
  • the pixel value of the pixel in the circular pattern P1 may be larger than the pixel value of the pixel on the upper surface of another substrate W.
  • the distribution of pixel values in the ejection region R1 immediately below the tip of the ejection nozzle 61 can be similar to the distribution of pixel values in the ejection region when a drop occurs. Therefore, the classifier 91 can classify the image data into the category C2 (fluttering state) even though the flashing has not actually occurred.
  • the pattern P1 is formed in a relatively large area of the substrate W, but may be formed in a predetermined area of a part of the upper surface of the substrate W.
  • the pattern P1 can be formed by reflecting light from the illumination unit 71 on the upper surface of the substrate W. For example, during the rotation of the substrate W, light irregularly reflected by the metal pattern in the predetermined region on the upper surface of the substrate W forms the pattern P1. Then, when the predetermined region enters immediately below the nozzle 61 with the rotation of the substrate W, the pattern P1 is also included in the ejection region R1 immediately below the nozzle 61 in the image data captured by the camera 70.
  • the distribution of the pixel values in the ejection region R1 can be similar to the distribution of the pixel values in the ejection region when a drop occurs. Therefore, the classifier 91 can classify the image data into the category C2 (fluttering state) even though the flashing has not actually occurred.
  • misclassified image data As teacher data and update the classifier 91 by re-learning based on the teacher data.
  • FIG. 10 is a diagram schematically showing an example of the internal configuration of control unit 9A.
  • the control unit 9A has the same configuration as the control unit 9 except for the presence or absence of the history data storage control unit 96.
  • the image data acquired by the camera 70 and the classification result information by the classifier 91 are input to the history data storage control unit 96.
  • the history data storage control unit 96 associates each image data with its classification result, and stores these in a storage medium as history data.
  • the history data storage control unit 96 may store a history data group for each process on the substrate W in a storage medium. That is, each time a process is performed on the substrate W, a plurality of pieces of history data acquired by the process may be stored in the storage medium as a group of history data.
  • the history data storage control unit 96 associates the accompanying information (for example, the identification information of the processing unit 1, the date and time information, and the identification information of the processing liquid supply unit) similar to the candidate data with the history data, and stores them in the storage medium. May be.
  • the storage medium in which the history data is stored may be the same as or different from the storage medium in which at least one of the determination database 913, the candidate database 931 and the teacher database 921 is stored.
  • the plurality of pieces of history data stored in the storage medium will also be referred to as a history database 961.
  • the input unit 82 can receive an input for instructing display of history data.
  • the display control unit 94 causes the display unit 81 to display the history data (image data and the classification result) in response to the input.
  • the operator visually checks whether the classification result of the image data displayed on the display unit 81 is correct. When this classification is incorrect, that is, when the image data is incorrectly classified, the operator inputs a correct category different from the misclassified category as a label by inputting to the input unit 82.
  • the teacher data generation unit 95 associates the image data displayed on the display unit 81 with the label input to the input unit 82, and stores the new data as new teacher data in the storage medium. That is, the teacher database 921 is updated.
  • FIG. 11 is a flowchart showing an example of the updating process of the classifier 91 by relearning.
  • the operator selects the misclassified image data as teacher data. Specifically, first, the operator causes the display unit 81 to display the browsing screen PS1 by inputting to the input unit 82.
  • FIG. 12 is a diagram schematically showing an example of the browsing screen PS1.
  • the viewing screen PS1 is similar to the input screen IS1 except for the following points. That is, history data, not candidate data, is displayed in the image display area IR1.
  • a list of history data groups at the designated processing unit 1 and date and time is displayed in the image display area IR1.
  • an item of “pass / fail” is provided in the table of the list display area TR1.
  • the item of “quality” indicates the quality of the discharge state of the processing liquid.
  • the category result of the history data included in the history data group includes the category C2 (flushed state)
  • the discharge state of the processing liquid is poor, and thus the processing liquid is defective. Is shown for the history data group. If nothing is displayed in the item of “pass / fail”, it indicates that the processing liquid has been normally discharged and stopped normally.
  • the display control unit 94 reads the selected history data group from the history database 961 in response to the input. Then, the image data of the history data group is displayed in the image display area IR1. Then, the operator operates a button in the image selection area Ca1 by inputting to the input unit 82 to sequentially display the image data included in the history data group in the image display area IR1. In particular, the operator visually determines whether or not a drop has occurred in the image data when the discharge of the processing liquid is stopped.
  • the classification result by the classifier 91 was appropriate, and the operator ends the processing.
  • the classification result by the classifier 91 is inappropriate, and the image data is adopted as teacher data. That is, a correct label is given to the image data.
  • the operator operates a button in the label selection area L1 by input to the input unit 82 to input a correct label for the image data. I do. For example, for the image data shown in FIG. 9, a label of "discharge stop state" is input.
  • the teacher data generation unit 95 receives the image data currently displayed in the image display area IR1 from the display control unit 94, and displays a label corresponding to the input to the image data. And these are stored in the storage medium as teacher data. As a result, teacher data for re-learning can be generated.
  • step S22 the machine learning unit 92 updates the classifier 91 by re-learning based on teacher data including new teacher data.
  • steps S23 and S24 are executed. Steps S23 and S24 are the same as steps S3 and S4, respectively.
  • the correct category for the misclassified image data can be reflected on the classifier 91. Therefore, it is possible to reduce the possibility that the updated classifier 91 will misclassify.
  • a label “discharge stop state” is given to the image data that has been incorrectly classified.
  • the feature vector of the misclassified image data is misclassified because it is close to the category C2 of “blurred state”.
  • the later classifier 91 may easily misclassify the image data that should be classified into the category C2 in the “dropped state” into the category C3 in the “discharge stopped state”.
  • a new label may be assigned to the misclassified image data. That is, another category C4 may be provided as a type of classification in addition to the categories C1 to C3.
  • the category C4 may be a category indicating misclassification.
  • a category (label) indicating that a pattern (for example, a pattern) P1 similar to the bleeding is formed on the upper surface of the substrate W may be employed.
  • FIG. 13 is a diagram schematically showing an example of the browsing screen PS1A.
  • the browsing screen PS1A in FIG. 13 is the same as that in FIG. 12 except for the types of buttons in the label selection area L1.
  • the label selection area L1 includes a button of “pattern similar to dropping” as a label corresponding to the category C4.
  • the teacher data generation unit 95 responds to the input and adds the input label to the history data currently displayed in the image display area IR1. Are stored in the storage medium as new teacher data.
  • a dedicated label (a label corresponding to the category C4) can be given to the image data that is misclassified.
  • the classifier 91 classifies all of the image data sequentially input from the camera 70 into the respective categories during the processing period using the processing liquid for the substrate W, thereby determining whether the discharge state of the processing liquid is good. Good.
  • the control unit 9 (or the control unit 9A) may determine that the ejection state is defective when the image data is classified into the category C2. However, it is not necessarily limited to this.
  • the control unit 9 (or the control unit 9A) may first tentatively determine the defective ejection state by a simpler process on the image data.
  • FIG. 14 is a functional block diagram schematically showing an example of the internal configuration of the control unit 9B.
  • the control unit 9B is the same as the control unit 9 except for the presence or absence of the temporary determination unit 97.
  • Image data obtained by the camera 70 is input to the temporary determination unit 97.
  • the tentative determination unit 97 performs image processing on the image data and tentatively determines whether the ejection state is defective. Specifically, the provisional determination unit 97 determines whether or not the ejection state is a drop-off state.
  • FIG. 15 is a diagram schematically illustrating an example of image data.
  • the ejection region R1 is a rectangular region extending from the tip of the ejection nozzle 61 in the ejection direction of the processing liquid in the image data, and has a width wider than the width of the processing liquid.
  • the ejection region R1 is set to a length that does not include, for example, a liquid landing point between the processing liquid and the substrate W.
  • the provisional determination unit 97 calculates the statistics of the pixel values in the ejection region R1.
  • This statistic is a value corresponding to the area of the image of the processing liquid in the ejection region R1, and is, for example, a variance (for example, a standard deviation) of pixel values in the ejection region.
  • the processing liquid when the processing liquid is normally discharged from the discharge nozzle 61, the processing liquid extends from one end in the vertical direction to the other in the discharge region R1, and the upper surface of the substrate W appears on both sides in the left and right direction. Since the luminance value of the pixel in which the processing liquid is captured increases due to the reflection of light on the processing liquid, the variation in the luminance value in the ejection region R1 becomes relatively high.
  • the camera 70 may be a camera that acquires grayscale image data, or may be a camera that acquires color image data. In the former case, it can be said that the pixel value of the image data indicates a luminance value. In the following, a camera that acquires grayscale image data will be described as an example. However, in the case of color, a luminance value may be calculated from a pixel value and the luminance value may be used.
  • the processing liquids are discretely arranged in the vertical direction in the discharge region R1 (see also FIG. 5). Therefore, the variation of the pixel value in the ejection region R1 is higher than in the normal ejection state.
  • the processing liquid is not discharged from the discharge nozzle 61, the upper surface of the substrate W is captured in the discharge region R1, so that the pixel value in the discharge region R1 is relatively small and its variation is relatively small. Therefore, it is possible to detect a drop-off state (defective ejection state) according to the variance (for example, the standard deviation) of the pixel values in the ejection region R1.
  • the tentative determination unit 97 calculates the variance in the ejection region R1, and determines whether the statistic (variance) is larger than a reference value. When the variance is larger than the reference value, the tentative determination unit 97 tentatively determines that a dropping state has occurred, and notifies the classifier 91 of the fact.
  • the classifier 91 performs a classification process on the image data, triggered by the notification from the temporary determination unit 97. In other words, if the classifier 91 does not receive the notification from the provisional determination unit 97, it does not perform the classification process.
  • the tentative determination is performed by simpler image processing, and when the tentative determination is made that a defective ejection state has occurred, the classification process by the classifier 91 is performed. Therefore, the processing in the control unit 9B can be reduced in comparison with the case where the classification processing is always performed by the classifier 91.
  • a semiconductor substrate has been described as the substrate W, the present invention is not limited to this.
  • a substrate such as a glass substrate for a photomask, a glass substrate for a liquid crystal display, a glass substrate for a plasma display, a substrate for an FED (Field Emission Display), a substrate for an optical disk, a substrate for a magnetic disk, or a substrate for a magneto-optical disk may be used. Good.
  • the present embodiment can be applied to any apparatus that performs a predetermined process by discharging a processing liquid from a movable nozzle to a substrate.
  • a rotary coating apparatus spin coater
  • spin coater that discharges a photoresist liquid from a nozzle onto a rotating substrate to apply a resist, and a substrate on which a film is formed on the surface.
  • the technology according to the present embodiment may be applied to a device that discharges a film removing liquid from a nozzle to an edge of the substrate or a device that discharges an etching liquid from a nozzle to the surface of a substrate.
  • the processing liquid is discharged from the discharge nozzle as a continuous flow, but is not necessarily limited to this.
  • the processing liquid may be discharged from the discharge nozzle in a mist state.
  • the history data storage control unit 96 may store all of the image data during the processing period and the classification result of the image data on a recording medium as history data. Thereby, all the image data during the processing period can be confirmed. Alternatively, the history data storage control unit 96 may store, for example, the image data in only the candidate period and the classification result for the image data on the recording medium as history data.
  • the teacher data may be stored in an external server or the like.
  • the teacher data stored in the external server may be transmitted to the control unit of another substrate processing apparatus, and the control unit may generate a classifier.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • General Physics & Mathematics (AREA)
  • Manufacturing & Machinery (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Power Engineering (AREA)
  • Cleaning Or Drying Semiconductors (AREA)

Abstract

L'invention concerne un procédé de génération de données d'apprentissage qui peut générer des données d'apprentissage en fonction d'un état d'évacuation d'une buse d'évacuation. Le procédé de génération de données d'apprentissage présente une étape de support, une étape d'évacuation de liquide de traitement, une étape de capture d'image et une étape de génération de données d'apprentissage. Dans l'étape de support, un substrat est supporté de manière approximativement horizontale dans un mécanisme de support de substrat. Dans l'étape d'évacuation de liquide de traitement, un liquide de traitement est évacué de la buse d'évacuation vers une surface principale du substrat. Dans l'étape de capture d'image, une image d'une zone de capture d'image, qui est ajustée pour inclure la buse d'évacuation et au moins une partie du liquide de traitement évacué dans l'étape d'évacuation de liquide de traitement, est capturée par une caméra sur une période comprenant au moins une partie de l'étape d'évacuation de liquide de traitement et une pluralité de parties de données d'image est acquise. Dans l'étape de génération de données d'apprentissage, par rapport à au moins une partie de la pluralité de parties de données d'image, une étiquette se rapportant à un état d'évacuation du liquide de traitement photographié dans les données d'image est appliquée aux données d'image et les données d'apprentissage sont générées.
PCT/JP2019/032550 2018-08-27 2019-08-21 Procédé de génération de données d'apprentissage et procédé de détermination d'état d'évacuation WO2020045176A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020217004602A KR102524704B1 (ko) 2018-08-27 2019-08-21 교사 데이터 생성 방법 및 토출 상태의 판정 방법
CN201980055179.1A CN112601617A (zh) 2018-08-27 2019-08-21 教师数据生成方法以及吐出状态的判断方法

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2018-158343 2018-08-27
JP2018158343 2018-08-27
JP2019-149428 2019-08-16
JP2019149428A JP7219190B2 (ja) 2018-08-27 2019-08-16 教師データ生成方法および吐出状態の判定方法

Publications (1)

Publication Number Publication Date
WO2020045176A1 true WO2020045176A1 (fr) 2020-03-05

Family

ID=69644398

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/032550 WO2020045176A1 (fr) 2018-08-27 2019-08-21 Procédé de génération de données d'apprentissage et procédé de détermination d'état d'évacuation

Country Status (1)

Country Link
WO (1) WO2020045176A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11176734A (ja) * 1997-12-16 1999-07-02 Sony Corp レジスト塗布装置
WO2016098495A1 (fr) * 2014-12-15 2016-06-23 株式会社Screenホールディングス Dispositif de classification d'image et procédé de classification d'image
JP2016122681A (ja) * 2014-12-24 2016-07-07 株式会社Screenホールディングス 基板処理装置および基板処理方法
JP2016178238A (ja) * 2015-03-20 2016-10-06 東京エレクトロン株式会社 薬液供給装置の調整方法、記憶媒体及び薬液供給装置
JP2017173098A (ja) * 2016-03-23 2017-09-28 株式会社Screenホールディングス 画像処理装置および画像処理方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11176734A (ja) * 1997-12-16 1999-07-02 Sony Corp レジスト塗布装置
WO2016098495A1 (fr) * 2014-12-15 2016-06-23 株式会社Screenホールディングス Dispositif de classification d'image et procédé de classification d'image
JP2016122681A (ja) * 2014-12-24 2016-07-07 株式会社Screenホールディングス 基板処理装置および基板処理方法
JP2016178238A (ja) * 2015-03-20 2016-10-06 東京エレクトロン株式会社 薬液供給装置の調整方法、記憶媒体及び薬液供給装置
JP2017173098A (ja) * 2016-03-23 2017-09-28 株式会社Screenホールディングス 画像処理装置および画像処理方法

Similar Documents

Publication Publication Date Title
JP7177628B2 (ja) 基板処理方法、基板処理装置および基板処理システム
KR102522968B1 (ko) 기판 처리 장치 및 기판 처리 방법
TWI778290B (zh) 教師資料產生方法以及吐出狀態的判斷方法
TWI743522B (zh) 基板處理方法、基板處理裝置以及基板處理系統
TWI777463B (zh) 基板處理方法及基板處理裝置
WO2020071206A1 (fr) Dispositif de traitement de substrat et procédé de traitement de substrat
JP2016122681A (ja) 基板処理装置および基板処理方法
JP2021190511A (ja) 基板処理方法および基板処理装置
JP2015173204A (ja) 基板処理装置および基板処理方法
TW201929112A (zh) 基板處理裝置以及基板處理方法
WO2020045176A1 (fr) Procédé de génération de données d'apprentissage et procédé de détermination d'état d'évacuation
JP7157629B2 (ja) 基板処理装置および基板処理方法
JP7202106B2 (ja) 基板処理方法および基板処理装置
JP2021044467A (ja) 検知装置、および、検知方法
CN112490167A (zh) 基板处理装置及基板处理方法
CN112509940B (zh) 基板处理装置以及基板处理方法
TWI831236B (zh) 狀態檢測裝置以及狀態檢測方法
WO2024084853A1 (fr) Procédé de détermination de position et dispositif de détermination de position
JP2023137511A (ja) 基板処理装置および監視方法
JP2020061405A (ja) 基板処理方法および基板処理装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19853368

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20217004602

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19853368

Country of ref document: EP

Kind code of ref document: A1