WO2016088260A1 - Dispositif à faisceau de particules chargées et procédé de détection de portée de chambre et d'objet cible - Google Patents

Dispositif à faisceau de particules chargées et procédé de détection de portée de chambre et d'objet cible Download PDF

Info

Publication number
WO2016088260A1
WO2016088260A1 PCT/JP2014/082292 JP2014082292W WO2016088260A1 WO 2016088260 A1 WO2016088260 A1 WO 2016088260A1 JP 2014082292 W JP2014082292 W JP 2014082292W WO 2016088260 A1 WO2016088260 A1 WO 2016088260A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
sample
stage
light source
image
Prior art date
Application number
PCT/JP2014/082292
Other languages
English (en)
Japanese (ja)
Inventor
千葉 寛幸
達也 平戸
中村 光宏
Original Assignee
株式会社日立ハイテクノロジーズ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立ハイテクノロジーズ filed Critical 株式会社日立ハイテクノロジーズ
Priority to PCT/JP2014/082292 priority Critical patent/WO2016088260A1/fr
Priority to JP2016562177A priority patent/JP6335328B2/ja
Publication of WO2016088260A1 publication Critical patent/WO2016088260A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01JELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
    • H01J37/00Discharge tubes with provision for introducing objects or material to be exposed to the discharge, e.g. for the purpose of examination or processing thereof
    • H01J37/02Details
    • H01J37/20Means for supporting or positioning the objects or the material; Means for adjusting diaphragms or lenses associated with the support
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01JELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
    • H01J37/00Discharge tubes with provision for introducing objects or material to be exposed to the discharge, e.g. for the purpose of examination or processing thereof
    • H01J37/26Electron or ion microscopes; Electron or ion diffraction tubes
    • H01J37/28Electron or ion microscopes; Electron or ion diffraction tubes with scanning beams
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L21/00Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
    • H01L21/02Manufacture or treatment of semiconductor devices or of parts thereof
    • H01L21/027Making masks on semiconductor bodies for further photolithographic processing not provided for in group H01L21/18 or H01L21/34

Definitions

  • the present invention relates to a charged particle beam apparatus, a chamber scope, and an object detection method, for example, an observation technique in a charged particle beam apparatus.
  • a charged particle beam apparatus is an apparatus that performs observation using a charged particle beam, and a detection target (sample stage) needs to be placed in a vacuum in order to prevent scattering of charged particles. For this reason, the sample cannot be visually confirmed, and a method using an optical camera for confirming the inside of the sample chamber is known.
  • an optical camera is attached as a means for specifying the state and location of a sample installed in a sealed vacuum sample chamber, and the camera image is displayed on an external monitor in real time. It is a method to confirm.
  • the method of displaying the inside of the sample chamber on an external monitor and visually confirming it has a problem that it is necessary to keep an eye on the image acquired mainly by the charged particles to be observed. In this way, with the method of observing with an optical camera, the image to be observed and the image of the external monitor for confirming the sample chamber cannot be visually confirmed at the same time. There is a problem that there is no.
  • the detection target and the sample chamber that is the background are made of metal, so that the entire image is brightly shining, and only the detection target is automatically extracted from the acquired image alone. It is difficult.
  • Patent Documents 1 and 2 disclose a method of automatically detecting a detection target by irradiating the edge portion of the detection target with light, photographing the irradiation light with a camera installed on the opposite side of the light source with respect to the detection target. Yes.
  • Patent Document 1 attempts to automatically detect an edge portion of a sample, it is intended for a thin edge and light needs to pass through a part of the sample.
  • Patent Document 2 deals with a case where light does not pass through the sample, it is necessary to install a detection target (sample stage) between the sample and the LED light source. Therefore, the cameras that can be used are limited to characteristic ones, and the viewing angle of the camera with respect to the edge is limited. For this reason, it cannot be applied to a sample chamber of a charged particle beam apparatus in which the installation location is limited, or an object without an edge portion (for example, an observation location inside the sample) cannot be used, and is not general purpose.
  • the present invention has been made in view of such a situation, and does not limit the position and direction of the camera and the detection target, the type of the camera, and the detection target regardless of whether light passes through the sample.
  • a technology that enables automatic identification and identification is provided.
  • light is irradiated from a light source to a detection target placed on a stage provided in a sample chamber of the charged particle beam apparatus, and the light is irradiated against the background of the sample chamber.
  • the detected object is imaged with a camera.
  • a processor processes the image of the detection target imaged with the camera.
  • the detection target and the sample chamber are made of different materials, and the wavelength of light from the light source is set so that the reflectance of the detection target is different from the reflectance of the sample chamber.
  • the position and state of the detection target can be automatically detected without limiting the position and direction of the camera and the detection target, and the type of the camera.
  • FIG. 1 It is a figure which shows the example of schematic structure of the charged particle beam apparatus by embodiment of this invention. It is a figure which shows the light source arrangement example in the chamber scope by embodiment of this invention. It is a figure which shows the structural example for automatically detecting the position of object in the sample chamber of a charged particle beam apparatus. It is a figure which shows the relationship between the wavelength and reflectance in aluminum and iron. It is a figure which shows the example of the line profile of the Y direction of a X direction center coordinate. It is a figure which shows the coordinate direction in the acquired image. It is a flowchart for demonstrating the process which calculates
  • FIG. 2 is a diagram showing a structure of a sample holder 1001 placed on a stage 105 and a sample stage 104 placed on the sample holder 1001.
  • FIG. 6 is a diagram illustrating a configuration example of a UI (User Interface) operation screen displayed in order to avoid a collision between an upper portion of a sample stage 104 and a detector or an objective lens in a charged particle beam apparatus (scanning electron microscope). It is a figure which shows the relationship between the wavelength of the laser (light source) regarding each material, and a reflectance.
  • UI User Interface
  • the embodiment of the present invention may be implemented by software running on a general-purpose computer, or may be implemented by dedicated hardware or a combination of software and hardware.
  • FIG. 1 is a diagram illustrating a schematic configuration example of a charged particle beam apparatus according to an embodiment of the present invention.
  • the charged particle beam apparatus includes a sample chamber 101 that maintains a vacuum state so that the charged particle beam is not scattered, an objective lens 102 that finally converges the charged particle beam onto the sample, and detection that detects a signal obtained from the sample.
  • a vessel 103 a sample stage 104 for placing a sample, a stage 105 for mounting the sample stage 104 and moving an observation position, a chamber scope 106 for grasping the position of the sample stage 104, and a chamber scope
  • a control device processor such as a microcomputer
  • 109 for controlling a light source (LED) and an arithmetic device (computer) 110 that executes various calculations are included.
  • the chamber scope 106 has an illumination 107 for illuminating the inside of the sample chamber 101 and an optical camera 108 for taking an image in the sample chamber 101 as a configuration for detecting the sample position.
  • the control device 109 has an LED control unit 1091 for performing lighting control or light amount control of the LED illumination.
  • the LED control unit 1091 may be configured by a program.
  • the arithmetic device 110 is configured by a computer, and includes a CPU (processor) 1101 that executes various programs, a memory 1102 that stores various programs, an input device such as a mouse and a keyboard, and an output device such as a display and a printer. An output device 1103 and a storage device 1104 that stores various parameters and the like are included.
  • the memory 1102 includes, as programs, a camera image acquisition unit 11021 that acquires an image of an optical camera, an image processing unit 11022 that processes the captured image, an arithmetic processing unit 11023 that specifies a position from the processed image, and a processing result And a result output unit 11024 for outputting.
  • FIG. 2 is a diagram illustrating an arrangement example of light sources of the chamber scope according to the embodiment of the present invention.
  • FIG. 2 shows the chamber scope 106 viewed from the direction of the arrow 111 (FIG. 1).
  • the camera (CMOS camera or CCD camera) 201 and the plurality of light sources 202 to 204 are installed such that the imaging direction of the camera 201 and the light irradiation direction of the light source are the same. ing.
  • a combination of a plurality of light sources a combination of a white LED 202, a blue LED 203, and an infrared LED 204 can be given.
  • the light sources of the white LED 202, the blue LED 203, and the infrared LED 204 can be switched according to an operator's instruction.
  • the operator inputs a light source switching instruction using an input device (not shown) provided in the control device 109, and the LED control unit 1091 emits any LED in response to the instruction.
  • the infrared LED is used because the emission wavelength is out of the detection sensitivity of the SE (secondary electron) detector, so that simultaneous observation of the SE image and the chamber scope is facilitated.
  • a red LED may be used instead of the infrared LED.
  • FIG. 3 is a diagram for explaining the principle for automatically grasping (detecting) the state and position of the detection target 301 (corresponding to the sample stage 104 in FIG. 1) in the charged particle beam apparatus having the above configuration. .
  • the control device 109 turns on the illumination 107 of the chamber scope 106 in response to an instruction from the operator, and irradiates the detection target 301 and the background material 302 with light. At this time, with respect to the light of the illumination 107, light having a wavelength such that there is a difference in reflectance between the detection target 301 and the background material 302 is selected. In this state, an image is acquired using the optical camera 108, and the CPU 1101 of the arithmetic device 110 captures an image using the camera image acquisition unit 11021. Then, the CPU 1101 processes the captured image using the image processing unit 11022.
  • processing such as application of an edge enhancement filter for enhancing an edge, binarization processing of an acquired image, gray scale conversion of an image, and color extraction may be performed.
  • image processing such as pattern matching may be used to improve the accuracy.
  • the CPU 1101 performs an operation for obtaining the coordinates of the detection target 301 on the image thus processed. For example, the CPU 1101 extracts a boundary between the detection target 301 and the background material 302 by obtaining a coordinate value having the highest luminance for the image whose edge is enhanced by the image processing unit 11022, and outputs the boundary as the detection target coordinate. May be.
  • the edge-enhanced image is integrated in the X direction or the Y direction (note that it is a coordinate system for image processing, not the coordinate system of the sample stage in the charged particle beam apparatus), and the profile The position with the highest luminance may be used as the detection target coordinates. This may be performed only in the X direction or the Y direction, or may be performed in both the X direction and the Y direction. Note that such processing need not be performed with edge enhancement, and may be applied to grayscale images or color extracted images.
  • region division may be performed on the binarized image, the area of the region may be obtained, and the coordinates of the region closest to the area of the target to be obtained may be output.
  • the center of gravity of a white area obtained simply by binarization may be output as a coordinate value without dividing the area by appropriately setting a threshold value.
  • coordinates obtained by pattern matching processing may be output.
  • the CPU 1101 may simply output a coordinate value by using the result output unit 11024, or may display it superimposed on the acquired image.
  • the value used as the evaluation value may also be output together and output as information on the reliability of the result.
  • illumination 107 illumination having a wavelength such that the reflectance differs between the wall of the sample chamber 101 and the sample stage 104 is selected.
  • the illumination 107 is based on the relationship between the wavelength of light and the reflectance in aluminum (Al) and iron (Fe) shown in FIG. Select the color. That is, FIG. 4 shows that, in the combination of aluminum and iron, if blue illumination 107 with a short wavelength is selected, the reflectance differs between aluminum and iron, and a luminance difference occurs between the respective materials. . Therefore, in this example, by adopting blue light as the light source, a bright image can be acquired only for the sample stage 104.
  • the illumination 107 may not be blue light, and the wavelength may be selected so that the reflectance of the sample stage (inspection object) 104 and the reflectance of the sample chamber (background material) 101 are different. Other combinations of the inspection object, the background material, and the light source (illumination color) for brightening only the inspection object will be described later.
  • the image acquired in this way is converted into a grayscale image by the image processing unit 11022.
  • This conversion is not limited to grayscale conversion, and an image obtained by extracting each color may be used, or an image obtained by extracting an edge may be used.
  • the arithmetic processing unit 11023 acquires a line profile in the Y direction of the center coordinate in the X direction.
  • FIG. 5 is a diagram illustrating an example of the acquired luminance profile. If this profile is used, the position of the sample stage 104 can be easily specified. That is, it is specified that the location where the luminance intensity exceeds a predetermined threshold is the upper end of the sample stage 104. Therefore, the distance from the position of the lens aperture to the sample stage 104 can be known.
  • the X direction and the Y direction are coordinate directions in the acquired image as shown in FIG. 6, and are different from the coordinates of the charged particle beam apparatus.
  • the Y direction (Y coordinate) in FIG. 6 corresponds to the Z coordinate of the charged particle beam apparatus (the same applies to the Z coordinate in FIG. 5). Further, this profile may be displayed on a display included in the input / output device 1103.
  • FIG. 7 is a flowchart for explaining processing for obtaining the coordinate value of the sample stage from the acquired profile.
  • Step 701 In response to the operator's instruction, the control device 109 irradiates the sample chamber 101 with blue light using the LED control unit 1091.
  • Step 702 The CPU 1101 of the arithmetic device 110 acquires an image of the target (sample stage 104 and sample chamber 101) irradiated with blue light using the camera image acquisition unit 11021, and acquires the image acquired using the image processing unit 11022, for example. Convert to a grayscale image. Then, the CPU 1101 uses the arithmetic processing unit 11023 to obtain a profile at the X-axis center (fixed Y coordinate) for the image converted to grayscale.
  • Step 703 The CPU 1101 uses the arithmetic processing unit 11023 to calculate the differential value of the profile acquired in step 703.
  • Step 704 The CPU 1101 uses the arithmetic processing unit 11023 to determine whether or not the differential value obtained in step 704 is equal to or greater than a preset threshold value (predetermined threshold value). If the differential value of the profile is less than the predetermined threshold (No in step 704), the process proceeds to step 705. If the differential value is equal to or greater than the predetermined threshold (Yes in step 704), the process proceeds to step 706.
  • predetermined threshold value a preset threshold value
  • Step 705 The CPU 1101 moves the profile acquisition position to the next Y coordinate. Then, the processing in steps 702 to 704 is repeated.
  • Step 706 Using the arithmetic processing unit 11023, the CPU 1101 sets the current Y coordinate as the upper end of the sample stage 104, and ends the sample stage height automatic detection process.
  • FIG. This coordinate value can be used to limit the operation of the stage, for example. Specifically, when there is a possibility that the sample stage 104 collides with the detector or the objective lens, it can be used as a safety function that automatically stops the movement of the stage 105. In addition, since the distance between the objective lens and the sample can be determined from the height of the sample stage 104, the focus position of the charged particle beam apparatus (scanning electron microscope) can be roughly positioned, and the speed of automatic focusing can be increased. It becomes like this.
  • FIG. 8 is a diagram showing a charged particle beam apparatus (scanning electron microscope) including a configuration example for automatically detecting the orientation of the sample stage.
  • a detection marker 801 made of a material having a reflectance different from that of the sample stage 104 is attached to the side surface of the sample stage 104.
  • the sample stage 104 is made of aluminum, and a screw hole is cut in the side surface of the sample stage 104. Then, a screw made of iron is attached to this screw hole as the detection marker 801.
  • the sample stage 104 may be made of iron and the screw may be made of aluminum.
  • the detection marker attached to the side surface does not need to be a screw.
  • a hole may be made in the sample stage 104 and a pin or the like may be inserted. As a result, only the detection marker 801 is displayed in black in the brightly displayed portion of the sample stage 104, so that it can be easily determined whether or not the marker 801 is in front.
  • FIG. 9 is a flowchart for explaining a process of automatically detecting the direction (front) of the sample stage 104 (aligning the detection marker 801 with the front of the chamber scope 106).
  • Step 901 In response to the operator's instruction, the control device 109 irradiates the sample chamber 101 with blue light using the LED control unit 1091.
  • Step 902 The CPU 1101 of the arithmetic device 110 uses the camera image acquisition unit 11021 to acquire an image of the target (the sample stage 104 and the detection marker 801) irradiated with blue light.
  • Step 903 The CPU 1101 uses the arithmetic processing unit 11023 to determine whether or not the image of the marker 801 is included in the image acquired in step 902. For example, the image of the detection marker 801 is included in the captured image when the sample stage 104 is facing the front, but when the detection marker 801 is at a position opposite to the chamber scope 106 direction, the image of the detection marker 801 is It is not included in the captured image. In addition, when the detection marker 801 is not directly in front, the image of the detection marker 801 is a dull image compared to the image in front of it. For example, if the detection marker has a circular shape, an elliptical image is acquired.
  • the detection marker 801 is detected when the aspect ratio of the ellipse is equal to or greater than a predetermined value.
  • an image obtained when the detection marker 801 is at least in front is stored in the storage device 1104 in advance, and the stored image is compared with the image acquired in step 902, whereby the detection marker 801 is stored. Confirm existence.
  • the presence of the detection marker 801 may be confirmed by observing the luminance of the image captured in step 902. In this case, the average value of the luminance of each region is taken and a histogram is created. Since the luminance of the area where the detection marker 801 is present is lower than that of the other areas, the histogram takes a low value. An area where the value of this histogram is lower than a predetermined value may be determined as the position of the detection marker 801.
  • Step 904 The CPU 1101 notifies the control device 109 that the detection marker 801 has not been detected. Then, the control device 109 rotates the stage 105 using the stage control unit 802 and moves to the next rotation angle. Then, the processing of steps 902 and 903 is repeated.
  • Step 905 The CPU 1101 ends the profile acquisition, and ends the automatic detection process for the sample stage with the current rotation angle as the front.
  • FIG. 10 is a diagram showing the structure of the sample holder 1001 placed on the stage 105 and the sample stand 104 placed on the sample holder 1001.
  • the sample holder 1001 has a stage attachment portion 10011 for attaching the sample holder 1001 to the stage 105 and a height adjusting screw 10012 for adjusting the height of the sample holder 1001.
  • the height of the sample holder 1001 is adjusted by the height adjusting screw 10012. Therefore, when the sample stage 104 is detached from the sample holder 1001, the height is adjusted, and the sample is observed again, as shown in FIG. 11, the previous observation position (stage rotation angle) and the current observation are shown.
  • the position (stage rotation angle) may be different.
  • a dotted-line rectangle 11002 indicates the reproduction of the previous observation position.
  • the previous observation position can be reproduced by storing stage coordinates and magnification information in the storage device 1104.
  • the same image as the previous observation is taken. Will be able to.
  • the sample 11001 at the previous observation position can be observed again at a higher magnification, or can be observed again under different optical conditions.
  • FIG. 14 is a flowchart for explaining processing for automatically detecting the outer diameter of the sample stage 104 in the charged particle beam apparatus (scanning electron microscope).
  • Step 1401 In response to the operator's instruction, the control device 109 irradiates the sample chamber 101 with blue light using the LED control unit 1091.
  • Step 1402 The CPU 1101 executes the automatic detection processing (steps 902 to 905) for the sample stage in FIG. 9, and moves the position of the detection marker 801 to the front.
  • Step 1403 The control device 109 uses the stage control unit 802 to rotate the stage 105 by an arbitrary angle.
  • the rotation angle may be determined in advance.
  • Step 1404 The CPU 1101 measures the amount of movement of the detection marker 801 using the arithmetic processing unit 11023. This is because the amount of movement of the detection marker 801 varies depending on the outer diameter size of the sample stage 104.
  • Step 1405 The CPU 1101 uses the image processing unit 11022 to calculate the outer diameter of the sample stage 104 by image processing from the movement amount obtained in step 1404, and ends the automatic detection process of the sample stage outer diameter.
  • FIG. 15 is a diagram showing a configuration example of a UI (User Interface) operation screen displayed to avoid collision between the upper part of the sample stage 104 and the detector or the objective lens in the charged particle beam apparatus (scanning electron microscope). is there.
  • UI User Interface
  • the operator can adjust the position of the upper limit line adjustment 1502 by moving the upper limit line adjustment unit 1503 up and down with a mouse or the like on the UI operation screen shown in FIG. 15 displayed on a display (not shown). This limits the upper limit of the sample stage 104.
  • a scale display 1501 that can intuitively display the distance from the detector or the objective lens may be displayed. Furthermore, the area 1504 where the sample stage 104 cannot enter may be masked and displayed.
  • FIG. 16 is a diagram showing the relationship between the wavelength of the laser (light source) and the reflectance for each material. According to this graph and / or experiments conducted by the inventors, it was found that the following combinations are possible.
  • a charged particle beam apparatus acquires an image of a detection target (sample stage) placed on a stage and performs predetermined image processing on the image, thereby detecting the detection target. Detect the state and position (including height).
  • the detection target and the sample chamber are made of different materials.
  • the wavelength of the light emitted from the light source that irradiates the detection target with light is set so that the reflectance of the detection target is different from the reflectance of the sample chamber.
  • the wavelength of the light of the light source is set so that the reflectance of the detection target is higher than the reflectance of the sample chamber.
  • the luminance of the detection target can be made higher than that of the background, so that the state and position of the detection target can be easily and automatically detected.
  • a position where the luminance of the image of the sample table is higher than a predetermined threshold is detected as the height of the sample table.
  • the position where the luminance of the image of the sample table is lower than the predetermined threshold is detected as the height of the sample table. Since the height can be detected by the image processing in this way, it is possible to automatically grasp the height position of the detection target (sample stage).
  • the direction (angle) of the sample stage can be automatically detected. That is, a marker having a reflectance different from that of the sample table when the light from the light source is irradiated is provided at a predetermined position of the sample table. And the image of a sample stand and a marker is acquired with a camera, and the direction of a sample stand is detected by the luminance difference between them.
  • the relationship between the orientation of the sample stage (angle from the reference position) and the shape of the marker image is stored in advance on a table, etc., and it detects not only the state of the sample stage in front but also the state at an arbitrary angle. You may make it do.
  • the amount of movement of the marker when the stage is rotated is measured, and the size of the sample stage (outer diameter and area of the sample stage) is calculated from the amount of movement.
  • the shape of the pedestal portion of the sample stage is not limited to a circle, and may be a square, a rectangle, or other polygons. However, it is necessary to recognize the relative positional relationship between the rotation angle and the sample stage.
  • the detection target is made of aluminum or silver
  • the sample chamber is made of iron, copper, or gold.
  • the light source when the detection object-sample chamber combination is aluminum-iron, aluminum-copper, and silver-copper, a light source that emits blue light, green light, or ultraviolet light is used. When the combination is aluminum-gold, a light source that emits ultraviolet light is used.
  • the detection object-sample chamber combination is iron-silver, the light source emits light (ultraviolet light, blue light, green light, yellow light, orange light, red light, infrared light) having a wavelength of 400 nm or more. Can be used.
  • the detection object-sample chamber combination is iron-aluminum
  • a light source that emits light (blue light, green light, yellow light, orange light, red light) having a wavelength of 400 nm to 800 nm can be used.
  • the chamber scope according to the embodiment of the present invention has a plurality of light sources for irradiating the inspection target with light and a control device for switching the light sources.
  • a plurality of light sources for example, a light source that emits blue light, a light source that emits white light, and a light source that emits infrared light are employed.
  • blue light is used to acquire position information (height and position coordinates) of a detection target
  • white light is used to acquire color information
  • infrared light is used to observe a sample. Used. Since the emission wavelength of the infrared light deviates from the detection sensitivity of the SE (secondary electron) detector, the SE image and the chamber scope can be observed simultaneously. For observation, red light may be used instead of infrared light.
  • the direction of the camera and the direction of the plurality of light sources are the same.
  • the functions according to the embodiment of the present invention can also be realized by software program codes.
  • a storage medium in which the program code is recorded is provided to the system or apparatus, and the computer (or CPU or MPU) of the system or apparatus reads the program code stored in the storage medium.
  • the program code itself read from the storage medium realizes the functions of the above-described embodiments, and the program code itself and the storage medium storing the program code constitute the present invention.
  • a storage medium for supplying such program code for example, a flexible disk, CD-ROM, DVD-ROM, hard disk, optical disk, magneto-optical disk, CD-R, magnetic tape, nonvolatile memory card, ROM Etc. are used.
  • an OS operating system
  • the computer CPU or the like performs part or all of the actual processing based on the instruction of the program code.
  • the program code is stored in a storage means such as a hard disk or a memory of a system or apparatus, or a storage medium such as a CD-RW or CD-R
  • the computer (or CPU or MPU) of the system or apparatus may read and execute the program code stored in the storage means or the storage medium when used.
  • control lines and information lines are those that are considered necessary for the explanation, and not all control lines and information lines on the product are necessarily shown. All the components may be connected to each other.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • Manufacturing & Machinery (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Power Engineering (AREA)
  • Analysing Materials By The Use Of Radiation (AREA)

Abstract

Cette invention concerne un dispositif à faisceau de particules chargées, dans lequel un objet à détecter peut être automatiquement distingué et identifié sans limiter le type de caméra, la position et l'orientation de la caméra et de l'objet à détecter et, indépendamment du fait qu'une lumière est transmise ou non à travers un échantillon. Au-dessus d'un étage (105) disposé à l'intérieur d'une chambre d'échantillon (101) d'un dispositif à faisceau de particules chargées, un objet à détecter, tel qu'un étage d'échantillon (104), est irradié avec de la lumière provenant d'une source de lumière (107) pour capturer au moyen d'une caméra (108) l'objet à détecter, la chambre d'échantillon (101) servant d'arrière-plan. Ensuite, un processeur (1101) traite l'image de l'objet à détecter, qui a été capturée par la caméra (108). L'objet à détecter et la chambre d'échantillon (101) sont constitués de matériaux différents, et la longueur d'onde de la lumière provenant de la source de lumière (107) est réglée de telle manière que la réflectance de l'objet à détecter est différente de la réflectance de la chambre d'échantillon (101).
PCT/JP2014/082292 2014-12-05 2014-12-05 Dispositif à faisceau de particules chargées et procédé de détection de portée de chambre et d'objet cible WO2016088260A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2014/082292 WO2016088260A1 (fr) 2014-12-05 2014-12-05 Dispositif à faisceau de particules chargées et procédé de détection de portée de chambre et d'objet cible
JP2016562177A JP6335328B2 (ja) 2014-12-05 2014-12-05 荷電粒子線装置、チャンバースコープ、及び対象物検出方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2014/082292 WO2016088260A1 (fr) 2014-12-05 2014-12-05 Dispositif à faisceau de particules chargées et procédé de détection de portée de chambre et d'objet cible

Publications (1)

Publication Number Publication Date
WO2016088260A1 true WO2016088260A1 (fr) 2016-06-09

Family

ID=56091232

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/082292 WO2016088260A1 (fr) 2014-12-05 2014-12-05 Dispositif à faisceau de particules chargées et procédé de détection de portée de chambre et d'objet cible

Country Status (2)

Country Link
JP (1) JP6335328B2 (fr)
WO (1) WO2016088260A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019020156A (ja) * 2017-07-12 2019-02-07 株式会社堀場製作所 放射線検出装置、放射線検出方法及びコンピュータプログラム
WO2020183596A1 (fr) * 2019-03-12 2020-09-17 株式会社日立ハイテク Dispositif à faisceau de particules chargées
JP2022068150A (ja) * 2018-03-14 2022-05-09 株式会社Fuji 撮像ユニット
US11545334B2 (en) 2018-08-02 2023-01-03 Hitachi High-Tech Corporation Charged particle beam device

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6954943B2 (ja) 2019-03-15 2021-10-27 日本電子株式会社 荷電粒子線装置

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012018811A (ja) * 2010-07-08 2012-01-26 Keyence Corp 拡大観察装置及び拡大観察方法、拡大観察用プログラム並びにコンピュータで読み取り可能な記録媒体

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011003480A (ja) * 2009-06-22 2011-01-06 Hitachi High-Technologies Corp Sem式外観検査装置およびその画像信号処理方法
JP5047318B2 (ja) * 2010-03-05 2012-10-10 株式会社日立ハイテクノロジーズ 電子顕微鏡画像と光学画像を重ねて表示する方法

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012018811A (ja) * 2010-07-08 2012-01-26 Keyence Corp 拡大観察装置及び拡大観察方法、拡大観察用プログラム並びにコンピュータで読み取り可能な記録媒体

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019020156A (ja) * 2017-07-12 2019-02-07 株式会社堀場製作所 放射線検出装置、放射線検出方法及びコンピュータプログラム
JP2022068150A (ja) * 2018-03-14 2022-05-09 株式会社Fuji 撮像ユニット
JP7271738B2 (ja) 2018-03-14 2023-05-11 株式会社Fuji 撮像ユニット
US11545334B2 (en) 2018-08-02 2023-01-03 Hitachi High-Tech Corporation Charged particle beam device
WO2020183596A1 (fr) * 2019-03-12 2020-09-17 株式会社日立ハイテク Dispositif à faisceau de particules chargées
JPWO2020183596A1 (ja) * 2019-03-12 2021-10-21 株式会社日立ハイテク 荷電粒子線装置
JP7065253B2 (ja) 2019-03-12 2022-05-11 株式会社日立ハイテク 荷電粒子線装置
US11387072B2 (en) 2019-03-12 2022-07-12 Hitachi High-Tech Corporation Charged particle beam device

Also Published As

Publication number Publication date
JPWO2016088260A1 (ja) 2017-08-17
JP6335328B2 (ja) 2018-05-30

Similar Documents

Publication Publication Date Title
JP6335328B2 (ja) 荷電粒子線装置、チャンバースコープ、及び対象物検出方法
TWI558997B (zh) 缺陷觀察方法及其裝置
US10746763B2 (en) Apparatus and method for diagnosing electric power equipment using thermal imaging camera
JP6425755B2 (ja) 基板の異物質検査方法
JP4691453B2 (ja) 欠陥表示方法およびその装置
US9759662B2 (en) Examination device and examination method
JP2010112941A (ja) 表面検査装置
US20120326033A1 (en) Method for superimposing and displaying electron microscope image and optical image
JP2018039028A (ja) レーザ加工機、及びレーザ加工方法
US9658175B2 (en) X-ray analyzer
JP6697285B2 (ja) ウェハ欠陥検査装置
JP2014523545A (ja) 生体イメージングのための顕微鏡システム及び方法
JP2015040796A (ja) 欠陥検出装置
CN111801545B (zh) 线形状检查装置以及线形状检查方法
JP2009128303A (ja) 基板外観検査装置
US10446359B2 (en) Charged particle beam device
JP5401005B2 (ja) テンプレートマッチング方法、および走査電子顕微鏡
JP2014224686A (ja) 微粒子検出装置および微粒子検出方法
JP2009236760A (ja) 画像検出装置および検査装置
JP5379571B2 (ja) パターン検査装置及びパターン検査方法
US9964500B2 (en) Defect inspection device, display device, and defect classification device
JP2013015389A (ja) 溶接位置の検査方法及びその装置
JP7198360B2 (ja) 荷電粒子線装置
JP2016157720A (ja) ボンディングワイヤの検出方法及びボンディングワイヤの検出装置
JP4389761B2 (ja) はんだ検査方法およびその方法を用いた基板検査装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14907300

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2016562177

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14907300

Country of ref document: EP

Kind code of ref document: A1