NL1039086A - Particle beam microscope and method for operating the particle beam microscope. - Google Patents
Particle beam microscope and method for operating the particle beam microscope. Download PDFInfo
- Publication number
- NL1039086A NL1039086A NL1039086A NL1039086A NL1039086A NL 1039086 A NL1039086 A NL 1039086A NL 1039086 A NL1039086 A NL 1039086A NL 1039086 A NL1039086 A NL 1039086A NL 1039086 A NL1039086 A NL 1039086A
- Authority
- NL
- Netherlands
- Prior art keywords
- surface model
- microscope
- image
- dependence
- particle beam
- Prior art date
Links
- 239000002245 particle Substances 0.000 title claims description 169
- 238000000034 method Methods 0.000 title claims description 74
- 238000005259 measurement Methods 0.000 claims description 69
- 238000001514 detection method Methods 0.000 claims description 11
- 238000012544 monitoring process Methods 0.000 claims description 6
- 238000004891 communication Methods 0.000 claims 2
- 238000001000 micrograph Methods 0.000 claims 2
- 230000003287 optical effect Effects 0.000 description 29
- 238000003384 imaging method Methods 0.000 description 16
- 230000008569 process Effects 0.000 description 9
- 238000004422 calculation algorithm Methods 0.000 description 6
- 238000003860 storage Methods 0.000 description 6
- 238000005070 sampling Methods 0.000 description 5
- 238000003708 edge detection Methods 0.000 description 4
- 239000007789 gas Substances 0.000 description 4
- 238000012876 topography Methods 0.000 description 4
- 238000013519 translation Methods 0.000 description 4
- 238000002347 injection Methods 0.000 description 3
- 239000007924 injection Substances 0.000 description 3
- 238000010884 ion-beam technique Methods 0.000 description 3
- 238000002360 preparation method Methods 0.000 description 3
- 238000005086 pumping Methods 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 238000010894 electron beam technology Methods 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 239000001307 helium Substances 0.000 description 2
- 229910052734 helium Inorganic materials 0.000 description 2
- -1 helium ion Chemical class 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000003909 pattern recognition Methods 0.000 description 2
- 238000002310 reflectometry Methods 0.000 description 2
- FKOQWAUFKGFWLH-UHFFFAOYSA-M 3,6-bis[2-(1-methylpyridin-1-ium-4-yl)ethenyl]-9h-carbazole;diiodide Chemical compound [I-].[I-].C1=C[N+](C)=CC=C1C=CC1=CC=C(NC=2C3=CC(C=CC=4C=C[N+](C)=CC=4)=CC=2)C3=C1 FKOQWAUFKGFWLH-UHFFFAOYSA-M 0.000 description 1
- SUBDBMMJDZJVOS-UHFFFAOYSA-N 5-methoxy-2-{[(4-methoxy-3,5-dimethylpyridin-2-yl)methyl]sulfinyl}-1H-benzimidazole Chemical compound N=1C2=CC(OC)=CC=C2NC=1S(=O)CC1=NC=C(C)C(OC)=C1C SUBDBMMJDZJVOS-UHFFFAOYSA-N 0.000 description 1
- IPWKGIFRRBGCJO-IMJSIDKUSA-N Ala-Ser Chemical compound C[C@H]([NH3+])C(=O)N[C@@H](CO)C([O-])=O IPWKGIFRRBGCJO-IMJSIDKUSA-N 0.000 description 1
- 239000004956 Amodel Substances 0.000 description 1
- 102100029469 WD repeat and HMG-box DNA-binding protein 1 Human genes 0.000 description 1
- 101710097421 WD repeat and HMG-box DNA-binding protein 1 Proteins 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 229910001423 beryllium ion Inorganic materials 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000005284 excitation Effects 0.000 description 1
- 150000002500 ions Chemical group 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 238000000386 microscopy Methods 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 238000013022 venting Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J37/00—Discharge tubes with provision for introducing objects or material to be exposed to the discharge, e.g. for the purpose of examination or processing thereof
- H01J37/26—Electron or ion microscopes; Electron or ion diffraction tubes
- H01J37/28—Electron or ion microscopes; Electron or ion diffraction tubes with scanning beams
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J37/00—Discharge tubes with provision for introducing objects or material to be exposed to the discharge, e.g. for the purpose of examination or processing thereof
- H01J37/02—Details
- H01J37/20—Means for supporting or positioning the object or the material; Means for adjusting diaphragms or lenses associated with the support
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J37/00—Discharge tubes with provision for introducing objects or material to be exposed to the discharge, e.g. for the purpose of examination or processing thereof
- H01J37/02—Details
- H01J37/22—Optical, image processing or photographic arrangements associated with the tube
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J37/00—Discharge tubes with provision for introducing objects or material to be exposed to the discharge, e.g. for the purpose of examination or processing thereof
- H01J37/02—Details
- H01J37/22—Optical, image processing or photographic arrangements associated with the tube
- H01J37/226—Optical arrangements for illuminating the object; optical arrangements for collecting light from the object
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J37/00—Discharge tubes with provision for introducing objects or material to be exposed to the discharge, e.g. for the purpose of examination or processing thereof
- H01J37/26—Electron or ion microscopes; Electron or ion diffraction tubes
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J37/00—Discharge tubes with provision for introducing objects or material to be exposed to the discharge, e.g. for the purpose of examination or processing thereof
- H01J37/26—Electron or ion microscopes; Electron or ion diffraction tubes
- H01J37/261—Details
- H01J37/265—Controlling the tube; circuit arrangements adapted to a particular application not otherwise provided, e.g. bright-field-dark-field illumination
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J2237/00—Discharge tubes exposing object to beam, e.g. for analysis treatment, etching, imaging
- H01J2237/18—Vacuum control means
- H01J2237/184—Vacuum locks
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J2237/00—Discharge tubes exposing object to beam, e.g. for analysis treatment, etching, imaging
- H01J2237/20—Positioning, supporting, modifying or maintaining the physical state of objects being observed or treated
- H01J2237/202—Movement
- H01J2237/20278—Motorised movement
- H01J2237/20285—Motorised movement computer-controlled
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J2237/00—Discharge tubes exposing object to beam, e.g. for analysis treatment, etching, imaging
- H01J2237/20—Positioning, supporting, modifying or maintaining the physical state of objects being observed or treated
- H01J2237/202—Movement
- H01J2237/20292—Means for position and/or orientation registration
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J2237/00—Discharge tubes exposing object to beam, e.g. for analysis treatment, etching, imaging
- H01J2237/21—Focus adjustment
- H01J2237/216—Automatic focusing methods
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J2237/00—Discharge tubes exposing object to beam, e.g. for analysis treatment, etching, imaging
- H01J2237/22—Treatment of data
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J2237/00—Discharge tubes exposing object to beam, e.g. for analysis treatment, etching, imaging
- H01J2237/22—Treatment of data
- H01J2237/221—Image processing
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J2237/00—Discharge tubes exposing object to beam, e.g. for analysis treatment, etching, imaging
- H01J2237/248—Components associated with the control of the tube
- H01J2237/2482—Optical means
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J2237/00—Discharge tubes exposing object to beam, e.g. for analysis treatment, etching, imaging
- H01J2237/26—Electron or ion microscopes
Landscapes
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Analysing Materials By The Use Of Radiation (AREA)
- Length Measuring Devices By Optical Means (AREA)
Description
PARTICLE BEAM MICROSCOPE AND METHOD FOR OPERATING THEPARTICLE BEAM MICROSCOPE
Cross-References to Related Applications
The present application claims priority of German PatentApplication No. 10 2010 046 902.5, filed September 29, 2010 in Germany, entitled "Partikelstrahlmikroskop undVerfahren zum Betreiben hierzu", and of German PatentApplication No. 10 2011 103 997.3, filed June 10, 2011 inGermany, entitled "Partikelstrahlmikroskop und Verfahrenzum Betreiben hierzu", and of U.S. Patent Application No.13/029,998, entitled "Method of Operating a ScanningElectron Microscope", the contents of these documents arehereby incorporated by reference in its entirety.
Technical Field
The present invention relates to a particle beammicroscope and a method for operating a particle beammicroscope. More specifically, the present inventionrelates to an electron microscope, such as a scanningelectron microscope and a method for operating a scanningelectron microscope.
Background
When samples are imaged or processed with a particle beammicroscope, such as an electron microscope, they areusually kept in a vacuum environment in the specimenchamber. The specimen chamber is evacuated by a vacuumpump. Typically, measurements with scanning electronmicroscopes are conducted at a vacuum level in thespecimen chamber in a range of about high vacuum and about22.5 Torr. The specimen chamber is therefore designed as avacuum vessel, having solid walls and flanges, such thatthe leaking rates of atmospheric leaks can be kept as lowas possible. Hence, the vacuum vessel usually does nothave windows, which are large enough to allow a user tocontrol the positioning of the object in front of theobjective lens by visual observation.
Typically, the positioning of the sample is monitored by aCCD-camera, which is arranged within the specimen chamber.The camera acquires a video image from the sample and theobjective lens, which is displayed on a display. Bylooking at the video image, the user can observe thepositioning process in real time and control thepositioning of the sample via control signals, which aretransmitted to a positioning device.
However, the displayed video image provides the user onlywith a two dimensional image from the interior of thespecimen chamber, such that it is complicated toaccurately position the object relative to the objectlens. Furthermore, the viewing angle of the CCD-camera forobserving the object surface is typically obstructed bythe objective lens and detectors, especially when theobject is located close to the objective lens. Hence, theuser quite often is not able to determine, which part ofthe sample is irradiated by the electron beam.
Beside the objective lens, there are typically alsofurther components arranged in the interior of thespecimen chamber, which may obstruct the view to thesample during a positioning process. Examples of suchcomponents are detectors, gas injection systems andmanipulators. These components may also collide with thesample during a positioning process.
Conducting the positioning is even more complicated when anumber of objects, in particular objects having a complexgeometry, are attached to the object holder for beingpositioned in front of the objective lens.
By an inaccurately conducting the positioning process, itis possible that collisions occur, which may result indamages to either the object or to components of theelectron microscope.
It has been recognized, that the positioning of a sampleinside of a particle beam microscope is complicated toconduct. Thereby, handling the particle beam microscopefor carrying out a positioning process within a reasonableamount of time requires a lot of experience.
Summary
Embodiments provide a method for operating a particle beammicroscope, which comprises an objective lens having anobject region, wherein the method comprises: detectinglight rays and/or particles, which emanate from astructure, wherein the structure comprises at least aportion of a surface of an object and/or at least aportion of a surface of an object holder of the particlebeam microscope; generating a surface model of thestructure depending on the detected light rays and/orparticles; determining a position and an orientation ofthe surface model of the structure relative to the object region; determining a measurement location relative to thesurface model of the structure; and positioning the objectdepending on the generated surface model of the structure,depending on the determined position and orientation ofthe surface model of the structure, and depending on thedetermined measurement location.
Accordingly, a method for operating a particle beammicroscope is provided, which allows to position a samplerelative to a component of a particle beam microscope, inparticular an objective lens, with a high accuracy. Inparticular, it is possible to position a location on theobject surface, at which a measurement is to be taken, inan object region of the objective lens with high accuracyand within a short time. Thereby, it is possible, even foran inexperienced user, to conduct a measurement within ashort time.
By way of example, the particle beam microscope may be ascanning electron microscope. Further examples of aparticle beam microscopes are a focused ion beam systems,in particular helium ion microscopes.
The generating of the surface model of the structure isperformed depending on the detected light rays and/orparticles. The surface model may be generated dependingexclusively on the detected light rays. In other words,the surface model is generated exclusively from theinformation, which is obtained by the detected light rays.
However, it is also conceivable that additionalinformation is used for generating the surface model. Forexample, the generating of the surface model may beperformed depending on values, which are obtained bymeasurements, which are carried out in addition to adetecting of the light rays and/or particles. Thereby, itis possible to increase the speed for generating the surface model. A surface model may for example bedetermined depending on a measurement conducted by acoordinate measuring device. Furthermore, a surface modelof at least a portion of the structure, in particular ofat least a portion of the surface of the object holder,may be generated based on a CAD drawing.
The detecting of the light rays may be carried out with alight-sensitive sensor, in particular a semiconductorsensor. The generating of the surface model may be carriedout by a computer. The positioning of the object maycomprise an automatic positioning, which is controlled bythe computer.
Furthermore, the detecting of the light rays may beperformed by a light sensitive image capturing device. Theimage capturing device may comprise an image sensor, suchas a CCD image sensor. A light sensitive image capturingdevice may for example comprise a camera, in particular aCCD-camera. The light sensitive image capturing device maybe configured and arranged such that a digital image isacquirable, wherein the digital image represents or showsat least a portion of the structure. Furthermore, it isconceivable that the detected light rays are laser beams,which are scattered or reflected at the structure. Thelaser beams may be generated by a laser scanner whichscans the structure. Based on the detected laser beams, atleast one of the following may be performed: a time-of-flight measurement by timing the round-trip time of apulse of light, phase comparison and/or triangulation. Theimage sensor of the light sensitive image capturing devicemay for example comprise a CCD-image sensor and/or aphotodiode.
The light rays may have wavelengths within a range from400 nanometers to 7 00 nanometers. The light rays may beemitted from a light source and may be scattered or reflected at the structure. By way of example, in thespecimen chamber of the particle beam microscope, a lightsource may be arranged, which illuminates the interior ofthe specimen chamber. The light rays may be light rays ofa laser beam, which is emitted by a laser scanner, whereinthe laser scanner is configured such that it scans thesurface of the structure with the laser beam.Alternatively or additionally, it is also conceivable thatthe light rays are emitted from light sources which arearranged at the structure. Such light sources may forexample be light-emitting diodes (LEDs).
The detecting of the light rays may be performed when theobject and/or object holder is in the specimen chamber.Alternatively or additionally, the detecting of the lightrays may be performed when the object and/or object holderis outside of the specimen chamber. For example, thedetecting of the light rays may be performed in a load-lock chamber of the particle beam microscope. The load-lock chamber may be configured such that objects are firstloaded into the load-lock chamber. After an evacuation ofthe load lock chamber, the objects are transferred intothe specimen chamber. Thereby, the specimen chamber doesnot have to be ventilated for inserting new specimens.Thereby, the time in which the load-lock chamber isevacuated may be used to detect the light rays and togenerate the surface model. It is also conceivable thatthe detecting of the light rays is performed outside ofthe vacuum system, which comprises the load-lock chamberand the specimen chamber. For example, the detecting ofthe light rays may be performed under atmosphericpressure.
The detected particles may be charged particles. Theparticles may be electrons. The electrons may be secondaryelectrons and/or back scattered electrons. Furthermore,the particles may be ions, such as helium ions or secondary ions.
The particles emanate from the structure. The particlesmay be emitted from a portion of the object, which isirradiated by the primary beam of the particle beammicroscope. In other words, the particles may be emittedfrom an impingement location or an impingement region ofthe primary beam. The primary beam may be a scannableprimary beam.
The detecting of the particles may be performed by one ormore particle detectors. The particle detectors areconfigured such that particles are detected, which areemitted from an impingement location of the particle beam.
The object region may be defined as a spatial regionrelative to the particle beam microscope, wherein theparticle beam microscope is configured such that an imageis acquirable from a portion of an object, which isarranged in this spatial region. In other words, theobject region may represent a spatial region, which isscannable by the primary beam of the particle beammicroscope.
By way of example, the object is a wafer or a work piece.The scanning electron microscope may be used to acquire animage of a surface of the wafer or the work piece.
The structure may be a surface. The surface may be three-dimensional. The structure may be a surface, whichcomprises at least a portion of the surface of the objectand/or at least a portion of the surface of the objectholder. The structure may consist of a surface, which ismovable relative to the object region by the positioningdevice. It is further conceivable that the structurecomprises at least a portion of a surface of a furthercomponent of the particle beam microscope. It is further conceivable that the structure does not comprise the totalor the total exposed surface of the object. The structuredoes not have to comprise a surface of the object. Forexample, in case of objects being relatively smallcompared to a size of the object holder, it might besufficient, that the structure comprises a portion of thesurface of the object holder without any portion of thesurface of the object. The object holder may be defined asa component of the particle beam microscope, which isconfigured to retain an object on which measurements areto be taken. By way of example, the object holder maycomprise a surface, at which the object is attached. Theobject may be attached to the object holder by adhesiveand/or by screws of the object holder. The object may beattached to the object holder and the object holder may beattached to the positioning device. The object holder maybe configured to provide a mechanical connection betweenthe object and the positioning device. In other words, theobject and the object holder may be positionedsimultaneously within the particle beam microscope by thepositioning device.
The surface model may be a model, which represents theform or shape of the structure. In other words, thesurface model of the structure may be a mathematicalrepresentation of the structure. For example, a maximumdistance of the surface model from the structure may beless than 10 millimeters or less than 1 millimeter or lessthan 0.1 millimeter or less than 10 micrometers or lessthan 1 micrometer or less than 100 nanometers or less than10 nanometers. The distances may be measured along asurface normal of the surface model, wherein the surfacemodel is positioned relative to the structure such thatthe sum or integral of the squared distances yield aminimum.
Hence, the surface model may represent the structure to a predetermined accuracy. The accuracy of the surface modelmay be chosen such that a positioning of the structure inrelative to the objective lens may be carried out with apredetermined positioning accuracy. For examplepositioning accuracy may be lower than 100 nanometers,lower than 1 micrometer, lower than 10 micrometers, lowerthan 0.1 millimeter, lower than 0.5 millimeter, lower than1 millimeter or lower than 5 millimeters.
The surface model may represent a flat two-dimensionalstructure. For example, the surface model of a wafer maybe a circular disc, wherein the edge of the circular discrepresents the outer edge of the wafer. The surface modelmay be a three-dimensional surface model. A three-dimensional surface model may be defined such that itcomprises an uneven surface. By way of example, a threedimensional surface model may represent a lateral surfaceand a top surface of a cylinder or a cuboid (i.e. withoutits base).
By way of example, the surface model may comprise orconsist of a plurality of points. In other words, thesurface model may comprise or may consist of a pointcloud. The number of points may, for example, be more than10, more than 100, more than 1,000 or more than 10,000.Furthermore, the number of points may, for example, beless than 1010 points or less than 109 points. Each of thepoints may be defined by three coordinate values, whichrepresent a position of the points in space relative to acoordinate system.
At least a portion of the points may be connected bygeometric objects like line segments, polygons, planesegments arcuate surface segments and/or arcuate linesegments. The plane segments may comprise triangularand/or trapezoidal plane segments. For each point, thedistances between the point and its closest neighboring point may, be less than 5 millimeters or less than 1millimeter or less than 0.1 millimeter or less than 10micrometers or less than 1 micrometer or less than 100nanometers or less than 10 nanometers.
Additionally or alternatively, the surface model may atleast partly be based on splines. In other words, thesurface model may be based on a set of polynomial surfacefunctions, wherein a polynomial surface function describesat least a portion of the surface model. A plurality ofpolynomial surface functions of a degree less or equal tofour may be sufficient for achieving a predeterminedaccuracy of the surface model.
The surface model may further comprise marks, wherein themarks correspond to marks on the structure. For example,the structure may comprise marks, which are detectable bythe detecting of the light rays and/or particles. Suchmarks may, for example, be color coded marks or portionson the structure, which have a reflectivity, which isdifferent from a reflectivity of portions of thestructure, which surround the marks.
The objective lens may, be an electron beam objective lensor an objective lens for focused ion beams. Furthermore,also other components of the particle beam microscope maycomprise object regions, such as a particle detector or acomponent for object preparation. Examples for particledetectors are secondary electron detectors (also denotedas SE-detectors), energy dispersive detectors for X-rays(also denoted as EDX detectors) and electron backscattered electron detectors (also denoted as EBSDdetectors). Examples for components for object preparationare gas injection systems, focused ion beam systems (FIB)and micromanipulators.
Furthermore, the position and orientation of the surface model relative to the object region is determined. Thedetermining of the position and orientation of the surfacemodel may comprise interpolating of points of the surfacemodel.
A rigid body comprises six degrees of freedom of movement.The six degrees of freedom of movement are, for example,expressed by three coordinate values of translation andthree rotation angle values. Under translation, all pointsof the rigid body move by the same translation vector. Thethree coordinate values of translation together define theposition of the rigid body. Under rotation, all points ofthe rigid body are rotated by an angle about a rotationaxis. The three rotation angles define the orientation ofthe rigid body. The orientation of the surface model maybe expressed by yaw, pitch and roll or by Eulerian angles.
The determining of the position and orientation of thesurface model relative to the object region may beperformed such that the position and orientation of thesurface model is aligned to a position and orientation ofthe structure relative to the object region.
The determining of the position and orientation of thesurface model of the structure may be performed dependingon the surface model of the structure. For example, anextent of the structure and/or distances between marks ofthe structure may be known from the determined surfacemodel of the structure. Furthermore, the determining ofthe position and orientation of the surface model relativeto the object region may be performed depending on thedetected light rays. In particular, the position andorientation may be determined depending on a digital imageof a light sensitive image capturing device, wherein thedigital image depicts at least a portion of the structure.Additionally or alternatively, the determining of theposition and orientation may be performed depending on the signals, which are transmitted between the computer andthe positioning device. For example, the positioningdevice may comprise a measuring unit, which is configuredto measure the position and/or orientation of thestructure. Additionally or alternatively, the positionand/or orientation of the structure may be determineddepending on control signals, which are transmitted from acontroller to the positioning device. The controller may,for example, be a computer. Additionally or alternatively,the determining of the position and the orientation of thesurface model of the structure may be performed dependingon detected particles, which emanate from the structure.Particle detectors may detect the particles at differentfocus distances of the primary beam. Additionally oralternatively, determining of the position and orientationof the surface model of the structure may be performeddepending on particle microscopic images, which depict atleast a portion of the structure.
The positioning of the structure may be performed by apositioning device of the particle beam microscope. Thepositioning device may comprise one or more actuators. Theobject holder may be arranged at the positioning device.Thereby, the positioning device may be configured suchthat by controlling one or more actuators, the object ispositionable in the particle beam microscope relative tothe objective lens, relative to a detector and/or relativeto a component for object preparation. The positioningmay, in particular, comprise a positioning of themeasurement location in the object region of the objectivelens. Furthermore, the positioning may also comprise anadjusting of a measurement orientation. The measurementorientation may be defined as an orientation of theobject, at which a measurement is carried out. Ameasurement orientation, for example, may be defined bythree angles of rotation.
The measurement location may represent a portion on thesurface of the object, at which a measurement is to betaken or at which particle microscopic image is to beacquired. The measurement location may be located outsideof the surface model of the structure. The determining ofthe measurement location relative to the surface model maybe performed depending on a user input via the computer.For example, the user may select a portion of the surfacemodel in which he wants to perform a measurement oracquire an image, based on a two dimensionalrepresentation of the surface model on a display of thecomputer. Depending on the user input, the computer maydetermine or calculate a measurement location relative tothe surface model.
The positioning is performed depending on the determinedsurface model. The positioning may comprise interpolatingpoints of the surface model of the structure. Depending onthe surface model and the measurement location relative tothe surface model, a positioning direction may bedetermined for arranging the measurement location in theobject region. Furthermore, based on the surface model,the user or the computer may determine in whichmeasurement orientation the measurement is to be taken orthe image is to be acquired. The positioning of the objectmay be controlled by the computer. However, it is alsoconceivable that the user manually controls thepositioning of the object, wherein for example the surfacemodel of the structure, the position and orientation ofthe surface model of the structure and the measurementlocation is displayed on a display of the computer.Depending on the user input, the computer positions theobject.
According to a further embodiment, the positioning of theobject further comprises a determining of a positioningpath. The positioning path may be determined by a computer depending on the surface model, on the determined positionand orientation of the surface model relative to theobject region, the measurement location and/or themeasurement orientation. The positioning path may bedetermined such that the measurement location is locatedin the object region. Furthermore, the positioning pathmay be determined such that the positioning is carried outwithout collision.
According to an embodiment, the positioning of the objectcomprises arranging the measurement location in the objectregion.
According to a further embodiment, the method furthercomprises adjusting of a focus of the objective lens afterhaving arranged the measurement location in the objectregion.
By arranging the measurement location in the object regionaccording to the method, the position and orientation ofthe structure relative to the objective lens is known to acomparatively higher accuracy. The focus of the scanningelectron microscope is typically adjusted with anaccuracy, which is in a range between a few nanometers(nm) to a few micrometers (pm) , depending on the setmagnification of the scanning electron microscope. Theadjusting of the focus may be performed automatically bysetting operation parameters of the particle beam opticalsystem depending on acquired particle microscopic images.As a result of the determining of the position andorientation of the structure with high accuracy, anautomatic adjustment of the focus is alleviated. Thereby,in particular, an adjusting of the focus may be performedwithin a short time.
According to an embodiment, the method further comprises:generating a surface model of a microscope portion of the particle beam microscope; combining the surface model ofthe structure and the surface model of the microscopeportion to generate a combined surface model; andcalculating a distance between the surface model of thestructure and the surface model of the microscope portiondepending on the combined surface model; wherein thepositioning of the object comprises monitoring thedistance.
Accordingly, it is possible to quickly move the objectwithin the particle beam microscope without riskingcollisions, which may damage the object or the particlebeam microscope. In particular, a secure positioning isenabled for objects having a complex geometry or for aplurality of objects which are together mounted on theobject holder.
The microscope portion may be at least a portion of asurface of a component of the particle beam microscope.Examples for such a component are: the specimen chamber, adetector, a manipulator, a gas supply and/or an objectivelens .
The combined surface model may be defined as a surfacemodel, in which the surface model of the structure and thesurface model of the microscope portion are arrangedrelative to each other corresponding to the relativearrangement of the structure and the microscope portion inthe specimen chamber. The combining of the surface modelsmay be performed by the computer. The surface model of themicroscope portions may comprise points and/or geometricobjects, such as has been described with respect to thesurface model of the structure.
The combining to generate the combined surface model maycomprise: determining a position and an orientation of thesurface model of the structure relative to the surface model of the microscope portion.
The determining of the position and orientation of thesurface model of the structure relative to the surfacemodel of the microscope portion may comprise acquiring adigital image, which represents or shows at least aportion of the structure, wherein the digital image isacquired from a viewpoint position relative to themicroscope portion. The digital image may be generated bya light sensitive image capturing device, and/or thedigital image may be a particle microscopic image.Additionally, the digital image may show at least aportion of the microscope portion.
The acquired digital image may then be compared with thesurface model of the structure. Depending on thecomparing, a position and orientation of the surface modelof the structure relative to the surface model of the microscope portion may be determined. The comparing may comprise segmenting of the digital image. The segmentingmay comprise one or a combination of the followingmethods: a pixel-oriented method, an edge-oriented method,a region-oriented method, a model-based method, a texture-based method and/or a color-oriented method. In particular, the comparing may comprise a model-basedsegmentation method depending on the surface model of thestructure .
Additionally or alternatively, the method may compriseextracting features from the digital image, wherein theextracted features correspond to features of the surfacemodel of the structure. Examples of such features are:edges, surface topography, and/or detectable marks. The comparing may comprise applying a routine for edgedetection, for frequency filtering and/or for patternrecognition. Furthermore, the comparing may compriseinterpolating points of the surface model.
Additionally or alternatively, the combining to a combinedsurface model may be performed depending on signals, whichare transmitted between the computer and the positioningdevice. For example, the positioning device may comprise ameasuring unit, which is configured to determine theposition and orientation of the structure relative to themicroscope portion. Furthermore, the position and/ororientation of the surface model of the structure relativeto the surface model of the microscope portion may bedetermined depending on control signals, which aretransmitted from a controller to the positioning device.The controller may, for example, be the computer.Alternatively or additionally, the combining to thecombined surface model may be performed depending ondetected particles, which emanate from the structure.Particle detectors may detect particles at different focusdistances of the primary beam. Alternatively oradditionally, the determining of the position andorientation of the surface model of the structure relativeto the surface model of the microscope portion may beperformed depending on particle microscopic images, whichrepresent or show at least a portion of the structure.
Depending on such a combined surface model, a distancebetween the structure and the microscope portion isdeterminable. The detecting of an imminent collisionbetween the microscope portion and the structure may beperformed depending on the determined distance.
According to an embodiment, the method comprisesdetermining of a positioning path depending on thecombined surface model. The positioning path may becalculated by the computer.
The distance may represent a minimum distance between thestructure and the microscope portion. The minimum distance between two bodies may be determined by determining asmallest distance between any two points of the twobodies, wherein the line between the two points connectsthe two bodies.
For example, the determining of the distance may comprisecomparing distances between pairs of points, wherein eachpair comprises a point of the surface model of themicroscope portion and a point of the surface model of thestructure. Depending on the comparing, a pair of pointsmay be determined, which has a smallest distance of allpairs of points. The distance may be calculated by thecomputer. Furthermore, the determining of the distance maycomprise interpolating points of the surface model of thestructure and/or interpolating points of the surface modelof the microscope portion.
The determining of a distance may comprise determining orcalculating distances between pairs of points, whereineach of the pair of points comprises a point of thestructure and a point of the microscope portion; anddetermining a pair of points, which has the smallestdistance among all pairs of points.
Algorithms for determining collisions on the basis ofsurface models are disclosed in the Ph.D thesis "VirtualReality in Assembly Simulation - Collision Detection,Simulation Algorithms and Interaction Techniques" ofGabriel Zachmann (Technische Universitaet Darmstadt),published by Fraunhofer IRB Verlag; the contents of whichare incorporated herein in its entirety. Furthermore,algorithms for collision detection are disclosed in thearticle "Schnelle Kollisionserkennung durch paralleleAbstandsberechnung" of Dominik Henrich, et al., publishedin 13. Fachgespraech Autonome Mobile Systeme (AMS '97),Stuttgart, October 6th and 7th, 1997, published bySpringer Verlag, series "Informatik Aktuell"; the contents of which are incorporated herein in its entirety.
The monitoring of the distance may comprise issuing anotification or a warning signal by the particle beammicroscope system, when the distance has fallen below apredetermined or predeterminable permissible distance.Alternatively or additionally, it is conceivable that thepositioning of the object holder by the positioning deviceis automatically stopped when the distance is smaller thanthe permissible distance.
The permissible distance may be predetermined. Thepermissible distance may be determined such that acollision between a structure and the microscope portionis prevented. Furthermore, the permissible distance may bedetermined taking into account an accuracy with which thestructure and the microscope portion are approximated bythe combined surface model.
According to a further embodiment, the positioning of theobject comprises determining of a positioning pathdepending on the combined surface model. The determiningof the positioning path may comprise a determining ofdistances between the surface model of the structure andthe surface model of the microscope portion along thepositioning path. The positioning of the object may beperformed depending on the determined positioning path.
By automatically determining the positioning path by thecomputer, a fast and automatic positioning may beperformed without collision. However, it is alsoconceivable that a user may perform a manual positioning,wherein positioning movements which may lead to acollision are prevented by notifications, warning signals,and/or a stopping of the positioning process.
According to a further embodiment, the determining of the position and orientation of the surface model of thestructure relative to the object region comprises:generating a digital image, from at least a portion of thestructure; and comparing the surface model of thestructure with the digital image.
The digital image may be acquired with a light-sensitiveimage capturing device. Alternatively or additionally, thedigital image may be acquired by scanning a portion of thestructure with a primary beam of the particle beammicroscope. The digital image may be a particlemicroscopic image.
The comparing may comprise identifying features of thedigital image, wherein the features of the digital imagecorrespond to features of the surface model of thestructure or features of the combined surface model. Inother words, the comparing may comprise identifyingfeatures of the surface model, which are represented orshown in the digital image. Such features may, forexample, comprise edges, marks and/or surface topographyof the structure and/or microscope portions. The comparingmay comprise applying a routine for edge detection, forfrequency filtering and/or for pattern recognition.Furthermore, the comparing may comprise interpolatingpoints of the surface model. The comparing may comprisesegmenting the digital image. The segmenting may compriseone or a combination of the following methods: a pixeloriented method, an edge-oriented method, a region-oriented method, a model-based method, a texture-basedmethod. In particular, the comparing may comprise a model-based method for segmentation depending on the surfacemodel of the structure.
The digital image may be compared with a two-dimensionalrepresentation of the surface model of the structure. Thetwo-dimensional representation may be generated by projecting the surface model at a given position andorientation onto a plane. The two-dimensionalrepresentation may be compared with the digital image todecide whether the given position and orientationcorresponds to the position and orientation of thestructure.
According to a further embodiment, the determining of theposition and orientation of the surface model of thestructure relative to the object region is performeddepending on the digital image, depending on the viewpointposition of the image capturing device and depending onthe surface model of the structure.
According to a further embodiment, the method furthercomprises: determining a second measurement locationrelative to the surface model of the structure andrelative to the measurement location; and repositioningthe object depending on the measurement location and thesecond measurement location.
The repositioning may further be performed depending onthe surface model of the structure. The measurementlocation relative to the surface model of the structuremay be stored, in particular in a storage device of thecomputer. The storing of the measurement location relativeto the surface model may comprise a storing of coordinatesof a point relative to the surface model. Alternatively oradditionally, a measurement orientation relative to thesurface model may be stored. The measurement orientationmay be defined such that it represents the orientation ofthe structure when a measurement is taken.
The second measurement location may be the samemeasurement location as the stored measurement location.Thereby, it is possible to find again a location, at whicha measurement has been taken.
Thereby, it is possible to readjust a measurementorientation and/or to find a measurement location againafter the object has been moved by operating thepositioning device. The object may have been moved, forexample, to perform a preparation outside of the particlebeam microscope. This allows to obtain measurements ofexactly the same location and/or exactly the sameorientation. Furthermore, it is possible to assign storedimages, which have been acquired with the particle beammicroscope to stored measurement locations and/ormeasurement orientations .
According to a further embodiment, the method furthercomprises: generating a particle microscopic image, whichrepresents at least a portion of the measurement location;identifying a region of the particle microscopic image;and adjusting a position and/or an orientation of theobject depending on the identified region.
The adjusting in dependence on the identified region ofthe particle microscopic image may be performed at anaccuracy, which is higher than the accuracy for thepositioning in dependence of the surface model of thestructure. In other words, the positioning in dependenceof the surface model of the structure may provide a coarsepositioning, which is followed by a fine positioning,which is performed in dependence on the identified regionof the particle microscopic image. In particular, it ispossible to reproducibly find a measurement location againwith an accuracy, which corresponds to the resolution ofthe particle microscopic image.
The identifying of the region of the particle microscopicimage may comprise comparing the particle microscopicimage with stored particle microscopic images. The storedparticle microscopic images may have been acquired during a preceding positioning process. Thereby, it is possibleto identify a portion of the object, where already aparticle microscopic image has been acquired. Furthermore,the identifying of the region of the particle microscopicimage may comprise a segmenting of the particlemicroscopic image, an edge detection and/or a frequencyfiltering of the particle microscopic image. Thereby,features may be determined in the particle microscopicimage, which are to be examined by the particle beammicroscope. Based on the identified region of the particlemicroscopic image, it is possible to determine apositioning path for acquiring an image of the identifiedregion at a higher magnification. The computer may beconfigured to perform the positioning depending on theidentified region.
According to an embodiment, the detecting of the at lightrays and/or particles comprises detecting the light raysand/or particles at a plurality of different focusdistances .
The focus distances may be focus distances of a lightsensitive image capturing device and/or focus distances ofthe primary beam.
The focus distance of the primary beam may be a distanceof a beam waist of the primary beam of the particlemicroscope from a reference point of the particle opticalsystem of the particle beam microscope. The referencepoint may, for example, be a principal plane of theobjective lens or a component of the particle opticalsystem of the particle beam microscope. The focus distanceof the light sensitive image capturing device may be afocus distance of a light optical system of the lightsensitive image capturing device, such as a lens assembly.
According to an embodiment, the generating of the surface model of the structure further comprises: generating aplurality of stacks of image regions depending on thedetected light rays and/or the detected particles at theplurality of focus distances; wherein image regions, whichare part of a same stack of the plurality of stacksrepresent a same portion of the structure; determining foreach stack of the plurality of stacks an in-focus regiondepending on the image regions of the respective stack.
Each of the image regions may be a group of pixels of thedigital image. The digital image may be acquired at afocus distance of the light sensitive image capturingdevice and/or of the primary beam. Each of the imageregions may be generated by selecting pixels from thedigital image. All pixels of an image region may begenerated at the same focus distance.
Image regions, which form part of a same stack, show asame portion of the structure. Image regions, which formpart of a different stack may show different portions ofthe structure. The different portions of the structure maybe adjacent. The adjacent portions may be non-overlapping.Alternatively, the different portions may partly overlapeach other. Furthermore, the different portions may bespaced apart from each other.
The in-focus region is determined by determining the imageregion from all image regions of a stack, which has thehighest resolution. The determining of the in-focus regionmay comprise comparing all image regions of a stack. Thedetermining of the in-focus region may comprisedetermining frequencies, in particular spatialfrequencies, of image data values for each image region ofa stack. The frequencies may be frequencies of a rowand/or a column of the image region. For example, thedetermining of a frequency may comprise determining aFourier transform, in particular a discrete Fourier transform a of at least a portion of the image data of animage region. For example, the image region, which has ahighest frequency in its power spectrum is the in-focusregion. Furthermore, the in-focus region may be the imageregion having the greatest power values in the powerspectrum at a predetermined frequency or within apredetermined frequency range. Additionally oralternatively, the determining of the in-focus region maycomprise determining of differences and/or gradients ofimage data values of the image regions of the stack. Forexample, the image region having the highest absolutevalues of differences of neighboring image data values isdetermined as the in-focus region. Additionally oralternatively, determining of the in-focus region maycomprise applying an edge detection filter to each imageregion of a stack.
The determining of the in-focus region may be performeddepending on pixel data values of the image regions of therespective stack. Alternatively or additionally, thedetermining of the in-focus region may be performeddepending on pixels outside of the image region. Forexample, the determining of the in-focus region may beperformed depending on pixels, which are adjacent to orspaced apart from the pixels of the image region of therespective stack. Thereby, it is in particular possiblethat an image region consists of a single pixel.
According to a further embodiment, each image region of atleast a portion of the generated image regions is anisolated pixel cluster.
A pixel cluster may be defined as a group of pixels,wherein each of the pixels is located adjacent (i.e. notspaced apart) to at least one other pixel of the pixelcluster. An isolated pixel cluster may be defined as apixel cluster, wherein each pixel of the isolated pixel cluster spaced apart from a pixel of another image regionof a different stack. In other words, the portion of thestructure, which is represented or shown by the pixelcluster is neither adjacent nor overlapping, but spacedapart from portions of the structure, which arerepresented by other image regions which form part of adifferent stack.
Each of the isolated pixel clusters may consist of between1 and 8 pixels, between 1 and 50 pixels or between 1 and500 pixels, or between 1 and 1,000 pixels, or between 1and 10,000 pixels. In particular, a pixel cluster mayconsist of an individual pixel.
A minimum distance between a first and a second pixelcluster may be defined as a smallest distance of alldistances between pixels of the first pixel cluster andpixels of the second pixel cluster.
The minimum distance between pixel clusters of differentstacks may be more than 10 times, more than 100 times ormore than 1,000 times the diameter of the pixel. In otherwords, a distance between regions of the structure, whichare represented by isolated pixel clusters of differentstacks may be many times more than a sampling distancebetween pixels of the image region. A sampling distancemay be defined as a diameter of a portion of thestructure, which is represented by a pixel.
The acquiring of the image data of the image region maycomprise a scanning with the primary beam structureregions, which connect the isolated pixel clusters. Theisolated pixel clusters may then be cut out from theacquired image. Thereby, it is possible, that only a smallnumber of pixel data values have to be processed by thecomputer to generate the surface model of the structure.
Alternatively, the generating of the image regions maycomprise skipping a scanning of structure portions, whichconnect the isolated pixel clusters. In other words, thestructure portions, which connect the isolated pixelclusters are not scanned by the primary beam. This allowsto generate a surface model having a comparatively largestructure within a short time.
According to a further embodiment, the method furthercomprises: generating digital image data, which representat least a portion of the structure depending on thedetected light rays and/or the detected particles; whereinthe generating of the surface model of the structure isperformed depending on the digital image data.
The digital image data may be pixel data values of a groupof pixels, in particular of a digital image. The pixeldata values may represent color and/or gray scale values.The digital image data values may represent at least aportion of the structure. The digital image data may beacquired by a light-sensitive image acquisition deviceand/or by a scanning of the primary beam.
Based on the digital image data, the surface model may becalculated by a computer. Such algorithms are, forexample, described in the article "3D Reconstruction fromMultiple Images: Part 1 Principles" of Theo Moons, Luc vanGool and Maarten Vergauwen, published in "Foundations andTrends in Computer Graphics and Vision", Volume 4, Issue4, pages 287 to 404; the contents of which areincorporated herein in its entirety. Furthermore, suchalgorithms are described in the article "DLP-Based 3DMetrology by Structured Light or projected FringeTechnology for Life Sciences and Industrial Metrology" ofG. Frankowski and R. Hainich, published in "Proceedings ofSPIE Photonics West 2009"; the contents of which areincorporated herein in its entirety. Furthermore, such algorithms are described in the article "ProFORMA:Probabilistic Feature-based On-line Rapid Model
Acquisition" of Qi Pan et al. , published in theProceedings of the "BMVC 2009" of the British MachineVision Association, London (obtainable on the webpagehttp://www.bmva.org/bmvc/2009/index.htm), the contents ofwhich are incorporated herein in its entirety.
Alternatively or additionally, it is conceivable that, forexample, based on further measurements at the structure, acourse model is available which is adapted depending onthe digital image data. For example, a surface model of atleast a portion of the surface of the object holder may bestored in the storage device. Depending on the digitalimage data, the stored surface model of the portion of thesurface of the object holder is supplemented to yield asurface model of the structure.
Thereby, the surface model of the structure may be obtained from the digital image data within a short time.
The position and orientation of the surface model relativeto the object region or relative to the surface model ofthe microscope portion may be determined depending on thedigital image data. By way of example, a viewpoint position relative to the object region from which thedigital image data are acquired, an imaging direction and/or a magnification of the digital image data may beknown. Thereby, it is possible to determine the positionand orientation of the surface model.
The acquiring of the digital image of the structure may beperformed by a light-sensitive image capturing device,such as for example a camera.
According to a further embodiment, the generating of thesurface model depending on the digital image data comprises a segmenting of the digital image data. Thesegmenting may further comprise one or a combination ofthe following segmenting methods: a pixel-oriented method,an edge-oriented method, a region-oriented method, amodel-based method and a texture-oriented method. Anexample of a pixel-oriented method is the thresholdmethod. Examples for edge-oriented methods are: applyingthe Sobel-Operator, applying the Laplace-Operator and/orgradient detection. Examples for a region-oriented methodare: Region Growing, Region-Splitting, Pyramid Linking andSplit and Merge. An example for a model-based method isthe Hough-Transformation. Examples for a texture-basedmethod are co-occurrence Matrices and Texture-Energy-Measure .
According to a further embodiment, the generating of thedigital image data comprises generating the digital imagedata from at least two different imaging directions.
The image data, which have been acquired from the at leasttwo different imaging directions may representstereoscopic image data. For example or two or more imagesare acquired from different imaging directions relative tothe structure. Depending on the stereoscopic image data,the surface model of the structure, the position and/ororientation of the surface model of the structure relativeto the object region, and/or the position and orientationof the surface model of the structure relative to thesurface model of the microscope portion may be determined.
The acquiring of the digital images from different imagingdirections may, for example, comprise varying of theorientation and/or position of the structure relative to alight-sensitive image capturing device and/or relative tothe primary beam. For example, the orientation and/or theposition of the structure may be varied by the positioningdevice. Thereby, the structure may be imaged by the camera or the primary beam from different imaging directions. Animaging direction may be defined by a vector, which isparallel to the optical axis of the light-sensitive imagecapturing device or parallel to the optical axis of theparticle optical system.
Additionally or alternatively, the imaging direction maybe altered by varying an impingement direction of theprimary beam relative to an optical axis of the particlebeam microscope. Additionally or alternatively, avariation of the position of the light sensitive imagecapturing device relative to the specimen chamber mayresult in a variation of the imaging direction of thelight sensitive image capturing device.
Additionally or alternatively, the light sensitive imagecapturing device may have more than one imaging direction.For example, the light-sensitive image capturing devicemay comprise a plurality of cameras, which are arrangedsuch that they have different imaging directions relativeto the structure. For example, the image capturing devicecomprises two, three or more cameras.
Additionally or alternatively, the particle optical systemmay provide a first imaging direction and the lightsensitive image capturing device may provide a secondimaging direction.
According to a further embodiment, the detecting of thelight rays comprises: detecting of a laser beam, which hasbeen reflected at the structure.
Algorithms for generating surface models from reflectedlaser beams of a laser scanner are disclosed in the Ph.Dthesis "Model-based Analysis and Evaluation of Point Setsfrom Optical 3D Laser Scanners", written by ChristianTeutsch (Otto-von-Guericke-Universitaet, Magdeburg,
Germany), published by Shaker Verlag, Herzogenrath,Germany; the contents of which are incorporated herein inits entirety.
For example, the particle beam microscope comprises alaser scanner, which is configured to scan at least aportion of the structure and/or the microscope portion.The laser scanner may be configured such that thereflected laser beams are detected by performing at leastone of the following: measuring the time-of-flight, inparticular by timing the round-trip time of a pulse oflight, performing phase comparison and/or performingtriangulation.
Furthermore, the laser scanner may be configured todetermine the position and orientation of the structuredepending on the detected reflected laser beams. Thereby,a position and an orientation of the surface model of thestructure relative to the object region may be determined.
According to a further embodiment, the generating of thesurface model of the structure comprises: generating afirst surface model of a first portion of the structure ina first position of the structure relative to an imageacquisition device and/or to the objective lens;generating a second surface model of a second portion ofthe structure in a second position of the structurerelative to the image acquisition device and/or theobjective lens; and combining the first surface model andthe second surface model to the surface model of thestructure .
Accordingly, it is possible to generate a comparativelylarge surface model, which extends the field of view ofthe image acquisition device or the particle microscope.In particular, this allows to use the particle microscopeto generate a surface model of an extended object.
The first and the second surface model may be generateddepending on the detected light rays and/or particles. Thefirst surface model and the second surface model may beadjacent and non-overlapping. Alternatively, the firstsurface model and the second surface model may bepartially overlapping. The first position and the secondposition are measured relative to the image acquisitiondevice and/or relative to the objective lens.
Embodiments provide a particle beam microscope system,comprising: an objective lens, having an object region; anobject holder which is configured such that an object ismountable on the object holder; a positioning device,which is configured to adjust a position and/or anorientation of the object holder relative to the objectregion; a detecting device, which is configured to detectlight rays, which emanate from a structure, and/orparticles, which emanate from the structure, wherein thestructure comprises at least a portion of the surface ofthe object holder and/or at least a portion of a surfaceof the object; a computer, which is configured for signalcommunication with the positioning device and thedetecting device, wherein the computer is furtherconfigured to: generate a surface model of the structuredepending on the detected light rays and/or the detectedparticles; determine a position and an orientation of thesurface model of the structure relative to the objectregion; determine a measurement location relative to thesurface model of the structure; and to position the objectdepending on the determined surface model of thestructure, the determined position and orientation of thesurface model of the structure and the determinedmeasurement location.
Accordingly, a particle beam microscope is obtained, whichallows an automatic, fast and easy-to-perform positioningof the object relative to the objective lens within a short time.
The computer may be configured to automatically performthe positioning of the object. It is also conceivable thatthe computer displays the surface model of the structure,the position and orientation of the surface model of thestructure and the measurement location on a display. Thecomputer may further be configured to position the objectdepending on the user input. For example, the computer maybe configured to determine a measurement location relativeto the surface model of the structure depending on aninput of the user.
Brief Description of the Drawings
The forgoing as well as other advantageous features of theinvention will be more apparent from the followingdetailed description of exemplary embodiments of theinvention with reference to the accompanying drawings. Itis noted that not all possible embodiments of the presentinvention necessarily exhibit each and every, or any, ofthe advantages identified herein.
Figure 1 schematically illustrates an object and anobject holder which are arranged close to an objectivelens and a BSE-detector according to an exemplaryembodiment;
Figure 2 schematically shows a particle beam systemaccording to an exemplary embodiment;
Figure 3 schematically shows a surface model of astructure which is obtained by a method according to anexemplary embodiment;
Figure 4 schematically shows a combined surface model,which is obtained according to an exemplary embodiment;
Figure 5 schematically shows a determining of theposition and the orientation of the surface modelaccording to an exemplary method;
Figure 6 is a flow-chart, which schematically illustratesan exemplary method of operating a particle beammicroscope;
Figure 7 is a flow-chart, which schematically illustratesa further exemplary method of operating a particle beammicroscope;
Figure 8 schematically illustrates the acquiring of thesurface model of a structure by using different focusdistances of the particle optical system, as shown inFigure 2;
Figure 9 schematically illustrates the generating of thesurface model of the structure from a plurality ofparticle microscopic images;
Figure 10 schematically shows the surface model of thestructure which has been generated according to anexemplary method, as shown in Figures 8 and 9;
Figures 11a and lib schematically show the generating of asurface model of a structure in the exemplary method asshown in Figures 8 and 9, wherein the structure is greaterthan a field of view of the particle beam microscope; and
Figure 12 schematically illustrates the generating of asurface model of a structure from particle microscopicimages according to an exemplary embodiment.
Detailed Description of Exemplary Embodiments
It should be noted in this context that the terms"comprise", "include", "having" and "with", as well asgrammatical modifications thereof used in thisspecification or in the claims, indicate the presence oftechnical features such as stated components, figures,integers, steps or the like, and by no means preclude thepresence or addition of one or more alternative features,particularly other components, figures, integers, steps orgroups thereof.
Figure 1 schematically shows a structure, which isarranged close to an objective lens 30 of a particle beammicroscope, which is, for example, a scanning electronmicroscope. The objective lens 30 has an optical axis ΟΆand an object region OR. The object region OR is a spatialregion, into which the particle beam of the particle beammicroscope is focused. In other words, a surface region ofan object, which is arranged in the object region OR, isimageable by the particle beam microscope. The objectregion OR is located at a working distance WD spaced awayfrom the objective lens 30. The working distance WD andthe extent of the object region OR depend on the design ofthe particle optical system of the particle beammicroscope as well as on operation parameters of theparticle optical system, such as the magnification.
A first object 10 and a second object 11 and a thirdobject 12 are mounted on an object holder 20. The objectholder 20 is attached to a positioning device, which isnot illustrated in Figure 1. The positioning device isconfigured such that the object holder 20 is independentlymovable along an X-axis, along a Y-axis and a Z-axis of acoordinate system. This is illustrated by the doublearrows 50, 51 and 53. Thereby, the positioning device isconfigured to position the object holder with threedegrees of freedom. The positioning device may further beconfigured such that the object holder 20 is rotatable about the X-axis, the Y-axis and the Z-axis. In Figure 1this is illustrated by arrows 54, 55 and 56. Thereby, thepositioning device may be configured such that the objectholder 20 is positionable in six degrees of freedom. Thepositioning device may comprise one or more actuators. Theactuators may be piezo actuators and/or step motors.
At an end face of the objective lens 30, a detector 40 isarranged, which is configured to detect back scatteredparticles, which have been scattered at the object 10. Incase of the particle beam microscope being a scanningelectron microscope, the detector 40 may be a BSE-detector(back scattered electron detector). The particle beammicroscope may comprise further particle detectors, whichare not illustrated in Figure 1.
In order to acquire an electron microscopic image of alocation M on the surface of the first object 10, thefirst object 10 has to be arranged at a position and anorientation, such that the location M is located in theobject region OR. The orientation may, for example, bedefined by three angles.
The object holder 20 may comprise marks 21, 22. The marks21, 22 are configured such that they are detectable in animage of a light-sensitive image capturing device, such asa CCD-camera, and/or by scanning the primary beam of theparticle beam microscope across the marks.
In the exemplary embodiments, which are discussed withreference to the following figures, a surface model isgenerated from the structure for performing a precisepositioning of the objects 10, 11, 12 relative to theobjective lens 30. The structure comprises a portion ofthe surface of the objects 10, 11, 12 and/or a portion ofthe surface of the object holder 20. Additionally oralternatively, also a surface model of a microscope portion (such as a portion of the objective lens 30 and/orthe detector 40) is generated to ensure a collision-freepositioning of the objects 10, 11, 12.
The surface models may be generated, for example fromcamera images, which are arranged in the specimen chamberand/or in the load-lock chamber of the microscope. Thesurface models may also be generated from particlemicroscopic images and/or by using a laser scanner.
Figure 2 is a schematical illustration of a particle beammicroscope system 1 according to an exemplary embodiment.The particle beam microscopy system 1 may comprise ascanning electron microscope. The specimen chamber 80comprises a vacuum pumping system 83, which is configuredto evacuate the specimen chamber 80 to a vacuum level,which is suitable for conducting measurements with theprimary beam. The vacuum pumping system 83 may comprise afore pump and a turbo molecular pump. The vacuum level forconducting measurements may be in a range of between 1mbar to 10~7 mbar. In order to avoid venting the specimenchamber 80 for changing the samples 10, 11, 12, a load-lock chamber 85 may be connected to the specimen chamber80 which comprises a further vacuum pumping system 81. Thesamples 10, 11, 12, which are attached to the objectholder 20, are first introduced into the load-lock chamber85. After having evacuated the load-lock chamber 85, thesamples 10, 11, 12 and the object holder 20 aretransferred from the load-lock chamber 85 to the specimenchamber 80 and the object holder 20 is attached to thepositioning device 60 of the particle beam microscope.
The particle beam microscope comprises a first camera 31,such as a CCD-camera, which is arranged in the specimenchamber 80. The first camera 31 is configured to acquiredigital images of at least a portion of the surface of thefirst object 10 and/or a portion of the surface of the object holder 20. The first camera 31 is connected to thecomputer 70 of the particle beam microscope system 1 via afirst signal line 34. The computer 70 comprises a storagedevice 71. The storage device 71 is configured to storethe digital image of the first camera 31. The positioningdevice 60 may be configured such that the first, secondand third object 10, 11, 12 and the object holder 20 areimagable by the first camera 31 from different imagingdirections. For example, the positioning device 60 mayperform a rotation about the Z-axis by a predeterminedangle, such that the first, second and third object 10,11, 12 and/or the object holder 20 is imagable by thefirst camera 31 from at least two different imagingdirections. Depending on the images of the first camera31, the computer 70 calculates a surface model of thestructure, which comprises at least a portion of a surfaceof the first, second, third object 10, 11, 12, and/or theobject holder 20.
The particle beam microscope 1 may further comprise asecond camera 32, such as a CCD-camera, which is alsoarranged in the specimen chamber 80. The second camera 32and the first camera 31 have different imaging directionsrelative to the structure. By using two cameras, it ispossible to acquire digital images from the structure fromdifferent imaging directions, without having to change theposition or orientation of the structure of thepositioning device 60.
The particle beam microscope system 1 further comprises aparticle optical system 39, which has an objective lens30. The objective lens 30 comprises an end face, whichfaces the object plane of the particle optical system 39.At the end face, a detector 40, such as a BSE-detector maybe arranged. It is also conceivable that the detector isattached to a wall of the specimen chamber 80 or isreceived within the particle optical system. The particle optical system 39 and the detector 40 are connected to thecomputer 70 via a third signal line 37. Through the thirdsignal line 37, control signals are transmitted betweenthe computer 70 and the particle optical system 39.Depending on the signals of the detector 40, the computer70 generates particle microscopic images, which representdigital images.
Digital images, which have been acquired by the firstcamera 31, and/or the second camera 32 and/or which havebeen generated depending on the signals of the detector 40are stored in the storage device and later processed bythe computer 70. Depending on the digital images, thecomputer 70 calculates a surface model of the structure.The structure can be used to position the objects 10, 11,12 relative to the objective lens to acquire particlemicroscopic images.
The computer 70 is further configured to calculate asurface model of a microscope portion of the particle beammicroscope system 1 depending on the digital images.Alternatively, it is possible that the computer calculatesthe surface model of the microscope portion depending on aCAD-model. The microscope portion may, for example, be asurface of an object-side end portion of the objectivelens 30 and/or a portion of the surface of the detector40. The computer 70 is further configured to combine thesurface model of the structure and the surface model ofthe microscope portion to a combined surface model. Thecombined surface model can be used to monitor a distancebetween the structure and the microscope portion in orderto avoid collisions during the positioning process.
A third camera 33, such as a CCD-camera, may be arrangedin the load-lock chamber 85. The third camera is connectedto the computer 70 via a fourth signal line 36.Furthermore, the load-lock chamber 85 may comprise a positioning device, which is configured such that digitalimages are acquirable by the third camera 33 fromdifferent imaging direction relative to the structure. Inthe load-lock chamber 85, more than one camera may bearranged. The cameras in the load lock chamber may bearranged such that they have different imaging directionsrelative to the structure.
The cameras in the load-lock chamber 85 may be configuredto generate digital image data, which show or represent atleast a portion of the structure, such that the surfacemodel of the structure is calculable depending on thedigital image data. In the load lock chamber, the field ofview of the camera is not obstructed by the presence of anobjective lens and/or detectors.
Depending on the generated surface model, the position andorientation of the structure in the specimen chamber 80may be determined by comparing the surface model with thedigital images, which have been generated in the specimenchamber.
Figure 3 schematically shows the generated surface model90 of the structure. In the example, shown in figure 3,the structure comprises the top surfaces and the lateralsurfaces of the first, second and third object 10, 11, 12.Furthermore, the structure comprises the top surface ofthe object holder 20. Those surfaces of the object holder,which are not represented by the surface model of thestructure 90 are indicated in Figure 3 by dashed lines.The surface model of the structure 90 comprises aplurality of points 91, wherein the plurality of points 91are connected by geometric objects such as line segmentsor plane segments 91A.
Furthermore, the surface model of the structure 90comprises marks 97, 98, which represent the marks 21, 22 on the structure, as illustrated in Figure 1.
After having generated the surface model of the structure90, the computer 70 (illustrated in figure 2) isconfigured to determine the position and orientation ofthe surface model 90 relative to the object region OR, aswill be discussed in detail with reference to figure 4.
The computer 7 0 is further configured to show a two-dimensional representation 73 on a display 72 of thecomputer 70, such as illustrated in figure 2. This allowsthe user to select a location, at which he wants toperform a measurement. The user may select a view of therepresentation 73 on the display. Based on the selectedview, it is easier for the user to decide at whichlocation he wants to perform the measurement. Therepresentation 73 may be superimposed on a camera imageshowing the structure and/or the microscope portion.
Based the user input, the computer 70 determines ameasurement location P relative to the surface model 90.The measurement point P corresponds to a location M (asshown in figure 1), at which a measurement is to be taken.
Depending on the determined position and orientation ofthe surface model 90 relative to the object region OR, aswell as depending on the measurement location P, thecomputer calculates a positioning path P.
The positioning path may comprise translational movementsand/or rotational movements. In Figure 4, the positioningpath T is schematically indicated as a vector, whichconnects the measurement location P with the object regionOR. However, it is also conceivable that the positioningpath T comprises an arcuate path of translatory movement.After having determined the positioning path T, thecomputer transmits control signals to the positioning device 60 for arranging a location on the structure, whichcorresponds to the measurement location P within theobject region OR.
Figure 4 shows, in an exemplary manner, a combined surfacemodel 93, which has been generated by combining thesurface model of the structure 90 with a surface model ofa microscope portion 92. In this context, the termcombining may be understood to arrange the surface modelof the structure 90 and the surface model of themicroscope portion 92 relative to each other such thatthey represent the position and orientation of thestructure relative to the microscope portion in theparticle beam microscope.
The surface model of the microscope portion 92 may begenerated depending on the detected light rays.Alternatively or additionally, the surface model of themicroscope portion 92 may be determined depending on acontact-based measurement. The contact-based measurementmay be performed by a coordinate measuring machine.
The computer 70 is configured to calculate a distance Dbetween the surface model of the structure 90 and thesurface model of the microscope portion 92 depending onthe combined surface model 93. For example, the computercalculates all distances between pairs of points of thecombined surface models 93, wherein each pair of pointsconsists of a point of the surface model of the structure90 and a point of the surface model of the microscopeportion 92. Depending on the determined distances of thepairs of points, the smallest distance D may bedetermined. The distance D, which is shown in Figure 4, isthe distance between the point Q of the surface model ofthe microscope portion 92 and the point R of the surfacemodel of the structure 90. In case the distance D issmaller than a predetermined permissible distance, the particle beam microscope issues a warning signal or anotification. Furthermore, the particle beam microscopysystem 1 may be configured to stop positioning movementswhich lead to a distance between the microscope portionand the structure, which is smaller than the permissibledistance. The particle beam microscope system 1 isconfigured to determine a positioning path T depending onthe combined surface model 93, wherein the positioningpath T is determined such that a collision between themicroscope portion and the structure is avoided.
Figure 5 schematically shows in an exemplary manner thedetermining of the position and orientation of the surfacemodel of the structure relative to the object region.
After the surface model 90 has been generated, the firstcamera 31 (illustrated in Figure 2) acquires a digital image 94, as illustrated in Figure 5. In other words, the first camera 31 is a position acquisition camera of theparticle beam microscope system 1. The computer 70 isconfigured to compare the digital image 94 with the surface model of the structure 90. For example, the digital image 94 is compared with two-dimensionalrepresentations 90A, 90B which represent the surface modelin different orientations and positions. The comparingmay, for example, comprise extracting an edge 96 of thestructure 90 from the digital image 94 and comparing theextracted edge 96 with an edge or a rim 96A of the representation 90A of the surface model 90. Furthermore,the comparing may comprise extracting the mark 99, shownin a digital image 94, with a mark 99A of the representation 90A of the surface model 90. The extractingof the edge 96A and/or the mark 99 may comprise segmentingthe digital image 94.
Based on the comparison, the two dimensional representation 90A is identified as representing theposition and orientation of the structure. Thereby, the position and orientation of the surface model of thestructure 90 is determined.
It is conceivable that the determining of the position andthe orientation of the surface model 90 comprisesdetermining digital images from at least two differentimaging directions relative to the structure. The digitalimages may represent stereoscopic image data.
Figure 6 is a flow-chart of an exemplary method forpositioning the object within a particle beam microscopesystem 1, as shown in Figure 2, by using the surface modelof the structure 90, as shown in Figure 3. A detecting 100of light rays, which emanate from the structure isperformed by the first and/or second camera 31, 32.Depending on the geometry of the structure and/or therequired accuracy for the calculation of the surface model90, one or more images of the first camera 31 may besufficient to calculate the surface model of the structure90. The acquired digital images, which represent digitalimage data, are transmitted via the first and the secondsignal lines 34, 35 to the computer 70 and are stored in the storage device 71. Depending on the acquired digitalimages, a generating 101 of the surface model 90 isperformed by the computer 70. The generated surface model90 is stored in the storage device 71 of the computer 70.
Alternatively or additionally, the computer 70 may beconfigured to calculate the surface model depending onsignals of a particle detector, such as the detector 40,which is illustrated in Figure 2. Exemplary embodimentsfor calculating the surface model of the structure fromthe detected particles will be discussed with reference toFigures 9 to 12.
Depending on the known viewpoint positions, the knownimaging directions and the known magnifications of the first and/or second camera 31, 32, and/or depending on thegenerated surface model 90, a determining 102 of aposition and orientation of the surface model of thestructure 90 relative to the object region OR isperformed.
Alternatively or additionally, the determining 102 of theposition and orientation of the surface model of thestructure relative to the object region may be performeddepending on signals between the positioning device 60 andthe computer 70.
Alternatively or additionally, the determining 102 of theposition and orientation of the surface model of thestructure 90 is performed depending on signals of aparticle detector, such as the particle detector 40, asshown in Figure 2. In particular, the determining 102 ofthe position and orientation of the surface model of thestructure may be performed depending on particlemicroscopic images.
The computer 70 is configured to display a two-dimensionalrepresentation 73 of the surface model on the display 72.Based on the shown representation 73, the user can selecta location at which he wants to acquire a particlemicroscopic image. Depending on the user input, thecomputer performs a determining 103 of a measurementlocation P relative to the surface model of the structure90.
Depending on the position and orientation of the surfacemodel 90 relative to the object region OR and thedetermined measurement point P, the computer determines104 a positioning path. Depending on the determinedpositioning path T the computer transmits signals to thepositioning device 60 to control a positioning 105 of theobject. After the positioning of the object, the location of the object 10, at which a measurement is to be taken,is arranged in the object region OR. Then, the computer 70may again determine 102 the position and orientation ofthe surface model 90 or may determine 103 a measurementlocation depending on an input of the user.
Figure 7 illustrates a flow-chart of a further exemplarymethod, which is performed by the particle beam microscopesystem 1, as shown in Figure 2, wherein the combinedsurface model 93, as shown in Figure 4 is used forcollision detection. The method steps of detecting 110 thelight rays and/or particles and generating 111 the surfacemodel of the structure are performed as has been discussedwith reference to Figure 6.
In the exemplary method shown in Figure 7, the computergenerates 112 also a surface model of a microscope portion92. The microscope portion 92 may for example comprise atleast a portion of the surface of the detector 40, anobjective lens 30, a manipulator, a gas injection system,and/or a wall of the specimen chamber 80. Then, thecomputer 70 combines the surface model of the structure 90with the surface model of the microscope portion 92 toform a combined surface model 93. In the combined surfacemodel 93, the surface model of the structure 90 isarranged relative to the surface model of the microscopeportion 92 such that it corresponds to a relativeorientation and a relative position of the structurerelative to the microscope portion in the specimen chamber80. The combining 113 may be performed depending ondigital images of the first camera, the second camera 32,and/or signals of the detector 40. Alternatively oradditionally, the combining 113 may be performed dependingon control and/or sensor signals between the positioningdevice 60 and the computer 70.
The surface model of the structure 90 and the surface model of the microscope portion 92 may be generatedconsecutively. However, it is also conceivable that thesurface model of the structure 90 and the surface model ofthe microscope portion 92 are generated simultaneously, inparticular depending on the same digital images. Dependingon the combined surface model 93, a distance between thesurface model of the structure and the surface model ofthe microscope portion is determined. Depending on thecombined surface model 93 and the determined distance, thecomputer 70 determines 115 a positioning path T. Thepositioning path T is determined such that a collisionbetween the structure and the microscope portion isavoided. After the positioning 116, the computer 70 againgenerates a combined surface model 93. After having againdetermined the distance, the positioning path is againdetermined such that the collision between the structureand the microscope portion is avoided. Then, the computeragain controls the positioning 116 along the positioningpath T.
Figure 8 shows in an exemplary manner how the surfacemodel of the structure is generated depending on imagedata, which have been acquired by detecting particles. Theimage data are generated at different focus distances of aprimary beam 201 of the particle optical system 38(illustrated in Figure 2). The primary beam 201 is scannedacross the structure 203. The primary beam 201 comprises abeam waist W. The beam waist W is a portion of the primarybeam 201, in which the primary beam has a smallest beamdiameter measured perpendicular to a beam axis BA of theparticle optical system. A region B of the structure 203,which is located at a distance A away from the beam waistW is irradiated with a beam diameter of the primary beam201, which is greater than the beam diameter of the beamwaist W.
During the scanning of the primary beam 201 across the structure 203, image data are generated. The image datarepresent a discrete sampling of the structure 203. Forexample, the image data may comprise 1024 times 1024 pixeldata values. Therefore, each pixel data value represents aportion of the structure 203, having a diameter D. Forexample, M times M pixel data values are acquired from asquare-shaped portion of the structure having side lengthsL. The diameter of the portion of the structure 203, whichis represented by a pixel data value is L/M.
In case the diameter of the primary beam at the irradiatedportion B is greater than the diameter D, this causes alower resolution in the image data of the digital image. Adepth of focus T of the primary beam 201 may be defined asa range along the beam axis BA, in which the diameter ofthe particle beam 201 is smaller than the diameter D. Thedepth of focus T depends on an aperture angle a of theprimary beam 201. The aperture angle a may be defined as amaximum angle, which is formed by the particles of theprimary beam 201 with the beam axis BA.
When the distance A of the portion B of the object surfaceOS from the beam waist W is smaller or equal to half ofthe depth of focus T, this does not cause a reducedresolution in the image data of the digital image.However, in case the distance A is greater than half ofthe depth of focus T, this leads to a reduced resolutionof the image data.
The focus distance may be defined as the distance of thebeam waist W from a reference point of the particleoptical system. The reference point may, for example, be aprinciple plane of the objective lens 30 (shown in Figure2). A variation of the focus distance thereby causes avariation of the distance A. Hence, a variation of thefocus distance may lead to a different resolution of theimage data which represent the portion B. A comparatively high resolution of the portion B is achieved by a distanceA of the portion B from the beam waist W of the primarybeam 201 being smaller than half of the depth of focus T.
The focus distance of the particle optical system 39 maybe varied by varying an excitation of the objective lens30 (shown in Figure 2).
Figure 9 schematically illustrates how the surface modelof the structure is generated depending on a plurality ofdigital images 301, 302, 303 according to an exemplarymethod. Each of the digital images 301, 302, 303 has beengenerated by scanning the particle beam across at least aportion of the structure. The images 301, 302, 303 show asame portion of the structure. The digital images 301,302, 303 have been acquired at different focus distancesof the particle optical system 39. Therefore, portions inthe images 301, 302, 303, which represent a common portionof the structure, may have a different resolution. Forsimplicity of illustration, only three digital images areshown in Figure 9. However, the calculation of the surfacemodel may be performed depending on more than 5, more than10, more than 20, more than 50 or more than 100 digitalimages, which have been acquired at mutually differentfocus distances. For example, the surface model may begenerated depending on less than 500 or less than 200digital images.
A plurality of image regions 310, 311, 312, 320, 321, 322is selected from the image data of each of the digitalimages 301, 302, 303. For simplicity of illustration, onlysix image regions are shown in each of the digital images301, 302, 303. The plurality of image regions of a digitalimage may cover the whole or substantially the wholedigital image. The image regions 310, 311, 312, 320, 321,322 of the digital images 301, 302, 303 are selected suchthat the image regions 310, 311, 312, 320, 321, 322 may be divided into stacks, which show the same portion of thestructure .
In the embodiment, which is illustrated in Figure 9, afirst stack of image regions consists of the image regions310, 311 and 312. Each of the image regions 310, 311, 312of the first stack show a first common object portion. Asecond stack of image regions consists of the imageregions 320, 321 and 322. Each of the image regions 320,321 and 322 shows a second common object portion. Thefirst common object portion is different from the secondcommon object portion. In the exemplary embodiment, whichis shown in Figure 9, the first common object portion isadjacent and non-overlapping to the second common objectportion. However, the first common object portion maypartly overlap with the second common object portion. Itis also conceivable that the first common object portionand the second common object portion are neither adjacentnor overlapping, but located at a distance spaced apartfrom each other. For simplicity of illustration, only sixstacks of image regions are shown in Figure 9. Forexample, more than 100, more than 10,000 or more than 106stacks of image regions may be generated from the digitalimages, wherein each of the stacks represents a differentportion of the structure. For example, less than 109 stacksof image regions may be generated from the digital images.
The stacks of image regions, which represent a commonobject region may be determined by identifying objectfeatures, which appear in each of the digital images 301,302, 303. For example, the identifying of object featuresmay comprise identifying edges, identifying a differenceamong image data and/or determining a frequency of imagedata of an image region. The identifying of the objectfeatures may comprise segmenting each of the digitalimages 301, 302, 303.
An image region consists of a group of pixels. An imageregion may have the form of a square. For example, animage region may consist of 4 times 4 pixels, of 8 times 8pixels or of 10 times 10 pixels. An image region may be apixel cluster, which has an irregular or non-symmetricalshape. An image region may consist of a single pixel.
The computer 70 (illustrated in Figure 2) is configured todetermine for each of the stacks of image regions an imageregion which has the highest resolution among all imageregions in the respective stack and which is denotedherein as in-focus region. The in-focus region is selectedfrom the image regions of the respective stack.
For example, from the image regions 310, 311 and 312,which form the first stack, the in-focus image region isselected. Furthermore, from the image regions 320, 321,322, which form the second stack, a second in-focus regionis selected. Image region 311 is the in-focus region ofthe first stack and image region 322 is the in-focusregion of the second stack.
Each of the image regions represents an X-coordinate valueand a Y-coordinate value in a plane perpendicular to theoptical axis of the particle optical system. The X-coordinate value and the Y-coordinate value of the imageregion 322 are schematically illustrated in Figure 9.Furthermore, the focus distance at which the image data ofan image region have been acquired, represents a Z-coordinate value of a coordinate axis, which is orientedparallel to the optical axis of the particle opticalsystem.
The X-coordinate values, Y-coordinate values and Z-coordinate values of all in-focus image regions representa surface model of the structure.
Figure 10 schematically shows a surface model 390 whichhas been generated according to the method which has beendescribed with reference to Figure 9. The surface model390 is a two dimensional function, which assigns afunction value to discrete coordinate values in the X-Y-plane, wherein the function value represents a coordinatevalue of the Z-coordinate axis. Each of the functionvalues of the two dimensional function corresponds to afocus distance of one of the determined in-focus regionswithin a stack. The discrete coordinate values in the X-Y-plane correspond to the X-coordinate values and Y-coordinate values of the in-focus regions. The X-Y-planecorresponds to a plane, which is oriented perpendicular tothe optical axis of the particle optical system.
The computer 70 (illustrated in Figure 2) is configured tostore a measurement location 340 relative to the surfacemodel 390. For example, the computer 70 may be configuredto determine which location of the surface model of thestructure represents a region at which the primary beamimpinges. The computer 70 is configured to assign imagedata of an image 341, which has been generated by scanningthe primary beam at the measurement location 34 0 to thestored measurement location 340. The image 341 may, forexample, be a secondary electron image or an image, whichhas been generated by detecting back scattered electrons.The storing of the measurement location 340 may comprisestoring of X-coordinate values, Y-coordinate values and Z-coordinate values of the measurement location 340.
This allows a user or an evaluation routine of thecomputer to determine based on the surface model 390, fromwhich portions of the structure high resolution imageshave already been generated. Furthermore, it is possibleto interpret the image data of the image 341 in dependenceon the topography data of the surface model 390. Forexample, the surface portion, which is shown in image 341 may have a surface inclination, which is not recognizablein the image data of the image 341. However, by storingthe measurement location 340 relative to the surface model390, it is possible to recognize that the image data ofthe image 341 represent a flank surface of the groove 342.Thereby, it is possible for the user or for the evaluationroutine of the computer to determine a relationship or adependence between the surface topography, which isrepresented by the surface model 390 and the digital imagedata of the image 341. The image 341 may depend more oncompositional contrast than on topographical contrast. Inparticular, the digital image data of the image 341 may begenerated depending on detector signals of the detectorfor back scattered electrons. Thereby, to establish arelationship or a dependency between the compositionalcontrast of the image data of the image 341 and thesurface topography of the surface model 390.
Figure 11a schematically illustrates the generating of asurface model of the structure depending on detectedparticles according to a further exemplary embodiment. Byscanning the primary beam, a plurality of image groups isdetermined. In the embodiment, which is illustrated inFigure 11a, 12 image groups have been generated. Each ofthe image groups comprises a plurality of digital images,which represent a same or substantially same portion ofthe structure. The images of the image group, aregenerated at mutually different focus distances. A firstimage group 401 comprises the digital images 401a, 401b,401c, wherein for simplicity of illustration, only thepixel values of the image 401a are shown. Also forsimplicity of illustration, only three digital images ofthe image group 401 are shown. Similar to the exemplaryembodiment, which is shown in Figure 9, each of the imagegroups may comprise a plurality of digital images, inparticular more than three digital images. A second imagegroup 411 comprises the digital images 411a, 411b and 411c. The images of all image groups represent astructure, which comprises a portion of the surface of theobject holder 411 and a portion of the surface of theobject 410, which are schematically illustrated in Figurelib. The arrow, which is shown in Figure lib schematicallyindicates an imaging direction VD of the digital images,which are shown in Figure 11a. The imaging direction VD isoriented parallel to the optical axis of the particleoptical system. Each of the digital images, which areshown in Figure 11a, has been acquired at a workingdistance of the particle optical system of 20 millimeters.A side length of a field of view fv along an edge of thedigital images is 5 millimeters.
With a field of view of this size, it is not possible toimage the complete top surface of the object 411 in asingle scanning process. However, a surface model may begenerated depending on the plurality of image groups ofparticle optical images, as shown in Figure 11a. Each ofthe image groups has been generated at a differentposition of the structure relative to the objective lens.Each of the image groups yield a surface model. Thesurface models of the image groups are combined to formthe surface model of the structure.
In the exemplary embodiment, which is shown in Figure 11a,the digital images of adjacent image groups showneighboring portions of the structure, which overlap. Forexample, the portion, which is shown in image 401aoverlaps with the portion which is shown in image 411a.
Based on the images of each image group, image regions aregenerated, as has been discussed with reference to Figure9. Thereby, for each of the image groups, a surface modelis obtained. The surface models of neighboring groupsoverlap. Depending on the data values of the surface modelin the overlapping region, the surface models are combined to a surface model of the total structure.
Thereby, it is possible to generate a surface model of astructure by detecting particles, wherein the structurehas a greater extent measured in a plane perpendicular tothe optical axis than a side length vf of a field of viewof an image of the particle optical system.
Figure 12 schematically illustrates an alternativeembodiment for generating a surface model depending ondetected particles. A digital image 412 shows one of aplurality of images, from which image regions 600 aregenerated. The image regions 600 show portions of thestructure, which are located at a distance spaced apartfrom each other. In other words, the image regions areneither adjacent nor overlapping. The image regions 600may consist of a plurality of pixels of between 1 and 8,or between 1 and 50, or between 1 and 500 or between 1 and1, 000 or between 1 and 10,000 pixels. In the exemplaryembodiment, as shown in Figure 11, the image regions 600are pixel clusters, each of which consist of 16 pixels.For example, the first image region 500 consists of pixels501, ... 516.
Each of the pixel clusters is an isolated pixel cluster.In other words, each point of the structure, which isrepresented by the first image region, is located fromeach point of the structure, which is represented by afurther image region, at least with a distance b. One ofthose further image regions is the image region 600. Thedistance b may be a multiple of the diameter of a portionof the structure, which is represented by a pixel of thepixel cluster. This diameter may be defined as thesampling distance. The distance b may be greater than 10times, greater than 100 times or greater than 1,000 timesthe sampling distance. The distance b may be less than10,000 times the sampling distance.
Accordingly, it is possible to calculate a surface modelof the structure within a comparatively short time. Inparticular, it is thereby possible that only a small portion of the structure has to be scanned by the primarybeam and/or only image data from a comparatively smallnumber of pixels have to be processed for generating thesurface model.
It is further conceivable, that one or more or all pixelclusters consist of a single pixel. The pixel represents alocation at which the primary beam is positioned at thestructure. At this location, a focus distance of the primary beam may be varied without scanning the surface.During the varying of the focus distance, particles are detected which are generated by an interaction of the primary beam with the structure. Depending on the detectorsignal, it may be determined which focus distancecorresponds to the object distance, i.e. when a distancebetween the irradiated portion of the structure and thebeam waist is less than half of the depth of focus.Thereby, it is possible to generate a surface model of astructure in a very short time.
While the invention has been described with respect tocertain exemplary embodiments thereof, it is evident thatmany alternatives, modifications and variations will beapparent to those skilled in the art. Accordingly, theexemplary embodiments of the invention set forth hereinare intended to be illustrative and not limiting in anyway. Various changes may be made without departing fromthe spirit and scope of the present invention as definedin the following claims.
In the figures: 100 = detect light rays and/or particles 101 = generate a surface model of the structure 102 = determine position and orientation of the surface model 103 = determine a measurement location 104 = determine a positioning path 105 = position the object 110 = detect light rays and/or particles 111 = generate a surface model of the structure 112 = generate a surface model of the microscope portion 113 = combine surface models to generate a combined
Surface model 114 = determine a distance 115 = determine a positioning path 116 = position the object
Claims (21)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
NL2013145A NL2013145C2 (en) | 2010-09-29 | 2014-07-07 | Particle beam microscope and method for operating the particle beam microscope. |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102010046902 | 2010-09-29 | ||
DE102010046902.5A DE102010046902B4 (en) | 2010-09-29 | 2010-09-29 | Particle beam microscope and method for operating this |
US13/029,998 US8227752B1 (en) | 2011-02-17 | 2011-02-17 | Method of operating a scanning electron microscope |
US201113029998 | 2011-02-17 | ||
DE102011103997 | 2011-06-10 | ||
DE102011103997 | 2011-06-10 |
Publications (2)
Publication Number | Publication Date |
---|---|
NL1039086A true NL1039086A (en) | 2012-04-02 |
NL1039086C2 NL1039086C2 (en) | 2014-07-15 |
Family
ID=44994120
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
NL1039086A NL1039086C2 (en) | 2010-09-29 | 2011-09-29 | Particle beam microscope and method for operating the particle beam microscope. |
NL2013145A NL2013145C2 (en) | 2010-09-29 | 2014-07-07 | Particle beam microscope and method for operating the particle beam microscope. |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
NL2013145A NL2013145C2 (en) | 2010-09-29 | 2014-07-07 | Particle beam microscope and method for operating the particle beam microscope. |
Country Status (4)
Country | Link |
---|---|
CN (1) | CN102543640B (en) |
CZ (1) | CZ307992B6 (en) |
GB (1) | GB2484197A (en) |
NL (2) | NL1039086C2 (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2835817B1 (en) * | 2013-08-09 | 2017-12-20 | Carl Zeiss Microscopy Ltd. | Method for semi-automated particle analysis using a charged particle beam |
JP7008650B2 (en) * | 2019-02-01 | 2022-01-25 | 日本電子株式会社 | Sample measurement method using a charged particle beam system and a scanning electron microscope |
US11821860B2 (en) * | 2019-10-16 | 2023-11-21 | Carl Zeiss X-Ray Microscopy Inc. | Optical three-dimensional scanning for collision avoidance in microscopy system |
JP7054711B2 (en) * | 2020-01-23 | 2022-04-14 | 日本電子株式会社 | How to adjust charged particle beam device and charged particle beam device |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1061358A2 (en) * | 1999-06-15 | 2000-12-20 | Applied Materials, Inc. | Apparatus and method for reviewing defects on an object |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5594245A (en) * | 1990-10-12 | 1997-01-14 | Hitachi, Ltd. | Scanning electron microscope and method for dimension measuring by using the same |
US5481111A (en) * | 1994-01-03 | 1996-01-02 | Philips Electronics North America Corporation | Electron microscope having a goniometer controlled from the image frame of reference |
US5548694A (en) * | 1995-01-31 | 1996-08-20 | Mitsubishi Electric Information Technology Center America, Inc. | Collision avoidance system for voxel-based object representation |
DE102005026022A1 (en) * | 2005-06-03 | 2006-12-07 | Werth Messtechnik Gmbh | Coordinate measuring device and method for measuring an object with a coordinate measuring machine |
JP4426519B2 (en) * | 2005-11-11 | 2010-03-03 | 株式会社日立ハイテクノロジーズ | Optical height detection method, electron beam measuring device, and electron beam inspection device |
WO2007090537A2 (en) * | 2006-02-03 | 2007-08-16 | Carl Zeiss Nts Gmbh | Focusing and positioning auxiliary device for a particle-optical scanning microscope |
JP5075393B2 (en) * | 2006-10-30 | 2012-11-21 | 株式会社日立ハイテクノロジーズ | Scanning electron microscope |
DE102008001812B4 (en) * | 2008-05-15 | 2013-05-29 | Carl Zeiss Microscopy Gmbh | Positioning device for a particle beam device |
US7745804B1 (en) * | 2009-02-13 | 2010-06-29 | Advanced Ion Beam Technology, Inc. | Ion implantation method and application thereof |
-
2011
- 2011-09-28 GB GB1116720.2A patent/GB2484197A/en not_active Withdrawn
- 2011-09-29 CZ CZ2011-607A patent/CZ307992B6/en unknown
- 2011-09-29 CN CN201110418980.3A patent/CN102543640B/en active Active
- 2011-09-29 NL NL1039086A patent/NL1039086C2/en not_active IP Right Cessation
-
2014
- 2014-07-07 NL NL2013145A patent/NL2013145C2/en not_active IP Right Cessation
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1061358A2 (en) * | 1999-06-15 | 2000-12-20 | Applied Materials, Inc. | Apparatus and method for reviewing defects on an object |
Also Published As
Publication number | Publication date |
---|---|
NL1039086C2 (en) | 2014-07-15 |
GB201116720D0 (en) | 2011-11-09 |
GB2484197A (en) | 2012-04-04 |
CN102543640A (en) | 2012-07-04 |
NL2013145A (en) | 2014-08-07 |
CZ307992B6 (en) | 2019-10-09 |
CZ2011607A3 (en) | 2012-12-05 |
CN102543640B (en) | 2016-09-28 |
NL2013145C2 (en) | 2015-05-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8487252B2 (en) | Particle beam microscope and method for operating the particle beam microscope | |
CN107230649B (en) | Section processing observation method and section processing observation device | |
JP5968677B2 (en) | Sample inspection method using charged particle microscope | |
JP4474337B2 (en) | Sample preparation / observation method and charged particle beam apparatus | |
JP4974737B2 (en) | Charged particle system | |
JP6925608B2 (en) | Cross-section processing observation method, cross-section processing observation device | |
NL1039086A (en) | Particle beam microscope and method for operating the particle beam microscope. | |
US9287087B2 (en) | Sample observation method, sample preparation method, and charged particle beam apparatus | |
WO2017033591A1 (en) | Charged particle beam device and sample stage alignment adjustment method | |
DE102010046902B4 (en) | Particle beam microscope and method for operating this | |
US9035247B2 (en) | Method for operating a particle beam device and/or for analyzing an object in a particle beam device | |
JP6360620B2 (en) | Charged particle beam apparatus, charged particle beam apparatus alignment method, alignment program, and storage medium | |
US10541108B2 (en) | Method and apparatus for transmission electron microscopy | |
JPWO2016157403A6 (en) | Charged particle beam apparatus, charged particle beam apparatus alignment method, alignment program, and storage medium | |
JP2002270127A (en) | Data processing device for electron beam device and method of stereoscopic measurement of electron beam device | |
KR20230015441A (en) | Charged particle beam device and sample observation method using the same | |
CN113785170B (en) | Pattern measuring apparatus and measuring method | |
JP6764953B2 (en) | Charged particle beam device | |
JP7323574B2 (en) | Charged particle beam device and image acquisition method | |
Cui | Robust micro/nano-positioning by visual servoing | |
EP4012745A1 (en) | 3d mapping of samples in charged particle microscopy |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
MM | Lapsed because of non-payment of the annual fee |
Effective date: 20221001 |