KR102055307B1 - Apparatus for generating three-dimensional shape information of an object to be measured - Google Patents

Apparatus for generating three-dimensional shape information of an object to be measured Download PDF

Info

Publication number
KR102055307B1
KR102055307B1 KR1020180119751A KR20180119751A KR102055307B1 KR 102055307 B1 KR102055307 B1 KR 102055307B1 KR 1020180119751 A KR1020180119751 A KR 1020180119751A KR 20180119751 A KR20180119751 A KR 20180119751A KR 102055307 B1 KR102055307 B1 KR 102055307B1
Authority
KR
South Korea
Prior art keywords
light
hologram
wavelength
information
measured
Prior art date
Application number
KR1020180119751A
Other languages
Korean (ko)
Inventor
김병목
김지훈
성맑음
이상진
Original Assignee
주식회사 내일해
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 내일해 filed Critical 주식회사 내일해
Priority to KR1020180119751A priority Critical patent/KR102055307B1/en
Application granted granted Critical
Publication of KR102055307B1 publication Critical patent/KR102055307B1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images

Abstract

One embodiment of the present invention, a light source for emitting a single wavelength of light, a collimator for collimating the single wavelength of light emitted from the light source, a light splitter for splitting the single wavelength of light passing through the collimator into object light and reference light, An object light objective lens for passing the object light divided by the light splitter, a reference light objective lens for passing the reference light split by the light splitter, an optical mirror for reflecting the reference light passed through the reference light objective lens, An interference fringe formed through the object light objective lens and the reference light reflected by the optical mirror through the object light objective lens and the reference light reflected by the optical mirror, respectively, passing through the object light objective lens and the reference light objective lens and transmitted to the optical splitter An image sensor for recording an object and an object generated by converting the interference fringe from the image sensor And a processor configured to receive and store an image including intensity information of the hologram, and generate three-dimensional shape information of the object to be measured, wherein the light source unit is configured to measure the object in the first direction. Provided is a three-dimensional shape information generating device of the object to be measured, wherein the wavelength value of the single wavelength light is determined using step information, measurement accuracy in the first direction, and bit depth of the image sensor. .

Description

Apparatus for generating three-dimensional shape information of an object to be measured}

The present invention relates to an apparatus for generating three-dimensional shape information of an object to be measured. More specifically, the present invention is a three-dimensional object of the measurement object from the image containing the intensity information of the object hologram generated by the interference of the reference light reflected from the optical mirror and the object light reflected from the object or transmitted through the measurement object An apparatus for generating shape information.

Digital holography microscope refers to a microscope that acquires the shape of an object using digital holography technology.

If a conventional microscope is a device that acquires the shape of an object by acquiring reflected light reflected from the object, the digital holography microscope acquires the interference light and / or diffracted light generated by the object, and obtains the shape of the object therefrom. to be.

Digital holography microscopy uses a laser that generates light of a single wavelength as a light source, and splits the light generated by the laser into two lights using a light splitter. In this case, one light (hereinafter referred to as reference light) is directed toward the image sensor, and the other light (hereinafter referred to as object light) is reflected from the target object and directed toward the image sensor described above so that interference between the reference light and the object light is generated. do.

The image sensor may record the interference fringe according to the interference phenomenon as a digital image, and restore the three-dimensional shape of the object to be measured from the recorded interference fringe. The interference fringe recorded by the image sensor at this time is usually referred to as hologram.

Conventional optical holography microscopes record interference fringes as a special film due to interference between reference light and object light. At this time, when the reference film is irradiated onto the special film on which the interference fringe is recorded, the shape of the virtual measurement object is restored to the position where the measurement object is located.

Compared with the conventional optical holography microscope, the digital holography microscope digitizes (or digitizes) the interference fringes of the light through an image sensor and restores the shape of the object to be measured through electronic calculations rather than optical methods. There is a difference.

On the other hand, the conventional digital holography microscope using a single wavelength laser light source has a problem that the minimum unit length of the measurement of the object is limited to the wavelength of the laser. Another conventional digital holography microscope using a laser light source of two or more wavelengths to compensate for this is not only high production cost of the microscope, but also has a problem that can not obtain a three-dimensional shape of the object in real time.

In addition, the above-described conventional digital holography microscopes generate a computer generated hologram (CGH) with a computer to restore the shape of an object to be measured, and then display it on a spatial light modulator (SLM) and display the shape on the displayed shape. The 3D hologram image of the object was obtained by illuminating the reference light. However, this method not only requires the use of an expensive spatial light modulator (SLM), but also merely digitizes a special film of the optical holographic microscope described above, and the technical limitations are clear.

In order to solve the problems of such conventional digital holography microscopes, for example, Korean Patent Publication No. 10-2016-0029606 (hereinafter referred to as "disclosed prior art") proposes a digital holography microscope and a digital holographic image generating method. Hereinafter, a brief description of the disclosed prior art.

Figure 2 is a block diagram showing in detail a two-wavelength digital holographic microscope apparatus according to the prior art.

Referring to FIG. 2, the conventional two-wavelength digital holographic microscope apparatus includes a mixed light source unit 10, a wavelength division unit 20, an interference fringe acquisition unit 30, an object unit 40, an image sensor unit 50, The image storage unit 60, the control unit 70, the object shape recovery unit 80 is included.

The mixed light source unit 10 includes a mixed light source light emitting unit 11 and a light source unit lens 12. The mixed light source light emitting unit 11 emits mixed light having a wavelength band distributed in several bands which are not single. The light source unit lens 12 optically adjusts the mixed light generated by the mixed light source light emitting unit 11 and makes it enter the wavelength division unit 20.

The wavelength splitter 20 includes a first light splitter 21, a first filter plate 22, a second filter plate 23, and a first reflector 24. The first light splitter 21 receives the mixed light incident from the mixed light source unit 10 and divides the light into two lights. In this case, the first light splitter 21 divides the incident mixed light into different directions to advance the split light. The first filter plate 22 receives one of the lights split by the first light splitter 21 and obtains a first light beam having a predetermined single wavelength. Here, the light input to the first filter plate 22 is filtered while passing through the first filter plate 22, and a first light beam having a single wavelength determined according to the characteristics of the first filter plate 22 is obtained. The second filter plate 23 receives the other of the light split from the first light splitter 21 in the same manner as the first filter plate 22, and has a second wavelength different from the wavelength of the first light beam. Acquire a ray. The second beam is then sent to the interference fringe acquisition unit 30. The first reflector 24 serves to reflect the first light beam obtained from the first filter plate 22 to the interference fringe acquisition unit 30.

The interference pattern acquisition unit 30 includes a second light splitter 31, a third light splitter 32, a second reflector 33, a third light filter plate 34, and a third reflector 35. The second light splitter 31 receives the first light beam input from the wavelength splitter 20 and divides the first light beam into a first object light and a first reference light. In this case, the second light splitter 31 divides the incident first light beams into different directions. The third light splitter 32 receives the second light beam in the same manner as the second light splitter 31 and divides the second light beam into a second object light and a second reference light. The second reflector 33 receives the first reference light and sends the reflected first light to the second light splitter 31. The third filter plate 34 may receive the first reference light split by the second light splitter 31 and send the incident light to the second reflector 33, and receive the reflected first reflection reference light to the second light splitter. In addition, the third filter plate 34 prevents the second object light from reaching the second reflector 33 when the second object light reaches the second light splitter 31 and splits light so that part of the light passes toward the second reflector 33. . To this end, the third filter plate 34 is a filter plate having the same characteristics as the first filter plate 22 in transmitting light. The third reflector 35 receives the second reference light and sends the reflected second reference light to the third light splitter 32, where the second reflector 33 and the third reflector 35 are the control unit 70. The angle can be adjusted according to the control of), so that an off-axis hologram can be realized.

On the other hand, the first object light and the second object light obtained as described above is converted to each of the first reflecting object light and the second reflecting object light through the following process is sent to the image sensor unit 50. The second light splitter 31 enters the first object light divided as described above into the object to be measured which is mounted on the objective part 40, and the second object split and sent from the third light splitter 32. Light is incident on the object to be measured. In this case, the reflected light reflecting the first object light incident from the measurement target object is called the first reflecting object light. In addition, the reflected light reflecting the second object light incident from the object to be measured is referred to as the second reflecting object light. The second light splitter 31 receives the first reflecting object light and the second reflecting object light reflected as described above, and sends them to the third light splitter 32. The third light splitter 32 transmits the first reflecting object light and the second reflecting object light, which are input as described above, to the image sensor unit 50 again.

In addition, the first reflection reference light and the second reflection reference light obtained as described above are sent to the image sensor unit 50 through the following process. Specifically, the second light splitter 31 receives the first reflecting reference light reflected from the second reflector 33 and sends it to the third light splitter 32. The third light splitter 32 receives the first reflection reference light sent from the second light splitter 31 and the second reflection reference light reflected from the third reflector 35 as described above, and then receives the image sensor unit 50 again. Send to. Accordingly, in the third light splitter 32, the first reflecting object light, the first reflecting reference light, the second reflecting object light, and the second reflecting reference light are all sent in the same direction to the image sensor unit 50, and then interfering with each other. An interference fringe is created.

On the other hand, the second reflector 33 and the third reflector 35 are angled under the control of the control unit 70 in order to form an off-axis system that allows light rays of different wavelengths to form different interference fringes. Characterized in that can be adjusted in a multi-direction. That is, as the angles of the second reflector 33 and the third reflector 35 are different from each other, the first reflecting reference light reflected from the second reflector 33 and the second reference reflected from the third reflector 35 When the separation occurs in the direction of the light, the first reflecting reference light and the second reflecting reference light are combined with the first reflecting object light and the second reflecting object light reaching the image sensor unit 50 to form an interference fringe. Very different shrinkage interference patterns are formed.

The objective portion 40 includes an object holder 41 and an objective lens 42. The object holder 41 fixes the object to be measured in the holder to measure the object, and the objective lens 42 optically adjusts the first object light and the second object light incident on the object to be measured.

The image sensor unit 50 projects the interference fringes acquired by the interference fringe acquisition unit 30 to the digital image sensor, measures the projected interference fringes using the digital image sensor, and measures the measured values as discrete signals. Convert to The recording of the interference fringe is usually called a hologram. As such a digital image sensor, various image sensors such as a CCD may be used.

The image storage unit 60 stores the interference fringe information converted into discrete signals by the image sensor unit 50 in various storage media such as a memory or a disk device.

The controller 70 implements the above-described off-axis system and adjusts the position and angle of the second reflector 33 and the third reflector 35 to obtain the interference fringe, and the like. ), The objective part 40 is controlled to adjust the objective lens 42 to adjust the first object light and the second object light incident on the object to be measured, and the interference fringe is measured and The image sensor unit 50 is controlled to convert the information into a discrete signal, and the image storage unit 60 is controlled to store the interference fringe information converted into the discrete signal.

The object shape restoring unit 80 includes a phase information acquiring unit 81, a thickness information acquiring unit 82, and a shape restoring unit 83. The phase information acquisition unit 81 obtains the phase information of the interference fringe for the first ray and the phase information of the interference fringe for the second ray by using the interference fringe information, and the thickness information acquirer 82. Acquires thickness information of the measurement target object using the phase information, and the shape restoration unit 83 restores the real-time three-dimensional shape of the measurement target object using the thickness information. In this case, the thickness information of the object to be measured includes difference information between paths of the object light and the reference light. Because of the optical path difference between the object light and the reference light, the interference fringe is formed when the object light and the reference light overlap.

According to the disclosed prior art including the above, it is possible to improve the measurement resolution and to secure the real-time property of image acquisition, but the following problems still occur.

In the prior art disclosed first, since a mixed light source having a wavelength band distributed in several bands is used, the wavelength division unit 20 splits first and second light sources having different wavelengths to obtain at least two single wavelengths. To do this, the first filter plate 22, the second filter plate 23, and the first reflector 24 should be used.

The third light splitter 32 for dividing the second light source, the third reflector 35 for reflecting the second light source, and the second light source are the second reflector 33. The third filter plate 34 to block the incident to should be used additionally.

Therefore, the structure of the microscope becomes complicated, which involves various problems such as an increase in manufacturing cost and an increase in design complexity. Therefore, while using a single wavelength light source, a new method for solving the above problems is required.

Republic of Korea Patent Publication No. 10-2016-0029606 Republic of Korea Patent Publication No. 10-2010-0095302 Republic of Korea Patent Publication No. 10-2012-0014355 Republic of Korea Patent No. 10-1139178 Republic of Korea Patent No. 10-1441245 U.S. Pat.No.7,649,160

The present invention is to solve the problems of the prior art as described above, it is intended to accurately generate the three-dimensional shape information of the object to be measured by obtaining only one hologram.

In particular, the present invention is to generate three-dimensional shape information of the measurement target object with improved accuracy by generating information about the reference light and curvature aberration information of the object light objective lens from one hologram and correcting the obtained object hologram.

The present invention also seeks to solve complex optical device structures and thus significant high cost problems.

Furthermore, the present invention seeks to detect defects of such structures with a high probability by accurately obtaining three-dimensional shapes of ultra-fine structures such as TFTs and semiconductors.

One embodiment of the present invention, a light source for emitting a single wavelength of light, a collimator for collimating the single wavelength of light emitted from the light source, a light splitter for splitting the single wavelength of light passing through the collimator into object light and reference light, An object light objective lens for passing the object light divided by the light splitter, a reference light objective lens for passing the reference light split by the light splitter, an optical mirror for reflecting the reference light passed through the reference light objective lens, An interference fringe formed through the object light objective lens and the reference light reflected by the optical mirror through the object light objective lens and the reference light reflected by the optical mirror, respectively, passing through the object light objective lens and the reference light objective lens and transmitted to the optical splitter An image sensor for recording an object and an object generated by converting the interference fringe from the image sensor And a processor configured to receive and store an image including intensity information of the hologram, and generate three-dimensional shape information of the object to be measured, wherein the light source unit is configured to measure the object in the first direction. Provided is a three-dimensional shape information generating device of the object to be measured, wherein the wavelength value of the single wavelength light is determined using step information, measurement accuracy in the first direction, and bit depth of the image sensor. .

Another embodiment of the present invention, a light source for emitting a single wavelength of light, a collimator for collimating a single wavelength of light emitted from the light source, a light splitter for splitting the single wavelength light passing through the collimator into object light and reference light, An object light objective lens configured to pass the object light split by the optical splitter and then pass an object transmitted light including information of the object to be measured, and reflect the object transmitted light passed through the object light objective lens A second optical mirror, a reference light objective lens for passing the reference light divided by the light splitter, a first optical mirror for reflecting the reference light passing through the reference light objective lens, the reference light reflected by the first optical mirror, and A second light splitter and the second light splitter, through which the object transmitted light reflected by the second optical mirror is transmitted; Receives and stores an image sensor that records the interference fringes formed by the reference light and the transmitted object light transmitted to the circumference, and an image including intensity information of the object hologram generated by converting the interference fringes in the image sensor. And a processor configured to generate three-dimensional shape information of the object to be measured, wherein the light source unit includes step information of the object to be measured with respect to a first direction known in advance, measurement accuracy with respect to the first direction, and An apparatus for generating three-dimensional shape information of an object to be measured, in which a wavelength value of the single wavelength light is determined using a bit depth of an image sensor.

In one embodiment of the present invention, the wavelength value of the single wavelength light of the image sensor that satisfies both the first range of the maximum height value or more of the step information of the object to be measured, and the second range of the measurement accuracy arbitrarily selected It may be determined by any one selected from the bit depth candidate group.

In one embodiment of the present invention, the processor extracts real components corresponding to a real image of at least one frequency component included in the image, and pairs with the reference light based on the real components. a real hologram including a correction light having a relation) and mounting information of the object to be measured; and generating an intermediate hologram from which the information of the reference light is removed from the actual hologram, based on the correction light, After generating the curvature aberration correction information from a hologram, based on the curvature aberration correction information, a correction hologram in which an error due to curvature aberration is improved in the intermediate hologram is generated, and the 3 of the object to be measured is measured from the correction hologram. Dimensional shape information can be generated.

In one embodiment of the present invention, the processor generates three-dimensional shape information of the measurement object from the intermediate hologram, and the curvature based on the three-dimensional shape information of the measurement object generated from the intermediate hologram At least one parameter for determining aberration correction information may be determined, and the curvature aberration correction information may be generated based on the parameter.

Other aspects, features, and advantages other than those described above will become apparent from the following detailed description, claims, and drawings.

According to an embodiment of the present invention made as described above, according to the present invention it is possible to accurately generate the three-dimensional shape information of the object to be measured by obtaining only one hologram.

In particular, by generating information about the reference light and curvature aberration information of the object light objective lens from one hologram and correcting the obtained object hologram in consideration of this, it is possible to generate three-dimensional shape information of the measurement object with improved accuracy.

It can also solve complex optical device structures and therefore significant cost problems.

Furthermore, by accurately acquiring three-dimensional shapes of ultra-fine structures such as TFTs and semiconductors, defects of these structures can be detected with high probability.

Of course, the scope of the present invention is not limited by these effects.

1 is a block diagram illustrating in detail a two-wavelength digital holographic microscope apparatus according to the disclosed prior art.
2A is a block diagram showing a schematic configuration of a holographic restoration apparatus according to the first embodiment of the present invention.
2B is a block diagram illustrating a schematic configuration of a holographic restoration apparatus according to a second embodiment of the present invention.
2C and 2D are block diagrams schematically showing another embodiment of the first and second embodiments of the present invention.
3 and 4 are diagrams for explaining a method of determining the conditions of the light provided from the light source unit.
5A and 5B are diagrams for describing an external shape of an exemplary measurement object.
6 is an example of an image of a portion of an object to be measured.
FIG. 7 is a diagram illustrating frequency components of an image of a portion of the measurement object illustrated in FIG. 6.
FIG. 8 is a diagram for describing a method of extracting frequency components corresponding to actual images from the frequency components illustrated in FIG. 7.
9A is a diagram showing the intensity of the digital reference light.
9B is a diagram illustrating the phase of reference light.
9C is a diagram showing the intensity of correction light.
9D is a diagram showing the phase of correction light.
10 is a diagram illustrating an exemplary actual hologram.
11 and 12 are diagrams for describing a method of determining, by a processor, a curvature aberration correction term from an intermediate hologram, according to an exemplary embodiment.
FIG. 13 is a diagram illustrating an example of a three-dimensional shape of a measurement target object generated from a hologram.
14 is a flowchart illustrating a method of generating three-dimensional shape information of a measurement target object performed by a holographic reconstruction apparatus according to an embodiment of the present invention.
15 and 16 are flowcharts of a method of removing noise of a holographic reconstruction apparatus according to embodiments of the present invention.

DETAILED DESCRIPTION Various embodiments of the present disclosure are described below in connection with the accompanying drawings. Various embodiments of the present disclosure may have various changes and various embodiments, and specific embodiments are illustrated in the drawings and described in detail. However, this is not intended to limit the various embodiments of the present disclosure to specific embodiments, it should be understood to include all modifications and / or equivalents and substitutes included in the spirit and scope of the various embodiments of the present disclosure. In the description of the drawings, similar reference numerals are used for similar elements.

Expressions such as "comprises" or "may include" as used in various embodiments of the present disclosure indicate the existence of the corresponding function, operation or component disclosed, and additional one or more functions, operations or It does not restrict the components. In addition, in various embodiments of the present disclosure, the terms "comprise" or "have" are intended to indicate that there is a feature, number, step, action, component, part, or combination thereof described in the specification, It is to be understood that it does not exclude in advance the possibility of the presence or addition of one or more other features or numbers, steps, operations, components, components or combinations thereof.

In various embodiments of the present disclosure, the expression “or” includes any and all combinations of words listed together. For example, "A or B" may include A, may include B, or may include both A and B.

The expressions “first,” “second,” “first,” or “second,” etc., used in various embodiments of the present disclosure may modify various elements of the various embodiments, but do not limit the corresponding elements. Do not. For example, the above expressions do not limit the order and / or importance of the corresponding elements. The above expressions may be used to distinguish one component from another. For example, both a first user device and a second user device are user devices and represent different user devices. For example, without departing from the scope of the various embodiments of the present disclosure, the first component may be named a second component, and similarly, the second component may also be named the first component.

When a component is said to be "connected" or "connected" to another component, the component may or may not be directly connected to or connected to the other component. It is to be understood that there may be new other components between the other components. On the other hand, when a component is referred to as being "directly connected" or "directly connected" to another component, it will be understood that there is no new other component between the component and the other component. Should be able.

The terms used in various embodiments of the present disclosure are merely used to describe specific embodiments, and are not intended to limit the various embodiments of the present disclosure. Singular expressions include plural expressions unless the context clearly indicates otherwise.

Unless defined otherwise, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art to which various embodiments of the present disclosure belong.

Terms such as those defined in the commonly used dictionaries should be construed as having meanings consistent with the meanings in the context of the related art, and, unless expressly defined in the various embodiments of the present disclosure, are ideal or overly formal. It is not interpreted in the sense.

2A is a block diagram showing a schematic configuration of an inspection unit 300A according to the first embodiment of the present invention.

In the present invention, the "holographic reconstruction device" may refer to a device that acquires a hologram (hereinafter, referred to as an "object hologram") for an object to be measured and analyzes and / or displays the obtained object hologram.

For example, the holographic restoration apparatus 300A may be a device arranged on a semiconductor manufacturing line to obtain an object hologram of a semiconductor to be produced, and determine whether the semiconductor is integrity from the obtained object hologram. However, this is merely exemplary and the spirit of the present invention is not limited thereto.

Meanwhile, in the present invention, the "hologram" is a hologram that can be generated from an image acquired by the holographic restoration apparatus 300A, and may mean a hologram before various processing by the holography restoration apparatus 300A is performed. have. Detailed description thereof will be described later.

Referring to FIG. 2A, the holographic restoration apparatus 300A according to the first exemplary embodiment of the present invention includes a light source unit 310 that emits a single wavelength light, and a collimator 320 for collimating single wavelength light emitted from the light source unit 310. ), An optical splitter 330 that splits the single wavelength light passing through the collimator 320 into the object light O and the reference light R, and an object that passes the object light O split by the light splitter 330. A reference light objective lens 360 for passing the reference light R divided by the optical objective lens 340, the light splitter 330, and an optical mirror 370 for reflecting the reference light R passing through the reference light objective lens 360. ), The object light O reflected by the object light objective lens 340 and reflected from the surface of the object 350 and the reference light R reflected by the optical mirror 370 are respectively the object light objective lens 340. And a recording medium for passing the reference light objective lens 360 to the optical splitter 330 to record an image. A sensor 380, and the image sensor processor 380 processes the acquired image may include 390.

In this case, the processor 390 may generate 3D information of the object to be measured 350 from the image acquired by the image sensor 380. A detailed description of the operation of the processor 390 will be described later.

2B is a block diagram showing a schematic configuration of a holographic restoration apparatus 300B according to a second embodiment of the present invention.

Referring to FIG. 2B, the holographic restoration apparatus 300B according to the second exemplary embodiment of the present invention includes a light source unit 310 for emitting single wavelength light, and a collimator 320 for collimating single wavelength light emitted from the light source unit 310. ), The light splitter 330 for dividing the single wavelength light passing through the collimator 320 into the object light O and the reference light R, and the object light O split by the light splitter 330 is an object to be measured. Reflects the object light objective lens 340 and the object transmitted light T passing through the object light objective lens 340 after passing through the 350 and passing the object transmitted light T including information of the object 350 to be measured. The first optical mirror 372, the reference light objective lens 360 passing through the reference light R split by the light splitter 330, and the first light reflecting the reference light R passing through the reference light objective lens 360. Transmission of the object reflected by the optical mirror 370, the reference light R reflected by the first optical mirror 370 and the second optical mirror 372 An image sensor 380 for recording an image formed by the second light splitter 332 to which T is transmitted, the reference light R transmitted to the second light splitter 332, and the object light transmitted light T; The image sensor 380 may include a processor 390 for processing an image obtained.

Of course, in this second embodiment, the processor 390 may generate 3D information of the object to be measured 350 from the image acquired by the image sensor 380. A detailed description of the operation of the processor 90 will be described later.

2C and 2D are block diagrams schematically showing another embodiment of the first and second embodiments of the present invention.

2C and 2D, the holographic restoration apparatus 300A, 300B, which is another embodiment of the first and second embodiments, transmits light of a single wavelength provided from the light source unit 310 to the inspection unit 100. It may further include a light transmitter 311 to.

Here, the inspection unit 100 means an optical configuration except for the light source unit 310 and the processor 390 among the holographic restoration apparatuses 300A and 300B. Specifically, the inspection unit 100 according to the first embodiment includes a collimator 320, a light splitter 330, an object light objective lens 340, a reference light objective lens 360, an optical mirror 370, and an image sensor ( 380, the inspection unit 100 according to the second embodiment includes a collimator 320, a light splitter 330, an object light objective lens 340, a second optical mirror 372, and a reference light objective lens 360. ), A first optical mirror 370, a second light splitter 332, and an image sensor 380. In this case, the inspection unit 100 may omit the collimator 320 by the function of the light transmitter 311 described later. The inspection unit 100 is an optical system for forming paths of the object light O and the reference light R for generating the object hologram of the object to be measured, and between the components in order to generate accurate three-dimensional shape information. The system is built with the distance set in advance.

The light transmitter 311 is configured to increase the positional freedom of the inspection unit 100, and the light having a single wavelength provided from the light source unit 310 to the inspection unit 100 spaced apart from the light source unit 310. It can perform the function of delivering. The light transmitter 311 is a flexible light transmission medium, and may be, for example, an optical fiber. When the optical fiber is used as the optical transmitter 311, the optical fiber 311 performs a function of transmitting light only for a specific wavelength, and thus may remove noise of input light transmitted from the light source unit 310. When emitting the beam size can be provided to the inspection unit 100 by expanding the beam size. In other words, the optical fiber 311 may replace the function of the collimator 320, so that the inspection unit 100 may omit the collimator 320 as necessary.

The holographic restoration apparatuses 300A and 300B of FIGS. 2C and 2D described above may be disposed in an in-line deposition system such as a semiconductor process to perform inspection in real time without having to carry out an inspection object to the outside of the system after deposition. When provided in such an in-line deposition system, only a compact inspection unit 100 is disposed inside the in-line deposition system in order to minimize interference in the deposition process, and the bulky light source unit 310 is disposed outside. Can be. The light transmitter 311 may be installed between the light source unit 310 and the inspection unit 100 that are physically spaced apart from each other, and may transmit light provided from the light source unit 310 to the inspection unit 100.

2A to 2D, the holographic restoration apparatus 300A according to the first embodiment of the present invention and the holographic restoration apparatus 300B according to the second embodiment of the present invention are respectively measured by the object light O. 2B, 2B, 2B, and / or the reflection of the target object 350 (the embodiment of FIGS. 2A and 2C) or the object light O transmits the measurement object 350 (the embodiment of FIGS. Except for the use of elements (eg, the further use of the second optical mirror 372 and the second light splitter 332 of the embodiment of FIG. 2B and the arrangement of some of the components accordingly), they have substantially the same configuration.

In particular, it should be noted that the image is acquired by the image sensor 380 and the processor 390 has the same feature in that it generates the reference light R from the acquired image.

Hereinafter, the holographic restoration apparatuses 300A and 300B according to the first and second embodiments of the present invention will be collectively referred to as a holographic restoration apparatus 300.

Meanwhile, the processor 390 of the holographic restoration apparatus 300 according to the exemplary embodiment of the present invention may include all kinds of devices capable of processing data. For example, the processor 390 may refer to a data processing apparatus embedded in hardware having a physically structured circuit for performing a function represented by a code or an instruction included in a program. The processor 390 may be included in the inspection unit 300 and disposed inside the transfer chambers 201 to 205, but may be provided as a separate configuration and disposed outside the transfer chambers 201 to 205.

As an example of the data processing device embedded in the hardware, a microprocessor, a central processing unit (CPU), a processor core, a multiprocessor, an application-specific integrated (ASIC) Although it may include a processing device such as a circuit, a field programmable gate array (FPGA), etc., the scope of the present invention is not limited thereto.

In addition, the image sensor 380 according to an embodiment of the present invention may be implemented with at least one image sensor such as a charge coupled device (CCD), a complementary metal-oxide semiconductor (CMOS), or the like.

3 and 4 are diagrams for describing a method of determining the condition of the light L provided from the light source unit 310.

3 and 4, the light source unit 310 includes step information of the measurement target object 350 in a first direction known in advance, measurement accuracy in the first direction, and bit depth of the image sensor 380. The bit depth may be used to determine the wavelength of single wavelength light.

The holographic restoration apparatus 300 generates three-dimensional shape information of the measurement target object 350 formed of the structures M1 and M2 having a height with respect to the first direction. Here, the first direction may be a direction perpendicular to one surface of a substrate on which the structures M1 and M2 are disposed. Alternatively, the first direction may be a direction in which the object light O is irradiated toward the measurement object 350 through the object light objective lens 340.

The light source unit 310 may apply any kind of source device capable of generating light, and may use, for example, a laser capable of irradiating light of a specific wavelength band. The light source unit 310 may use a laser having good coherence to form an interference fringe of the object light O and the reference light R. The light source unit 310 may generate light having a wavelength λ required by the holographic restoration apparatus 300. In this case, the wavelength λ may be determined by step information of the measurement target object 350.

The holographic restoration apparatus 300 generates an object hologram using the interference between the object light O and the reference light R reflected from the surface of the object 350 to be measured, and determines the coherence. Longer lengths allow for accurate measurements. In other words, the wavelength of the light must be longer than the height of the measurement object 350 to enable accurate measurement. The holographic reconstruction apparatus 300 extracts phase information of light from an image including a recorded interference fringe.

At this time, the phase information is expressed in radians. When the wavelength of the light becomes shorter than the height of the object 350 to be measured, the phase of the object light O reflected from the surface of the object 350 to be measured changes within one period. Accurate measurement is difficult because it is impossible to tell whether it is one or more than one cycle. In other words, when the complex amplitudes of light corresponding to the phase information π / 2 and 5π / 2 of the image recorded in the image sensor 380 are recorded, they have the same complex amplitude at π / 2 and 5π / 2. Indistinguishable. Therefore, the wavelength of the light should be longer than the height of the object 350 to be measured.

Meanwhile, the wavelength value of the single wavelength light may be determined in a first range equal to or greater than the maximum height value of the step information of the measurement target object. As described above, the wavelength of the light used for the measurement should be longer than the height of the object 350 to be measured, and as shown in FIG. 3, the first structure in which the object 350 is measured has a first height t1. In the case of including the second structure M2 having the height M2 and the second height t2 higher than the first height t1, the length must be longer than the second height t2. For example, when the step of the object to be measured is 400 nm to 600 nm, the wavelength value of the single wavelength light may be determined in the range of 600 nm or more. On the other hand, the light source unit 310 may be selected in the wavelength band (400 nm to 700 nn) of the visible light region in one embodiment. In this case, the first range C1 of the wavelength value of the single wavelength light may be 600 nm to 700 nm.

The light source unit 310 may determine the wavelength value of the single wavelength light using the measurement accuracy in the first direction and the bit depth of the image sensor. Here, the measurement accuracy refers to the accuracy of the step of the measurement target object 350, and may be the resolution of the holographic reconstruction device 300 in the first direction. The measurement accuracy is correlated with the wavelength of the light generated by the light source 310 and the bit depth n of the image sensor 380. Specifically, the measurement accuracy is determined by the wavelength of the light used for the measurement (

Figure 112018099007019-pat00001
) Can be proportional to the power of 2 divided by n powers. This relation can be expressed by the following equation (1).

[Equation 1]

Figure 112018099007019-pat00002

At this time, the wavelength value of the single wavelength light is the bit depth of the image sensor that satisfies both the first range (C1) more than the maximum height value of the step information of the measurement target object 350, and the second range (C2) of the arbitrarily selected measurement accuracy It may be determined by any one selected from the candidate group. As shown in FIG. 4, the x-axis represents a wavelength and the y-axis represents a measurement accuracy, and the measurement accuracy according to the wavelength may be displayed as a graph of a linear function for each bit depth of the image sensor. The bit depth candidate group of the image sensor may be formed of beep depths that pass through an area where the first range C1 and the second range C2 overlap, as shown by the area shaded in the drawing.

For example, when the measurement accuracy is set in the range of 1 nm to 2 nm, the bit depth n of the image sensor passing through the first range C1 and the second range C2 becomes 10 bits and 12 bits. . When the bit depth of the image sensor is 10 bits (n = 10), the wavelength value of the single wavelength light can be determined by selecting from 600 nm to 700 nm, and when the bit depth of the image sensor is 12 bits (n = 12), The wavelength value of the wavelength light can be determined by selecting from the range of 650 nm to 700 nm.

5A and 5B are diagrams for describing an appearance of an exemplary object to be measured 350. When the measurement target object 350 is the substrate M, deposition materials may be formed on the substrate M along a mask pattern. Since the thin film pattern deposited on the substrate M has a rather complicated shape, the following description will be given by using the simplified pattern shown in FIGS. 5A and 5B as an example for convenience of description.

As illustrated in FIGS. 5A and 5B, the measurement target object 350 may include rectangular parallelepiped structures 51A to 51I disposed on one surface at predetermined intervals. In other words, the measurement object 350 may include rectangular parallelepiped structures 51A to 51I protruding in the Z direction on a plane parallel to the X-Y plane.

Hereinafter, the holographic restoration apparatus 300 irradiates the object light O in a direction perpendicular to the plane on which the rectangular parallelepiped structures 51A to 51I of the measurement target object 350 are disposed, and thus the image of the measurement target object 350. It is explained on the premise of obtaining.

First, the image sensor 380 according to an exemplary embodiment may acquire an image of the measurement target object 350.

In the present invention, the 'image' of the object 350 to be measured is intensity information (that is, | at each position of the object hologram U0 (x, y, 0) with respect to the object 350 to be measured. (U0 (x, y, 0) | 2 ) and may be expressed as Equation 2 below.

[Equation 2]

Figure 112018099007019-pat00003

Here, the object hologram Uo (x, y, 0) represents phase information at each x, y point of the object to be measured, and x and y are coordinates in the space where the object to be measured is placed and are perpendicular to the object light O. Coordinates defining a plane to be expressed, and O (x, y) and R (x, y) represent the object light O and the reference light R, respectively, and O * (x, y) and R * (x, y) represents the complex conjugate of the object light O and the reference light R, respectively.

For example, the image sensor 380 may acquire an image as shown in FIG. 5 with respect to a portion of the measurement object 350 shown in FIGS. 5A and 5B (eg, a portion including 51A and 51B). Can be.

Since the image acquired by the image sensor 380 includes intensity information at each position of the object hologram U0 (x, y, 0) as described above, the image sensor 380 obtains the general information. It may be different from the image of the object to be measured 350 (that is, photographed only with the object light O).

Referring to Equation 1, the object hologram U0 (x, y, 0) does not include the object light 0 including the phase information of the object 350 to be measured at each point and the phase information of the object to be measured. It may be generated by the interference of the reference light (R).

In addition, the object hologram U0 (x, y, 0) is an object light objective lens 340 in addition to the phase information (that is, the height information of the object) at each point (that is, each x, y point) of the object 350 to be measured. Error and noise (eg, speckle noise due to photon use of a laser) may be further included.

Accordingly, the processor 390 according to an embodiment of the present invention may perform various calculation processes as described below to remove the above-described error and noise from the image acquired by the image sensor 380.

The processor 390 according to an embodiment of the present invention may check the frequency components of the image acquired by the image sensor 380. For example, the processor 390 may perform a 2D Fourier transform on the image to identify frequency components of the image.

In other words, the processor 90 may determine the frequency components included in the image including the location-specific intensity information of the object hologram U0 (x, y, 0) (that is, | (U0 (x, y, 0) | 2 ). In this case, the image may include a frequency component corresponding to a real image, a frequency component corresponding to an Imaginary image, and a DC component.

Of course, the image may further include various components in addition to the aforementioned three components (frequency component corresponding to the actual image, frequency component corresponding to the virtual image and DC component). For example, the image may further include frequency components due to noise. However, this is merely exemplary and the spirit of the present invention is not limited thereto.

The processor 390 according to an embodiment of the present invention may extract only the components corresponding to the actual from the identified frequency components. In this case, the processor 390 may extract components corresponding to the actual image in various ways.

For example, the processor 390 extracts components (hereinafter, peak components) having a peak value of a component from frequency components included in an image, and extracts peak components corresponding to actual phases from the extracted peak components. Components within the frequency difference of firing may be extracted as components corresponding to the actual phase.

In this case, the processor 390 may determine components corresponding to the actual image in various ways based on peak components corresponding to the actual image. For example, the processor 90 may determine, among the frequency components corresponding to the real, frequency components in the cross region including the peak component as the components corresponding to the real. In this case, the length of the cross region from the peak component may be determined based on a distance difference value between the frequency component corresponding to the actual image and the frequency component corresponding to the origin. For example, but this is merely illustrative and the spirit of the present invention is not limited thereto.

In an optional embodiment, the processor 390 may extract only components corresponding to the actual image from the frequency components included in the hologram using an automatic real image spot-position extraction algorithm.

In the present invention, 'extracting' a specific frequency component may mean extracting a frequency of the frequency component and a magnitude (or intensity) of the frequency component.

FIG. 7 is a diagram illustrating frequency components of an image of a portion of the measurement target object 350 illustrated in FIG. 6.

As described above, the processor 390 may check the frequency components of the image acquired by the image sensor 380. Accordingly, the processor 390 may determine the frequency components 911 corresponding to the real image and the frequency components corresponding to the virtual image. Various frequency components can be identified, including 912 and DC component 913.

In addition, the processor 390 may extract only the frequency component 911 corresponding to the actual image from the identified components. At this time, the processor 390 removes noise from the frequency component corresponding to the actual state. Specifically, the processor 390 removes the frequency components located in the direction of the interference fringe and the normal direction of the interference fringe with noise, so as to correspond to the actual image as shown in FIGS. 8A, 8B, 8C, and 8D, for example. The frequency components in the cross region centered on the peak component may be determined as components corresponding to the actual conditions. At this time, the direction of the cross section is rotated according to the direction of the interference fringe.

The processor 390 according to an embodiment of the present invention may generate the digital reference light from frequency components corresponding to the real image extracted by the above-described process. In more detail, the processor 390 may calculate the propagation direction and the wave number of the digital reference light based on the frequency components corresponding to the actual conditions. In other words, the processor 390 may calculate the wavenumber vector of the digital reference light.

In addition, the processor 390 generates a digital reference light based on the propagation direction and the wave number (or frequency vector) of the digital reference light, and generates a conjugate term of the generated digital reference light R (x, y) as shown in Equation 3 below. By this, correction light Rc (x, y) can be generated.

[Equation 3]

Rc (x, y) = conj [R (x, y)]

In this case, R (x, y) represents the digital reference light generated based on the frequency components corresponding to the actual image, and Rc (x, y) represents the correction light.

The processor 390 extracts the normals of the interference fringes and the direction lines Line1 and Line2 parallel to the interference fringes, corresponding to the peak component 911P, from the frequency components 911 corresponding to the actual images. The processor 390 determines the area including Line 1 and Line 2 as noise areas Noise 1, Noise 2, Noise 3, and Noise 4. The processor 390 may extract frequency components distributed in regions other than the noise regions. The processor 90 may extract frequency components corresponding to the real image from which the noise is removed using a pattern except for the noise region. As shown in FIG. 8C, the processor 90 may remove noise by using a cross pattern Pattern1 that excludes frequency components distributed in Lines 1 and 2. In this case, the processor 90 may form a cross pattern Pattern1 based on a predetermined ratio, for example, 1/3 of R, between the origin component 913 and the frequency component 911 corresponding to the actual image. You can decide.

The processor 90 may set various patterns for removing the noise area. As illustrated in FIG. 8D, noise of the frequency component corresponding to the real image may be noise using the pattern Pattern2 that is wider as the peak component of the frequency component 911 corresponding to the real image becomes wider.

Since the digital reference light R (x, y) and the correction light Rc (x, y) are conjugated, the intensity is the same as shown in Figs. 9A and 9C, and as shown in Figs. 9B and 9D. Likewise, the phases can be reversed. 9A is a diagram showing the intensity of the digital reference light R (x, y), FIG. 9B is a diagram showing the phase of the reference light, and FIG. 9C is the intensity of the correction light Rc (x, y). 9D is a diagram showing the phase of correction light.

The generated correction light Rc (x, y) can be used to correct the actual hologram Um (x, y, 0), which will be described later.

Meanwhile, the 'digital reference light' is light having the same property as the reference light R generated by the light splitter 330 from the light of the single wavelength, and the processor 390 recovers from the image acquired by the image sensor 380. It may be a virtual light.

The processor 390 according to an embodiment of the present invention may generate a virtual hologram based on frequency components corresponding to the real image extracted by the above-described process. For example, the processor 390 may generate an actual hologram as illustrated in FIG. 9 by performing an inverse 2D Fourier transform on frequency components corresponding to an actual image.

In this case, the hologram may be represented by Equation 4 below.

[Equation 4]

Um (x, y, 0) = O (x, y) R * (x, y)

Where Um (x, y, 0) actually represents the hologram, O (x, y) represents the object light O, and R * (x, y) represents the complex conjugate of the reference light R.

On the other hand, such a real hologram Um (x, y, 0), in addition to the information on the height of the measurement target object 350, the error due to the aberration of the information on the reference light R and the object light objective lens 340. It may include.

Therefore, the processor 390 according to an embodiment of the present invention takes the hologram Um (x, y, 0) into consideration in consideration of the influence caused by the reference light R and the error caused by the aberration of the object light objective lens 340. The correction hologram Uc (x, y, 0) may be generated.

For example, the processor 90 may actually use the term Rc (x, y) for correction light and the term Rca (x) for correction of curvature aberration in the hologram Um (x, y, 0) as shown in Equation 5 below. By multiplying (y)), the correction hologram Uc (x, y, 0) can be generated.

[Equation 5]

Uc (x, y, 0) = Um (x, y, 0) Rc (x, y) Rca (x, y)

Here, Uc (x, y, 0) represents a correction hologram from which information on the reference light R and aberration information of the object light objective lens 340 are removed, and Um (x, y, 0) represents a hologram in reality. Rc (x, y) represents the term for the correction light and Rca (x, y) the term for the curvature aberration correction.

Meanwhile, the processor 390 according to an embodiment of the present invention may generate the term Rca (x, y) for the aforementioned curvature aberration correction in various ways.

For example, the processor 390 actually uses three of the object 350 to be measured from the hologram (hereinafter, referred to as an intermediate hologram) in which the hologram Um (x, y, 0) is multiplied by only the term Rc (x, y) for the corrected light. A dimensional shape may be generated, and a term Rca (x, y) for curvature aberration correction may be generated from the generated three-dimensional shape.

In detail, the processor 390 may determine at least one parameter for determining a curvature aberration correction term from the three-dimensional shape of the measurement target object 350 generated from the intermediate hologram. In this case, for example, the parameter may include a coordinate and a radius of a center point defining a hemispherical curved surface.

11 and 12 are diagrams for describing a method of determining, by the processor 390, a curvature aberration correction term from an intermediate hologram, according to an exemplary embodiment.

For convenience of explanation, it is assumed that the image sensor 380 acquires an image of the rectangular parallelepiped structure 51D of FIG. 3B, and the processor 390 generates an intermediate hologram for the structure 51D according to the above-described process. do. It is also assumed that the three-dimensional shape 920 of the structure 51D generated from the intermediate hologram for the structure 51D is as shown in FIG. 10.

Under the foregoing assumption, the processor 390 according to an embodiment of the present invention may determine at least one parameter for determining a curvature aberration correction term from the three-dimensional shape 920. For example, the processor 390 may determine, as parameters, the coordinates Cx and Cy of the center point of the hemispherical curved surface and the radius r of the curved surface from the curve on the cross section II of the three-dimensional shape 920 as shown in FIG. 12. . In this case, the processor 390 according to an embodiment of the present invention may determine the position and / or direction of the cut surface such that the cut surface such as the I-I cross section includes the center point of the three-dimensional shape 920 (that is, the center point of the hemispherical shape). The processor 390 may also determine that a cut plane, such as an I-I cross section, is parallel to the direction of travel of the object light 0.

The processor 390 according to an embodiment of the present invention may generate (or determine) a curvature aberration correction term based on at least one parameter determined by the above-described process. For example, the processor 390 generates a surface in three-dimensional space with reference to the coordinates (Cx, Cy) of the center point of the surface and the radius (r) of the surface, and is reflected in the phase correction of each x, y point from the generated surface. The curvature aberration correction term may be generated (or determined) by generating information.

In an optional embodiment, the processor 390 may determine the correction term from the middle hologram of the object being measured (eg, an object with the same z value in all x, y coordinates).

In the case of a measurement object that knows the shape in advance, the z value at each x and y point is known in advance, so that the processor 390 may determine the three-dimensional shape of the measurement object generated from the intermediate hologram and the shape of the known object. The correction term may be determined by checking the difference between the z values at the x and y points. However, this is merely exemplary and the spirit of the present invention is not limited thereto.

The processor 390 according to an embodiment of the present invention may generate a three-dimensional shape of the object to be measured 350 based on the correction hologram Uc (x, y, 0). In other words, the processor 390 may calculate the height in the z direction of the object at each x and y point.

For example, the processor 90 may convert the correction hologram Uc (x, y, 0) into information of the reconstructed image surface. In this case, the reconstructed image plane refers to a virtual image display plane corresponding to the distance between the measurement object and the image sensor by the processor, and may be a virtual plane calculated and simulated by the processor 90.

The processor 390 may calculate the height in the z direction of the object at points x and y as shown in FIGS. 13A, 13B, and 13C from the restored information in consideration of the restored image plane. FIG. 13A shows a result of reconstruction without removing noise from a frequency component. FIG. 13B shows a reconstruction result of removing noise using a cross pattern Pattern1, and FIG. 13C shows a reconstruction result of removing noise using a Pattern2. . A1, A2, and A3 represent the height values in the Z direction in a planar graph. A1 has no noise removed, so the change in the height value in the z direction is large, and A2 and A3 show little change in the height value in the z direction.

When using the cross pattern (Pattern), the processor 390 may extract the frequency component according to Equation 5 below.

 [Equation 5]

Figure 112018099007019-pat00004

13A, 13B, and 13C illustrate three-dimensional shapes of two rectangular parallelepiped structures 51A and 51B disposed on the measurement target object 350.

FIG. 14 is a flowchart for describing a method of generating 3D shape information of the object to be measured 350, which is performed by the holographic restoration apparatus 300, according to an exemplary embodiment. Hereinafter, descriptions of contents overlapping with those described with reference to FIGS. 2 to 13 will be omitted, and will be described with reference to FIGS. 2 to 13.

The holographic reconstruction apparatus 300 according to an embodiment of the present invention may acquire an image of the measurement target object 350 (S1201).

In the present invention, the 'image' of the object 350 to be measured is intensity information (that is, | at each position of the object hologram U0 (x, y, 0) with respect to the object 350 to be measured. (U0 (x, y, 0) | 2 ), and may be represented by Equation 2 described above.

For example, the holographic restoration apparatus 300 acquires an image as shown in FIG. 6 with respect to a portion of the measurement object 350 shown in FIGS. 5A and 5B (eg, a portion including 51A and 51B). can do.

Since the image acquired by the holographic restoration apparatus 300 includes intensity information at each position of the object hologram U0 (x, y, 0) as described above, the holographic restoration apparatus 300 acquires it. It may be different from the image of one general (ie, taken only with the object light O) measurement object 350.

Referring to Equation 2, the object hologram U0 (x, y, 0) does not include the object light 0 including the phase information of the object 350 to be measured at each point and the phase information of the object to be measured. It may be generated by the interference of the reference light (R).

In addition, the object hologram U0 (x, y, 0) is an object light objective lens 340 in addition to the phase information (that is, the height information of the object) at each point (that is, each x, y point) of the object 350 to be measured. Error and noise (eg, speckle noise due to photon use of a laser) may be further included.

Accordingly, the holographic restoration apparatus 300 according to an embodiment of the present invention may perform the operations of steps S1202 to S1207 to remove the above-described error and noise from the image obtained by the holography restoration apparatus 300. .

The holographic restoration apparatus 300 according to an embodiment of the present invention may check frequency components of the image acquired by the holography restoration apparatus 300 (S1202). For example, the holographic reconstruction apparatus 300 may perform 2D Fourier transform on an image to check frequency components of the image.

In other words, the holographic restoration apparatus 300 may include a frequency included in an image including location-specific intensity information of the object hologram U0 (x, y, 0) (that is, | (U0 (x, y, 0) | 2 ). In this case, the image may include a frequency component corresponding to a real image, a frequency component corresponding to an image and a DC component.

Of course, the image may further include various components in addition to the aforementioned three components (frequency component corresponding to the actual image, frequency component corresponding to the virtual image and DC component). For example, the image may further include frequency components due to noise. However, this is merely exemplary and the spirit of the present invention is not limited thereto.

The holography restoration apparatus 300 according to an exemplary embodiment of the present invention may extract only components corresponding to the actual image from the identified frequency components (S1203). In this case, the holographic restoration apparatus 300 may extract components corresponding to the actual image in various ways.

For example, the holographic reconstruction apparatus 300 extracts components (hereinafter, peak components) having a peak value of a component from frequency components included in an image, and peaks corresponding to actual images from the extracted peak components. Components within the difference between the components and the firing frequency may be extracted as components corresponding to the actual phase.

In this case, the holographic restoration apparatus 300 may determine components corresponding to the actual image in various ways based on peak components corresponding to the actual image. For example, the holographic restoration apparatus 300 may determine the frequency components in the cross region centered on the peak components corresponding to the actual image as the components corresponding to the actual image. However, this is merely exemplary and the spirit of the present invention is not limited thereto.

In an exemplary embodiment, the holographic reconstruction apparatus 300 may extract only components corresponding to a real image from frequency components included in the hologram by using an automatic real image spot-position extraction algorithm.

In the present invention, 'extracting' a specific frequency component may mean extracting a frequency of the frequency component and a magnitude (or intensity) of the frequency component.

Referring back to FIG. 7, the holography restoration apparatus 300 may identify frequency components of the image acquired by the holography restoration apparatus 300, and thus the holography restoration apparatus 300 may correspond to the frequency components 911 corresponding to the actual image. ), Various frequency components including the frequency component 912 and the DC component 913 corresponding to the virtual image may be identified.

Also, the holographic restoration apparatus 300 may extract only the frequency component 911 corresponding to the actual image from the identified components. In this case, the holographic restoration apparatus 300 may determine, as shown in FIG. 7, the frequency components 911B in the cross region centered on the peak component 911A corresponding to the actual image as components corresponding to the actual image.

The holographic restoration apparatus 300 according to an embodiment of the present invention may generate a digital reference light from frequency components corresponding to the real image extracted by the above-described process (S1204). In more detail, the holographic restoration apparatus 300 may calculate the propagation direction and the wave number of the digital reference light based on the frequency components corresponding to the reality. In other words, the holographic restoration apparatus 300 may calculate the wavenumber vector of the digital reference light.

Also, the holographic restoration apparatus 300 generates a digital reference light based on the propagation direction and the wave number (or wave vector) of the digital reference light, and generates the digital reference light R (x, y) as shown in Equation 3 above. The correction light Rc (x, y) can be generated by obtaining the conjugate term.

Since the digital reference light R (x, y) and the correction light Rc (x, y) are conjugated, the intensity is the same as shown in Figs. 9A and 9C, and as shown in Figs. 9B and 9D. Likewise, the phases can be reversed. 9A is a diagram showing the intensity of the digital reference light R (x, y), FIG. 9B is a diagram showing the phase of the reference light, and FIG. 9C is the intensity of the correction light Rc (x, y). 9D is a diagram showing the phase of correction light.

The generated correction light Rc (x, y) can be used to correct the actual hologram Um (x, y, 0), which will be described later.

Meanwhile, the 'digital reference light' is light having the same property as the reference light R generated by the light splitter 330 from the light having a single wavelength, and the holographic restoration device 300 is obtained by the holography restoration device 300. Virtual light reconstructed from an image.

The holographic restoration apparatus 300 according to an embodiment of the present invention may also generate a hologram on the basis of frequency components corresponding to the real image extracted by the above-described process (S1204). For example, the holographic restoration apparatus 300 may generate an actual hologram as illustrated in FIG. 10 by performing an inverse 2D Fourier transform on frequency components corresponding to an actual image. In this case, the hologram may be represented by Equation 3 described above.

The holographic restoration apparatus 300 according to an embodiment of the present invention may generate an intermediate hologram in order to generate a term Rca (x, y) for curvature aberration correction (S1205). For example, the holographic restoration apparatus 300 may generate the intermediate hologram by multiplying the hologram Um (x, y, 0) by the term Rc (x, y) for the corrected light. The generated intermediate hologram may be used to generate curvature aberration correction information in step S1206.

The holographic restoration apparatus 300 according to an embodiment of the present invention generates a three-dimensional shape of the object to be measured 350 from the intermediate hologram generated in step S1205, and the term for the curvature aberration correction from the generated three-dimensional shape ( Rca (x, y)) may be generated (S1206). In more detail, the holographic restoration apparatus 300 may determine at least one parameter for determining a curvature aberration correction term from the three-dimensional shape of the measurement target object 350 generated from the intermediate hologram. In this case, for example, the parameter may include a coordinate and a radius of a center point defining a hemispherical curved surface.

Referring back to FIGS. 11 and 12, a method of determining the curvature aberration correction term by the holographic restoration apparatus 300 according to an embodiment of the present invention from an intermediate hologram will be described. For convenience of description, the holographic restoration apparatus 300 acquires an image of the rectangular parallelepiped structure 51D of FIG. 5B, and the holography restoration apparatus 300 generates an intermediate hologram for the structure 51D according to the above-described process. Suppose you created it. It is also assumed that the three-dimensional shape 920 of the structure 51D generated from the intermediate hologram for the structure 51D is as shown in FIG. 11.

Under the above-described assumption, the holographic restoration apparatus 300 according to an embodiment of the present invention may determine at least one parameter for determining a curvature aberration correction term from the three-dimensional shape 920. For example, the holographic restoration apparatus 300 determines, as parameters, the coordinates (Cx, Cy) and the radius (r) of the center point of the hemispherical curved surface from the curve on the II section of the three-dimensional shape 920 as shown in FIG. 12. Can be. At this time, the holographic restoration apparatus 300 according to an embodiment of the present invention may determine the position and / or direction of the cut surface such that a cut surface such as a cross section II includes a center point of the three-dimensional shape 920 (that is, the center point of the hemispherical shape). . In addition, the holographic restoration apparatus 300 may determine that a cut plane such as an I-I cross section is parallel to a traveling direction of the object light 0.

The holographic restoration apparatus 300 according to an exemplary embodiment may generate (or determine) a curvature aberration correction term based on at least one parameter determined by the above-described process. For example, the holographic restoration apparatus 300 generates a surface in three-dimensional space with reference to the coordinates (Cx, Cy) of the center point of the surface and the radius (r) of the surface, and corrects the phase of each x, y point from the generated surface. The curvature aberration correction term may be generated (or determined) by generating information to be reflected.

In an alternative embodiment, the holographic restoration apparatus 300 may determine the correction term from an intermediate hologram of a measurement target object (eg, an object having the same z value in all x, y coordinates) having a known shape.

In the case of the measurement target object that knows the shape in advance, the z-values at the x and y points are known in advance. The correction term may be determined by checking the difference between the z values at each x and y point of the shape. However, this is merely exemplary and the spirit of the present invention is not limited thereto.

The holographic restoration apparatus 300 according to an embodiment of the present invention actually takes a hologram Um (x, y, 0) in consideration of an error caused by the reference light R and an aberration of the object light objective lens 340. The correction hologram Uc (x, y, 0) can be generated from S1207. For example, the holographic reconstruction device 300 is a term Rc (x, y) for correction light and a term Rca for curvature aberration correction in the hologram Um (x, y, 0) as shown in Equation 5 described above. By multiplying (x, y), a correction hologram Uc (x, y, 0) can be generated. In this case, the term Rc (x, y) for the correction light may be generated in step S1204, and the term Rca (x, y) for the curvature aberration correction may be generated in step S1206.

The holographic restoration apparatus 300 according to an exemplary embodiment may generate three-dimensional shape information of the measurement target object 350 based on the correction hologram Uc (x, y, 0) (S1208). In other words, the holographic restoration apparatus 300 may calculate the height in the z direction of the object at each x and y point.

For example, the holographic restoration apparatus 300 may convert the correction hologram Uc (x, y, 0) into information of a restored image plane. In this case, the reconstructed image plane means a virtual image display plane corresponding to the distance between the measurement object and the image sensor by the processor, and may be a virtual plane that is calculated and simulated by the holographic reconstruction device 300. .

The holographic restoration apparatus 300 may calculate the height in the z direction of the object at the x and y points as shown in FIG. 13 from the restored information in consideration of the restored image plane. 13 illustrates a three-dimensional shape of two rectangular parallelepiped structures 51A and 51B disposed on the measurement object 350.

15 and 16 are flowcharts of a method for removing noise of the holographic restoration apparatus 300 according to embodiments of the present disclosure.

The holography restoration apparatus 300 according to an exemplary embodiment of the present invention may extract only components corresponding to the actual image from the identified frequency components (S1203).

In S12031, the holographic restoration apparatus 300 determines a first frequency component corresponding to a real image included in an image, a second frequency component corresponding to a virtual image, and a third frequency component corresponding to an origin.

In S12032, the holographic restoration apparatus 300 calculates the direction and the normal direction of the interference fringe from the first frequency component, and determines the frequency component located in the direction and the normal direction of the interference fringe as noise.

In S12033, the holography restoration apparatus 300 extracts a frequency component corresponding to the actual image by removing noise from the first frequency component.

In another embodiment, the holographic restoration apparatus 300 may set a pattern from which noise is removed and extract a frequency component corresponding to a real image using the pattern.

In operation S12034, the holography restoration apparatus 300 determines a first frequency component corresponding to a real image included in an image, a second frequency component corresponding to a virtual image, and a third frequency component corresponding to an origin.

In operation S12035, the holographic restoration apparatus 300 generates a cross region pattern including the peak component of the first frequency component. In S12036, the holographic reconstruction apparatus 300 extracts frequency components included in the cross region pattern from among the first frequency components.

Equation applying the cross-sectional pattern is as follows. The holographic restoration apparatus 300 may extract frequency components included in the cross-sectional pattern according to Equation 7 below.

[Equation 7]

Figure 112018099007019-pat00005

Here, ms is the size of the cross-sectional pattern, xc is the X coordinate of the peak component, yc is the Y coordinate of the peak component. R is a number proportional to the distance between the frequency component and the origin component corresponding to the actual image. For example, R may be distance / 3, distance / 2, or the like.

The size (ms) of the cross-sectional pattern is determined based on the distance between the origin component and the actual frequency component and the corresponding frequency component, but the size of the cross-sectional pattern is adjustable for efficient removal of the noise component. The size of the cross-sectional pattern can be optimized through an iterative noise reduction process.

In another embodiment, the holographic reconstruction apparatus 1 may give different weights according to the position of the filtering region with hemispherical filtering in order to remove noise more efficiently. For example, the holographic restoration apparatus 1 may multiply a weight smaller than 1 as it moves away from the center.

[Equation 8]

Figure 112018099007019-pat00006

Here, R refers to a number proportional to the distance between the frequency component and the origin component corresponding to the actual image. For example, R may be distance / 3, distance / 2 and the like.

Embodiments according to the present invention described above may be implemented in the form of a computer program that can be executed through various components on a computer, such a computer program may be recorded on a computer readable medium. In this case, the medium may store a computer executable program. Examples of the medium include magnetic media such as hard disks, floppy disks and magnetic tape, optical image sensors such as CD-ROMs and DVDs, magneto-optical media such as floppy disks, And ROM, RAM, flash memory, and the like, configured to store program instructions.

Meanwhile, the computer program may be specially designed and configured for the present invention or may be known and available to those skilled in the computer software field. Examples of computer programs may include high-level language code that can be executed by a computer using an interpreter as well as machine code such as produced by a compiler.

As described above, the holographic restoration apparatus 300 according to an embodiment of the present invention, that is, the inspection unit 300 may accurately generate three-dimensional shape information of the object to be measured by acquiring one hologram. . The inspection unit 300 can simplify components while generating accurate three-dimensional shape information through the above-described methods, and can be disposed inside the in-line deposition equipment to perform inspection in real time in the in-line process. To be.

Particular implementations described in the present invention are embodiments and do not limit the scope of the present invention in any way. For brevity of description, descriptions of conventional electronic configurations, control systems, software, and other functional aspects of the systems may be omitted. In addition, the connection or connection members of the lines between the components shown in the drawings are illustrative of the functional connection and / or physical or circuit connections as an example, in the actual device replaceable or additional various functional connections, physical It may be represented as a connection, or circuit connections. In addition, unless specifically mentioned, such as "essential", "important" may not be a necessary component for the application of the present invention.

Therefore, the spirit of the present invention should not be limited to the above-described embodiments, and the scope of the spirit of the present invention is defined not only in the claims below, but also in the ranges equivalent to or equivalent to the claims. Will belong to.

300, 300A, 300B: Holographic Restoration Device
310: light source unit
320: collimator
330,332: optical splitter
340: object light objective lens
350: object to be measured
360: reference light objective lens
370,372: optical mirror
380: image sensor
390: processor

Claims (5)

  1. A light source unit emitting single wavelength light;
    A collimator for collimating single wavelength light emitted from the light source unit;
    A light splitter that splits the single wavelength light that has passed through the collimator into object light and reference light;
    An object light objective lens for passing the object light divided by the light splitter;
    A reference light objective lens for passing the reference light divided by the light splitter;
    An optical mirror for reflecting the reference light passing through the reference light objective lens;
    Interference that is transmitted through the object light objective lens and the reference light reflected by the optical mirror and the reference light reflected by the optical mirror passes through the object light objective lens and the reference light objective lens to be transmitted to the optical splitter, respectively An image sensor for recording a pattern; And
    And a processor configured to receive and store an image including intensity information of an object hologram generated by converting the interference fringe from the image sensor, and to generate three-dimensional shape information of the object to be measured.
    The light source unit has a wavelength value of the single wavelength light by using step information of the object to be measured in a first direction known in advance, measurement accuracy in the first direction, and a bit depth of the image sensor. Determined,
    The wavelength value of the single wavelength light is selected by any one of a bit depth candidate group of the image sensor that satisfies both a first range of the maximum height value or more of the step information of the measurement target object and a second range of the arbitrarily selected measurement accuracy. 3D shape information generating device of the object to be measured determined.
  2. A light source unit emitting single wavelength light;
    A collimator for collimating single wavelength light emitted from the light source unit;
    A light splitter that splits the single wavelength light that has passed through the collimator into object light and reference light;
    An object light objective lens for passing the object transmitted light including information of the object to be measured after the object light divided by the light splitter passes the object to be measured;
    A second optical mirror for reflecting the transmitted light of the object passing through the object light objective lens;
    A reference light objective lens for passing the reference light divided by the light splitter;
    A first optical mirror for reflecting the reference light passing through the reference light objective lens;
    A second light splitter to which the reference light reflected by the first optical mirror and the object transmitted light reflected by the second optical mirror are respectively transmitted;
    An image sensor for recording an interference fringe formed by the reference light and the object transmitted light transmitted to the second light splitter; And
    And a processor configured to receive and store an image including intensity information of an object hologram generated by converting the interference fringe from the image sensor and to generate three-dimensional shape information of the object to be measured.
    The light source unit has a wavelength value of the single wavelength light by using step information of the object to be measured in a first direction known in advance, measurement accuracy in the first direction, and a bit depth of the image sensor. Determined,
    The wavelength value of the single wavelength light is selected by any one of a bit depth candidate group of the image sensor that satisfies both a first range of the maximum height value or more of the step information of the measurement target object and a second range of the arbitrarily selected measurement accuracy. 3D shape information generating device of the object to be measured determined.
  3. delete
  4. The method according to any one of claims 1 and 2,
    The processor extracts real components corresponding to a real image among at least one frequency component included in the image, and corrected light having a conjugate relationship with the reference light based on the real components. Generating an actual hologram including mounting information of an object to be measured, generating an intermediate hologram from which the reference light is removed from the actual hologram based on the correction light, and generating curvature aberration correction information from the intermediate hologram; The object to be measured is generated based on the curvature aberration correction information, and generates a correction hologram in which an error due to curvature aberration is improved in the intermediate hologram, and generates the three-dimensional shape information of the measurement target object from the correction hologram. 3D shape information generating device.
  5. The method of claim 4, wherein
    The processor generates at least one piece of 3D shape information of the measurement target object from the intermediate hologram, and determines at least one curvature aberration correction information based on the 3D shape information of the measurement target object generated from the intermediate hologram. 3. The apparatus for generating three-dimensional shape information of the object to be measured, which determines a parameter and generates the curvature aberration correction information based on the parameter.


KR1020180119751A 2018-10-08 2018-10-08 Apparatus for generating three-dimensional shape information of an object to be measured KR102055307B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020180119751A KR102055307B1 (en) 2018-10-08 2018-10-08 Apparatus for generating three-dimensional shape information of an object to be measured

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020180119751A KR102055307B1 (en) 2018-10-08 2018-10-08 Apparatus for generating three-dimensional shape information of an object to be measured

Publications (1)

Publication Number Publication Date
KR102055307B1 true KR102055307B1 (en) 2020-01-22

Family

ID=69368397

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020180119751A KR102055307B1 (en) 2018-10-08 2018-10-08 Apparatus for generating three-dimensional shape information of an object to be measured

Country Status (1)

Country Link
KR (1) KR102055307B1 (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7649160B2 (en) 2005-02-23 2010-01-19 Lyncee Tec S.A. Wave front sensing method and apparatus
KR20100095302A (en) 2009-02-20 2010-08-30 (주)펨트론 3d measuring apparatus using off-axis dual wavelength digital holography
KR20120014355A (en) 2010-08-09 2012-02-17 (주)펨트론 3d measurement apparatus using dual wave digital holography
KR101139178B1 (en) 2011-09-30 2012-04-26 디아이티 주식회사 Device for measuring the 3d cubic matter using a digital holography
KR101441245B1 (en) 2013-05-29 2014-09-17 제주대학교 산학협력단 Digital Holographic Microscope Apparatus
KR20160029606A (en) 2014-09-05 2016-03-15 광운대학교 산학협력단 Digital holographic microscopy and method for generating digital holographic image
KR101716452B1 (en) * 2015-08-21 2017-03-15 삼성디스플레이 주식회사 System and method for measuring high height by digital holography microscope
JP6132650B2 (en) * 2013-04-30 2017-05-24 オリンパス株式会社 Confocal microscope
KR20180036659A (en) * 2018-03-09 2018-04-09 주식회사 내일해 An On-Axis and Off-Axis Digital Hologram Generating Device and Method
KR20180057291A (en) * 2016-11-22 2018-05-30 주식회사 내일해 Digital Holographic Reconstruction Apparatus and Method Using Single Generated Phase Shifting Method
JP6378931B2 (en) * 2014-05-21 2018-08-22 浜松ホトニクス株式会社 Microscope device and image acquisition method

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7649160B2 (en) 2005-02-23 2010-01-19 Lyncee Tec S.A. Wave front sensing method and apparatus
KR20100095302A (en) 2009-02-20 2010-08-30 (주)펨트론 3d measuring apparatus using off-axis dual wavelength digital holography
KR20120014355A (en) 2010-08-09 2012-02-17 (주)펨트론 3d measurement apparatus using dual wave digital holography
KR101139178B1 (en) 2011-09-30 2012-04-26 디아이티 주식회사 Device for measuring the 3d cubic matter using a digital holography
JP6132650B2 (en) * 2013-04-30 2017-05-24 オリンパス株式会社 Confocal microscope
KR101441245B1 (en) 2013-05-29 2014-09-17 제주대학교 산학협력단 Digital Holographic Microscope Apparatus
JP6378931B2 (en) * 2014-05-21 2018-08-22 浜松ホトニクス株式会社 Microscope device and image acquisition method
KR20160029606A (en) 2014-09-05 2016-03-15 광운대학교 산학협력단 Digital holographic microscopy and method for generating digital holographic image
KR101716452B1 (en) * 2015-08-21 2017-03-15 삼성디스플레이 주식회사 System and method for measuring high height by digital holography microscope
KR20180057291A (en) * 2016-11-22 2018-05-30 주식회사 내일해 Digital Holographic Reconstruction Apparatus and Method Using Single Generated Phase Shifting Method
KR20180036659A (en) * 2018-03-09 2018-04-09 주식회사 내일해 An On-Axis and Off-Axis Digital Hologram Generating Device and Method

Similar Documents

Publication Publication Date Title
Schnars et al. DIGITAL HOLOGRAPHY AND WAVEFRONT SENSING.
Zuo et al. Phase aberration compensation in digital holographic microscopy based on principal component analysis
KR101441245B1 (en) Digital Holographic Microscope Apparatus
Colomb et al. Numerical parametric lens for shifting, magnification, and complete aberration compensation in digital holographic microscopy
Grilli et al. Whole optical wavefields reconstruction by digital holography
JP5648193B2 (en) Interference measuring apparatus and interference measuring method
JP4772961B2 (en) Method for simultaneously forming an amplitude contrast image and a quantitative phase contrast image by numerically reconstructing a digital hologram
KR101159380B1 (en) Methods and apparatus for wavefront manipulations and improved 3-d measurements
US7034949B2 (en) Systems and methods for wavefront measurement
JP3856838B2 (en) Method and apparatus for measuring the shape of an object
Picart New techniques in digital holography
US5777742A (en) System and method for holographic imaging with discernible image of an object
Mann et al. Quantitative phase imaging by three-wavelength digital holography
DE10392881B4 (en) Frequency-scanning interferometer with diffuse-reflecting reference surface
US9858671B2 (en) Measuring apparatus for three-dimensional profilometry and method thereof
US7116425B2 (en) Faster processing of multiple spatially-heterodyned direct to digital holograms
EP2667150B1 (en) Three-dimensional shape measurement method and three-dimensional shape measurement device
JP5021054B2 (en) Refractive index distribution measuring method and refractive index distribution measuring apparatus
Colomb et al. Extended depth-of-focus by digital holographic microscopy
JP4323955B2 (en) System and method for measuring wavefront
Huntley et al. Phase-shifted dynamic speckle pattern interferometry at 1 kHz
EP2492921A1 (en) Holographic microscopy of holographically trapped three-dimensional structures
US7889356B2 (en) Two grating lateral shearing wavefront sensor
JP4782958B2 (en) Surface shape measuring apparatus and method, program, and storage medium
Situ et al. Generalized in-line digital holographic technique based on intensity measurements at two different planes

Legal Events

Date Code Title Description
A107 Divisional application of patent
GRNT Written decision to grant