CN112838018B - Optical measuring method - Google Patents

Optical measuring method Download PDF

Info

Publication number
CN112838018B
CN112838018B CN201911163770.7A CN201911163770A CN112838018B CN 112838018 B CN112838018 B CN 112838018B CN 201911163770 A CN201911163770 A CN 201911163770A CN 112838018 B CN112838018 B CN 112838018B
Authority
CN
China
Prior art keywords
surface image
image
positioning mark
pixels
light modulator
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911163770.7A
Other languages
Chinese (zh)
Other versions
CN112838018A (en
Inventor
萧玮仁
蔡钧亦
张巍耀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chroma ATE Suzhou Co Ltd
Original Assignee
Chroma ATE Suzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chroma ATE Suzhou Co Ltd filed Critical Chroma ATE Suzhou Co Ltd
Priority to CN201911163770.7A priority Critical patent/CN112838018B/en
Publication of CN112838018A publication Critical patent/CN112838018A/en
Application granted granted Critical
Publication of CN112838018B publication Critical patent/CN112838018B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L22/00Testing or measuring during manufacture or treatment; Reliability measurements, i.e. testing of parts without further processing to modify the parts as such; Structural arrangements therefor
    • H01L22/10Measuring as part of the manufacturing process
    • H01L22/12Measuring as part of the manufacturing process for structural parameters, e.g. thickness, line width, refractive index, temperature, warp, bond strength, defects, optical inspection, electrical measurement of structural dimensions, metallurgic measurement of diffusions
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L22/00Testing or measuring during manufacture or treatment; Reliability measurements, i.e. testing of parts without further processing to modify the parts as such; Structural arrangements therefor
    • H01L22/30Structural arrangements specially adapted for testing or measuring during manufacture or treatment, or specially adapted for reliability measurements

Abstract

The application provides an optical measurement method for measuring the surface of an object, which comprises the following steps. First, an input ray is provided. In the optical path of the input light, a spatial light modulator is provided, the spatial light modulator having a plurality of pixels for selectively modulating the input light into the test light. And controlling the spatial light modulator to ensure that a first pixel group in the plurality of pixels does not modulate the input light, wherein the first pixel group corresponds to the position of the positioning mark on the surface. And acquiring a first surface image and a second surface image of the surface at a first time and a second time respectively. And aligning the first surface image and the second surface image according to the positioning marks in the first surface image and the second surface image.

Description

Optical measuring method
Technical Field
The present application relates to an optical measurement method, and more particularly to an optical measurement method for analyzing a surface image of an object.
Background
After the product is manufactured, a certain test program is carried out to check the quality of the product. Generally, it is relied on manpower to check whether the appearance of the product is defective or whether the function is normal by observing the appearance of the product. However, some products have a finer structure, and sometimes cannot be inspected by a person with naked eyes. Conventionally, when it is impossible to determine whether there is a defect by naked eyes, the appearance of a product can be photographed by means of a camera, and the appearance of a specific area of the product can be checked by enlarging the photographed image. For example, after wafer epitaxy is completed, the wafer surface is often photographed by a camera, and the wafer surface image is used to inspect the epitaxy quality of each component in the wafer.
Because of the very small size of the components in the wafer, the surface image often requires a relatively high resolution to inspect the epitaxial quality in detail. However, in order to improve the resolution of an image, a method of capturing a plurality of images and synthesizing the images into a single image is often used, but the captured surface image is often shifted due to vibration of a camera or displacement of a wafer, and the resolution is sometimes poor. Therefore, a new optical measurement method is needed to effectively calibrate the surface image when the surface image is shifted, so as to improve the resolution of the surface image.
Disclosure of Invention
In view of the above, the present application provides an optical measurement method, which avoids the positioning mark of the object when the structured light is projected onto the surface of the object. Therefore, when the surface image of the object is photographed, the positioning mark is found to be not interfered by the structured light, so that the subsequent alignment and correction procedure of the surface image is easier to perform.
The application provides an optical measurement method for measuring the surface of an object, which comprises the following steps. First, an input ray is provided. In the optical path of the input light, a spatial light modulator is provided, the spatial light modulator having a plurality of pixels for selectively modulating the input light into the test light. And controlling the spatial light modulator to ensure that a first pixel group in the plurality of pixels does not modulate the input light, wherein the first pixel group corresponds to the position of the positioning mark on the surface. And acquiring a first surface image and a second surface image of the surface at a first time and a second time respectively. And aligning the first surface image and the second surface image according to the positioning marks in the first surface image and the second surface image.
In some embodiments, the plurality of pixels of the spatial light modulator may correspond to a first pattern at a first time and the plurality of pixels of the spatial light modulator may correspond to a second pattern at a second time. In addition, the step of aligning the first surface image and the second surface image according to the positioning marks in the first surface image and the second surface image comprises the following steps. First, a positioning mark in the first surface image is positioned, so as to generate a first coordinate. And positioning the positioning mark in the second surface image so as to generate a second coordinate. And comparing the first coordinate with the second coordinate, and calculating a first offset value. And compensating the second surface image according to the first offset value to enable the second surface image to be aligned with the first surface image.
In some embodiments, the first surface image may correspond to a first image capturing range of the surface, and the second surface image may correspond to a second image capturing range of the surface, the first image capturing range being different from the second image capturing range. In addition, in the step of aligning the first surface image and the second surface image, a region where the first image capturing range and the second image capturing range overlap with each other may be set as an intersection image capturing range. And the data of the first surface image and the second surface image outside the intersection image capturing range can be removed.
In some embodiments, the optical measurement method may further comprise the following steps. First, an initial surface image of the surface may be acquired in advance. And, the position of the positioning mark in the initial surface image can be calculated, so as to generate a calculation result. And selecting a first pixel group from the plurality of pixels according to the calculation result, so that the first pixel group corresponds to the position of the positioning mark in the initial surface image. Furthermore, the first group of pixels may correspond to a non-projected range on the surface, an area of the non-projected range is larger than an area of the positioning mark, and a center position of the non-projected range may be substantially equal to a center position of the positioning mark.
In summary, the optical measurement method provided by the present application avoids the positioning mark of the object when the structured light is projected onto the surface of the object. In addition, the optical measurement method provided by the application can utilize the positioning mark to overlap the surface images shot for multiple times, and the positioning mark is not interfered by the structured light, so that the surface images shot for multiple times are easier to carry out subsequent alignment and correction procedures.
Other features and embodiments of the present application are described in detail below with reference to the accompanying drawings.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments described in the present application, and other drawings may be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of an optical measurement system according to an embodiment of the application;
FIG. 2A is a schematic illustration of an object surface according to an embodiment of the application;
FIG. 2B is a schematic diagram of a spatial light modulator according to one embodiment of the present application;
FIG. 3A is a schematic view of a first surface image according to an embodiment of the application;
FIG. 3B is a schematic diagram of a second surface image according to an embodiment of the application;
FIG. 4 is a schematic diagram of a positioning mark offset according to an embodiment of the present application;
FIG. 5 is a flowchart illustrating steps of an optical measurement method according to an embodiment of the present application.
Symbol description
1 optical measurement System 10 light Source
11 lens 12 beam splitting unit
13 lens 14 spatial light modulator
140 pixels 142 first pixel group
15 lens 16 image capturing device
17 process unit 20 article
20a surface 200 component to be tested
202 positioning mark 3 first surface image
30 second surface image of positioning mark 4 in the image
40 positioning mark in image
A. Center point of positioning mark in B image
cmd control command S50-S58 step flow
Detailed Description
The foregoing and other technical aspects, features and advantages of the present application will become more apparent from the following detailed description of a preferred embodiment, which proceeds with reference to the accompanying drawings. The directional terms mentioned in the following embodiments are, for example: upper, lower, left, right, front or rear, etc., are merely references to the directions of the drawings. Thus, the directional terminology is used for purposes of illustration and is not intended to be limiting of the application.
Referring to fig. 1 and fig. 2A together, fig. 1 is a schematic diagram illustrating an optical measurement system according to an embodiment of the application, and fig. 2A is a schematic diagram illustrating a surface of an object according to an embodiment of the application. As shown in the drawings, the optical measurement method disclosed in the present application can be applied to the optical measurement system 1, and the optical measurement system 1 can be used to detect the object 20. Here, the object 20 may be a wafer or a carrier, for loading the component 200 to be tested. The device under test 200 is disposed on the surface 20a of the object 20, and may be a chip, a die, a panel or a circuit, which is not limited herein. In addition, the surface 20a of the object 20 may further have a positioning mark 202, and in practice, the object 20 may be aligned by using the positioning mark 202 when performing detection. In addition, the optical measurement system 1 shown in fig. 1 may have a light source 10, a lens 11, a beam splitting unit 12, a lens 13, a spatial light modulator (spatial light modulator) 14, a lens 15, an image capturing device 16 and a processing unit 17, and the optical architecture of the optical measurement system 1 is described below.
Although fig. 1 shows the light source 10 as a point light source, the present embodiment is not limited thereto, and the light source 10 may be a surface light source. Further, the light source 10 may be a white light source or a non-coherent light source, such that the input light may be white light or a non-coherent light. Here, the input light provided by the light source 10 enters the lens 11, and the function of the lens 11 is to convert the input light from the point light source into parallel light. It will be appreciated by those skilled in the art that the light source 10 is substantially at the focal position of the lens 11, such that the input light passing through the lens 11 can be substantially regarded as a planar light. Of course, the lens 11 may not be actually needed if the light source 10 is originally a parallel surface light source.
Further, a spectroscopic unit 12 and a spatial light modulator 14 may be provided in the optical path of the input light. In the example shown in fig. 1, the spatial light modulator 14 may be disposed between the lens 11 and the light splitting unit 12, and the spatial light modulator 14 may convert at least part of the input light into the test light, where the test light may be, for example, the first structured light. In practice, the spectroscopic unit 12 may be an optical spectroscope, which may reflect the test light from the spatial light modulator 14 toward the lens 13, so that the test light can be irradiated toward the object 20 through the lens 13. In one example, the input light and the test light modulated by the spatial light modulator 14 may be in the same direction, and the test light reflected by the light splitting unit 12 may be in a direction perpendicular to the input light.
The spatial light modulator 14 may have a plurality of pixels, and the plurality of pixels may be arranged in a form of a plurality of groups. Each pixel, as referred to herein, may be comprised of liquid crystal, and may be selectively transparent to light and opaque to determine the ratio of light passing therethrough. For example, the spatial light modulator 14 may determine which pixels are turned on or partially turned on and which pixels are turned off according to the control command cmd provided by the processing unit 17. In one example, the spatial light modulator 14 in fig. 1 may have a transmissive pixel array, which may have a liquid crystal layer (not shown), and the control command cmd controls the rotation direction of the liquid crystal in the liquid crystal layer to determine how much of the test light can pass through the specific pixel. For convenience in describing the spatial light modulator 14 of the present embodiment, please refer to fig. 1, fig. 2A and fig. 2B together, fig. 2B is a schematic diagram illustrating a spatial light modulator according to an embodiment of the present application. As shown, the spatial light modulator 14 may have a plurality of array-arranged pixels 140, and the pixels 140 may be controlled by a control command cmd to adjust the proportion of the test light passing therethrough.
In one example, the plurality of pixels 140 may have a spatial correspondence with the surface 20a of the object 20, e.g., any one of the pixels 140 may correspond to a particular location on the surface 20a. Conversely, any one location on surface 20a may also correspond to a particular pixel 140 in spatial light modulator 14. When the correspondence between the plurality of pixels 140 and the surface 20a is established, a portion of the pixels 140 may be defined as a first pixel group 142, and the first pixel group 142 may substantially correspond to the location of the positioning mark 202 on the surface 20a. In practice, in the spatial light modulator 14 provided in the present embodiment, the first pixel group 142 can be regarded as an area where the test light is not modulated, that is, the pixels 140 in the first pixel group 142 can let all the test light pass through. For example, the first pixel group 142 corresponds to a non-projection range on the surface 20a, the center position of the non-projection range is substantially equal to the center position of the positioning mark 202, and the area of the non-projection range is slightly larger than the area occupied by the positioning mark 202.
As a practical example, if it is known to repeatedly detect a similar batch of objects 20, the processing unit 17 may store in advance the correspondence between the pixels 140 and the surface 20a, and may also store in advance the positions of the positioning marks 202. That is, since the processing unit 17 stores the correspondence between the pixels 140 and the surface 20a in advance, it is able to search for which pixels 140 will correspond to the surrounding area of the positioning mark in the initial surface image. Then, the processing unit 17 sets a plurality of pixels 140 corresponding to the surrounding area of the positioning mark in the initial surface image as the first pixel group 142. In one example, in order to quickly complete the step of setting the first pixel group 142, the resolution of the initial surface image may be low or insufficient for focusing, so long as the processing unit 17 can see the positioning mark in the initial surface image, which is consistent with the scope of the initial surface image in this embodiment.
On the other hand, the present embodiment does not limit the spatial light modulator 14 to have a transmissive pixel array, for example, it is also possible for the spatial light modulator 14 to have a reflective pixel array. For example, if the pixel 140 is a reflective architecture, the pixel 140 may be composed of a liquid crystal layer and a mirror. Here, the pixel 140 may also determine the proportion of the test light reflected by the mirror by controlling the rotation direction of the liquid crystal in the pixel. In addition, reflective pixels 140 may have no liquid crystal layer and only a mirror, for example, each pixel 140 may be a small mirror, and the control command cmd may determine which pixels 140 reflect the test light to the surface 20a. Similarly, even though the pixels 140 are reflective, the first pixel group 142 can be regarded as an area where the test light is not modulated, i.e. the pixels 140 in the first pixel group 142 can reflect the test light to the surface 20a. In addition, the spatial light modulator 14 may be used with a polarizer, however, since the function and purpose of the polarizer will be understood by those skilled in the art, and the present application is not limited thereto.
After being reflected by the surface 20a of the object 20, the test light may be directed again through the lens 13 to the spectroscopic unit 12. Due to the optical characteristics of the light-splitting unit 12, the light from the lens 13 can directly penetrate the light-splitting unit 12 to enter the lens 15, and then be focused by the lens 15 to be guided to the image capturing device 16. In one example, the image capturing device 16 may be disposed on one side of the beam splitting unit 12, and is configured to receive the reference light reflected from the lens 13 and the light reflected from the surface 20a of the object 20. In one example, to see the surface 20a at a certain height, the image capturing device 16 may capture surface images of the plurality of objects 20 at a plurality of consecutive time points to improve the resolution of the surface images.
For example, the image capturing device 16 may acquire a first surface image of the surface 20a at a first time, and the image capturing device 16 may acquire a second surface image of the surface 20a at a second time, which may be two consecutive capturing time points, which is not limited herein. For convenience of description, please refer to fig. 1, fig. 2A, fig. 2B, fig. 3A and fig. 3B together, fig. 3A is a schematic diagram illustrating a first surface image according to an embodiment of the present application, and fig. 3B is a schematic diagram illustrating a second surface image according to an embodiment of the present application. As shown, at a first time, the spatial light modulator 14 may modulate the test light into a first structured light, so that the image capturing device 16 may capture the first structured light onto the surface 20a to obtain the first surface image 3. In practice, since the first structured light may have a specific pattern, if the positioning mark 202 of the surface 20a is also projected with the specific pattern, it may not be easily interpreted in the first surface image 3. Therefore, the spatial light modulator 14 of the present embodiment controls the pixels 140 in the first pixel group 142, such that the pixels 140 in the first pixel group 142 do not modulate the test light, and the positioning mark 202 of the surface 20a is not projected with a specific pattern.
In other words, since the pixels 140 in the first pixel group 142 do not modulate the test light, the positioning mark 30 in the first surface image 3 is not projected with the first structured light, so that the positioning mark 30 can be easily and clearly read. In addition, since the pixels 140 other than the first pixel group 142 modulate the test light normally, other portions (e.g., the device under test 200) on the surface 20a of the object 20 are projected with the first structured light, so that the object 20 can be detected normally.
Similarly, at a second time, the spatial light modulator 14 may modulate the test light into a second structured light, so that the image capturing device 16 may capture the second structured light onto the surface 20a to obtain the second surface image 4. The second structured light may have a different phase than the first structured light, or the second structured light may have a different pattern than the first structured light, which embodiment is not limited herein. As described above, the spatial light modulator 14 of the present embodiment also controls the pixels 140 in the first pixel group 142, so that the pixels 140 in the first pixel group 142 do not modulate the test light, and the positioning mark 202 of the surface 20a is not projected with the second structured light. Since the pixels 140 in the first pixel group 142 do not modulate the test light, the positioning mark 40 in the second surface image 4 can be easily and clearly read.
In a practical example, when the image capturing device 16 captures the first surface image 3 and the second surface image 4 sequentially, the object 20 may be shifted relative to the image capturing device 16 due to some undesirable factors. At this time, the processing unit 17 of the present embodiment may receive the first surface image 3 and the second surface image 4, and align the first surface image 3 and the second surface image 4 by using the positioning mark 30 in the first surface image 3 and the positioning mark 40 in the second surface image 4. Referring to fig. 4, fig. 4 is a schematic diagram illustrating a positioning mark offset according to an embodiment of the application. As shown, assuming that the positioning mark 30 in the first surface image 3 defines a center point a, the processing unit 17 may give the center point a coordinates (a 1, a 2) for convenience of operation. Similarly, the processing unit 17 can find the positioning mark 40 in the second surface image 4 and give the center point B a coordinate (B1, B2) in the same coordinate system.
When the coordinates of the center point a and the center point B are different, it can be known that the image capturing device 16 or the object 20 vibrates between the first time and the second time, so that the object 20 is offset relative to the image capturing device 16. Thus, the processing unit 17 may first determine the degree of offset (i.e. the first offset value) of the center point a and the center point B, which may be B1-a1 in the first direction and B2-a2 in the second direction. The processing unit 17 may then choose to translate the second surface image 4 based on the first offset value based on the first surface image 3. For example, the entire second surface image 4 is moved in the first direction- (b 1-a 1) and moved in the second direction- (b 2-a 2) so that the second surface image 4 can overlap the first surface image 3. In this way, the processing unit 17 completes the alignment of the first surface image 3 and the second surface image 4.
In one example, the processing unit 17 may not calculate the degree of offset of the positioning marks 30 and 40 based on the center points a and B, for example, the processing unit 17 may select other points or lines that are conveniently defined to calculate the first offset value. In addition, if the relative rotation between the object 20 and the image capturing apparatus 16 also occurs, the processing unit 17 may calculate the respective angles of the positioning mark 30 and the positioning mark 40 based on a plurality of points, points and lines or a combination of lines and lines to determine the rotation degrees of the positioning mark 30 and the positioning mark 40. In other words, the first offset value may be used to indicate not only the degree of translation but also the degree of rotation, which is not limited by the present embodiment.
In addition, since the capturing range of the image capturing apparatus 16 should be fixed, if the subject 20 is offset relative to the image capturing apparatus 16 between the two captures, it should be understood by those skilled in the art that the two captures of the image capturing apparatus 16 should be slightly different. That is, the first surface image 3 may correspond to a first image capturing range of the surface 20a, and the second surface image 4 may correspond to a second image capturing range of the surface 20a, where the first image capturing range is not equal to the second image capturing range. On the other hand, since the first capturing range is not equal to the second capturing range, the processing unit 17 may also crop the first surface image 3 and the second surface image 4, and only preserve the image content at the overlapping portion (intersection capturing range) of the first capturing range and the second capturing range. In practice, even if the object 20 is offset relative to the image capturing device 16, the offset is not too great, and most of them are within acceptable levels. Therefore, the cut-out portions of the first surface image 3 and the second surface image 4 are edges belonging to the image frame, and the purpose of detecting the component 200 to be detected is not affected.
In one example, after the processing unit 17 completes the alignment operation of the first surface image 3 and the second surface image 4, it is also possible to combine the first surface image 3 and the second surface image 4, so that the combined surface images can have a higher resolution and can represent more complete details. Of course, the present embodiment is not limited to the processing unit 17 being capable of aligning only two surface images, for example, if the image capturing device 16 continuously captures a plurality of surface images, the processing unit 17 may also sequentially align or combine the surface images.
It should be noted that the processing unit 17 does not have to store the correspondence between the pixels 140 and the surface 20a in advance, and does not have to store the positions of the positioning marks 202 in advance. For example, if different objects are detected from time to time, the processing unit 17 may cause the image capturing device 16 to capture an initial surface image of the surface 20a of the object 20 before the formal test. Then, the processing unit 17 can calculate the position of the positioning mark 202 in the initial surface image, so as to generate a calculation result. The processing unit 17 may select the first pixel group 142 from the plurality of pixels 140 according to the calculation result, so that the first pixel group 142 corresponds to the position of the positioning mark in the initial surface image.
For further explanation of the steps of the optical measurement method of the present application, please refer to fig. 1 to 5, wherein fig. 5 is a flowchart illustrating the steps of the optical measurement method according to an embodiment of the present application. As shown in the figure, in step S50, the light source 10 provides an input light. In step S52, the spatial light modulator 14 is provided on the optical path of the input light, and the spatial light modulator 14 has a plurality of pixels 140, and the plurality of pixels 140 are used for selectively modulating the input light into the test light. In step S54, the processing unit 17 may control the spatial light modulator 14 such that the first pixel group 142 of the plurality of pixels 140 does not modulate the input light, and the first pixel group 142 corresponds to the position of the positioning mark 202 in the surface 20a of the object 20. Next, in step S56, the image capturing apparatus 16 obtains the first surface image 3 and the second surface image 4 of the surface 20a at the first time and the second time, respectively. Next, in step S58, the processing unit 17 may align the first surface image 3 and the second surface image 4 according to the positioning marks 30 and 40 in the first surface image 3 and the second surface image 4. Other details of the steps of the optical measurement method of the present application are described in the foregoing embodiments, and are not repeated herein.
In summary, the optical measurement method provided by the present application avoids the positioning mark of the object when the structured light is projected onto the surface of the object. In addition, the optical measurement method provided by the application can utilize the positioning mark to overlap the surface images shot for multiple times, and the positioning mark is not interfered by the structured light, so that the surface images shot for multiple times are easier to carry out subsequent alignment and correction procedures.
The above examples and/or embodiments are merely for illustrating the preferred embodiments and/or implementations of the present technology, and are not intended to limit the embodiments and implementations of the present technology in any way, and any person skilled in the art should be able to make some changes or modifications to the embodiments and/or implementations without departing from the scope of the technical means disclosed in the present disclosure, and it should be considered that the embodiments and implementations are substantially the same as the present technology.

Claims (8)

1. An optical measurement method for measuring a surface of an object, the optical measurement method comprising:
providing an input light;
providing a spatial light modulator on an optical path of the input light, the spatial light modulator having a plurality of pixels for selectively modulating the input light into a test light;
controlling the spatial light modulator to make a first pixel group in the pixels not modulate the input light, wherein the first pixel group corresponds to the position of a positioning mark on the surface;
acquiring a first surface image and a second surface image of the surface at a first time and a second time respectively; and
aligning the first surface image and the second surface image according to the positioning marks in the first surface image and the second surface image.
2. The method of claim 1, wherein aligning the first surface image and the second surface image according to the positioning mark in the first surface image and the second surface image comprises:
positioning the positioning mark in the first surface image to generate a first coordinate;
positioning the positioning mark in the second surface image to generate a second coordinate;
comparing the first coordinate with the second coordinate, and calculating a first offset value; and
and compensating the second surface image according to the first offset value to enable the second surface image to be aligned with the first surface image.
3. The method of claim 1, wherein the first surface image maps a first image capturing range of the surface, the second surface image maps a second image capturing range of the surface, and the first image capturing range is not equal to the second image capturing range.
4. The method of claim 3, wherein aligning the first surface image and the second surface image further comprises:
setting the overlapping area of the first image capturing range and the second image capturing range on the surface as an intersection image capturing range; and
and removing the data of the first surface image and the second surface image outside the intersection image capturing range.
5. The optical measurement method according to claim 1, further comprising:
pre-acquiring an initial surface image of the surface;
calculating the position of the positioning mark in the initial surface image so as to generate a calculation result; and
and selecting the first pixel group from the pixels according to the calculation result, so that the first pixel group corresponds to the position of the positioning mark in the initial surface image.
6. The method of claim 5, wherein the first group of pixels corresponds to a non-projection area on the surface, the non-projection area having an area larger than an area of the positioning mark.
7. The method of claim 6, wherein the center position of the non-projection range is substantially equal to the center position of the positioning mark.
8. The method of claim 1, wherein the pixels of the spatial light modulator correspond to a first pattern at the first time and the pixels of the spatial light modulator correspond to a second pattern at the second time.
CN201911163770.7A 2019-11-25 2019-11-25 Optical measuring method Active CN112838018B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911163770.7A CN112838018B (en) 2019-11-25 2019-11-25 Optical measuring method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911163770.7A CN112838018B (en) 2019-11-25 2019-11-25 Optical measuring method

Publications (2)

Publication Number Publication Date
CN112838018A CN112838018A (en) 2021-05-25
CN112838018B true CN112838018B (en) 2023-09-15

Family

ID=75922138

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911163770.7A Active CN112838018B (en) 2019-11-25 2019-11-25 Optical measuring method

Country Status (1)

Country Link
CN (1) CN112838018B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09312250A (en) * 1996-05-22 1997-12-02 Nikon Corp Position alignment method
JPH1012520A (en) * 1996-06-21 1998-01-16 Nikon Corp Projection aligner
US6288830B1 (en) * 1998-05-13 2001-09-11 Ricoh Microelectronics Company, Ltd. Optical image forming method and device, image forming apparatus and aligner for lithography
KR20040010091A (en) * 2002-07-25 2004-01-31 주식회사 솔루션닉스 Apparatus and Method for Registering Multiple Three Dimensional Scan Data by using Optical Marker
TW201131138A (en) * 2010-03-10 2011-09-16 Ind Tech Res Inst Surface measure device, surface measure method thereof and correction method thereof
TW201329445A (en) * 2012-01-02 2013-07-16 Shen-Jwu Su Method and system of material torsion testing
CN104634788A (en) * 2013-11-07 2015-05-20 财团法人工业技术研究院 Image positioning method and device
TW201521672A (en) * 2013-12-13 2015-06-16 Crystalvue Medical Corp Optical device for measuring corneal and method for measuring corneal
CN105890546A (en) * 2016-04-22 2016-08-24 无锡信捷电气股份有限公司 Structured light three-dimensional measurement method based on orthogonal Gray code and line shift combination
TW201708984A (en) * 2015-06-05 2017-03-01 Asml荷蘭公司 Alignment system
CN107407894A (en) * 2014-12-24 2017-11-28 株式会社尼康 Measurement apparatus and measuring method, exposure device and exposure method and device making method
CN108240800A (en) * 2016-12-23 2018-07-03 致茂电子(苏州)有限公司 The method for measurement of surface topography
CN109655232A (en) * 2017-10-12 2019-04-19 致茂电子(苏州)有限公司 Optical measurement device
TWI668439B (en) * 2018-11-26 2019-08-11 致茂電子股份有限公司 Method of measuring surface topography

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010125813A1 (en) * 2009-04-30 2010-11-04 株式会社ニコン Exposure method, method for manufacturing device, and method for measuring superposition error

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09312250A (en) * 1996-05-22 1997-12-02 Nikon Corp Position alignment method
JPH1012520A (en) * 1996-06-21 1998-01-16 Nikon Corp Projection aligner
US6288830B1 (en) * 1998-05-13 2001-09-11 Ricoh Microelectronics Company, Ltd. Optical image forming method and device, image forming apparatus and aligner for lithography
KR20040010091A (en) * 2002-07-25 2004-01-31 주식회사 솔루션닉스 Apparatus and Method for Registering Multiple Three Dimensional Scan Data by using Optical Marker
TW201131138A (en) * 2010-03-10 2011-09-16 Ind Tech Res Inst Surface measure device, surface measure method thereof and correction method thereof
TW201329445A (en) * 2012-01-02 2013-07-16 Shen-Jwu Su Method and system of material torsion testing
CN104634788A (en) * 2013-11-07 2015-05-20 财团法人工业技术研究院 Image positioning method and device
TW201521672A (en) * 2013-12-13 2015-06-16 Crystalvue Medical Corp Optical device for measuring corneal and method for measuring corneal
CN107407894A (en) * 2014-12-24 2017-11-28 株式会社尼康 Measurement apparatus and measuring method, exposure device and exposure method and device making method
TW201708984A (en) * 2015-06-05 2017-03-01 Asml荷蘭公司 Alignment system
CN105890546A (en) * 2016-04-22 2016-08-24 无锡信捷电气股份有限公司 Structured light three-dimensional measurement method based on orthogonal Gray code and line shift combination
CN108240800A (en) * 2016-12-23 2018-07-03 致茂电子(苏州)有限公司 The method for measurement of surface topography
CN109655232A (en) * 2017-10-12 2019-04-19 致茂电子(苏州)有限公司 Optical measurement device
TWI668439B (en) * 2018-11-26 2019-08-11 致茂電子股份有限公司 Method of measuring surface topography

Also Published As

Publication number Publication date
CN112838018A (en) 2021-05-25

Similar Documents

Publication Publication Date Title
CN105783784B (en) Inspection apparatus and control method of inspection apparatus
EP0563829B1 (en) Device for inspecting printed cream solder
KR101782336B1 (en) Inspection apparatus and inspection method
US6366689B1 (en) 3D profile analysis for surface contour inspection
US9232117B2 (en) Digital Schlieren imaging
TWI484283B (en) Image measurement method, image measurement apparatus and image inspection apparatus
US10540561B2 (en) Inspection method and inspection apparatus
US9646374B2 (en) Line width error obtaining method, line width error obtaining apparatus, and inspection system
US7197176B2 (en) Mark position detecting apparatus and mark position detecting method
CN108986170A (en) A kind of line-scan digital camera method for correcting flat field suitable for field working conditions
CN104168414A (en) Object image shooting and splicing method
JP2015108582A (en) Three-dimensional measurement method and device
JPH1054709A (en) Three-dimensional image recognizing device with microscope
JP4825833B2 (en) Pattern inspection apparatus and pattern inspection method
KR101826127B1 (en) optical apparatus for inspecting pattern image of semiconductor wafer
CN112838018B (en) Optical measuring method
KR102118824B1 (en) Substrate inspection device, substrate processing apparatus and substrate inspection method
JP5531883B2 (en) Adjustment method
JPH11257930A (en) Three-dimensional shape measuring apparatus
JP6668199B2 (en) Mask inspection method
CN111220087B (en) Surface topography detection method
TWI749409B (en) Optical measuring method
CN115103124A (en) Active alignment method for camera module
JP2009079915A (en) Method and device for measuring micro-dimension
JP2006343143A (en) Inspection system for imaging device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant