KR101190125B1 - Method of three dimensional mesurement - Google Patents

Method of three dimensional mesurement Download PDF

Info

Publication number
KR101190125B1
KR101190125B1 KR1020100035255A KR20100035255A KR101190125B1 KR 101190125 B1 KR101190125 B1 KR 101190125B1 KR 1020100035255 A KR1020100035255 A KR 1020100035255A KR 20100035255 A KR20100035255 A KR 20100035255A KR 101190125 B1 KR101190125 B1 KR 101190125B1
Authority
KR
South Korea
Prior art keywords
area
measurement
image
measurement objects
substrate
Prior art date
Application number
KR1020100035255A
Other languages
Korean (ko)
Other versions
KR20110115752A (en
Inventor
권달안
Original Assignee
주식회사 고영테크놀러지
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 고영테크놀러지 filed Critical 주식회사 고영테크놀러지
Priority to KR1020100035255A priority Critical patent/KR101190125B1/en
Priority to US13/086,879 priority patent/US8855403B2/en
Priority to CN201110096894.5A priority patent/CN102261895B/en
Priority to JP2011091979A priority patent/JP2011227080A/en
Publication of KR20110115752A publication Critical patent/KR20110115752A/en
Application granted granted Critical
Publication of KR101190125B1 publication Critical patent/KR101190125B1/en
Priority to JP2014139443A priority patent/JP2014197023A/en
Priority to JP2016021646A priority patent/JP6211114B2/en

Links

Images

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)

Abstract

Disclosed is a three-dimensional shape measurement method that can improve the accuracy of inspection. The three-dimensional shape measuring method may be configured to obtain an image by irradiating light toward a substrate on which a plurality of measurement objects are formed and receiving light reflected from the substrate on which the plurality of measurement objects are formed, and among the inspection areas of the obtained image. Determining a bottom area excluding the first and second object areas from among the first object area in which the first measurement object is located and the second object area in which the second measurement object is located, and the inspection area; Irradiating pattern light toward the substrate on which the measurement object is formed, receiving pattern light reflected from the substrate on which the plurality of measurement objects are formed, obtaining a pattern image, and obtaining a phase at each point of the obtained pattern image; The plurality of sides using phases of the first and second object regions and phases of the bottom region; Obtaining a phase of the object.

Description

3D shape measurement method {METHOD OF THREE DIMENSIONAL MESUREMENT}

The present invention relates to a three-dimensional shape measurement method, and more particularly to a three-dimensional shape measurement method using a phase shift method using a moire pattern.

Electronic equipment has developed remarkably, and lightweight and miniaturization has been gradually progressed. Therefore, the possibility of an error occurring in the manufacturing process of these electronic equipment is increasing, and the equipment for detecting it is also being improved accordingly.

In recent years, three-dimensional shape measurement technology is not limited to the engineering field, but is expanding its application to various ranges. As a three-dimensional shape measurement technology that can satisfy the needs of these fields, in the past, there has been a contact measurement method using a three-dimensional coordinate measuring machine (CMM), but active research on the non-contact three-dimensional shape measurement technology based on optical theory is in progress. It is becoming.

In the three-dimensional shape measuring method using the moire phenomenon, which is one of typical non-contact measuring methods, the three-dimensional shape is measured by detecting the height at each point on the xy plane by examining the grid pattern whose phase is shifted.

In more detail, the grid pattern is irradiated to the measurement area FOV, and the inspection area ROI in which the measurement object is formed is irradiated. According to the conventional method, the object area in which the measurement object is formed in the inspection area is not found, It is assumed that a measurement object is formed at a position on the CAD.

However, when there is warpage of the substrate or the like, there is a problem that the accuracy of inspection is inferior because it does not coincide with the position on the CAD.

In particular, as shown in FIG. 1, when a printed circuit board for a semiconductor having a plurality of bumps is an object, the measurement object is densely formed, and at this time, it is not easy to check the bottom region outside the object region and the repeatability is poor.

The problem to be solved by the present invention is to provide a three-dimensional shape measurement method that can improve the accuracy and repeatability of the inspection.

In order to solve this problem, a three-dimensional shape measuring method according to an exemplary embodiment of the present invention, irradiates light toward a substrate on which a plurality of measurement objects are formed and reflects light reflected from the substrate on which the plurality of measurement objects are formed. Receiving an image and acquiring an image, and among the inspection areas of the acquired image, a first object area in which a first measurement object is located and a second object area in which a second measurement object is located, and the first and second inspection areas in the inspection area. Determining a bottom region excluding a second object region, irradiating pattern light toward a substrate on which the plurality of measurement objects are formed, and receiving pattern light reflected from a substrate on which the plurality of measurement objects are formed to obtain a pattern image And obtaining a phase at each point of the obtained pattern image and generating the first and second object zeros. And the phase and the phase of the bottom region includes a step of obtaining a phase of the plurality of measurement object.

According to another exemplary embodiment of the present invention, a method for measuring a three-dimensional shape includes obtaining an image by irradiating light toward a substrate on which a plurality of measurement objects are formed and receiving light reflected from the substrate on which the plurality of measurement objects are formed. Determining an object area in which the measurement object is located and a bottom area except the object area in the inspection area of the acquired image, and at least two or more toward the substrate on which the plurality of measurement objects are formed. Irradiating pattern light in a direction, respectively, and acquiring at least two or more pattern images reflected from the substrate on which the plurality of measurement objects are formed, and respectively acquiring a phase at each point of the obtained at least two pattern images And the plurality of measurements using the phase of the bottom region. And a step of obtaining a phase water phase, respectively.

Meanwhile, in the inspecting region of the acquired image, determining the object region in which the plurality of measurement objects are located and the bottom region except the object region in the inspection region may include intensity at each point of the acquired image. Obtaining information, forming an intensity histogram as a first axis, and forming a gray level histogram having the number of points corresponding to the intensity as a second axis; and the object area from the gray histogram. And obtaining a boundary of the floor area, and determining a floor area excluding the object area among the object area in which the plurality of measurement objects are located and the inspection area.

The obtaining of the boundary between the object area and the bottom area from the gray level histogram may include determining a region corresponding to an intensity corresponding to a minimum number range in the gray level histogram as the boundary between the object area and the floor area. Or determining the boundary in which the area determined as the object area is expanded as a boundary between the object area and the bottom area.

Alternatively, obtaining a boundary between the object region and the bottom region from the gradation histogram may include determining a boundary determined by an Otsu algorithm in the gradation histogram as a boundary between the object region and the bottom region, or the object. The boundary in which the area determined as the area is expanded may be determined as a boundary between the object area and the bottom area.

Meanwhile, the method may further include removing noise of phases in consideration of positional information based on reference data and a correlation between the size of the object regions and the surroundings.

In addition, the step of irradiating light toward the substrate on which the plurality of measurement targets are formed and receiving light reflected from the substrate on which the plurality of measurement targets is formed, obtaining an image may include a plurality of color illumination substrates on which the plurality of measurement targets are formed. And receiving light reflected from the substrate on which the plurality of measurement objects are formed to irradiate toward to obtain a plurality of color images.

In this case, the obtaining of the color image may be performed by irradiating red, green, and blue lights, respectively, to obtain a red image, a green image, and a blue image, respectively, or at least two or more lights together to obtain an image. .

The determining of the object area in which the plurality of objects to be measured and the floor area excluding the object area from the inspection area may be determined among the inspection areas of the acquired image according to the material at each point of the acquired image. Acquiring color information to be distinguished and generating a color information map storing color information distinguished according to the obtained material; and separating the object area and the floor area using the color information map. And determining a bottom area excluding the object area among the object area in which the plurality of measurement objects are located and the inspection area.

In this case, the color information map is a saturation map, and the generating of the saturation map may include obtaining hue, saturation, and intensity information for each color through color coordinate conversion of the color image. And generating the saturation map by using the saturation information for each color.

The inspection area may be an entire area of an image obtained by receiving light reflected from a substrate on which the plurality of measurement objects are formed.

In addition, the same phase of the bottom region may be applied to the plurality of measurement objects.

According to the present invention, when the position of the measurement object due to the warpage of the substrate, the position of the solder formation position, the overpayment or the soldering of the solder is unclear, the measurement object and the substrate surface using the actual measured two-dimensional image By improving the reliability of the floor phase by clearing the boundary of the floor and averaging the measured phase of the floor area, the accuracy of the three-dimensional shape measurement can be improved.

In addition, the phase of the inspection area is obtained in at least two directions, and the phase of the bottom area is matched in the phase of the inspection area obtained in two or more directions, thereby improving the accuracy of the inspection and furthermore, the two-dimensional inspection. The inspection area can be shortened by utilizing the floor area determined by the method in the data measured in various directions by not distinguishing the inspection area and the floor area for each phase acquired in each direction.

In addition, because the measurement objects are densely present in the inspection area, it is not easy to set the inspection area for each measurement object and when the reliability of the floor area is low because the area of the floor area is narrow, the phase of the floor area is acquired throughout the measurement area. The repeatability can be improved.

1 is a plan view illustrating a semiconductor in which a measurement object is densely present.
FIG. 2A is a schematic diagram illustrating a measurement area observed from a photographing unit of a three-dimensional shape measuring device using a phase shift method using a moire pattern. FIG.
FIG. 2B is a schematic diagram illustrating a case where a plurality of measurement objects are formed in the inspection area when the inspection area is the same as the measurement area.
FIG. 3 is a schematic view showing a cross section taken along line II ′ shown in FIG. 2A.
4 is a schematic phase diagram of the measurement object of FIG. 3.
FIG. 5 is a phase diagram of correcting the phase shown in FIG. 4.
6 is a schematic diagram of a gradation histogram showing intensity and frequency.
7A and 7B are schematic diagrams of phase histograms showing phases and frequencies, showing before and after noise removal, respectively.

The present invention is capable of various modifications and various forms, and specific embodiments are illustrated in the drawings and described in detail in the text. It should be understood, however, that the invention is not intended to be limited to the particular forms disclosed, but includes all modifications, equivalents, and alternatives falling within the spirit and scope of the invention.

The terms first, second, etc. may be used to describe various elements, but the elements should not be limited by the terms. The terms are used only for the purpose of distinguishing one component from another. For example, without departing from the scope of the present invention, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component.

The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. Singular expressions include plural expressions unless the context clearly indicates otherwise. In this application, the terms "comprise" or "having" are intended to indicate that there is a feature, number, step, action, component, part, or combination thereof described in the specification, and that one or more other features It should be understood that it does not exclude in advance the possibility of the presence or addition of numbers, steps, actions, components, parts or combinations thereof.

Unless defined otherwise, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art.

Terms such as those defined in the commonly used dictionaries should be construed as having meanings consistent with the meanings in the context of the related art, and shall not be construed in ideal or excessively formal meanings unless expressly defined in this application. Do not.

Hereinafter, for the understanding of the present invention, a three-dimensional shape measuring method by a phase shift method using a moire pattern will be briefly described.

In the phase shift method using a moiré pattern, the grid pattern image is irradiated to the measurement object, and the grid image reflected from the measurement object is observed to measure a three-dimensional shape.

In this case, it is assumed that each position of the surface of the substrate as the xy plane, and the intensity value of light corresponding to each x and y coordinate value may be expressed by Equation 1 below.

Figure 112010024352810-pat00001

In this equation, I is the intensity of the measured light, D is the DC light intensity (a function of illumination light intensity and the reflectivity of the object), and γ is the visibility, a function of the reflectance of the object and the grating period. For example, in the case of a four-bucket algorithm, the subscripts may have 1,2,3 and 4, respectively, when the phases are changed by 90 degrees, 180 degrees and 270 degrees without changing the phase.

From transplantation, equation (2) can be obtained.

Figure 112010024352810-pat00002

Meanwhile, the phase Φ and the height h have a proportional relationship as shown in Equation 3 below.

Figure 112010024352810-pat00003

Λ is equivalent wavelength.

Using the equation described above, the grid pattern is irradiated to the object to obtain a phase value Φ corresponding to each x, y coordinate value, and the height corresponding to the x, y coordinate value is used using the phase value. By obtaining (h), the three-dimensional shape of the measurement object can be obtained.

FIG. 2A is a schematic diagram illustrating a measurement area observed from a photographing unit of a three-dimensional shape measuring device using a phase shift method using a moire pattern. FIG. FIG. 2B is a schematic diagram illustrating a case where a plurality of measurement objects are formed in the inspection area when the inspection area is the same as the measurement area. FIG. 3 is a schematic view showing a cross section taken along line II ′ shown in FIG. 2A.

Referring to FIGS. 2A and 3, a field of view (FOV) 100 observed from a photographing unit generally includes a plurality of region of interest (ROI) 110, and such an inspection region 110. The object may include an object region 111 on which a measurement object is formed and a bottom region 112 outside the object region 111. Alternatively, as shown in FIG. 2B, the inspection area 110 may be the same as the measurement area 100 of FIG. 2A, and a plurality of measurement objects are formed in the inspection area 110, thereby providing a plurality of object areas 111. ) May be included.

Meanwhile, as illustrated in FIG. 3, the grating pattern is inclinedly irradiated to the measurement object O, as in the first direction D1 or the second direction D2. For this reason, the case where the three-dimensional shape of a measurement object cannot be measured correctly arises. In more detail, when the grid pattern is irradiated in the first direction D1, a shadow area may be generated on the right side of the measurement object O. On the contrary, when the grid pattern is irradiated in the second direction D2, A shadow area may be generated on the left side of the measurement object O. In order to accurately measure the shape of the measurement object O corresponding to the shadow area, the phase is measured on both sides of the measurement object O. On the other hand, additional investigations may be made in other directions to improve the accuracy of the inspection.

4 is a schematic phase diagram of the measurement object of FIG. 3.

Referring to FIG. 4, an error S1 may occur on the right side of each of the phases observed in the first direction D1 of FIG. 3, and an error S2 may be observed on the left side of each of the phases observed in the second direction D2. ) May be generated. Thus, by correcting the phase observed in both directions of the measurement object and making the object area clear, accurate height detection of the measurement object is possible.

Meanwhile, the phase-shift method using the moire pattern is enormous since the relative height measurement than the height, as shown in Figure 4, the first direction (D1) phases of the bottom phase (Φ A) of the floor area and the second measurement from Since the bottom phase Φ B which is the phase of the bottom area measured in the direction D2 may be different from each other, it is necessary to correct this.

To this end, first, among the inspection area 110 that is a part of the measurement area 100, the object area 111 in which the measurement target is located is distinguished from the bottom area 112 outside the object area. A method of distinguishing the object region 111 from the bottom region 112 will be described in detail later.

Thereafter, the phase at each point (x, y) of the floor area is averaged to obtain a floor phase, and the floor phase is used to shift the phase of the object area. The phase diagram shown in FIG. 5 is obtained by removing the shadow region from the shifted phase, and the height at each point (x, y) is obtained by using Equation 3 above to measure the three-dimensional shape of the measurement object. .

In an embodiment of the present invention, in order to distinguish the object region from the bottom region, a two-dimensional image of a substrate on which a measurement object is formed is obtained. Such a two-dimensional image may be, for example, a white monochrome image or may be a plurality of color images.

To this end, first, selectively, by using a characteristic value (e.g., an unusual shape of a wiring pattern, etc.) on a substrate, a pad level for accurately setting the inspection area through quantitative compensation for distortion between the reference data and the measured image. You can also go through the pad referencing process. That is, by using the design or manufacturing data, the object region may be temporarily determined in a single monochrome image or a plurality of color images, which will be described later, and then a process of distinguishing the more accurate object region and the bottom region may be performed. As described above, when the pad referencing process is performed, the positions of the object region and the bottom region can be known first, thereby reducing the time required to determine the object region and the bottom region.

First, a method of distinguishing an object region and a bottom region using a white monochrome image will be described.

First, white light is irradiated to the inspection area or the measurement area, and the intensity at each point of the inspection area is obtained. Preferably, it is desirable to obtain intensity in the entire measurement area. As described above, when the bottom phase of the entire measurement area is obtained by obtaining the intensity in the entire measurement area, when the other inspection area inside the measurement area is inspected, the same bottom phase may be applied, thereby improving repeatability. Furthermore, when the object area in the inspection area is considerably larger than the floor area, the bottom phase accuracy of the floor area may be lowered. By measuring the floor phase of the entire measurement area, the accuracy of the floor phase may be improved.

Thereafter, a gradation histogram having an intensity as a first axis and a number of points corresponding to the intensity as a second axis is formed, and then a boundary between the object area and the bottom area is obtained from the gradation histogram.

First, in order to obtain intensity at each point of the measurement area, the light is irradiated from the upper part of the measurement object toward the measurement object, and the light reflected from the measurement object is received at each point of the measurement area. Measure the intensity of.

On the other hand, when irradiating light to a measurement object, it is important to make light evenly radiate to the whole measurement area. This is because when the light reflected from the measurement object is uneven and the intensity of a part is high or low, the accuracy may be reduced. In one method, the measurement object, the light receiving portion, and the light source may be disposed so as to be on a vertical coaxial line, or the light may be irradiated toward the measurement area from a light source arranged concentrically on the upper part of the measurement area.

The intensity at each point (x, y) of the measurement area thus measured is used to form a gradation histogram having the intensity as the first axis and the number of points corresponding to the intensity as the second axis.

6 is a schematic diagram of a gradation histogram showing intensity and frequency, and FIG. 7 is a schematic diagram of a phase histogram showing phase and frequency.

Referring to FIG. 6, for example, any one of an A region having low intensity and a B region having high intensity corresponds to an object region, and the other corresponds to a floor region. In the case where the measurement object has a higher reflectance than the bottom area, for example, when solder (a measurement object) is formed on the PCB substrate (bottom area), the intensity of the measurement object is high and the intensity of the bottom area is low. In this case, area A corresponds to the floor area, and area B corresponds to the object area. On the other hand, when the measurement object has a lower reflectance than the bottom area, the intensity of the measurement object is low and the intensity of the bottom area is high. In this case, area B corresponds to the floor area, and area A corresponds to the object area.

In this case, the C region where the frequency is low between the A region with low intensity and the B region with high intensity may correspond to the boundary between the object region and the bottom region.

The area corresponding to the opening of the CAD data or the stencil for forming the measurement object may correspond to the boundary between the object area and the bottom area.

Referring to FIG. 7A, the method may further include removing noise of phases in consideration of the positional information based on the reference data, the size of the object regions, and the correlation between the surroundings. The silk screen pattern area and the OCR area formed by the silk screen printing method on the surface of the printed circuit board and the hole for connecting the wiring pattern and the like to the PCB board by the positional information by reference data are removed. In the hole, light is not reflected, and the silkscreen pattern region and the OCR region light are saturated to act as noise. In addition, it is also possible to remove noise with a peripheral correlation, that is, a place where there is a sudden change compared to the surroundings.

Referring to FIG. 7B, the regions corresponding to the intensity corresponding to the minimum number range in the grayscale histogram are determined as the boundary between the object region and the bottom region, the bottom region is determined, and the phase in the bottom region A 'is determined. Average these to obtain the bottom phase. Alternatively, in the area where the object area is expanded at the boundary between the object area and the floor area in consideration of the shadow area, it is determined as the boundary between the object area and the floor area, and after determining the floor area, the floors are averaged from the floor area A '. Phase may also be obtained.

Alternatively, in order to obtain a boundary between the object region and the bottom region from the gradation histogram, the boundary between the object region and the bottom region is determined using an Otsu algorithm, or the object region is expanded at the boundary of the bottom region. After determining the boundary as the boundary between the object region and the bottom region and determining the bottom region, the bottom phase may be obtained by averaging the phases in the bottom region A '.

The method for determining the boundary value between the object area and the floor area by the Otsu algorithm will be described in detail. First, the boundary value T between the object area and the floor area is estimated. For example, CAD data may be used to estimate the boundary value T between the object area and the floor area, and a point having a minimum frequency in the grayscale histogram described above may be used.

Next, the estimated boundary value is divided into two groups, the object region G1 and the bottom region G2, the intensity average m1 in the object region is obtained, and the intensity average m2 in the bottom region is obtained.

The mean value of the intensity values in the object area and the mean value in the floor area are again obtained to obtain a new boundary value (T = (m1 + m2) / 2), and the above process is repeated. The above process is repeated until the difference is smaller than the specified value ε.

This method is an example of an Otsu algorithm, and various Otsu algorithms may be applied.

In another embodiment of the present invention, in order to distinguish the object area from the floor area, a plurality of color lights are irradiated to the measurement area to obtain a plurality of color images. For example, the red, green, and blue illuminations are irradiated to the measurement area, respectively, and then a red image, a green image, and a blue image corresponding to each color illumination are obtained.

With the use of red, green and blue illumination having different wavelengths, chromatic aberration causes different distributions of the red, green and blue images within the measurement area. In order to distinguish the bottom area B excluding the solder S and the wiring pattern P shown in FIG. 1, the bottom area may be easily used by using a color information map in which color information classified according to a material is stored. (B) can be distinguished.

For example, using the saturation map, the bottom region B can be easily distinguished from the solder S and the wiring pattern P. FIG. In general, since the measurement object, for example, the solder S is close to achromatic color, a region having a value close to zero on the chroma map may be determined as an object region. For example, since the wiring pattern P of the substrate in addition to the solder S in FIG. 1 is also achromatic, the bottom region B is excluded from the bottom region B except for the solder S and the wiring pattern P having achromatic color. Can be distinguished.

In order to form such a saturation map, HSI information including hue, saturation, intensity, etc. for each color is obtained through color coordinate transformation of the obtained color images. Since the process of converting the RGB information into the HSI information can be performed by a well-known method, a detailed description thereof will be omitted.

On the other hand, before the color coordinate conversion is performed on the obtained color images, each of the obtained color images may be subjected to an averaging filter to mitigate saturation.

Thereafter, a saturation map is generated using color saturation information of the HSI information.

The saturation map may be generated using saturation information for each pixel of the red image, the green image, and the blue image. In detail, the saturation map may be generated based on the saturation for each pixel calculated through Equation 4 below.

Figure 112010024352810-pat00004

In Equation 4, R is chroma information for each pixel in the red image, G is chroma information for each pixel in the green image, and B is chroma information for each pixel in the blue image.

The segmentation map generated through Equation 4 has a value of 0 to 1, and the closer to 1, the primary color. Through this method, the floor area can be determined and the floor phase can be obtained by averaging the phases of the floor area.

While the present invention has been described in connection with what is presently considered to be practical and exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit and scope of the invention. Therefore, the above description and the drawings below should be construed as illustrating the present invention, not limiting the technical spirit of the present invention.

100: measuring area
110: inspection area
111: object area
112: floor area

Claims (12)

Irradiating light toward a substrate on which the plurality of measurement objects are formed and receiving light reflected from the substrate on which the plurality of measurement objects are formed to obtain an image;
Among the inspection areas of the acquired image, a first object area in which a first measurement object is located, a second object area in which a second measurement object is located, and a bottom area of the first and second object areas in the inspection area are defined. Determining;
Irradiating pattern light toward a substrate on which the plurality of measurement objects are formed and receiving pattern light reflected from the substrate on which the plurality of measurement objects are formed to obtain a pattern image; And
Acquiring a phase at each point of the acquired pattern image and acquiring phases of the plurality of measurement objects by using phases of the first and second object regions and phases of the bottom region; How to measure.
Irradiating light toward a substrate on which the plurality of measurement objects are formed and receiving light reflected from the substrate on which the plurality of measurement objects are formed to obtain an image;
Determining an object area in which the measurement object is located and a floor area excluding the object area from the inspection area among the inspection areas of the obtained image;
Irradiating pattern light toward a substrate on which the plurality of measurement objects are formed and receiving pattern light reflected from the substrate on which the plurality of measurement objects are formed to obtain a pattern image; And
Acquiring a phase at each point of the acquired pattern image and acquiring phases of the plurality of measurement objects using the phases of the bottom area, respectively.
3. The method according to claim 1 or 2,
In the inspecting region of the obtained image, determining the object region in which the plurality of measurement objects are located and the bottom region except the object region in the inspection region,
Obtain intensity information at each point of the acquired image, use the intensity as a first axis, and a gradation histogram with the number of points corresponding to the intensity as a second axis. Forming;
Obtaining a boundary between the object area and the bottom area from the gray level histogram; And
And determining a bottom area excluding the object area from among the object area in which the plurality of measurement objects are located and the inspection area.
The method of claim 3, wherein obtaining a boundary between the object area and the bottom area from the gray level histogram comprises:
An area corresponding to an intensity corresponding to a minimum number range in the grayscale histogram is determined as a boundary between the object area and the floor area, or a boundary in which the area determined as the object area is expanded is defined by the area of the object area and the floor area. Three-dimensional shape measurement method comprising the step of determining the boundary.
The method of claim 3, wherein obtaining a boundary between the object area and the bottom area from the gray level histogram comprises:
Determining a boundary determined using the Otsu algorithm in the grayscale histogram as a boundary between the object region and the bottom region, or determining a boundary in which the region determined as the object region is expanded as a boundary between the object region and the bottom region. Three-dimensional shape measurement method comprising a.
3. The method of claim 1, further comprising removing noise of phases in consideration of positional information based on reference data and a correlation between the size of the object regions and a periphery. 4. The method of claim 1, wherein the irradiating light toward the substrate on which the plurality of measurement objects are formed and receiving light reflected from the substrate on which the plurality of measurement objects are formed, acquires an image.
Irradiating a plurality of color lights toward a substrate on which the plurality of measurement objects are formed, receiving light reflected from the substrate on which the plurality of measurement objects are formed, and obtaining a plurality of color images. How to measure.
The method of claim 7, wherein the obtaining of the color image,
3. The method of claim 3, wherein the red, green, and blue lights are respectively irradiated to obtain a red image, a green image, and a blue image, or at least two lights are irradiated together.
The method of claim 2, wherein the determining of the object area in which the plurality of measurement objects is located and the floor area excluding the object area from the inspection area is performed from among the inspection areas of the obtained image.
Acquiring color information classified according to a material at each point of the obtained image and generating a color information map storing color information classified according to the obtained material;
Dividing the object area from the bottom area by using the color information map; And
And determining a bottom area excluding the object area from among the object area in which the plurality of measurement objects are located and the inspection area.
The method of claim 9, wherein the color information map is a chroma map,
Generating the saturation map,
Acquiring hue, saturation, and intensity information for each color through color coordinate transformation on the color image; And
And generating the chroma map using the chroma information for each color.
3. The method according to claim 1 or 2,
The inspection area is a three-dimensional shape measuring method, characterized in that the entire area of the image obtained by receiving light reflected from the substrate on which the plurality of measurement objects are formed
3. The method according to claim 1 or 2,
3. The method of claim 3, wherein the phases of the plurality of measurement objects are obtained by subtracting the phase of the same bottom region from the object phase of each of the plurality of measurement objects existing in the inspection area.
KR1020100035255A 2010-04-16 2010-04-16 Method of three dimensional mesurement KR101190125B1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
KR1020100035255A KR101190125B1 (en) 2010-04-16 2010-04-16 Method of three dimensional mesurement
US13/086,879 US8855403B2 (en) 2010-04-16 2011-04-14 Method of discriminating between an object region and a ground region and method of measuring three dimensional shape by using the same
CN201110096894.5A CN102261895B (en) 2010-04-16 2011-04-18 Method of discriminating between an object region and a ground region and method of measuring three dimensional shape by using the same
JP2011091979A JP2011227080A (en) 2010-04-16 2011-04-18 Discrimination method of object area from ground area and three-dimensional shape measurement method
JP2014139443A JP2014197023A (en) 2010-04-16 2014-07-07 Discrimination method of object area from ground area and three-dimensional shape measurement method
JP2016021646A JP6211114B2 (en) 2010-04-16 2016-02-08 A method for distinguishing an object region from a ground region and a three-dimensional shape measurement method.

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020100035255A KR101190125B1 (en) 2010-04-16 2010-04-16 Method of three dimensional mesurement

Publications (2)

Publication Number Publication Date
KR20110115752A KR20110115752A (en) 2011-10-24
KR101190125B1 true KR101190125B1 (en) 2012-10-11

Family

ID=45030296

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020100035255A KR101190125B1 (en) 2010-04-16 2010-04-16 Method of three dimensional mesurement

Country Status (1)

Country Link
KR (1) KR101190125B1 (en)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2208354A4 (en) 2007-10-10 2010-12-22 Gerard Dirk Smits Image projector with reflected light tracking
US9946076B2 (en) 2010-10-04 2018-04-17 Gerard Dirk Smits System and method for 3-D projection and enhancements for interactivity
JP5948496B2 (en) * 2012-05-22 2016-07-06 コー・ヤング・テクノロジー・インコーポレーテッド Method for measuring the height of a three-dimensional shape measuring device
US8971568B1 (en) 2012-10-08 2015-03-03 Gerard Dirk Smits Method, apparatus, and manufacture for document writing and annotation with virtual ink
WO2015149027A1 (en) 2014-03-28 2015-10-01 Gerard Dirk Smits Smart head-mounted projection system
US9377533B2 (en) 2014-08-11 2016-06-28 Gerard Dirk Smits Three-dimensional triangulation and time-of-flight based tracking systems and methods
WO2016168378A1 (en) * 2015-04-13 2016-10-20 Gerard Dirk Smits Machine vision for ego-motion, segmenting, and classifying objects
WO2017106875A1 (en) 2015-12-18 2017-06-22 Gerard Dirk Smits Real time position sensing of objects
US9813673B2 (en) 2016-01-20 2017-11-07 Gerard Dirk Smits Holographic video capture and telepresence system
US10067230B2 (en) 2016-10-31 2018-09-04 Gerard Dirk Smits Fast scanning LIDAR with dynamic voxel probing
US10261183B2 (en) 2016-12-27 2019-04-16 Gerard Dirk Smits Systems and methods for machine perception
WO2018209096A2 (en) 2017-05-10 2018-11-15 Gerard Dirk Smits Scan mirror systems and methods
US10591605B2 (en) 2017-10-19 2020-03-17 Gerard Dirk Smits Methods and systems for navigating a vehicle including a novel fiducial marker system
US10379220B1 (en) 2018-01-29 2019-08-13 Gerard Dirk Smits Hyper-resolved, high bandwidth scanned LIDAR systems
US11372320B2 (en) 2020-02-27 2022-06-28 Gerard Dirk Smits High resolution scanning of remote objects with fast sweeping laser beams and signal recovery by twitchy pixel array
KR20230022216A (en) * 2020-06-12 2023-02-14 미합중국 (관리부서 : 미합중국 해군성) Surface Profile Mapping to Evaluate III-N Device Performance and Yield

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010006886A2 (en) 2008-06-23 2010-01-21 Flatfrog Laboratories Ab Determining the location of one or more objects on a touch surface

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010006886A2 (en) 2008-06-23 2010-01-21 Flatfrog Laboratories Ab Determining the location of one or more objects on a touch surface

Also Published As

Publication number Publication date
KR20110115752A (en) 2011-10-24

Similar Documents

Publication Publication Date Title
KR101190125B1 (en) Method of three dimensional mesurement
JP5256251B2 (en) Inspection method of measurement object
JP6211114B2 (en) A method for distinguishing an object region from a ground region and a three-dimensional shape measurement method.
JP5562407B2 (en) Substrate inspection apparatus and inspection method
TWI440847B (en) Inspection method
JP2010266445A (en) Measuring method of measuring object on printed circuit board
JP2010271316A (en) Device and method for measuring shape
US8730464B2 (en) Method of inspecting a substrate
US10533952B2 (en) Method of inspecting a terminal of a component mounted on a substrate and substrate inspection apparatus
KR20130141345A (en) Solder height detection method and solder height detection device
JP5622816B2 (en) Method for measuring an object to be measured on a printed circuit board
JP5584671B2 (en) Board inspection method
KR101227110B1 (en) Method of checking and setting inspection apparatus
US20120127486A1 (en) Method of inspecting a substrate
JP2006317408A (en) Warpage checker
KR101190127B1 (en) Method of dividing a region
KR100862636B1 (en) Method for optical visual examination
KR101570360B1 (en) Method of setting reference region and method of eliminating noise for the same
KR101227116B1 (en) Method of checking and setting inspection apparatus
KR101216453B1 (en) Inspection method of measuring object
KR101311255B1 (en) Inspection method of measuring object

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20150924

Year of fee payment: 4

FPAY Annual fee payment

Payment date: 20160906

Year of fee payment: 5

FPAY Annual fee payment

Payment date: 20170907

Year of fee payment: 6

FPAY Annual fee payment

Payment date: 20190909

Year of fee payment: 8