JP6027362B2 - Image processing apparatus for widening visual field of outline data of semiconductor and computer program - Google Patents

Image processing apparatus for widening visual field of outline data of semiconductor and computer program Download PDF

Info

Publication number
JP6027362B2
JP6027362B2 JP2012166060A JP2012166060A JP6027362B2 JP 6027362 B2 JP6027362 B2 JP 6027362B2 JP 2012166060 A JP2012166060 A JP 2012166060A JP 2012166060 A JP2012166060 A JP 2012166060A JP 6027362 B2 JP6027362 B2 JP 6027362B2
Authority
JP
Japan
Prior art keywords
coordinate data
contour
plurality
pattern shape
processing apparatus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2012166060A
Other languages
Japanese (ja)
Other versions
JP2014026452A (en
Inventor
正弘 北澤
正弘 北澤
安部 雄一
雄一 安部
Original Assignee
株式会社日立ハイテクノロジーズ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立ハイテクノロジーズ filed Critical 株式会社日立ハイテクノロジーズ
Priority to JP2012166060A priority Critical patent/JP6027362B2/en
Publication of JP2014026452A publication Critical patent/JP2014026452A/en
Application granted granted Critical
Publication of JP6027362B2 publication Critical patent/JP6027362B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to an image processing apparatus that widens contour line data and a computer program that causes a computer to execute wide-field-view processing.

  When a person enters the manufacturing process of a semiconductor circuit, defects such as defective formation of a pattern shape occur in the semiconductor circuit due to changes in dust, humidity, vibration, and the like. Therefore, there is a strong demand for completely automatic inspection and measurement of semiconductor circuits.

  FIG. 1 shows a configuration of an inspection apparatus using a general scanning electron microscope (SEM) 101. First, a semiconductor circuit wafer to be inspected is set on the stage 105. Then, the computer 104 mounted on the control device 103 issues an instruction to the SEM 101 to image the semiconductor circuit, and an image (SEM image) captured by the SEM 101 is displayed on the display unit 109.

  As a method for inspecting a pattern shape or dimension from an SEM image, there is a method of extracting a contour line from an SEM image and comparing it with design data (for example, Patent Document 1). Further, as a method for inspecting and measuring a pattern shape with a wide field of view, there is a method of joining SEM images into a panoramic image (for example, Patent Document 2).

International Publication No. 2011/118745 (A1) Pamphlet International Publication No. 2011/090111 (A1) Pamphlet International Publication No. 10/061516 Pamphlet US Patent Application Publication No. 2009/242760 (A1) Specification

  When it is intended to accurately inspect and measure the pattern shape from the SEM image, it is desirable to increase the imaging magnification of the SEM and capture the details of the pattern shape in the SEM image. On the other hand, when the imaging magnification of the SEM is high, the field of view that can be captured becomes narrow. Therefore, the pattern shape to be inspected and measured may not be captured in one SEM image, and it becomes necessary to inspect and measure with a plurality of SEM images. Alternatively, it is necessary to obtain contour lines corresponding to the entire pattern shape to be inspected by connecting contour lines obtained from a plurality of SEM images.

  When connecting contour lines obtained from a plurality of SEM images, a general method is to use overlapping regions included in the plurality of SEM images for alignment. However, there is a possibility that the contour lines in the overlapping regions included in the plurality of SEM images are not the same due to the influence of the distortion of the SEM images or the charging due to the imaging.

  If an image is used for shape correction when connecting contour lines, expansion / contraction processing is performed in units of pixels in the image, and it is difficult to perform subtle shape correction in units of nm. In addition, in the expansion / contraction process using an image, the tip of a sharp corner is rounded, and accurate shape correction cannot be performed.

  In order to solve the above-described problems, the present invention performs the joining of contour lines obtained from a plurality of SEM images by performing expansion / contraction while maintaining the shape of the contour line in the contour line coordinate data, and is operator-free. An image processing apparatus and a computer program are provided.

  In order to solve the above problems, an image processing apparatus according to the present invention includes a storage unit that stores a plurality of contour coordinate data obtained from an SEM image of a pattern shape to be inspected or measured, and the plurality of contour coordinate data. A moving unit that adds position information of a field of view where the SEM image is captured, a correction unit that corrects a shift between the plurality of contour coordinate data to which the position information is added, and the correction unit And an integration unit that integrates the plurality of contour coordinate data corrected by the above into one wide-field contour coordinate data.

  Further, according to the present invention, the process of creating one wide-field contour coordinate data from a plurality of contour coordinate data obtained from the SEM image of the pattern shape to be inspected or measured is performed by the storage unit and the calculation unit. A program for causing an information processing apparatus to be executed is provided. The storage unit stores the plurality of contour line coordinate data. The program includes processing for adding position information of a field of view on which the SEM image is captured to each of the plurality of contour coordinate data, and the plurality of contour coordinates to which the position information is added. A process of correcting a shift between data and a process of integrating the plurality of contour line coordinate data corrected by the correction process into one wide-field contour line coordinate data are executed.

  According to the present invention, it is possible to satisfactorily join contour lines obtained from a plurality of SEM images by expanding and contracting while maintaining the shape of the contour line in the contour coordinate data.

  Further features related to the present invention will become apparent from the description of the present specification and the accompanying drawings. Further, problems, configurations and effects other than those described above will be clarified by the description of the following examples.

It is a figure which shows the structure of the test | inspection apparatus using a scanning electron microscope (SEM). It is the figure of one Example which imaged the design data of the semiconductor circuit pattern. It is a figure of one Example at the time of imaging the design data of FIG. 2 by SEM. It is a figure of one Example of the pattern shape of the semiconductor circuit imaged in the SEM image visual field of FIG. It is a figure of one Example of the result of having extracted outline coordinate data from the SEM image of FIG. It is a figure of one Example of one wide visual field outline coordinate data with which several outline data were connected. It is a figure explaining the structure of an image processing apparatus which is one Example of this invention, and the flow of data. It is a figure for demonstrating the part of the joining of two outline line coordinate data. It is a figure for demonstrating distortion correction processing. It is the figure which expanded the part of the code | symbol 801 of FIG. 8A, and is a figure which shows an example of the deviation correction process of this invention. It is a figure explaining an example of the expansion / contraction process of the outline data which is one Example of this invention. It is a figure explaining an example of the expansion / contraction process of the outline data which is one Example of this invention. It is a figure explaining an example of the expansion / contraction process of the outline data which is one Example of this invention. It is a figure explaining an example of the contraction process of the outline data which is one Example of this invention. It is a figure explaining an example of the expansion process of the outline data which is one Example of this invention. It is a flowchart explaining the deviation correction process which is one Example of this invention. It is a flowchart explaining the outline coordinate data expansion process which is one Example of this invention. It is a figure explaining the screen displayed on the display part which is one Example of this invention. It is a figure explaining the application to the double patterning which is one Example of this invention. It is a figure explaining the application to the double patterning which is one Example of this invention. It is a figure explaining application to the hole array which is one Example of this invention. It is a figure explaining application to the hole array which is one Example of this invention. It is a figure explaining application to the hole array which is one Example of this invention. It is a figure explaining an example of the expansion process which is one Example of this invention. It is a figure explaining an example of the expansion process which is one Example of this invention.

  Embodiments of the present invention will be described below with reference to the accompanying drawings. The accompanying drawings show specific embodiments in accordance with the principle of the present invention, but these are for the understanding of the present invention, and are never used to interpret the present invention in a limited manner. is not.

<First embodiment>
This embodiment is an image processing apparatus used in the scanning electron microscope (SEM) 101 shown in FIG. For example, the image processing apparatus of this embodiment is mounted on the computer 104 in FIG. The image processing apparatus calculates contour coordinate data with a wide field of view by connecting contour data of a plurality of SEM images captured by the SEM 101.

  FIG. 2 shows an image of semiconductor circuit pattern design data. The design data 201 includes a pattern shape 202. FIG. 3 shows an example in which the design data 201 of FIG. 2 is imaged with an SEM. For example, when the pattern shape 202 is imaged with the SEM image field 302 set at a high magnification, it is difficult to image the entire pattern shape 202 with one SEM image. Therefore, the field of view of the SEM image is moved, and four SEM images composed of the SEM image fields 302, 303, 304, and 305 are imaged.

  FIG. 4 shows an SEM image taken in the SEM image field of FIG. The SEM images captured in the SEM image fields 302, 303, 304, and 305 correspond to the SEM images 402, 403, 404, and 405, respectively. FIG. 5 shows an image obtained by extracting the contour line coordinate data from the SEM image of FIG. The SEM images 402, 403, 404, and 405 correspond to the contour line images 502, 503, 504, and 505, respectively.

<Hardware configuration of image processing apparatus>
Next, the hardware configuration of the image processing apparatus according to this embodiment will be described. The image processing apparatus is mounted on the computer 104 and is configured by an information processing apparatus such as a personal computer. In this embodiment, the image processing apparatus is mounted on the computer 104. However, the image processing apparatus may be configured by another information processing apparatus connected to the computer 104. The image processing apparatus includes the display unit 109 described above, an input unit 110 such as a keyboard and a mouse, a memory, a central processing unit (or also called a calculation unit), and a storage device. The storage device is a storage medium such as an HDD, a CD-ROM, or a DVD-ROM.

  The central processing unit includes a CPU (Central Processing Unit), a microprocessor, and the like. Each processing unit of the image processing apparatus described in detail below can be realized by a program code of software that realizes the function of each processing unit. In other words, each processing unit of the image processing apparatus may be stored in a memory as a program code, and may be realized by the central processing unit executing each program code. Note that each processing unit of the image processing apparatus may be realized by hardware, for example, by designing with an integrated circuit. Each storage unit of the image processing apparatus described in detail below is realized by the above-described memory or storage device.

<Configuration of image processing apparatus>
FIG. 7 is a diagram illustrating the configuration and data flow of the image processing apparatus 700 according to the present embodiment. The image processing apparatus 700 includes an SEM image storage unit 701, an outline extraction unit 703, an outline coordinate data conversion unit 705, an outline coordinate data storage unit 707, an outline distortion correction unit 741, and outline coordinate data. Field of view moving unit 709, deviation correction unit 711, contour coordinate data integration unit 713, wide field of view contour coordinate data storage unit 715, design data storage unit 720, contour inspection / measurement unit 721, and correction amount And an arithmetic unit 723.

  For example, the SEM image storage unit 701 stores a plurality of SEM images 402, 403, 404, and 405 as shown in FIG. The input to the image processing apparatus 700 is the SEM image 402 stored in the SEM image storage unit 701.

  The contour line extraction unit 703 acquires the SEM image 402 stored in the SEM image storage unit 701, leveles the SEM image 402 as shown in FIG. 3 of Patent Document 3, and removes noise. Thereafter, the contour line extraction unit 703 extracts a contour line using a predetermined method. As an example of a method for extracting a contour line, there is a method for detecting an edge and converting it into a binarized image, obtaining a center line of a white band, and obtaining a thin line image. As described above, the contour line extraction unit 703 converts the SEM image 402 into a contour image using the above-described method specialized for the SEM image, and inputs the contour line image 502 to the contour coordinate data conversion unit 705. To do.

  The contour line coordinate data conversion unit 705 converts the contour line image 502 into polygonal coordinate data. Here, for example, polygonal coordinate data is data in which the coordinates of the vertices of the contour line of the pattern shape are listed. The contour coordinate data conversion unit 705 stores the converted coordinate data in the contour coordinate data storage unit 707 as contour coordinate data 706. The contour coordinate data storage unit 707 stores a plurality of contour coordinate data obtained from SEM images around the pattern shape to be inspected or measured.

  The contour coordinate distortion correction unit 741 acquires contour coordinate data from the contour coordinate data storage unit 707 and corrects the distortion of the contour coordinate data. Since the contour line coordinate data is numerical data, it can be corrected more finely than correcting the image in units of pixels. The contour coordinate distortion correction unit 741 executes distortion correction processing using the distortion coefficient 830. The contour coordinate distortion correction unit 741 inputs the contour coordinate data 708 after distortion correction to the contour coordinate data visual field moving unit 709. Here, a plurality of contour coordinate data 708 corresponding to the SEM image around the pattern shape to be inspected or measured is input to the contour coordinate data visual field moving unit 709.

  The contour coordinate data visual field moving unit 709 adds position information (SEM image visual field position information) obtained by capturing the SEM image to the contour coordinate data 708 which is a coordinate value for each SEM image. At this time, the contour coordinate data visual field moving unit 709 uses the design data 301 of the design data storage unit 720 as position information of the SEM image visual field. The design data storage unit 720 stores design data indicating the imaging position of the SEM image and used for inspection or measurement of the contour line. Then, the contour coordinate data visual field moving unit 709 inputs the contour line coordinate data 708 to which the position information of the SEM image visual field is added to the misalignment correcting unit 711 as the panoramic contour coordinate data 710.

  The deviation correction unit 711 corrects the deviation in the contour coordinate data caused by the deviation between the visual field of the panoramic outline coordinate data 710 by a method using expansion / contraction while maintaining the outline shape. In order to execute this correction processing, the deviation correction unit 711 includes a contour line coordinate data expansion processing unit that performs expansion processing of the contour line coordinate data, and contour line coordinate data reduction processing that performs reduction processing of the contour line coordinate data. A part.

  Although the arrow is omitted in FIG. 7, the deviation correction unit 711 inputs panoramic contour coordinate data 710 to the contour inspection / measurement unit 721. The contour inspection / measurement unit 721 calculates a shift amount, which will be described later, by comparing the design data in the design data storage unit 720 with the panoramic contour coordinate data 710. The calculated deviation amount 724 is output to the deviation correction unit 711 and the correction amount calculation unit 723.

  The deviation correction unit 711 inputs the movement amount 926 obtained from the deviation amount 724 output by the contour inspection / measurement unit 721 to the correction amount calculation unit 723. The correction amount calculation unit 723 corrects the movement amount 926 using a correction method described later, and outputs the corrected movement amount 923 to the deviation correction unit 711. Note that the correction amount calculation unit 723 acquires the threshold value of the angle of the pattern to be corrected input on the display unit 109 (for example, the value of the threshold setting box 1405 in FIG. 14), and corrects it using this threshold value. It is also possible to calculate the movement amount 923. The deviation correction unit 711 corrects the deviation in the contour coordinate data using the corrected movement amount 923 and the movement amount 926 obtained from the deviation amount 724 output from the contour line inspection / measurement unit 721. The misalignment correction unit 711 inputs the contour line coordinate data whose misalignment has been corrected to the contour line coordinate data integration unit 713 as misalignment corrected contour line coordinate data 712.

  The contour coordinate data integration unit 713 performs panoramic processing on the shift-corrected contour coordinate data 712, and wide-field contour coordinate data 601 corresponding to the contour coordinate data obtained from one SEM image. Is calculated. The contour line coordinate data integration unit 713 stores the wide field contour line coordinate data 601 in the wide field contour line coordinate data storage unit 715, and further outputs the wide field contour line coordinate data 601 to the display unit 109.

  The wide-field outline coordinate data storage unit 715 stores wide-field outline coordinate data 601 corresponding to the outline coordinate data obtained from one SEM image. FIG. 6 shows an example in which the wide-field outline coordinate data 601 is imaged. As shown in FIG. 6, the image of the wide visual field contour coordinate data 601 is an image in which a plurality of contour images 502, 503, 504, and 505 are integrated (that is, connected) to form a wide visual field.

  Also, the contour inspection / measurement unit 721 acquires the wide-field contour coordinate data 601 from the wide-field contour coordinate data storage unit 715 and executes a comparison with the design data 301 from the design data storage unit 720. A quality determination process is executed using a technique as described in Document 4. For example, the quality determination process is performed by measuring the shape of the pattern from the wide visual field outline coordinate data 601 and comparing the measurement result with the design data 301. The contour inspection / measurement unit 721 outputs the result of the pass / fail determination process to the display unit 109 as pass / fail determination data 722. Therefore, the display unit 109 displays the pass / fail determination data 722 together with the wide visual field outline coordinate data 601. At this time, the display unit 109 can also display the SEM image 402 from the SEM image storage unit 701. Thereby, the user or the operator can confirm all of the SEM image 402, the wide visual field outline coordinate data 601, and the pass / fail judgment data 722 on the display unit 109.

  Also, the contour inspection / measurement unit 721 compares the wide-field contour coordinate data 601 with the design data 301 to calculate the pattern shape displacement amount 724, and again calculates the displacement amount 724 as the correction amount calculation unit 723 and It is also possible to input the deviation correction unit 711. Accordingly, when the wide-field contour coordinate data 601 corrected by the once calculated shift amount 724 is shifted from the design data 301, the correction amount calculation unit 723 and the calculated shift amount 724 are used as feedback. It is possible to input the deviation correction unit 711 and execute the deviation correction with high accuracy.

<Distortion correction processing and deviation correction processing>
As described above, FIG. 6 shows an example in which the wide-field contour coordinate data 601 obtained by connecting a plurality of contour coordinate data into a panorama is imaged. When creating such wide-field contour coordinate data 601, the relative distance (offset value) obtained by capturing the SEM image from the design data 301 is obtained, and the contours corresponding to the contour images 502, 503, 504, and 505 are obtained. An offset value may be added to the coordinate value so that the line coordinate data can be calculated in a continuous region, and a plurality of contour line coordinate data may be integrated by OR operation. However, in actuality, there is a gap between two contour line coordinate data (for example, contour line coordinate data of the contour line images 502 and 503) when an SEM image is captured or when a contour line is extracted from the SEM image. Therefore, it is conceivable that when the OR operation is performed, a step is produced in the joining of the contour line coordinate data.

  In the present embodiment, first, distortion correction processing is executed on the contour line coordinate data, and then deviation correction processing is executed. First, the distortion correction processing of the outline coordinate distortion correction unit 741 will be described. FIG. 8A is a diagram for explaining a joining portion of the contour line coordinate data. Hereinafter, the contour coordinate data of the contour images 502 and 503 will be described as contour coordinate data 502 and 503, respectively. As shown in FIG. 8A, the contour coordinate data 502 includes a pattern shape 802, and the contour coordinate data 503 includes a pattern shape 803. Further, there is an overlapping area 820 between the contour line coordinate data 502 and the contour line coordinate data 503. In this embodiment, the distortion correction process is executed before calculating the amount of deviation between the pattern shape 802 and the pattern shape 803.

  As illustrated in FIG. 8B, the contour coordinate distortion correction unit 741 executes a distortion correction process using the distortion coefficient 830 so that the curved image 840 becomes a linear image 841 using the SEM image as an image. Here, it is assumed that the distortion coefficient 830 is automatically acquired in advance at the start of work inspection in units of devices. As described above, the contour coordinate distortion correction unit 741 executes, for example, a distortion correction process on the frame region of the SEM image to be discarded, and increases the contour coordinate data to be used, and the pattern shape 802 and the pattern shape 803. Reduce the deviation as much as possible.

  Even if the distortion is corrected, the shift between the pattern shapes is not completely eliminated, so it is necessary to correct the shift. Next, an example of the deviation correction process of the deviation correction unit 711 will be described. FIG. 8C is an enlarged view of a portion 801 in FIG. 8A. A portion denoted by reference numeral 801 is an enlarged view of a portion where the contour coordinate data 502 and the contour coordinate data 503 are joined. A portion denoted by reference numeral 801 is obtained by drawing the visual field boundary portion related to the pattern shape 802 of the contour line coordinate data 502 and the visual field boundary portion related to the pattern shape 803 of the contour line coordinate data 503 from the contour line coordinate data. .

  In order to calculate the amount of deviation between the pattern shape 802 and the pattern shape 803, an intersection coordinate 805 where the visual field boundary line 804 of the contour line coordinate data 502 intersects the contour line of the pattern shape 803 is obtained. Similarly, the intersection coordinate 807 where the visual field boundary line 806 of the contour line coordinate data 503 intersects the contour line of the pattern shape 802 is obtained.

  Here, a vertex 808 that is paired with the intersection coordinate 807 on the visual field boundary line 806 of the contour line coordinate data 503 is obtained, and the distance between the intersection coordinate 807 and the vertex 808 is defined as a shift amount 810. Further, a vertex 811 that is paired with the intersection coordinate 805 on the visual field boundary line 804 of the contour line coordinate data 502 is obtained, and the distance between the intersection coordinate 805 and the vertex 811 is defined as a shift amount 812. It should be noted that a range between the visual field boundary line 806 of the contour line coordinate data 503 and the visual field boundary line 804 of the contour line coordinate data 502, that is, an overlapping region 820 between the contour line coordinate data 503 and the contour line coordinate data 502, If there are vertices, the amount of deviation is calculated for each vertex.

The amount of displacement of each vertex is calculated by the following equation 1 to obtain the amount by which the vertex is moved.

  Here, as shown in FIG. 8C, the weight 814 is a floating-point value from 0 to 1 that changes linearly such that it is 0.0 at the field boundary 806 and 1.0 at the field boundary 804. is there. Note that the change in the weight 814 may be a quantitative increase proportional to the distance between the visual field boundary line 806 and the visual field boundary line 804, or a curvilinear increase using a quadratic equation. Further, in this example, the “vertex coordinates on the pattern shape 802 side” in Expression 1 indicate the intersection coordinates 807 and the vertex 811. Thus, in the case of FIG. 8C, vertex 808 is replaced with intersection coordinate 807 and vertex 811 is replaced with intersection coordinate 805. A new side connecting the pattern shape 802 and the pattern shape 803 is a side 821. Accordingly, even when there is a step in the line segment of the pattern shape in the overlapping region 820, it is possible to correct the line segment without the step.

<About expansion / contraction treatment>
Next, an expansion / contraction process, which is an example of the deviation correction process of the deviation correction unit 711, will be described. First, the expansion process will be described with reference to FIGS. 17A and 17B. FIG. 17A shows a method of expanding the contour coordinate data in order to reproduce the pattern exposure of the semiconductor in a pseudo manner.

  The directions of the arrows on the sides 1703 and 1705 are the same as in the general polygon notation. That is, a clockwise arrow indicates a hole, and a counterclockwise arrow indicates a shape. In the expansion process, the vertex 1707 is moved by a displacement amount 725 in a direction passing through 1/2 of the outer angle 1708 of the vertex 1707 constituting the outer side 1702 of the shape portion 1701. An outer peripheral side 1705 constituted by the vertex 1704 after the movement becomes the outer periphery after expansion.

  As shown in FIG. 17A, when the shape portion 1701 has a donut shape having a hole portion 1706 on the inside, it passes through half of the inner angle 1710 of the vertex 1709 constituting the inner side 1703 of the shape portion 1701. The vertex 1709 is moved in the direction by a displacement amount 725. A side 1712 constituted by the moved vertex 1711 is the inner periphery after expansion. As shown in FIG. 17B, the shape formed by the outer peripheral side 1705 and the inner peripheral side 1712 is the expanded shape portion 1720. The contraction process can be performed in the same manner.

  Thus, unlike the enlargement / reduction process in which all the portions of the pattern shape are stretched or shrunk at the same rate, the expansion / contraction process in the present invention thickens or thins the shape while maintaining the pattern shape. be able to. In the example of FIG. 17A, the expansion processing is performed by calculating the displacement amount 725 of the vertex 1707. However, the expansion processing is performed by calculating the vertical displacement amount 726 (the displacement amount in the parallel movement direction) of the side 1702. You can go.

  Furthermore, FIG. 9A thru | or FIG. 9C show the example of the expansion / contraction process in this invention. As shown in FIG. 9A, first, an angle 906 formed by sides 901 and 902 formed between vertices of the contour line coordinate data is obtained. Next, an angle ½ of the angle 906 is obtained, and an X-direction movement amount 907 and a Y-direction movement amount 909 for moving the vertex 908 by a displacement amount 725 by a trigonometric function are obtained. The deviation correction unit 711 adds or subtracts the X-direction movement amount 907 and the Y-direction movement amount 909 to the coordinate data of the vertex 908 to obtain the movement destination coordinate data 910.

  As another example, the shift amount 726 of the pattern shape in the direction perpendicular to the sides 901 and 902 formed between the vertices of the contour line coordinate data is calculated, and the sides 901 and 902 are moved by the shift amount 726. Thus, new sides 904 and 905 may be formed, and the expansion process or the contraction process may be executed without changing the pattern shape. The shift amounts 725 and 726 can be obtained, for example, by comparing the panoramic contour coordinate data 710 and the design data 301 in the contour inspection / measurement unit 721.

  As shown in FIG. 9B, when the angle of the side formed between the vertices of the outline coordinate data is an acute angle such as the angle 921, the vertex is abnormal as the movement amount 926 compared to the movement amount of the other vertices. And the pattern shape collapses like the side 922 and the side 925. Therefore, when the side angle 921 formed between the vertices of the contour line coordinate data is an acute angle, the deviation correction unit 711 inputs the movement amount 926 obtained from the angle 921 to the correction amount calculation unit 723. The correction amount calculation unit 723 calculates the corrected movement amount 923 by correcting the movement amount 926 based on the movement amount 926 and the deviation amount 724 using Equation 2 as a correction formula (see FIG. 7). The shift amount 724 input from the contour inspection / measurement unit 721 to the correction amount calculation unit 723 is, for example, the shift amounts 725 and 726 in FIG. 9A, and the panoramic contour coordinate data 710 and the design data 301 are used. The amount of deviation calculated by comparison. Further, the correction may be performed by using the shift amount between the pattern shapes in the overlapping area of the contour coordinate data, such as the shift amounts 810 and 812 in FIG. 8C.

  The correction amount calculation unit 723 outputs b (movement amount 923) obtained by Expression 2 to the deviation correction unit 711. As illustrated in FIG. 9C, the shift correction unit 711 can suppress the sides 922 and 925 from having an abnormal shape by moving the vertex 924 using the movement amount 923.

  Next, the contraction process will be described. In the case of contraction, it is necessary to limit the coordinates by which the area of the pattern shape becomes 0 as a limit for moving the vertex of the contour coordinate data. FIG. 10 is a diagram for explaining the limit of the contraction process.

  In the contour line coordinate data 1001, the contraction direction of the vertex 1002 is the moving direction 1007 in the ½ direction of the interior angle 1006 of the vertex 1002. Similarly, at the vertex 1003, the moving direction 1008 is obtained. A point 1010 where the movement direction 1007 and the movement direction 1008 intersect is the contraction limit of the vertex 1002 and the vertex 1003. Similarly, when the shrinkage limit is obtained from the other vertices 1004 and 1005, a shrinkage limit line 1011 is obtained. At the time of contraction processing, it is determined that the contraction limit line 1011 is a coordinate that does not exceed the vertex of the contour coordinate data, and if it exceeds, the coordinate of the contraction limit line 1011 is replaced. By such processing, it becomes possible to prevent the pattern shape after shrinkage from becoming an inappropriate shape.

  Next, another example of the expansion process will be described. For example, when a plurality of pattern shapes are included in the contour line coordinate data, it is conceivable that adjacent pattern shapes overlap each other due to the expansion process. FIG. 11 is a diagram illustrating a process when adjacent pattern shapes are overlapped by the expansion process.

  In the input contour coordinate data, when the pattern shape 1101 and the pattern shape 1102 exist adjacent to each other, the pattern shape 1103 expanded from the pattern shape 1101 overlaps the pattern shape 1104 expanded from the pattern shape 1102. Assume that an area 1107 has occurred. Confirmation of the overlapping region by the expansion processing may be obtained by crossing the line segments, or another method may be used.

  In this case, a plurality of overlapping pattern shapes 1103 and 1104 are corrected as one pattern shape. For example, vertices 1109 and 1106 generated by the expansion process are added to the pattern shape 1103 in which the pattern shape 1101 is expanded and the pattern shape 1104 in which the pattern shape 1102 is expanded. 1110 is deleted. Thereafter, the vertex coordinate data 1108 is obtained by aligning the vertices so that the contour coordinate data maintains the polygon shape. By such a process, even when the adjacent pattern shape 1101 and the pattern shape 1102 are overlapped by the expansion process, the pattern shape after the expansion process can be made a more appropriate shape.

<Flow of misalignment correction processing>
Next, the flow of misalignment correction processing will be described. FIG. 12 is a flowchart showing the flow of the deviation correction process.

  First, in step 1201, the deviation correction unit 711 temporarily holds a plurality (N pieces) of panoramic outline coordinate data 710. Next, in step 1202, the shift correction unit 711 calculates a region where the visual field region overlaps between adjacent contour coordinate data (for example, the overlap region 820 in FIG. 8A) in order to obtain a region that needs to be corrected. To do.

  Next, in step 1203, the deviation correction unit 711 determines whether there is an overlapping region 820 between adjacent contour coordinate data. If there is an overlapping area 820, the process proceeds to step 1204. If there is no overlapping area 820, the process proceeds to step 1212.

  Next, in step 1204, the deviation correction unit 711 replaces the adjacent contour line coordinate data with one contour line coordinate data. In the example of FIG. 8A, this process corresponds to a process of replacing the outline coordinate data 502 and the outline coordinate data 503 with one outline coordinate data.

  Next, in step 1205, the deviation correction unit 711 performs line segment intersection determination only in the overlapping region between the adjacent contour coordinate data. For example, in the example of FIG. 8C, it is calculated whether the contour line of the pattern shape 802 and the contour line of the pattern shape 803 intersect in the overlapping region 820. Here, a point where the contour lines intersect is newly added as a vertex.

  Next, in step 1206, the deviation correction unit 711 determines whether a step is generated between the line segments of the contour coordinate data in the overlapping region 820. For example, in the example of FIG. 8C, intersection coordinates 807 where the visual field boundary line 806 of the contour line coordinate data 503 intersects the contour line of the pattern shape 802 are obtained, and the pattern shape 803 on the visual field boundary line 806 of the contour line coordinate data 503 is obtained. If the coordinates of these contour lines match, it is determined that there is no step, and if they do not match, it is determined that there is a step.

  Next, in step 1207, the contour inspection / measurement unit 721 calculates a deviation amount of the contour inspection / measurement unit 721. For example, the amount of deviation to be calculated is the amount of deviation 725, 726, etc. in FIG. 9A, and is the amount of deviation calculated by comparing the panoramic contour coordinate data 710 with the design data 301. Also, the amount of deviation between the contour lines in the overlapping area 820 of the contour line coordinate data, such as the amount of deviation 810, 812 in FIG. 8C, is calculated.

  Next, in step 1208, the deviation correction unit 711 executes a deviation correction process in the overlapping region of the contour line coordinate data. For example, as shown in FIG. 8C, a process for correcting the deviation between the contour lines in the overlapping region 820 of the contour line coordinate data is executed.

  Next, in step 1209, the polarity of the deviation amount obtained in step 1207 is determined. If the polarity is positive, the process proceeds to step 1210. On the other hand, if the polarity of the shift amount is negative, the process proceeds to step 1211. For example, as shown in FIG. 17A, a direction passing through half of the outer angle of the vertex constituting the outer peripheral side is determined as a positive direction, and a direction passing through 1/2 of the inner angle of the vertex is determined as a negative direction. In step 1210, contour line coordinate data expansion processing is executed. In step 1211, contour line coordinate data contraction processing is executed. Details of the flow of the expansion process will be described later.

  Finally, in step 1212, in order to send the contour coordinate data to the contour coordinate data integration unit 713, the vertex coordinate data is aligned so that the contour coordinate data is consistent as a polygon format. Thereafter, the aligned contour coordinate data is output to the contour coordinate data integration unit 713.

  Next, the flow of the contour coordinate data expansion process will be described. FIG. 13 is a flowchart showing the processing flow of the contour coordinate data expansion processing unit, and corresponds to the processing of step 1210 in FIG.

  First, in step 1301, the number N of processed figures is initialized. Next, in step 1302, the number M of processed vertices in one graphic is initialized.

  Next, in step 1303, an angle θ1 formed by the line segment between the processing target vertex M and the vertex (M−1) and the line segment between the processing target vertex M and the vertex (M + 1) is obtained, and the angle θ1 is set to 1. / 2 to obtain the angle θ2. For example, in the example of FIG. 9A, when the processing target vertex is a vertex 908, an angle θ1 of a side constituted by the vertex 908 and the front and rear vertices 911 and 912 is obtained, and an angle θ2 (angle 906) obtained by halving the angle θ1. )

  Next, in step 1304, it is determined whether the angle θ2 is smaller than the threshold value and an acute angle. If the angle θ2 is smaller than the threshold value, the process proceeds to step 1306. On the other hand, if the angle θ2 is greater than or equal to the threshold value, the process proceeds to step 1305.

  Next, in step 1305, the movement amount of the processing target vertex M is set by the shift amount obtained in step 1207 in FIG. On the other hand, in step 1306, the movement amount determined by the mathematical formula 2 is set as the movement amount of the processing target vertex M.

  Next, in step 1307, the X-direction movement amount and the Y-direction movement amount are calculated for the movement amount of the processing target vertex M based on the movement amount set in step 1305 or 1306. This process corresponds to the process of obtaining the X-direction movement amount 907 and the Y-direction movement amount 909 for moving the vertex 908 by the shift amount 725 in the example of FIG. 9A.

  Next, in step 1308, the X-direction movement amount and the Y-direction movement amount are added to the processing target vertex M to obtain coordinates after movement. In step 1309, the moved coordinates obtained in step 1308 are stored.

  In step 1310, the number M of processed vertices is incremented (+1). Next, in step 1311, it is determined whether all the vertices constituting the graphic have been processed. If processing for all the vertices has been completed, the process proceeds to step 1312. If all the vertices have not been processed yet, the processing target vertex is switched to the adjacent vertex, and steps 1303 to 1310 are executed.

  Next, in step 1312, an OR operation is performed on the original graphic and the graphic after the expansion process, and a process when a self-intersection (that is, an overlap of adjacent pattern shapes) occurs due to the expansion process is performed. This process is, for example, the process described in FIG.

  Next, in step 1313, the processed figure number N is incremented (+1). Next, in step 1314, it is determined whether all graphics have been processed. If all the graphics have been processed, the process is terminated. If processing of all the figures has not been completed yet, the figure to be processed is switched to the next figure, and Steps 1302 to 1313 are executed.

  Although only the expansion process has been described here, the contraction process can also be executed in a similar manner. For example, the contraction process can be executed in such a way that the moving direction of the vertex is opposite to the expansion process. However, in the case of contraction processing, the processing described in FIG. 10 is added. For example, first, a coordinate value that minimizes the area of the graphic is obtained as a result of moving the vertex as a half direction of the internal angle of the graphic. If the amount of movement of the vertex exceeds the coordinate value that minimizes the area of the figure, the amount of movement is replaced with the coordinate value that minimizes the area of the figure. In the case of the contraction process, the overlapping with the adjacent figure does not occur unlike the expansion process, and therefore the self-intersection cancellation process in step 1312 need not be executed.

<About the display screen>
Next, a screen displayed on the display unit 109 will be described. FIG. 14 shows an example of a screen for displaying the result of widening the contour line data.

  The screen displayed on the display unit 109 includes a result display unit 1401, a contour data list display unit 1402, and a function selection unit 1403. The function selection unit 1403 includes a wide field of view execution switch 1404, a threshold setting box 1405, a design data display switch 1406, a grid display switch 1407, and a layer setting box 1414.

  The wide field of view execution switch 1404 is for instructing a wide field of view execution process for calculating the wide field of view outline coordinate data 601 (see FIG. 7) by the image processing apparatus according to the present embodiment. A threshold value setting box 1405 is used to input or select an angle threshold value used in step 1304 of FIG. The design data display switch 1406 is for causing the result display unit 1401 to display the design data stored in the design data storage unit 720. The grid display switch 1407 is for displaying a grid line 1411 indicating how the contour line data is connected. The layer setting box 1414 is for designating a shape to be expanded or contracted by an identification number on design data called a layer number.

  The contour data list display unit 1402 displays a list of contour data to be processed. A thumbnail image 1408 obtained by converting the contour line data into an image and a file name 1409 are displayed in the contour line data list.

  The result display unit 1401 displays data 1412 obtained by imaging one wide-field contour coordinate data 601 obtained by connecting a plurality of contour data, design data 1410, and grid lines 1411. As illustrated in FIG. 7, the contour inspection / measurement unit 721 outputs the result of the pass / fail determination process to the display unit 109 as pass / fail determination data 722. Therefore, a portion 1413 that is determined to have a difference in outline compared to the design data 1410 may be highlighted. In this embodiment, the data 1412 obtained by imaging the wide-field contour coordinate data 601, the design data 1410, and the grid lines 1411 are displayed in an overlapping manner, but these may be displayed individually. Good.

<Second embodiment>
Next, an image processing apparatus according to a second embodiment of the present invention is described. The configuration of the image processing apparatus according to the second embodiment is the same as that shown in FIG. The image processing apparatus of the present invention can be applied not only to the pattern shape of the semiconductor circuit as described in the first embodiment but also to the pattern shape manufactured by double patterning.

  15A and 15B are diagrams for explaining application of the image processing apparatus of the present invention to double patterning. As shown in FIG. 15A, when a semiconductor circuit is manufactured by the multiple exposure method in the design data 1501, the design data layer 1503 is used in the first exposure and the design data layer 1502 is used in the second exposure. And

  The contour line data 1504 is obtained by extracting the contour line data of the pattern shape formed from the SEM image of the semiconductor circuit manufactured by the design data 1501. Here, in the case of the multiple exposure method, it is conceivable that a difference in shape occurs due to a difference in imaging magnification between the first exposure and the second exposure. Even if the pattern shape formed by the first exposure in the process after the second exposure is processed so as not to be deformed by the chemical treatment, the pattern shape may change immediately after the pattern formation. . Therefore, in the pattern shape inspection / measurement process, if the pattern shape is thick or thin at the first exposure portion and the second exposure portion, the pattern matching between the SEM image and the design data is not performed correctly, It is conceivable to inspect and measure a place different from the position to be inspected and measured.

  In order to perform pattern matching correctly, when there is a difference in the contour line data between the contour line data 1505 of the first exposure portion and the contour line data 1506 of the second exposure portion, the difference is compared with the design data 1501. The expansion process or the contraction process is executed while maintaining the shape of the large outline data. In the case of FIG. 15A, since the difference of the contour line data 1505 is larger than that of the design data 1501, the contour line data 1505 is replaced with the expanded contour line data.

  FIG. 15B shows a diagram in which the contour line data having a large difference compared to the design data is expanded. The contour line data 1505 in FIG. 15A, which has a large difference compared to the design data 1501, is replaced with the expanded contour data 1507. An expansion process is performed while maintaining the shape of the contour line data, and an image superimposed on the design data 1501 becomes 1504.

  By performing pattern matching using the contour line data 1506 and the expanded contour line data 1507, it is possible to execute pattern matching at a position to be inspected and measured. Actually, since it may not be possible to determine which one of the first exposure portion and the second exposure portion is deviated from the design data 1501 without looking at the actual product, the expansion process or the contraction process is executed. The layer number is designated in the layer setting box 1414 of the display unit 109 for the target pattern. That is, in the configuration of FIG. 7, the value of the layer setting box 1414 of the display unit 109 is input to the deviation correction unit 711, and the deviation correction unit 711 converts the contour line data corresponding to the layer specified in the layer setting box 1414. An expansion process or a contraction process is performed on the image. If differences from the contour data (pattern width differences, etc.) can be automatically detected compared to the design data, the contour data with the larger difference compared to the design data should be selected. Also good.

<Third embodiment>
Next, an image processing apparatus according to a third embodiment of the present invention is described. The configuration of the image processing apparatus according to the third embodiment is the same as that shown in FIG. The image processing apparatus of the present invention can also be applied to a hole array pattern shape including a plurality of hole shapes.

  16A to 16C are diagrams for explaining application of the image processing apparatus of the present invention to a hole array. As shown in FIG. 16A, in the case of a hole array having a high density such as the design data 1601 and a small hole shape, when the SEM image is picked up at a normal magnification, the hole shape is picked up clearly like the SEM image 1603. Can not. Therefore, in the SEM image 1603, a characteristic region 1602 necessary for alignment with the design data 1601 becomes an unclear shape 1614, and alignment may fail.

  In this case, the magnification of the SEM is set to a higher magnification than usual, and images are divided and imaged in the order of the imaging regions 1604, 1605, 1606, 1607, 1608, 1609, 1610, 1611, and 1612. As a result, as shown in FIG. 16B, it is possible to obtain a hole shape 1616 in which a characteristic region 1602 is clearly imaged. Since the SEM image 1615 is divided by the imaging regions 1604 to 1612, the contour line extraction unit 703 obtains respective contour lines for the imaging regions 1604 to 1612. The deviation correction unit 711 performs expansion processing or contraction processing on the contour line coordinate data of each of the imaging regions 1604 to 1612. Thereafter, the outline coordinate data integration unit 713 creates one outline coordinate data from the outline coordinate data of the imaging regions 1604 to 1612. Note that the contour coordinate data to be subjected to the expansion process or the contraction process may be specified on the display unit 109, or the deviation amount of the hole-shaped contour line in the overlapping region of the adjacent contour line coordinate data. It may be automatically determined from

  FIG. 16C is a diagram in which one piece of contour line coordinate data is created from a plurality of contour line coordinate data. One piece of contour line coordinate data 1607 is created from the contour line coordinate data of the imaging regions 1604 to 1612. By aligning the contour line coordinate data 1617 and the design data 1601, it is possible to correctly align even a high-density hole shape.

  Further, it is determined whether there is a difference between the hole interval 1618 and the hole interval 1619 using the contour coordinate data 1617 obtained by imaging a small hole shape at a high magnification, and the periodicity caused by the difference is determined. It is also easy to inspect or measure abnormalities.

<Summary>
According to the first embodiment, the image processing apparatus 700 includes a contour coordinate data storage unit 707 that stores a plurality of contour coordinate data obtained from an SEM image of a pattern shape to be inspected or measured, and a plurality of contour lines. A shift between the contour line coordinate data visual field moving unit 709 that adds the position information of the field of view where the SEM image is captured with respect to each of the coordinate data 708 and the plurality of panoramic contour line coordinate data 710 to which the position information is added. A deviation correction unit 711 for correction, and a contour coordinate data integration unit 713 for integrating a plurality of contour line coordinate data 712 corrected by the deviation correction unit 711 into one wide-field contour line coordinate data.
According to this configuration, one piece of contour coordinate data with a wide field of view is created using a plurality of contour coordinate data generated from a plurality of SEM images obtained by imaging a semiconductor circuit with an SEM. It is possible to inspect and measure even a pattern shape that cannot be captured in an SEM image. For example, it is possible to inspect and measure a contour line having a relatively large pattern shape, such as a power line of a semiconductor circuit, with a single image, thereby shortening the inspection and measurement time. It is also possible to inspect and measure periodic shape variations by continuously inspecting and measuring a plurality of semiconductor circuit repetitive pattern shapes. Further, the created contour coordinate data with a wide field of view corresponds to data when a wide field of view is imaged at a high magnification, and therefore should be inspected and measured even if there is a slight deviation in alignment. A pattern shape can be specified, and an operator-free semiconductor inspection apparatus can be provided.

  Further, the deviation correction unit 711 calculates the movement amount based on the deviation amount between the contour line coordinate data and the design data, and the line segment or vertex constituting the pattern shape in the plurality of contour line coordinate data is calculated by the movement amount. The expansion process or the contraction process is executed by moving only by. According to this configuration, the contour line coordinate data obtained from a plurality of SEM images can be satisfactorily joined by expanding / contracting while maintaining the shape of the contour line.

According to the second embodiment, the pattern shape is a pattern shape formed by a plurality of exposure processes, and the plurality of contour line coordinate data includes contour line coordinate data of each of the plurality of exposure processes, and a deviation correction unit 711 executes an expansion process or a contraction process on the outline coordinate data that is different from the design data 301 among the outline coordinate data of the plurality of exposure processes, and corrects the deviation between the plurality of outline coordinate data. .
According to this configuration, even in the case of a pattern shape manufactured by double patterning, it is possible to correctly align the contour line coordinate data and the design data 1501 with a wide field of view. Thereby, it becomes easy to inspect or measure a pattern shape abnormality produced by double patterning.

According to the third embodiment, the pattern shape is a hole array including a plurality of hole shapes, and the displacement correction unit 711 corrects the displacement between the hole shapes in the plurality of contour coordinate data by the expansion process or the reduction process. .
According to this configuration, even in the case of a high-density hole array, it is possible to correctly align the contour line coordinate data 1617 and the design data 1601 with a wide field of view. Further, it is determined whether there is a difference between the hole interval 1618 and the hole interval 1619 using the contour coordinate data 1617 obtained by imaging a small hole shape at a high magnification, and the periodicity caused by the difference is determined. It is also easy to inspect or measure abnormalities.

  In addition, this invention is not limited to the Example mentioned above, Various modifications are included. For example, the above-described embodiments have been described in detail for easy understanding of the present invention, and are not necessarily limited to those having all the configurations described. In addition, a part of the configuration of one embodiment may be replaced with the configuration of another embodiment, and the configuration of another embodiment may be added to the configuration of one embodiment. Further, it is possible to add, delete, and replace other configurations for a part of the configuration of each embodiment.

  For example, the image processing apparatus according to the first embodiment described above executes all the processes in FIGS. 8C, 9A to 9C, 10 and 11 as the deviation correction process, and this configuration is more preferable in the present invention. It is a form. In terms of correcting a shift between a plurality of contour line coordinate data, at least one of these correction processes may be included. Therefore, it is possible to configure the present invention by deleting a part of these correction processes. Of course, it is also possible to add the correction process of 2nd Example and 3rd Example to 1st Example.

  Further, as described above, the image processing apparatus may be implemented by software program code that implements the functions of the embodiments. In this case, a storage medium in which the program code is recorded is provided to the information processing apparatus, and the information processing apparatus (or CPU) reads the program code stored in the storage medium. In this case, the program code itself read from the storage medium realizes the functions of the above-described embodiments, and the program code itself and the storage medium storing it constitute the present invention. As a storage medium for supplying such program code, for example, a flexible disk, CD-ROM, DVD-ROM, hard disk, optical disk, magneto-optical disk, CD-R, magnetic tape, nonvolatile memory card, ROM Etc. are used.

  Further, based on the instruction of the program code, an OS (operating system) running on the information processing apparatus performs part or all of the actual processing, and the functions of the above-described embodiments are realized by the processing. It may be. Furthermore, by distributing the program code of the software that realizes the functions of the embodiments via a network, the program code is stored in a storage device of an information processing device or a storage medium such as a CD-RW or CD-R, and is used when used. The CPU of the information processing apparatus may read and execute the program code stored in the storage device or the storage medium.

  Although the present invention has been described with reference to specific examples, these are in all respects illustrative rather than restrictive. Those skilled in the art will appreciate that there are numerous combinations of hardware, software, and firmware that are suitable for practicing the present invention. For example, the program code for realizing the functions described in this embodiment can be implemented by a wide range of programs or script languages such as assembler, C / C ++, perl, Shell, PHP, Java (registered trademark).

  In addition, the image processing apparatus may be realized by hardware, for example, by designing a part or all of them with an integrated circuit.

  Further, the control lines and information lines in the drawings are those that are considered necessary for the explanation, and not all the control lines and information lines on the product are necessarily shown. All the components may be connected to each other.

  The present invention described above can be applied to a pattern inspection method, a length measuring apparatus, and a semiconductor inspection system for inspecting a circuit pattern of a wafer of a semiconductor device or an exposure mask.

101: Scanning electron microscope (SEM)
103: Control device 104: Computer 109: Display unit 110: Input unit 700: Image processing device 701: SEM image storage unit 703: Contour line extraction unit 705: Contour line coordinate data conversion unit 707: Contour line coordinate data storage unit 709: Contour line coordinate data visual field moving unit 711: Deviation correction unit 713: Contour line coordinate data integration unit 715: Wide visual field contour line coordinate data storage unit 720: Design data storage unit 721: Contour line inspection / measurement unit 723: Correction amount calculation unit

Claims (13)

  1. A storage unit for storing a plurality of contour coordinate data obtained from an SEM image of a pattern shape to be inspected or measured;
    A moving unit that adds position information of a field of view where the SEM image is captured to each of the plurality of contour line coordinate data;
    A correction unit that corrects a deviation between the plurality of contour coordinate data to which the position information is added;
    An integration unit that integrates the plurality of contour line coordinate data corrected by the correction unit into one wide-field contour line coordinate data;
    Equipped with a,
    The correction unit calculates a movement amount based on a deviation amount between the contour line coordinate data and design data, and sets a line segment or a vertex constituting the pattern shape in the plurality of contour line coordinate data as the movement amount. An image processing apparatus that performs an expansion process or a contraction process by moving the image by an amount corresponding to the distance .
  2. A storage unit for storing a plurality of contour coordinate data obtained from an SEM image of a pattern shape to be inspected or measured;
    A moving unit that adds position information of a field of view where the SEM image is captured to each of the plurality of contour line coordinate data;
    A correction unit that corrects a deviation between the plurality of contour coordinate data to which the position information is added;
    An integration unit that integrates the plurality of contour line coordinate data corrected by the correction unit into one wide-field contour line coordinate data;
    With
    The pattern shape is a pattern shape formed by a plurality of exposure processes,
    The plurality of contour coordinate data includes contour coordinate data of each of the plurality of exposure processes,
    The correction unit performs an expansion process or a contraction process on the outline coordinate data that is different from the design data among the outline coordinate data of the plurality of exposure processes, so that a shift between the plurality of outline coordinate data is performed. An image processing apparatus for correcting the above.
  3. A storage unit for storing a plurality of contour coordinate data obtained from an SEM image of a pattern shape to be inspected or measured;
    A moving unit that adds position information of a field of view where the SEM image is captured to each of the plurality of contour line coordinate data;
    A correction unit that corrects a deviation between the plurality of contour coordinate data to which the position information is added;
    An integration unit that integrates the plurality of contour line coordinate data corrected by the correction unit into one wide-field contour line coordinate data;
    With
    The pattern shape is a hole array including a plurality of hole shapes,
    The image processing apparatus, wherein the correction unit corrects a shift between hole shapes in the plurality of contour coordinate data by an expansion process or a reduction process.
  4. The image processing apparatus according to claim 1 .
    The correction unit corrects the plurality of overlapping pattern shapes as one pattern shape when the plurality of pattern shapes in the plurality of contour line coordinate data are overlapped by executing the expansion process. Processing equipment.
  5. The image processing apparatus according to claim 1 .
    The correction unit performs the contraction process to obtain a coordinate value that minimizes the area of the pattern shape, and when the amount of movement of the line segment or vertex constituting the pattern shape exceeds the coordinate value An image processing apparatus that replaces the movement destination of the line segment or vertex with the coordinate value.
  6. The image processing apparatus according to claim 1 .
    The image processing apparatus according to claim 1, wherein the correction unit corrects the amount of movement when an angle between line segments constituting the pattern shape in the plurality of contour coordinate data is an acute angle.
  7. In the image processing device according to any one of claims 1 to 3 ,
    The correction unit corrects a line segment constituting the pattern shape in the overlap region based on a shift amount between line segments constituting the pattern shape in the overlap region of the plurality of contour line coordinate data. An image processing apparatus.
  8. In the image processing device according to any one of claims 1 to 3 ,
    An image processing apparatus, further comprising: an inspection / measurement unit that compares the wide-field contour coordinate data with design data.
  9. The image processing apparatus according to claim 8 .
    The inspection / measurement unit outputs a deviation amount between the wide-field contour coordinate data and design data to the correction unit,
    The image processing apparatus, wherein the correction unit further corrects a shift between the plurality of contour line coordinate data based on the shift amount from the inspection / measurement unit.
  10. The image processing apparatus according to claim 1.
    The pattern shape is a pattern shape formed by a plurality of exposure processes,
    The plurality of contour coordinate data includes contour coordinate data of each of the plurality of exposure processes,
    The correction unit performs an expansion process or a contraction process on the outline coordinate data that is different from the design data among the outline coordinate data of the plurality of exposure processes, so that a shift between the plurality of outline coordinate data is performed. An image processing apparatus for correcting the above.
  11. The image processing apparatus according to claim 1.
    The pattern shape is a hole array including a plurality of hole shapes,
    The image processing apparatus, wherein the correction unit corrects a shift between hole shapes in the plurality of contour coordinate data by an expansion process or a reduction process.
  12. In the image processing device according to any one of claims 1 to 3 ,
    An image processing apparatus, further comprising: a display unit configured to superimpose or individually display the wide field outline coordinate data, design data, and a grid line indicating a field of view where the SEM image is captured.
  13. To cause an information processing apparatus including a storage unit and a calculation unit to execute processing for creating one wide-field contour coordinate data from a plurality of contour coordinate data obtained from an SEM image of a pattern shape to be inspected or measured The program of
    The storage unit stores the plurality of contour line coordinate data;
    In the calculation unit,
    A process of adding position information of a field of view on which the SEM image is captured to each of the plurality of contour line coordinate data;
    Processing for correcting a shift between the plurality of contour coordinate data to which the position information is added;
    A process for integrating the plurality of contour coordinate data corrected by the correcting process into one wide-field contour coordinate data;
    Was executed,
    The correction processing calculates a movement amount based on a deviation amount between the contour line coordinate data and design data, and moves the line segment or vertex constituting the pattern shape in the plurality of contour line coordinate data. A program comprising executing an expansion process or a contraction process by moving by an amount .
JP2012166060A 2012-07-26 2012-07-26 Image processing apparatus for widening visual field of outline data of semiconductor and computer program Active JP6027362B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2012166060A JP6027362B2 (en) 2012-07-26 2012-07-26 Image processing apparatus for widening visual field of outline data of semiconductor and computer program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2012166060A JP6027362B2 (en) 2012-07-26 2012-07-26 Image processing apparatus for widening visual field of outline data of semiconductor and computer program

Publications (2)

Publication Number Publication Date
JP2014026452A JP2014026452A (en) 2014-02-06
JP6027362B2 true JP6027362B2 (en) 2016-11-16

Family

ID=50200039

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2012166060A Active JP6027362B2 (en) 2012-07-26 2012-07-26 Image processing apparatus for widening visual field of outline data of semiconductor and computer program

Country Status (1)

Country Link
JP (1) JP6027362B2 (en)

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5408852B2 (en) * 2007-08-09 2014-02-05 株式会社日立ハイテクノロジーズ Pattern measuring device
JP5030906B2 (en) * 2008-09-11 2012-09-19 株式会社日立ハイテクノロジーズ Panorama image synthesis method and apparatus using scanning charged particle microscope
US8071943B2 (en) * 2009-02-04 2011-12-06 Advantest Corp. Mask inspection apparatus and image creation method

Also Published As

Publication number Publication date
JP2014026452A (en) 2014-02-06

Similar Documents

Publication Publication Date Title
KR101868379B1 (en) System, method and computer program product for classification within inspection images
KR101698700B1 (en) Pattern inspecting and measuring device and program
JP5254270B2 (en) Inspection method and inspection apparatus
US9013469B2 (en) Method and device for displaying a three-dimensional view of the surface of a viewed object
JP5743955B2 (en) Pattern inspection apparatus and pattern inspection method
JP5699788B2 (en) Screen area detection method and system
JP5097480B2 (en) Image measuring device
JP4997351B2 (en) Pattern inspection apparatus and method
JP4533689B2 (en) Pattern inspection method
JP4659004B2 (en) Circuit pattern inspection method and circuit pattern inspection system
JP3834041B2 (en) Learning type classification apparatus and learning type classification method
US20120057774A1 (en) Pattern generating apparatus and pattern shape evaluating apparatus
US20160196643A1 (en) Method and device for measuring features on or near an object
US6356300B1 (en) Automatic visual inspection apparatus automatic visual inspection method and recording medium having recorded an automatic visual inspection program
JP4223979B2 (en) Scanning electron microscope apparatus and reproducibility evaluation method as apparatus in scanning electron microscope apparatus
US6883160B2 (en) Pattern inspection apparatus
KR20170139613A (en) Outlier detection for a pattern image group of interest
JP4791267B2 (en) Defect inspection system
TWI475597B (en) Pattern evaluation method and pattern evaluation device
JP4615951B2 (en) Shape model creation method and structure optimization system
US8885950B2 (en) Pattern matching method and pattern matching apparatus
JP4554691B2 (en) Correction pattern image generation apparatus, pattern inspection apparatus, and correction pattern image generation method
JP5319931B2 (en) Electron microscope system and pattern dimension measuring method using the same
JP5029618B2 (en) Three-dimensional shape measuring apparatus, method and program by pattern projection method
JP4585822B2 (en) Dimension measuring method and apparatus

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20150422

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20160307

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20160315

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20160512

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20160920

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20161014

R150 Certificate of patent or registration of utility model

Ref document number: 6027362

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150