WO2021140591A1 - Image processing device, microscope system, and image processing method - Google Patents

Image processing device, microscope system, and image processing method Download PDF

Info

Publication number
WO2021140591A1
WO2021140591A1 PCT/JP2020/000318 JP2020000318W WO2021140591A1 WO 2021140591 A1 WO2021140591 A1 WO 2021140591A1 JP 2020000318 W JP2020000318 W JP 2020000318W WO 2021140591 A1 WO2021140591 A1 WO 2021140591A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
magnification
region
unit
objective lens
Prior art date
Application number
PCT/JP2020/000318
Other languages
French (fr)
Japanese (ja)
Inventor
渡辺 伸之
洋子 阿部
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Priority to PCT/JP2020/000318 priority Critical patent/WO2021140591A1/en
Publication of WO2021140591A1 publication Critical patent/WO2021140591A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to an image processing apparatus, a microscope system, and an image processing method.
  • Patent Document 1 a semiconductor inspection device for inspecting a semiconductor circuit board using a microscope is known (see, for example, Patent Document 1).
  • the semiconductor inspection device of Patent Document 1 has a function of navigating a defective position of a semiconductor circuit board by using CAD data of the semiconductor circuit board, and displays a microscopic image of the semiconductor circuit board superimposed on the CAD data.
  • a visual inspection is also performed in which an operator observes the semiconductor circuit board at a high magnification while manually operating a microscope. In the visual inspection, the operator manually moves the stage of the microscope to move the observation area of the semiconductor circuit board by the microscope, and observes the entire or wide area of the semiconductor circuit board.
  • the same or similar regions are regularly arranged. Further, in the visual inspection, a narrow range of the sample is magnified by using a high-magnification objective lens. Therefore, in the case of a high-magnification microscope image containing only the same or similar regions, the alignment of the CAD data and the high-magnification microscope image may fail. Further, in the case of a sample such as a glass substrate, the contrast of a high-magnification microscope image becomes low. Even in the case of such a microscope image, the alignment of the CAD data and the high-magnification microscope image may fail. For these reasons, it is difficult to obtain an accurate position in the imaging range of a high-magnification microscope image.
  • the present invention has been made in view of the above circumstances, and an object of the present invention is to provide an image processing apparatus, a microscope system, and an image processing method capable of accurately acquiring the position of a shooting range of a high-magnification image. And.
  • the present invention provides the following means.
  • a first image and a second image of a sample are input, and the first image and the second image are passed through objective lenses having a first magnification and a second magnification, respectively.
  • the first magnification is lower than the second magnification, and the optical axis of the objective lens having the first magnification and the optical axis of the objective lens having the second magnification are
  • a sample image input unit which is the same as or substantially the same as each other, and a reference image corresponding to the first image and the second image are input, and the reference image is the shooting range of the first image and the reference image.
  • the imaging range of the second image of the reference image is This is an image processing device including a position acquisition unit for acquiring the position of the second region corresponding to the above.
  • the first image and the second image of the sample are input to the sample image input unit, and the reference image of the sample is input to the reference image input unit.
  • the alignment unit estimates a first region corresponding to the shooting range of the first image in the reference image. Since the first image and the second image are microscope images taken through an objective lens having the same or substantially the same optical axis, the center position of the first region and the second image in the reference image The center positions of the second region corresponding to the imaging range coincide with or substantially coincide with each other. Therefore, the position acquisition unit can acquire the position of the second region in the reference image by referring to the estimated first region.
  • the shooting range of the low-magnification first image is wider than the shooting range of the high-magnification second image
  • the first image contains many feature points effective for alignment with respect to the reference image. Probability is high. Therefore, the first region can be estimated with high position accuracy. Then, the exact position of the second region in the reference image can be obtained based on the relationship that the center position of the first region and the center position of the second region coincide with each other or substantially match each other. ..
  • the position acquisition unit acquires a position deviated from the center position of the first region by a predetermined distance as the center position of the second region, and the predetermined distance is the first magnification.
  • the distance between the second magnification and the second magnification may be a distance corresponding to the amount of displacement of the optical axis of the objective lens.
  • the optical axis of the objective lens of the first magnification and the optical axis of the objective lens of the second magnification are changed. It can be slightly displaced from each other in the direction perpendicular to the optical axis.
  • the amount of displacement of the optical axis is substantially constant, and the amount of deviation of the center position between the first region and the second region is also substantially constant. Therefore, the amount of deviation of the center position can be estimated in advance from the amount of displacement of the optical axis, and the position deviated by a predetermined distance from the center position of the first region can be acquired as the center position of the second region. ..
  • the position acquisition unit estimates the second region by collating the second image with the first region and acquires the estimated position of the second region. Good.
  • a more accurate position of the second region is obtained by collating the second image with the first region and searching for the region corresponding to the second image from near the center of the first region. Can be obtained.
  • the search range is limited to the first region, the time and the amount of calculation required for estimating the second region can be reduced as compared with the case where the second image is collated with the reference image. it can.
  • the position acquisition unit corresponds to the second image within a range of a predetermined distance from the center position of the first region in collation of the second image with the first region.
  • a second region is searched, and the predetermined distance is a distance corresponding to a guaranteed range of the amount of displacement of the optical axis between the objective lens of the first magnification and the objective lens of the second magnification. May be good.
  • the amount of displacement of the center position between the shooting range of the first image and the shooting range of the second image is the amount of the optical axis between the objective lens of the first magnification and the objective lens of the second magnification. It falls within the guaranteed range of displacement. Therefore, by limiting the search area of the second image with respect to the first area when estimating the second area to the range corresponding to the guaranteed range, the second area is estimated efficiently and surely. be able to.
  • the sample may be a semiconductor circuit board
  • the reference image may be an image of a design drawing of the semiconductor circuit board
  • Another aspect of the present invention is through an objective lens arranged to face the sample, a magnification changing portion for changing the magnification of the objective lens between a first magnification and a second magnification, and the objective lens.
  • a microscope system including an imaging unit for imaging the sample, an optical microscope having the first magnification lower than the second magnification, and an image processing apparatus according to any one of the above.
  • an image acquisition unit that acquires an image captured by the imaging unit from the imaging unit, and the image acquisition unit interlocks with the operation of changing the magnification of the objective lens of the magnification changing unit. Even if the first image and the second image are selected from the images acquired from the imaging unit and the selected first image and the second image are input to the sample image input unit. Good. According to this configuration, it is possible to automate the process of selecting the first image and the second image from the images captured by the imaging unit and inputting them to the sample image input unit.
  • the optical microscope and the control unit for controlling the image acquisition unit are further provided, and the image processing apparatus evaluates whether or not the position of the second region is accurately acquired by the position acquisition unit.
  • An evaluation unit is further provided, and when it is evaluated that the position of the second region is not accurately acquired by the evaluation unit, the control unit controls the optical microscope and the image acquisition unit.
  • the first image may be imaged again and input to the sample image input unit, and the alignment unit may estimate the first region using the first image captured again. According to this configuration, if the position of the second region is not accurately acquired due to a failure in estimating the first region or the like, it is evaluated that the position of the second region is not accurately acquired.
  • the first image is imaged again, the first region is re-estimated, and the position of the second region is re-acquired based on the re-estimated first region. In this way, the evaluation and reacquisition of the position acquisition of the second region can be automated.
  • Another aspect of the present invention is an image processing method for processing a first image and a second image of a sample, wherein the first image and the second image have a first magnification and a second image. It is a microscope image taken through each of the objective lenses of the magnification, the first magnification is lower than the second magnification, and the optical axis of the objective lens of the first magnification and the second magnification The optical axes of the objective lenses are the same or substantially the same as each other, the first image and the second image are input, and the reference image corresponding to the first image and the second image is input.
  • the reference image is an image of the sample having a range wider than the shooting range of the first image and the shooting range of the second image, and by collating the first image with the reference image, A first region corresponding to the shooting range of the first image of the reference image is estimated, and based on the estimated first region, the second image of the reference image is shot.
  • This is an image processing method for acquiring the position of a second region corresponding to a range.
  • the image processing apparatus 1 displays the microscopic images A and B (see FIGS. 2B and 2C) of the sample S (see FIG. 2A) in the observation of the sample S using an optical microscope, for example, in the visual inspection of the semiconductor circuit substrate S. It processes and presents and notifies the operator or the system of the position of the imaging range of the microscope image B.
  • the image processing device 1 includes a sample image input unit 2, a reference image input unit 3, an alignment unit 4, and a position acquisition unit 5.
  • An example of an image processing device 1 is a computer including a processor and a storage unit having a RAM, a ROM, and any other storage device.
  • An image processing program is stored in the storage unit. When the processor executes the process according to the image processing program, the functions described later in each part 2, 3, 4, and 5 are realized.
  • the sample image input unit 2 is connected to, for example, a camera attached to an optical microscope, and the first image A and the second image B are input from the camera.
  • the camera captures an optical image of the sample S formed by the objective lens of the optical microscope, and acquires a first image A and a second image B which are digital images of the sample S.
  • the sample S is a semiconductor circuit board such as a DRAM or an imager. A large number of identical or similar circuit patterns are regularly arranged on the semiconductor circuit board S.
  • the first image A is an image captured through the objective lens of the first magnification
  • the second image B is an image captured through the objective lens of the second magnification.
  • the second magnification is a high magnification for visual inspection of the sample S, and the first magnification is lower than the second magnification. Therefore, the shooting range of the first image A is wider than the shooting range of the second image B.
  • the area R1 is the first area corresponding to the shooting range of the first image A
  • the area R2 is the second area corresponding to the shooting range of the second image B.
  • FIG. 3 illustrates a method of determining the first magnification and the photographing region R1.
  • the sample S includes a repeating region P1 in which the same or similar circuit pattern is regularly repeated, and a characteristic region P2 located around the repeating region P1 and having a characteristic circuit pattern.
  • the first magnification and the imaging region R1 are determined so that the imaging range of the first image A includes the boundary region P3 between the repeating region P1 and the feature region P2.
  • an optical microscope includes a first objective lens having a first magnification, a second objective lens having a second magnification, and a revolver holding the first objective lens and the second objective lens. The rotation of the revolver causes the first objective lens and the second objective lens to be selectively arranged on the optical path.
  • the optical microscope may include an objective lens having a zoom function capable of changing the magnification between the first magnification and the second magnification. Therefore, the optical axis of the objective lens of the first magnification and the optical axis of the objective lens of the second magnification are the same as or substantially the same as each other, and therefore, as shown in FIG. 2A, the first image A The center position of the photographing range of the second image B and the center position of the photographing range of the second image B are the same or substantially the same as each other.
  • the reference image input unit 3 is connected to, for example, an external device of the image processing device 1, and the reference image C corresponding to the first image A and the second image B is input from the external device.
  • the image processing device 1 may include a memory for storing the reference image C, and the reference image C may be input from the memory to the reference image input unit 3.
  • the reference image C is an image of the sample S having a range wider than the shooting range of the first image A and the shooting range of the second image B.
  • the reference image C may be an image of the design drawing of the semiconductor circuit board, and the CAD data of the design drawing may be input to the reference image input unit 3.
  • the reference image C may be a microscope image of the sample S taken through an objective lens having a magnification lower than the first magnification.
  • the alignment unit 4 receives the first image A and the reference image C from the image input units 2 and 3, and aligns the first image A with respect to the reference image C. That is, the alignment unit 4 collates the first image A with the reference image C, and searches and finds a region having the highest degree of similarity with the first image A from within the reference image C. Then, the alignment unit 4 estimates that the region in which the first image A is aligned is the first region R1.
  • An existing alignment algorithm is used to align the first image A with respect to the reference image C.
  • an edge image is created from the first image A by detecting the edge of the first image A.
  • Edge detection can be done by classical filtering or deep learning edge estimation (eg, Ruohui Wang, Edge Detection Using Convolutional Neural Network, Advances in Neural Network-ISNN2016, 13th International Symposium on Neural Network, pp12-20, 2016). See.) Etc. are used.
  • the kernel of a predetermined size (m pixels ⁇ n pixels) of the edge image or the reference image C is moved and rotated, and SAD (Sum of Absolute Difference), which is the sum of the absolute values of the differences between the pixel values, is calculated.
  • the size of the kernel is adjusted based on the size of the first image A obtained from the magnification of the optical system of the optical microscope and the actual size information of the reference image C.
  • the position acquisition unit 5 acquires the position of the second region R2 in the reference image C based on the first region R1 estimated by the alignment unit 4. Specifically, the position acquisition unit 5 has the size of the first region R1 to the second region based on the size ratio between the shooting range of the first image A and the shooting range of the second image B. Calculate the size of R2. Further, the position acquisition unit 5 determines the center position of the first region R1 to the center position of the second region R2. As a result, the position acquisition unit 5 can obtain the position of the second region R2 in the reference image C.
  • the size ratio of the photographing range is calculated from, for example, information on the actual field of view of the objective lens of the first magnification and the actual field of view of the objective lens of the second magnification.
  • the information regarding the actual field of view is, for example, numerical values of the first magnification and the second magnification, and is input from the optical microscope to the image processing device 1.
  • the position acquisition unit 5 outputs information on the position of the second image B and the second region R2.
  • the output information on the positions of the second image B and the second region R2 is presented to the operator, for example, by being displayed on a display device connected to the image processing device 1.
  • the operator observes a wide range of the sample S at a low magnification through the objective lens of the first magnification, and determines the observation area of the sample S. Subsequently, the operator switches the magnification of the objective lens to the second magnification while maintaining the position of the sample S, and observes the observation region at a high magnification.
  • the first image A and the second image B are acquired by the camera before and after the magnification is switched, and the first image A and the second image B are input from the camera to the image processing device 1 (step S1). Further, the reference image C is input to the image processing device 1 (step S2).
  • the first image A is sent to the alignment unit 4 via the sample image input unit 2
  • the second image B is sent to the position acquisition unit 4 via the sample image input unit 2.
  • the reference image C is sent to the alignment unit 4 and the position acquisition unit 5 via the reference image input unit 3.
  • the first image A is aligned with the reference image C (step S3), and the first region R1 in the reference image C is estimated (step S4).
  • the position acquisition unit 5 acquires the position of the second region R2 in the reference image C with reference to the estimated first region R1 (step S5).
  • the estimation results of the positions of the second image B and the second region R2 are output from the image processing device 1 and presented to the operator (step S6).
  • the operator collates the design drawing which is the reference image C with the second image B which is the actual image of the sample S. It is possible to detect defects and the like by strictly comparing with.
  • the high magnification second image B may include only identical or similar regions.
  • the contrast of the high-magnification second image B becomes low. Even when such a second image B is aligned with respect to the reference image C, misalignment is likely to occur.
  • the low-magnification first image A includes, in addition to the same or similar regions, characteristic regions that are effective for alignment with respect to the reference image C.
  • the contrast of the low-magnification first image A is often higher than that of the second image B because of the relationship between the optical resolution and the imager's resolution.
  • the variation of the evaluation function with respect to the misalignment becomes large. Therefore, the first image A can be aligned with the reference image C with high positional accuracy.
  • Optical microscopes used for visual inspection use multiple objective lenses or zoom optics with different magnifications, with the optical axes aligned or substantially matched within the guaranteed accuracy of the optical microscope, at two magnifications. Images can be acquired. That is, the positional relationship of the shooting range between the low-magnification first image A and the high-magnification second image B is guaranteed. Further, the size relationship between the first region R1 and the second region R2 in the reference image C is defined by the actual field of view of the objective lens of the first magnification and the actual field of view of the objective lens of the second magnification. Will be done. By utilizing such a positional relationship and a size relationship, it is possible to obtain an accurate position of the shooting range of the second image B with a high magnification from the position of the first region R estimated with high accuracy.
  • the position acquisition unit 5 acquires the center position of the first region R1 as the center position of the second region R2, but instead, from the center position of the first region R1. A position deviated by a predetermined distance may be acquired as the center position of the second region R2.
  • the predetermined distance is a distance according to the amount of displacement between the optical axes of the objective lens having the first magnification and the objective lens having the second magnification.
  • the optical axis may be slightly displaced in the direction orthogonal to the optical axis. A deviation occurs between the center position of the first region R1 and the center position of the second region R2 by the distance corresponding to the displacement amount of the optical axis. Since the displacement amount and the displacement direction of the optical axis of the objective lens are substantially constant, the displacement amount and the displacement direction between the center position of the first region R1 and the center position of the second region R2 are substantially constant. It can be obtained in advance.
  • the predetermined distance may be fixed, but may be corrected each time.
  • the amount of displacement of the optical axis of the objective lens when the magnification is changed may be different each time the magnification is changed. Therefore, for example, the displacement amount of the optical axis may be measured each time the magnification is changed, and a predetermined distance may be corrected based on the measured displacement amount.
  • the image processing apparatus 10 and the image processing method according to the second embodiment of the present invention will be described with reference to the drawings.
  • the image processing device 10 includes a sample image input unit 2, a reference image input unit 3, a first alignment unit 41, and a second alignment unit 51.
  • the operation of the image processing device 10 corresponding to the image processing method according to the present embodiment is that the image processing device 1 of the first embodiment performs step S7 instead of step S5. It's different.
  • the first alignment unit 41 is the same as the alignment unit 4 of the first embodiment.
  • the second alignment unit 51 is a position acquisition unit that acquires the position of the second region R2 in the reference image C, but the second region is different from the position acquisition unit 5 of the first embodiment. Acquire the position of R2.
  • the second alignment unit 51 aligns the second image B with respect to the first region R1 in the reference image C by using the same method as the alignment unit 4 (step S7). That is, the second alignment unit 51 collates the second image B with the first region R1 and searches for the region having the highest degree of similarity between the first region R1 and the second image B. And find out. Then, the second alignment unit 51 estimates that the region in which the second image B is aligned is the second region R2, and acquires the estimated position of the second region R2 (step S6). ).
  • the search range of the second region R is within the first region R1 as compared with the case where the second image B is collated with the reference image C to estimate the second region R2. Be restricted. Therefore, the time required for the alignment of the second image B can be shortened. In addition, it is possible to prevent misalignment due to a repeating pattern of the same or similar regions in the sample S, and accurately estimate the second region R2.
  • the search range in the alignment of the second image B may be the entire first region R1, but is narrower than the first region R1 centered on the center of the first region R1. It may be a range.
  • the second alignment unit 51 searches for the second region R2 within a predetermined distance from the center position of the first region R1 in collation with the first region R1 of the second image B. You may.
  • the predetermined distance is a distance corresponding to the guaranteed range of the displacement amount of the optical axis of the objective lens 62 between the first magnification and the second magnification.
  • the amount of deviation of the center position between the shooting range of the first image A and the shooting range of the second image B is the displacement of the optical axis of the objective lens between the first magnification and the second magnification. It fits within the guaranteed amount. Therefore, by limiting the search area of the second image B in the alignment of the second region B to the range corresponding to the guaranteed range, the calculation time and the load required for the alignment can be suppressed, and the calculation time and the load required for the alignment can be suppressed, which is efficient and reliable.
  • the second region R2 can be estimated.
  • the microscope system 100 and the image processing method according to the third embodiment of the present invention will be described with reference to the drawings.
  • the microscope system 100 is connected to the optical microscope 60, the image processing device 20, the optical microscope 60, and the image processing device 20, and controls the optical microscope 60 and the image processing device 20.
  • a control unit 80 is provided.
  • the operation of the image processing apparatus 20 corresponding to the image processing method according to the present embodiment is that the image processing apparatus of the first embodiment performs steps S8 to S10 instead of step S1. Different from 1.
  • the optical microscope 60 is a magnification changing unit that changes the magnification of the stage 61, the objective lens 62 arranged to face the sample on the stage 61, and the objective lens 62 between the first magnification and the second magnification.
  • a 63 and a digital camera (imaging unit) 64 that captures the sample S through the objective lens 62 are provided.
  • the optical microscope 60 notifies the main controller 80 of the timing of changing the magnification by the magnification changing unit 63.
  • the magnification changing unit 63 is a revolver that holds the objective lens 62 of the first magnification and the objective lens 62 of the second magnification, and by rotating the revolver 63, it faces the sample and observes the sample.
  • the magnification of the objective lens 62 used is changed.
  • the magnification changing unit 63 may be the objective lens 62.
  • a mechanism for detecting the state of the objective lens 62 may be provided in order to obtain information on the magnification.
  • the image processing device 20 includes an image acquisition unit 6 in addition to the sample image input unit 2, the reference image input unit 3, the alignment unit 4, and the position acquisition unit 5.
  • the image processing device 20 may include a first alignment unit 41 and a second alignment unit 51 in place of the alignment unit 4 and the position acquisition unit 5.
  • the image acquisition unit 6 acquires an image from the camera 64 (step S8), and in conjunction with the operation of changing the magnification of the objective lens 62 of the magnification changing unit 63, the first image is taken from the image.
  • Image A and the second image B are selected (step S9), and the selected first image A and the second image B are transferred to the sample image input unit 2 and input (step S10).
  • the image acquisition unit 6 has a FIFO (first in, first out) memory 6a.
  • the camera 64 captures the sample S at regular time intervals and transmits the acquired image to the FIFO memory 6a.
  • the FIFO memory 6a holds the latest plurality of images for a certain period of time.
  • the main controller 80 causes the image acquisition unit 6 to select the first image A and the second image B in response to the notification from the optical microscope 60. For example, the operator observes a wide range of the sample S at the first magnification to determine the observation area, switches from the first magnification to the second magnification, and observes the observation area at a high magnification. In this case, based on the notification that the first magnification is changed to the second magnification at time T, the image acquisition unit 6 selects the time immediately before time T from the images stored in the FIFA memory 6a. The image acquired at T- ⁇ 1 is selected as the first image A, and the image acquired at time T + ⁇ 2 immediately after the time T is selected as the second image B. In this way, a pair of images A and B whose centers of the photographing range coincide with each other or substantially coincide with each other can be specified and selected based on the timing of changing the magnification by the magnification changing unit 63.
  • the images A and B selected do not necessarily have to be the images immediately before and after the time T, and are within the period in which the objective lens 62 of the first magnification and the objective lens 62 of the second magnification are used, respectively. Any acquired image may be used.
  • a and the second image B are automatically selected, and a pair of the first image A and the second image B are automatically input to the sample image input unit 2. This eliminates the need for the operator to select the first image A and the second image B, and can improve the efficiency of observing the sample S.
  • the image processing device 20 does not necessarily have to include the image acquisition unit 6.
  • the sample image input unit is caused by an operator operating the camera 64 to execute acquisition of the first image A and the second image B immediately before and immediately after switching from the first magnification to the second magnification. Images A and B may be input to 2.
  • the operator selects a pair of the first image A and the second image B from the images acquired by the camera 64, and the selected images A and B are input to the sample image input unit 2. It may be configured in.
  • the microscope system 200 and the image processing method according to the fourth embodiment of the present invention will be described with reference to the drawings.
  • configurations different from those of the first to third embodiments will be described, and configurations common to the first to third embodiments will be designated by the same reference numerals and description thereof will be omitted.
  • the microscope system 200 evaluates whether or not the position of the second region R2 has been accurately acquired by the position acquisition unit 51, and if it is not accurate, the first image A is automatically reacquired. It is different from the third embodiment.
  • the microscope system 200 includes an optical microscope 60, an image processing device 30, and a main controller (control unit) 80 connected to the optical microscope 60 and the image processing device 30.
  • the operation of the image processing apparatus 30 corresponding to the image processing method according to the present embodiment is that steps S11 to S16 are added after the alignment of the second image in step S7. Is different from the third embodiment.
  • the image processing device 30 includes an evaluation unit 7 in addition to the sample image input unit 2, the reference image input unit 3, the first alignment unit 41, the second alignment unit 51, and the image acquisition unit 6.
  • the second image B is not properly aligned with respect to the first region R1 due to a failure of the alignment of the first image A with respect to the reference image C, and as a result, the position of the second region R2 is changed. It may not be acquired accurately.
  • the evaluation unit 7 evaluates whether or not the position of the second region R2 is accurately acquired by the second alignment unit 51 (step S11). Specifically, the evaluation unit 7 has a second with respect to the first region R1 based on the degree of similarity between the second image B and the second region R2 estimated by the second alignment unit 51. It is evaluated whether or not the alignment of the image B of 2 is performed accurately.
  • the evaluation unit 7 calculates the SAD between the second image B and the second region R2.
  • the evaluation unit 7 evaluates that the alignment of the second image B with respect to the first region R1 has been performed accurately (YES in step S11).
  • the evaluation unit 7 notifies the second alignment unit 51 of the accurate evaluation result, and the second alignment unit 51 provides information on the position of the second region R2.
  • Output step S6.
  • the evaluation unit 7 evaluates that the alignment of the second image B with respect to the first region R1 has not been performed accurately (NO in step S11).
  • the evaluation unit 7 notifies the main controller 80 of the inaccurate evaluation result (step S12), and the second image B is held in the memory 8 (step S13). ..
  • the main controller 80 responds to the notification of the inaccurate evaluation result and controls the magnification changing unit 63 and the camera 64 of the optical microscope 60 to re-image the first image A and input it to the sample image input unit 2.
  • the main controller 80 transmits a magnification change signal for changing the magnification of the objective lens 62 to the first magnification to the optical microscope 60.
  • the magnification changing unit 63 changes the magnification of the objective lens 62 to the first magnification in response to the magnification change signal, and then the camera 64 re-images the first image A.
  • the image acquisition unit 6 selects the first image A in conjunction with the change to the first magnification and inputs it to the sample image input unit 2 (steps S14 to S16).
  • the first alignment unit 41 re-estimates the first region R1 using the re-imaged first image A (steps S3 and S4), and the second alignment unit 51 is re-estimated.
  • the position of the second region R2 is acquired again (step S7).
  • the estimation accuracy of the second region R2 is evaluated, and when the estimation accuracy is low, the image imaging, the estimation of the first region, and the estimation of the second region are re-executed. Will be done. As a result, the exact position of the second region R2 can be reliably obtained.
  • the optical microscope 60 automatically switches the magnification of the objective lens 62, but instead of this, the work The person may manually switch the magnification of the objective lens 62 based on, for example, an inaccurate evaluation result displayed on the display device.
  • the image processing device 30 is provided with the first alignment unit 41 and the second alignment unit 51, but instead of this, it is provided with the alignment unit 4 and the position acquisition unit 5. You may.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Microscoopes, Condenser (AREA)

Abstract

An image processing device (1) comprising: a sample image input unit (2) to which first and second images of a sample are inputted, the first image and the second image being microscope images captured respectively through an objective lens having a first magnification and an objective lens having a second magnification that is higher than the first magnification; a reference image input unit (3) to which a reference image corresponding to the first and second images is inputted, the reference image being an image of the sample in a wider range than the imaging range of the first and second images; a positioning unit (4) for estimating a first region in the reference image, the first region corresponding to the imaging range of the first image, by comparing the first image with the reference image; and a position acquisition unit (5) for acquiring the position of a second region in the reference image, the second region corresponding to the imaging range of the second image, on the basis of the estimated first region.

Description

画像処理装置、顕微鏡システムおよび画像処理方法Image processing equipment, microscope system and image processing method
 本発明は、画像処理装置、顕微鏡システムおよび画像処理方法に関するものである。 The present invention relates to an image processing apparatus, a microscope system, and an image processing method.
 従来、顕微鏡を使用して半導体回路基板を検査する半導体検査装置が知られている(例えば、特許文献1参照。)。特許文献1の半導体検査装置は、半導体回路基板のCADデータを使用して半導体回路基板の不良位置をナビゲーションする機能を有し、CADデータ上に半導体回路基板の顕微鏡画像を重ねて表示する。
 一方、半導体回路基板の検査として、作業者が顕微鏡を手動操作しながら半導体回路基板を高倍率で観察する目視検査も行われている。目視検査において、作業者は、顕微鏡のステージを手動で移動させることによって顕微鏡による半導体回路基板の観察領域を移動させ、半導体回路基板の全体または広範囲を観察する。
Conventionally, a semiconductor inspection device for inspecting a semiconductor circuit board using a microscope is known (see, for example, Patent Document 1). The semiconductor inspection device of Patent Document 1 has a function of navigating a defective position of a semiconductor circuit board by using CAD data of the semiconductor circuit board, and displays a microscopic image of the semiconductor circuit board superimposed on the CAD data.
On the other hand, as an inspection of a semiconductor circuit board, a visual inspection is also performed in which an operator observes the semiconductor circuit board at a high magnification while manually operating a microscope. In the visual inspection, the operator manually moves the stage of the microscope to move the observation area of the semiconductor circuit board by the microscope, and observes the entire or wide area of the semiconductor circuit board.
特開2007-115991号公報JP-A-2007-115991
 CADデータに高倍率の顕微鏡画像を重ねて表示するためには、顕微鏡画像の撮影範囲の位置の情報が必要である。特許文献1のように電動ステージの制御によって顕微鏡の観察領域の位置を制御する場合、顕微鏡画像の撮影範囲を電動ステージの位置から算出することができる。これに対し、作業者が手動で試料の観察領域を移動させる場合、顕微鏡画像の撮影範囲の位置を、画像処理によるCADデータと高倍率の顕微鏡画像との位置合わせによって取得する必要がある。 In order to superimpose a high-magnification microscope image on CAD data and display it, information on the position of the imaging range of the microscope image is required. When the position of the observation area of the microscope is controlled by controlling the electric stage as in Patent Document 1, the imaging range of the microscope image can be calculated from the position of the electric stage. On the other hand, when the operator manually moves the observation area of the sample, it is necessary to acquire the position of the imaging range of the microscope image by aligning the CAD data by image processing with the high-magnification microscope image.
 DRAMまたはイメージャ等の半導体回路基板では、同一または類似の領域が規則的に配列している。さらに、目視検査では、高倍率の対物レンズを使用して試料の狭い範囲が拡大観察される。したがって、同一または類似の領域のみを含む高倍率の顕微鏡画像の場合、CADデータと高倍率の顕微鏡画像との位置合わせが失敗することがある。また、ガラス基板のような試料の場合、高倍率の顕微鏡画像のコントラストが低くなる。このような顕微鏡画像の場合も、CADデータと高倍率の顕微鏡画像との位置合わせが失敗することがある。これらの理由から、高倍率の顕微鏡画像の撮影範囲の正確な位置を取得することが難しい。 In semiconductor circuit boards such as DRAM or imager, the same or similar regions are regularly arranged. Further, in the visual inspection, a narrow range of the sample is magnified by using a high-magnification objective lens. Therefore, in the case of a high-magnification microscope image containing only the same or similar regions, the alignment of the CAD data and the high-magnification microscope image may fail. Further, in the case of a sample such as a glass substrate, the contrast of a high-magnification microscope image becomes low. Even in the case of such a microscope image, the alignment of the CAD data and the high-magnification microscope image may fail. For these reasons, it is difficult to obtain an accurate position in the imaging range of a high-magnification microscope image.
 本発明は、上述した事情に鑑みてなされたものであって、高倍率の画像の撮影範囲の位置を正確に取得することができる画像処理装置、顕微鏡システムおよび画像処理方法を提供することを目的とする。 The present invention has been made in view of the above circumstances, and an object of the present invention is to provide an image processing apparatus, a microscope system, and an image processing method capable of accurately acquiring the position of a shooting range of a high-magnification image. And.
 上記目的を達成するため、本発明は以下の手段を提供する。
 本発明の一態様は、試料の第1の画像および第2の画像が入力され、前記第1の画像および前記第2の画像は、第1の倍率および第2の倍率の対物レンズをそれぞれ通して撮像された顕微鏡画像であり、前記第1の倍率が前記第2の倍率よりも低く、前記第1の倍率の前記対物レンズの光軸と前記第2の倍率の前記対物レンズの光軸は相互に同一または略同一である、試料画像入力部と、前記第1の画像および前記第2の画像に対応する基準画像が入力され、該基準画像は、前記第1の画像の撮影範囲および前記第2の画像の撮影範囲よりも広い範囲の前記試料の画像である、基準画像入力部と、前記第1の画像を前記基準画像と照合することによって、前記基準画像のうち、前記第1の画像の撮影範囲に対応する第1の領域を推定する位置合わせ部と、該位置合わせ部によって推定された前記第1の領域に基づいて、前記基準画像のうち、前記第2の画像の撮影範囲に対応する第2の領域の位置を取得する位置取得部と、を備える画像処理装置である。
In order to achieve the above object, the present invention provides the following means.
In one aspect of the present invention, a first image and a second image of a sample are input, and the first image and the second image are passed through objective lenses having a first magnification and a second magnification, respectively. The first magnification is lower than the second magnification, and the optical axis of the objective lens having the first magnification and the optical axis of the objective lens having the second magnification are A sample image input unit, which is the same as or substantially the same as each other, and a reference image corresponding to the first image and the second image are input, and the reference image is the shooting range of the first image and the reference image. By collating the reference image input unit, which is an image of the sample in a range wider than the photographing range of the second image, and the first image with the reference image, the first of the reference images Based on the alignment unit that estimates the first region corresponding to the imaging range of the image and the first region estimated by the alignment unit, the imaging range of the second image of the reference image is This is an image processing device including a position acquisition unit for acquiring the position of the second region corresponding to the above.
 本態様によれば、試料画像入力部に試料の第1の画像および第2の画像が入力され、基準画像入力部に試料の基準画像が入力される。次に、位置合わせ部によって、基準画像内の第1の画像の撮影範囲に対応する第1の領域が推定される。
 第1の画像および第2の画像は、同一または略同一の光軸を有する対物レンズを通して撮像された顕微鏡画像であるので、第1の領域の中心位置と、基準画像内の第2の画像の撮影範囲に対応する第2の領域の中心位置は、相互に一致または略一致する。したがって、位置取得部は、推定された第1の領域を参照して、基準画像内の第2の領域の位置を取得することができる。
According to this aspect, the first image and the second image of the sample are input to the sample image input unit, and the reference image of the sample is input to the reference image input unit. Next, the alignment unit estimates a first region corresponding to the shooting range of the first image in the reference image.
Since the first image and the second image are microscope images taken through an objective lens having the same or substantially the same optical axis, the center position of the first region and the second image in the reference image The center positions of the second region corresponding to the imaging range coincide with or substantially coincide with each other. Therefore, the position acquisition unit can acquire the position of the second region in the reference image by referring to the estimated first region.
 ここで、低倍率の第1の画像の撮影範囲は、高倍率の第2の画像の撮影範囲に比べて広いので、第1の画像は、基準画像に対する位置合わせに有効な特徴点を多く含む蓋然性が高い。したがって、第1の領域を高い位置精度で推定することができる。そして、第1の領域の中心位置と第2の領域の中心位置とが相互に一致または略一致するという関係に基づいて、基準画像内の第2の領域の正確な位置を取得することができる。 Here, since the shooting range of the low-magnification first image is wider than the shooting range of the high-magnification second image, the first image contains many feature points effective for alignment with respect to the reference image. Probability is high. Therefore, the first region can be estimated with high position accuracy. Then, the exact position of the second region in the reference image can be obtained based on the relationship that the center position of the first region and the center position of the second region coincide with each other or substantially match each other. ..
 上記態様において、前記位置取得部が、前記第1の領域の中心位置から所定の距離だけずれた位置を前記第2の領域の中心位置として取得し、前記所定の距離が、前記第1の倍率と前記第2の倍率との間の前記対物レンズの光軸の変位量に応じた距離であってもよい。
 例えば、光学顕微鏡のレボルバの回転によって、または対物レンズのズーム機能によって対物レンズの倍率が変更される場合、第1の倍率の対物レンズの光軸と第2の倍率の対物レンズの光軸が、光軸に直交する方向に相互にわずかに変位し得る。この場合、光軸の変位量は略一定であり、第1の領域と第2の領域との間の中心位置のずれ量も略一定である。したがって、中心位置のずれ量を光軸の変位量から予め見積もることができ、第1の領域の中心位置から所定の距離だけずれた位置を、第2の領域の中心位置として取得することができる。
In the above aspect, the position acquisition unit acquires a position deviated from the center position of the first region by a predetermined distance as the center position of the second region, and the predetermined distance is the first magnification. The distance between the second magnification and the second magnification may be a distance corresponding to the amount of displacement of the optical axis of the objective lens.
For example, when the magnification of the objective lens is changed by the rotation of the revolver of the optical microscope or by the zoom function of the objective lens, the optical axis of the objective lens of the first magnification and the optical axis of the objective lens of the second magnification are changed. It can be slightly displaced from each other in the direction perpendicular to the optical axis. In this case, the amount of displacement of the optical axis is substantially constant, and the amount of deviation of the center position between the first region and the second region is also substantially constant. Therefore, the amount of deviation of the center position can be estimated in advance from the amount of displacement of the optical axis, and the position deviated by a predetermined distance from the center position of the first region can be acquired as the center position of the second region. ..
 上記態様において、前記位置取得部が、前記第2の画像を前記第1の領域と照合することによって前記第2の領域を推定し、推定された前記第2の領域の位置を取得してもよい。
 この構成によれば、第2の画像を第1の領域と照合し、第1の領域の中心付近から第2の画像に対応する領域を探索することによって、第2の領域のより正確な位置を取得することができる。また、探索範囲は第1の領域内に制限されるので、第2の画像を基準画像と照合する場合と比較して、第2の領域の推定に必要な時間および計算量を低減することができる。
In the above aspect, even if the position acquisition unit estimates the second region by collating the second image with the first region and acquires the estimated position of the second region. Good.
According to this configuration, a more accurate position of the second region is obtained by collating the second image with the first region and searching for the region corresponding to the second image from near the center of the first region. Can be obtained. Further, since the search range is limited to the first region, the time and the amount of calculation required for estimating the second region can be reduced as compared with the case where the second image is collated with the reference image. it can.
 上記態様において、前記位置取得部が、前記第2の画像と前記第1の領域との照合において、前記第1の領域の中心位置から所定の距離の範囲で前記第2の画像に対応する前記第2の領域を探索し、前記所定の距離が、前記第1の倍率の対物レンズと前記第2の倍率の対物レンズとの間の光軸の変位量の保証範囲に対応する距離であってもよい。
 通常、第1の画像の撮影範囲と第2の画像の撮影範囲との間の中心位置のずれ量は、第1の倍率の対物レンズと第2の倍率の対物レンズとの間の光軸の変位量の保証範囲内におさまる。したがって、第2の領域を推定する際の第1の領域に対する第2の画像の探索領域を保証範囲に対応する範囲に制限することによって、効率的にかつ確実に、第2の領域を推定することができる。
In the above aspect, the position acquisition unit corresponds to the second image within a range of a predetermined distance from the center position of the first region in collation of the second image with the first region. A second region is searched, and the predetermined distance is a distance corresponding to a guaranteed range of the amount of displacement of the optical axis between the objective lens of the first magnification and the objective lens of the second magnification. May be good.
Normally, the amount of displacement of the center position between the shooting range of the first image and the shooting range of the second image is the amount of the optical axis between the objective lens of the first magnification and the objective lens of the second magnification. It falls within the guaranteed range of displacement. Therefore, by limiting the search area of the second image with respect to the first area when estimating the second area to the range corresponding to the guaranteed range, the second area is estimated efficiently and surely. be able to.
 上記態様において、前記試料が、半導体回路基板であり、前記基準画像が、前記半導体回路基板の設計図面の画像であってもよい。 In the above aspect, the sample may be a semiconductor circuit board, and the reference image may be an image of a design drawing of the semiconductor circuit board.
 本発明の他の態様は、試料と対向して配置される対物レンズと、該対物レンズの倍率を第1の倍率と第2の倍率との間で変更する倍率変更部と、前記対物レンズを通して前記試料を撮像する撮像部と、を有し、前記第1の倍率が前記第2の倍率よりも低い、光学顕微鏡と、上記いずれかに記載の画像処理装置と、を備える顕微鏡システムである。 Another aspect of the present invention is through an objective lens arranged to face the sample, a magnification changing portion for changing the magnification of the objective lens between a first magnification and a second magnification, and the objective lens. A microscope system including an imaging unit for imaging the sample, an optical microscope having the first magnification lower than the second magnification, and an image processing apparatus according to any one of the above.
 上記態様において、前記撮像部から該撮像部によって撮像された画像を取得する画像取得部を備え、該画像取得部が、前記倍率変更部の前記対物レンズの倍率の変更の動作と連動して、前記撮像部から取得した画像の中から前記第1の画像および前記第2の画像を選択し、選択された前記第1の画像および前記第2の画像を前記試料画像入力部に入力してもよい。
 この構成によれば、撮像部によって撮像された画像の中から第1の画像および第2の画像を選択し試料画像入力部に入力する処理を自動化することができる。
In the above aspect, an image acquisition unit that acquires an image captured by the imaging unit from the imaging unit is provided, and the image acquisition unit interlocks with the operation of changing the magnification of the objective lens of the magnification changing unit. Even if the first image and the second image are selected from the images acquired from the imaging unit and the selected first image and the second image are input to the sample image input unit. Good.
According to this configuration, it is possible to automate the process of selecting the first image and the second image from the images captured by the imaging unit and inputting them to the sample image input unit.
 上記態様において、前記光学顕微鏡および前記画像取得部を制御する制御部をさらに備え、前記画像処理装置は、前記位置取得部によって前記第2の領域の位置が的確に取得されたか否かを評価する評価部をさらに備え、前記評価部によって前記第2の領域の位置が的確に取得されていないと評価された場合、前記制御部が、前記光学顕微鏡および前記画像取得部を制御することによって、前記第1の画像を再度撮像させて前記試料画像入力部に入力させ、前記位置合わせ部が、再度撮像された前記第1の画像を用いて前記第1の領域を推定してもよい。
 この構成によれば、第1の領域の推定の失敗等が原因で第2の領域の位置が的確に取得されなかった場合、第2の領域の位置が的確に取得されていないと評価され、第1の画像が再度撮像され、第1の領域が再度推定され、再度推定された第1の領域に基づいて第2の領域の位置が再度取得される。このように、第2の領域の位置取得の評価および再取得を自動化することができる。
In the above aspect, the optical microscope and the control unit for controlling the image acquisition unit are further provided, and the image processing apparatus evaluates whether or not the position of the second region is accurately acquired by the position acquisition unit. An evaluation unit is further provided, and when it is evaluated that the position of the second region is not accurately acquired by the evaluation unit, the control unit controls the optical microscope and the image acquisition unit. The first image may be imaged again and input to the sample image input unit, and the alignment unit may estimate the first region using the first image captured again.
According to this configuration, if the position of the second region is not accurately acquired due to a failure in estimating the first region or the like, it is evaluated that the position of the second region is not accurately acquired. The first image is imaged again, the first region is re-estimated, and the position of the second region is re-acquired based on the re-estimated first region. In this way, the evaluation and reacquisition of the position acquisition of the second region can be automated.
 本発明の他の態様は、試料の第1の画像および第2の画像を処理する画像処理方法であって、前記第1の画像および前記第2の画像は、第1の倍率および第2の倍率の対物レンズをそれぞれ通して撮像された顕微鏡画像であり、前記第1の倍率が前記第2の倍率よりも低く、前記第1の倍率の前記対物レンズの光軸と前記第2の倍率の前記対物レンズの光軸は相互に同一または略同一であり、前記第1の画像および前記第2の画像が入力され、前記第1の画像および前記第2の画像に対応する基準画像が入力され、該基準画像は、前記第1の画像の撮影範囲および前記第2の画像の撮影範囲よりも広い範囲の前記試料の画像であり、前記第1の画像を前記基準画像と照合することによって、前記基準画像のうち、前記第1の画像の撮影範囲に対応する第1の領域を推定し、推定された前記第1の領域に基づいて、前記基準画像のうち、前記第2の画像の撮影範囲に対応する第2の領域の位置を取得する、画像処理方法である。 Another aspect of the present invention is an image processing method for processing a first image and a second image of a sample, wherein the first image and the second image have a first magnification and a second image. It is a microscope image taken through each of the objective lenses of the magnification, the first magnification is lower than the second magnification, and the optical axis of the objective lens of the first magnification and the second magnification The optical axes of the objective lenses are the same or substantially the same as each other, the first image and the second image are input, and the reference image corresponding to the first image and the second image is input. The reference image is an image of the sample having a range wider than the shooting range of the first image and the shooting range of the second image, and by collating the first image with the reference image, A first region corresponding to the shooting range of the first image of the reference image is estimated, and based on the estimated first region, the second image of the reference image is shot. This is an image processing method for acquiring the position of a second region corresponding to a range.
 本発明によれば、高倍率の画像の撮影範囲の位置を正確に取得することができるという効果を奏する。 According to the present invention, there is an effect that the position of the shooting range of a high-magnification image can be accurately acquired.
本発明の第1の実施形態に係る画像処理装置の全体構成図である。It is an overall block diagram of the image processing apparatus which concerns on 1st Embodiment of this invention. 試料の基準画像の一例を示す図である。It is a figure which shows an example of the reference image of a sample. 試料の第1の画像の一例を示す図である。It is a figure which shows an example of the 1st image of a sample. 試料の第2の画像の一例を示す図である。It is a figure which shows an example of the 2nd image of a sample. 第1の倍率の決定方法を説明する図である。It is a figure explaining the method of determining the 1st magnification. 図1の画像処理装置の動作を示すフローチャートである。It is a flowchart which shows the operation of the image processing apparatus of FIG. 本発明の第2の実施形態に係る画像処理装置の全体構成図である。It is an overall block diagram of the image processing apparatus which concerns on 2nd Embodiment of this invention. 図5の画像処理装置の動作を示すフローチャートである。It is a flowchart which shows the operation of the image processing apparatus of FIG. 本発明の第3の実施形態に係る顕微鏡システムの全体構成図である。It is an overall block diagram of the microscope system which concerns on 3rd Embodiment of this invention. 図7の画像処理装置の動作を示すフローチャートである。It is a flowchart which shows the operation of the image processing apparatus of FIG. 本発明の第4の実施形態に係る顕微鏡システムの全体構成図である。It is an overall block diagram of the microscope system which concerns on 4th Embodiment of this invention. 図9の画像処理装置の動作を示すフローチャートである。It is a flowchart which shows the operation of the image processing apparatus of FIG. 図10Aのフローチャートの続きである。It is a continuation of the flowchart of FIG. 10A.
(第1の実施形態)
 本発明の第1の実施形態に係る画像処理装置1および画像処理方法について図面を参照して説明する。
 画像処理装置1は、光学顕微鏡を使用した試料Sの観察、例えば半導体回路基板Sの目視検査において、試料S(図2A参照。)の顕微鏡画像A,B(図2Bおよび図2C参照。)を処理し、顕微鏡画像Bの撮影範囲の位置を作業者またはシステムに提示し通知する。
(First Embodiment)
The image processing apparatus 1 and the image processing method according to the first embodiment of the present invention will be described with reference to the drawings.
The image processing apparatus 1 displays the microscopic images A and B (see FIGS. 2B and 2C) of the sample S (see FIG. 2A) in the observation of the sample S using an optical microscope, for example, in the visual inspection of the semiconductor circuit substrate S. It processes and presents and notifies the operator or the system of the position of the imaging range of the microscope image B.
 画像処理装置1は、図1に示されるように、試料画像入力部2と、基準画像入力部3と、位置合わせ部4と、位置取得部5とを備える。
 画像処理装置1の一例は、プロセッサと、RAM、ROMおよびその他の任意の記憶装置を有する記憶部とを備えるコンピュータである。記憶部には、画像処理プログラムが記憶されている。プロセッサが画像処理プログラムに従って処理を実行することによって、各部2,3,4,5の後述の機能が実現される。
As shown in FIG. 1, the image processing device 1 includes a sample image input unit 2, a reference image input unit 3, an alignment unit 4, and a position acquisition unit 5.
An example of an image processing device 1 is a computer including a processor and a storage unit having a RAM, a ROM, and any other storage device. An image processing program is stored in the storage unit. When the processor executes the process according to the image processing program, the functions described later in each part 2, 3, 4, and 5 are realized.
 試料画像入力部2は、例えば、光学顕微鏡に取り付けられたカメラと接続され、カメラから第1の画像Aおよび第2の画像Bが入力される。
 カメラは、光学顕微鏡の対物レンズによって形成された試料Sの光学像を撮像し、試料Sのデジタル画像である第1の画像Aおよび第2の画像Bを取得する。
The sample image input unit 2 is connected to, for example, a camera attached to an optical microscope, and the first image A and the second image B are input from the camera.
The camera captures an optical image of the sample S formed by the objective lens of the optical microscope, and acquires a first image A and a second image B which are digital images of the sample S.
 図2Aから図2Cは、基準画像C(後述)、第1の画像Aおよび第2の画像Bをそれぞれ示している。図2Aに示されるように、試料Sは、DRAMまたはイメージャ等の半導体回路基板である。半導体回路基板Sには、同一または類似の多数の回路パターンが規則的に配列されている。第1の画像Aは、第1の倍率の対物レンズを通して撮像された画像であり、第2の画像Bは、第2の倍率の対物レンズを通して撮像された画像である。第2の倍率は、試料Sの目視検査用の高倍率であり、第1の倍率は、第2の倍率よりも低い。したがって、第1の画像Aの撮影範囲は、第2の画像Bの撮影範囲よりも広い。図2Aにおいて、領域R1は、第1の画像Aの撮影範囲に対応する第1の領域であり、領域R2は、第2の画像Bの撮影範囲に対応する第2の領域である。 2A to 2C show a reference image C (described later), a first image A, and a second image B, respectively. As shown in FIG. 2A, the sample S is a semiconductor circuit board such as a DRAM or an imager. A large number of identical or similar circuit patterns are regularly arranged on the semiconductor circuit board S. The first image A is an image captured through the objective lens of the first magnification, and the second image B is an image captured through the objective lens of the second magnification. The second magnification is a high magnification for visual inspection of the sample S, and the first magnification is lower than the second magnification. Therefore, the shooting range of the first image A is wider than the shooting range of the second image B. In FIG. 2A, the area R1 is the first area corresponding to the shooting range of the first image A, and the area R2 is the second area corresponding to the shooting range of the second image B.
 試料Sが特徴的な領域を有する場合、第1の倍率および撮影領域R1は、第1の画像Aが特徴的な領域を含むように設定される。図3は、第1の倍率および撮影領域R1の決定方法を説明している。図3の例において、試料Sは、同一または類似の回路パターンが規則的に繰り返される繰り返し領域P1と、繰り返し領域P1の周辺に位置し特徴的な回路パターンを有する特徴領域P2とを含む。第1の画像Aの撮影範囲が繰り返し領域P1と特徴領域P2との間の境界領域P3を含むように、第1の倍率および撮影領域R1は決定される。 When the sample S has a characteristic region, the first magnification and the photographing region R1 are set so that the first image A includes a characteristic region. FIG. 3 illustrates a method of determining the first magnification and the photographing region R1. In the example of FIG. 3, the sample S includes a repeating region P1 in which the same or similar circuit pattern is regularly repeated, and a characteristic region P2 located around the repeating region P1 and having a characteristic circuit pattern. The first magnification and the imaging region R1 are determined so that the imaging range of the first image A includes the boundary region P3 between the repeating region P1 and the feature region P2.
 例えば、光学顕微鏡は、第1の倍率の第1の対物レンズと、第2の倍率の第2の対物レンズと、第1の対物レンズおよび第2の対物レンズを保持するレボルバとを備える。レボルバの回転によって、第1の対物レンズおよび第2の対物レンズが択一的に光路上に配置される。あるいは、光学顕微鏡は、第1の倍率と第2の倍率との間で倍率を変更可能なズーム機能を有する対物レンズを備えていてもよい。したがって、第1の倍率の対物レンズの光軸と、第2の倍率の対物レンズの光軸は、相互に同一、または略同一であり、そのため、図2Aに示されるように第1の画像Aの撮影範囲の中心位置と第2の画像Bの撮影範囲の中心位置は、相互に同一または略同一である。 For example, an optical microscope includes a first objective lens having a first magnification, a second objective lens having a second magnification, and a revolver holding the first objective lens and the second objective lens. The rotation of the revolver causes the first objective lens and the second objective lens to be selectively arranged on the optical path. Alternatively, the optical microscope may include an objective lens having a zoom function capable of changing the magnification between the first magnification and the second magnification. Therefore, the optical axis of the objective lens of the first magnification and the optical axis of the objective lens of the second magnification are the same as or substantially the same as each other, and therefore, as shown in FIG. 2A, the first image A The center position of the photographing range of the second image B and the center position of the photographing range of the second image B are the same or substantially the same as each other.
 基準画像入力部3は、例えば、画像処理装置1の外部装置と接続され、第1の画像Aおよび第2の画像Bに対応する基準画像Cが外部装置から入力される。あるいは、画像処理装置1が、基準画像Cを記憶するメモリを備え、メモリから基準画像入力部3に基準画像Cが入力されてもよい。 The reference image input unit 3 is connected to, for example, an external device of the image processing device 1, and the reference image C corresponding to the first image A and the second image B is input from the external device. Alternatively, the image processing device 1 may include a memory for storing the reference image C, and the reference image C may be input from the memory to the reference image input unit 3.
 図2Aに示されるように、基準画像Cは、第1の画像Aの撮影範囲および第2の画像Bの撮影範囲よりも広い範囲の試料Sの画像である。試料Sが半導体回路基板である場合、基準画像Cが、半導体回路基板の設計図面の画像であり、設計図面のCADデータが基準画像入力部3に入力されてもよい。あるいは、基準画像Cは、第1の倍率よりも低い倍率の対物レンズを通して撮像された試料Sの顕微鏡画像であってもよい。 As shown in FIG. 2A, the reference image C is an image of the sample S having a range wider than the shooting range of the first image A and the shooting range of the second image B. When the sample S is a semiconductor circuit board, the reference image C may be an image of the design drawing of the semiconductor circuit board, and the CAD data of the design drawing may be input to the reference image input unit 3. Alternatively, the reference image C may be a microscope image of the sample S taken through an objective lens having a magnification lower than the first magnification.
 位置合わせ部4は、画像入力部2,3から第1の画像Aおよび基準画像Cを受け取り、基準画像Cに対して第1の画像Aを位置合わせする。すなわち、位置合わせ部4は、第1の画像Aを基準画像Cと照合し、基準画像C内から第1の画像Aとの間の類似度が最も高い領域を探索して見付ける。そして、位置合わせ部4は、第1の画像Aが位置合わせされた領域を第1の領域R1であると推定する。 The alignment unit 4 receives the first image A and the reference image C from the image input units 2 and 3, and aligns the first image A with respect to the reference image C. That is, the alignment unit 4 collates the first image A with the reference image C, and searches and finds a region having the highest degree of similarity with the first image A from within the reference image C. Then, the alignment unit 4 estimates that the region in which the first image A is aligned is the first region R1.
 基準画像Cに対する第1の画像Aの位置合わせには、既存の位置合わせアルゴリズムが使用される。
 例えば、第1の画像Aのエッジ検出によって、第1の画像Aからエッジ画像を作成する。エッジ検出には、古典的なフィルタ処理または深層学習によるエッジ推定(例えば、「Ruohui Wang、Edge Detection Using Convolutional Neural Network、Advances in Neural Network - ISNN2016、13th International Symposium on Neural Network、pp12-20、2016」参照。)等が使用される。次に、エッジ画像または基準画像Cの所定サイズ(m画素×n画素)のカーネルを移動および回転させ、画素値の差分の絶対値和であるSAD(Sum of Absolute Difference)を算出し、SADが最小となる位置に基づいてエッジ画像を位置合わせする。カーネルの拡大または縮小が必要である場合、光学顕微鏡の光学系の倍率から得られる第1の画像Aのサイズと、基準画像Cの実寸情報とに基づいて、カーネルのサイズ調整が行われる。
An existing alignment algorithm is used to align the first image A with respect to the reference image C.
For example, an edge image is created from the first image A by detecting the edge of the first image A. Edge detection can be done by classical filtering or deep learning edge estimation (eg, Ruohui Wang, Edge Detection Using Convolutional Neural Network, Advances in Neural Network-ISNN2016, 13th International Symposium on Neural Network, pp12-20, 2016). See.) Etc. are used. Next, the kernel of a predetermined size (m pixels × n pixels) of the edge image or the reference image C is moved and rotated, and SAD (Sum of Absolute Difference), which is the sum of the absolute values of the differences between the pixel values, is calculated. Align the edge image based on the minimum position. When it is necessary to enlarge or reduce the kernel, the size of the kernel is adjusted based on the size of the first image A obtained from the magnification of the optical system of the optical microscope and the actual size information of the reference image C.
 位置取得部5は、位置合わせ部4によって推定された第1の領域R1に基づいて、基準画像C内の第2の領域R2の位置を取得する。
 具体的には、位置取得部5は、第1の画像Aの撮影範囲と第2の画像Bの撮影範囲との間のサイズ比に基づいて、第1の領域R1のサイズから第2の領域R2のサイズを算出する。また、位置取得部5は、第1の領域R1の中心位置を第2の領域R2の中心位置に決定する。これにより、位置取得部5は、基準画像C内の第2の領域R2の位置を得ることができる。
The position acquisition unit 5 acquires the position of the second region R2 in the reference image C based on the first region R1 estimated by the alignment unit 4.
Specifically, the position acquisition unit 5 has the size of the first region R1 to the second region based on the size ratio between the shooting range of the first image A and the shooting range of the second image B. Calculate the size of R2. Further, the position acquisition unit 5 determines the center position of the first region R1 to the center position of the second region R2. As a result, the position acquisition unit 5 can obtain the position of the second region R2 in the reference image C.
 撮影範囲のサイズ比は、例えば、第1の倍率の対物レンズの実視野および第2の倍率の対物レンズの実視野に関する情報から算出される。実視野に関する情報は、例えば、第1の倍率および第2の倍率の数値であり、光学顕微鏡から画像処理装置1に入力される。
 位置取得部5は、第2の画像Bと第2の領域R2の位置の情報とを出力する。出力された第2の画像Bおよび第2の領域R2の位置の情報は、例えば画像処理装置1に接続された表示装置に表示されることによって、作業者に提示される。
The size ratio of the photographing range is calculated from, for example, information on the actual field of view of the objective lens of the first magnification and the actual field of view of the objective lens of the second magnification. The information regarding the actual field of view is, for example, numerical values of the first magnification and the second magnification, and is input from the optical microscope to the image processing device 1.
The position acquisition unit 5 outputs information on the position of the second image B and the second region R2. The output information on the positions of the second image B and the second region R2 is presented to the operator, for example, by being displayed on a display device connected to the image processing device 1.
 次に、本実施形態に係る画像処理方法に対応する画像処理装置1の作用について、図4を参照して説明する。
 作業者は、第1の倍率の対物レンズを通して試料Sの広範囲を低倍率で観察し、試料Sの観察領域を決定する。続いて、作業者は、試料Sの位置を維持したまま対物レンズの倍率を第2の倍率に切り替え、観察領域を高倍率で観察する。倍率を切り替える前および後にカメラによって第1の画像Aおよび第2の画像Bが取得され、第1の画像Aおよび第2の画像Bがカメラから画像処理装置1に入力される(ステップS1)。また、基準画像Cが、画像処理装置1に入力される(ステップS2)。
Next, the operation of the image processing apparatus 1 corresponding to the image processing method according to the present embodiment will be described with reference to FIG.
The operator observes a wide range of the sample S at a low magnification through the objective lens of the first magnification, and determines the observation area of the sample S. Subsequently, the operator switches the magnification of the objective lens to the second magnification while maintaining the position of the sample S, and observes the observation region at a high magnification. The first image A and the second image B are acquired by the camera before and after the magnification is switched, and the first image A and the second image B are input from the camera to the image processing device 1 (step S1). Further, the reference image C is input to the image processing device 1 (step S2).
 画像処理装置1内において、第1の画像Aは、試料画像入力部2を経由して位置合わせ部4に送られ、第2の画像Bは、試料画像入力部2を経由して位置取得部5に送られ、基準画像Cは、基準画像入力部3を経由して位置合わせ部4および位置取得部5に送られる。 In the image processing device 1, the first image A is sent to the alignment unit 4 via the sample image input unit 2, and the second image B is sent to the position acquisition unit 4 via the sample image input unit 2. The reference image C is sent to the alignment unit 4 and the position acquisition unit 5 via the reference image input unit 3.
 位置合わせ部4において、基準画像Cに対して第1の画像Aが位置合わせされ(ステップS3)、基準画像C内の第1の領域R1が推定される(ステップS4)。次に、位置取得部5によって、推定された第1の領域R1を参照して基準画像C内の第2の領域R2の位置が取得される(ステップS5)。そして、第2の画像Bおよび第2の領域R2の位置の推定結果が画像処理装置1から出力され作業者に提示される(ステップS6)。 In the alignment unit 4, the first image A is aligned with the reference image C (step S3), and the first region R1 in the reference image C is estimated (step S4). Next, the position acquisition unit 5 acquires the position of the second region R2 in the reference image C with reference to the estimated first region R1 (step S5). Then, the estimation results of the positions of the second image B and the second region R2 are output from the image processing device 1 and presented to the operator (step S6).
 作業者は、例えば、基準画像C内の第2の領域Rと第2の画像Bとを照合することによって、基準画像Cである設計図面と、試料Sの実画像である第2の画像Bとを厳密に比較し、欠陥の検出等を行うことができる。 For example, by collating the second region R in the reference image C with the second image B, the operator collates the design drawing which is the reference image C with the second image B which is the actual image of the sample S. It is possible to detect defects and the like by strictly comparing with.
 図2Aから図2Cに示されるように、規則的に配列する同一または類似の領域を多数含む試料Sの場合、高倍率の第2の画像Bは、同一または類似の領域のみを含み得る。このような第2の画像Bを基準画像Cに対して位置合わせした場合、位置合わせの失敗が起こりやすい。また、ガラスのような試料Sの場合、高倍率の第2の画像Bのコントラストは低くなる。このような第2の画像Bを基準画像Cに対して位置合わせした場合も、位置合わせの失敗が起こりやすい。
 一方、低倍率の第1の画像Aは、同一または類似の領域に加えて、基準画像Cに対する位置合わせに有効な特徴的な領域も含む。さらに、特に顕微鏡おいては光学分解能とイメージャの解像度との関係から、低倍率の第1の画像Aのコントラストは、第2の画像Bと比較して高い場合が多い。コントラストの高い画像では、上述の画像の位置合わせアルゴリズムにおいて、位置ずれに対しての評価関数の変動が大きくなる。したがって、第1の画像Aは基準画像Cに対して高い位置精度で位置合わせすることができる。
As shown in FIGS. 2A-2C, in the case of sample S containing a large number of regularly arranged identical or similar regions, the high magnification second image B may include only identical or similar regions. When such a second image B is aligned with respect to the reference image C, misalignment is likely to occur. Further, in the case of the sample S such as glass, the contrast of the high-magnification second image B becomes low. Even when such a second image B is aligned with respect to the reference image C, misalignment is likely to occur.
On the other hand, the low-magnification first image A includes, in addition to the same or similar regions, characteristic regions that are effective for alignment with respect to the reference image C. Further, especially in a microscope, the contrast of the low-magnification first image A is often higher than that of the second image B because of the relationship between the optical resolution and the imager's resolution. In an image with high contrast, in the above-mentioned image alignment algorithm, the variation of the evaluation function with respect to the misalignment becomes large. Therefore, the first image A can be aligned with the reference image C with high positional accuracy.
 目視検査に使用される光学顕微鏡は、倍率の異なる複数の対物レンズまたはズーム光学系を使用して、光軸が光学顕微鏡の保証精度の範囲内で一致または略一致した状態で、2つの倍率の画像を取得することができる。すなわち、低倍率の第1の画像Aと高倍率の第2の画像Bとの間での撮影範囲の位置関係が保証されている。さらに、基準画像C内の第1の領域R1と第2の領域R2との間のサイズの関係は、第1の倍率の対物レンズの実視野および第2の倍率の対物レンズの実視野によって規定される。このような位置関係およびサイズの関係を利用し、高い精度で推定された第1の領域Rの位置から高倍率の第2の画像Bの撮影範囲の正確な位置を取得することができる。 Optical microscopes used for visual inspection use multiple objective lenses or zoom optics with different magnifications, with the optical axes aligned or substantially matched within the guaranteed accuracy of the optical microscope, at two magnifications. Images can be acquired. That is, the positional relationship of the shooting range between the low-magnification first image A and the high-magnification second image B is guaranteed. Further, the size relationship between the first region R1 and the second region R2 in the reference image C is defined by the actual field of view of the objective lens of the first magnification and the actual field of view of the objective lens of the second magnification. Will be done. By utilizing such a positional relationship and a size relationship, it is possible to obtain an accurate position of the shooting range of the second image B with a high magnification from the position of the first region R estimated with high accuracy.
 本実施形態において、位置取得部5が、第1の領域R1の中心位置を第2の領域R2の中心位置として取得することとしたが、これに代えて、第1の領域R1の中心位置から所定の距離だけずれた位置を第2の領域R2の中心位置として取得してもよい。 In the present embodiment, the position acquisition unit 5 acquires the center position of the first region R1 as the center position of the second region R2, but instead, from the center position of the first region R1. A position deviated by a predetermined distance may be acquired as the center position of the second region R2.
 所定の距離は、第1の倍率の対物レンズと第2の倍率の対物レンズの光軸間の変位量に応じた距離である。対物レンズの倍率を第1の倍率と第2の倍率との間で変更したとき、光軸が、該光軸に直交する方向にわずかに変位し得る。この光軸の変位量に応じた距離だけ、第1の領域R1の中心位置と第2の領域R2の中心位置との間にずれが生じる。対物レンズの光軸の変位量および変位方向は略一定であるので、第1の領域R1の中心位置と第2の領域R2の中心位置との間のずれ量およびずれ方向は略一定であり、予め取得することができる。 The predetermined distance is a distance according to the amount of displacement between the optical axes of the objective lens having the first magnification and the objective lens having the second magnification. When the magnification of the objective lens is changed between the first magnification and the second magnification, the optical axis may be slightly displaced in the direction orthogonal to the optical axis. A deviation occurs between the center position of the first region R1 and the center position of the second region R2 by the distance corresponding to the displacement amount of the optical axis. Since the displacement amount and the displacement direction of the optical axis of the objective lens are substantially constant, the displacement amount and the displacement direction between the center position of the first region R1 and the center position of the second region R2 are substantially constant. It can be obtained in advance.
 したがって、予め取得されたずれ量およびずれ方向に基づいて第1の領域R1の中心位置から所定の方向に所定の距離だけずれた位置を第2の領域R2の中心位置として取得することができる。これにより、第2の領域R2のより正確な位置を取得することができる。
 所定の距離は、固定であってもよいが、その都度補正してもよい。倍率を変更したときの対物レンズの光軸の変位量は、倍率の変更の都度、異なり得る。したがって、例えば、倍率の変更の都度、光軸の変位量を測定し、測定された変位量に基づいて所定の距離を補正してもよい。
Therefore, a position deviated by a predetermined distance in a predetermined direction from the center position of the first region R1 based on the deviation amount and the deviation direction acquired in advance can be acquired as the center position of the second region R2. This makes it possible to obtain a more accurate position of the second region R2.
The predetermined distance may be fixed, but may be corrected each time. The amount of displacement of the optical axis of the objective lens when the magnification is changed may be different each time the magnification is changed. Therefore, for example, the displacement amount of the optical axis may be measured each time the magnification is changed, and a predetermined distance may be corrected based on the measured displacement amount.
(第2の実施形態)
 次に、本発明の第2の実施形態に係る画像処理装置10および画像処理方法について図面を参照して説明する。
 本実施形態において、第1の実施形態と異なる構成について説明し、第1の実施形態と共通する構成については同一の符号を付して説明を省略する。
 画像処理装置10は、図5に示されるように、試料画像入力部2と、基準画像入力部3と、第1の位置合わせ部41と、第2の位置合わせ部51とを備える。図6に示されるように、本実施形態に係る画像処理方法に対応する画像処理装置10の作用は、ステップS5に代えてステップS7を行う点において、第1の実施形態の画像処理装置1と相違する。
(Second embodiment)
Next, the image processing apparatus 10 and the image processing method according to the second embodiment of the present invention will be described with reference to the drawings.
In the present embodiment, a configuration different from that of the first embodiment will be described, and the same reference numerals will be given to the configurations common to the first embodiment, and the description thereof will be omitted.
As shown in FIG. 5, the image processing device 10 includes a sample image input unit 2, a reference image input unit 3, a first alignment unit 41, and a second alignment unit 51. As shown in FIG. 6, the operation of the image processing device 10 corresponding to the image processing method according to the present embodiment is that the image processing device 1 of the first embodiment performs step S7 instead of step S5. It's different.
 第1の位置合わせ部41は、第1の実施形態の位置合わせ部4と同一である。
 第2の位置合わせ部51は、基準画像C内の第2の領域R2の位置を取得する位置取得部であるが、第1の実施形態の位置取得部5とは異なる方法で第2の領域R2の位置を取得する。第2の位置合わせ部51は、位置合わせ部4と同様の方法を用いて、基準画像C内の第1の領域R1に対して第2の画像Bを位置合わせする(ステップS7)。すなわち、第2の位置合わせ部51は、第2の画像Bを第1の領域R1と照合し、第1の領域R1内から第2の画像Bとの間の類似度が最も高い領域を探索して見付ける。そして、第2の位置合わせ部51は、第2の画像Bが位置合わせされた領域を第2の領域R2であると推定し、推定された第2の領域R2の位置を取得する(ステップS6)。
The first alignment unit 41 is the same as the alignment unit 4 of the first embodiment.
The second alignment unit 51 is a position acquisition unit that acquires the position of the second region R2 in the reference image C, but the second region is different from the position acquisition unit 5 of the first embodiment. Acquire the position of R2. The second alignment unit 51 aligns the second image B with respect to the first region R1 in the reference image C by using the same method as the alignment unit 4 (step S7). That is, the second alignment unit 51 collates the second image B with the first region R1 and searches for the region having the highest degree of similarity between the first region R1 and the second image B. And find out. Then, the second alignment unit 51 estimates that the region in which the second image B is aligned is the second region R2, and acquires the estimated position of the second region R2 (step S6). ).
 本実施形態によれば、第2の画像Bを基準画像Cと照合して第2の領域R2を推定する場合と比較して、第2の領域Rの探索範囲が第1の領域R1内に制限される。したがって、第2の画像Bの位置合わせの所要時間を短縮することができる。また、試料S内の同一または類似の領域の繰り返しパターンに起因する位置合わせの失敗を防止し、第2の領域R2を正確に推定することができる。 According to the present embodiment, the search range of the second region R is within the first region R1 as compared with the case where the second image B is collated with the reference image C to estimate the second region R2. Be restricted. Therefore, the time required for the alignment of the second image B can be shortened. In addition, it is possible to prevent misalignment due to a repeating pattern of the same or similar regions in the sample S, and accurately estimate the second region R2.
 本実施形態において、第2の画像Bの位置合わせにおける探索範囲が、第1の領域R1の全体であってもよいが、第1の領域R1の中心を中心とし第1の領域R1よりも狭い範囲であってもよい。
 例えば、第2の位置合わせ部51が、第2の画像Bの第1の領域R1との照合において、第1の領域R1の中心位置から所定の距離の範囲内で第2の領域R2を探索してもよい。所定の距離は、第1の倍率と第2の倍率との間の対物レンズ62の光軸の変位量の保証範囲に対応する距離である。
In the present embodiment, the search range in the alignment of the second image B may be the entire first region R1, but is narrower than the first region R1 centered on the center of the first region R1. It may be a range.
For example, the second alignment unit 51 searches for the second region R2 within a predetermined distance from the center position of the first region R1 in collation with the first region R1 of the second image B. You may. The predetermined distance is a distance corresponding to the guaranteed range of the displacement amount of the optical axis of the objective lens 62 between the first magnification and the second magnification.
 通常、第1の画像Aの撮影範囲と第2の画像Bの撮影範囲との間の中心位置のずれ量は、第1の倍率と第2の倍率との間の対物レンズの光軸の変位量の保証範囲内におさまる。したがって、第2の領域Bの位置合わせにおける第2の画像Bの探索領域を保証範囲に対応する範囲に制限することによって位置合わせにかかる計算時間および負荷を抑えることが出来、効率的にかつ確実に、第2の領域R2を推定することができる。 Normally, the amount of deviation of the center position between the shooting range of the first image A and the shooting range of the second image B is the displacement of the optical axis of the objective lens between the first magnification and the second magnification. It fits within the guaranteed amount. Therefore, by limiting the search area of the second image B in the alignment of the second region B to the range corresponding to the guaranteed range, the calculation time and the load required for the alignment can be suppressed, and the calculation time and the load required for the alignment can be suppressed, which is efficient and reliable. In addition, the second region R2 can be estimated.
(第3の実施形態)
 次に、本発明の第3の実施形態に係る顕微鏡システム100および画像処理方法について図面を参照して説明する。
 本実施形態において、第1および第2の実施形態と異なる構成について説明し、第1および第2の実施形態と共通する構成については同一の符号を付して説明を省略する。
 顕微鏡システム100は、図7に示されるように、光学顕微鏡60と、画像処理装置20と、光学顕微鏡60および画像処理装置20と接続され、光学顕微鏡60および画像処理装置20を制御するメインコントローラ(制御部)80とを備える。図8に示されるように、本実施形態に係る画像処理方法に対応する画像処理装置20の作用は、ステップS1に代えてステップS8~S10を行う点において、第1の実施形態の画像処理装置1と相違する。
(Third Embodiment)
Next, the microscope system 100 and the image processing method according to the third embodiment of the present invention will be described with reference to the drawings.
In this embodiment, configurations different from those of the first and second embodiments will be described, and configurations common to the first and second embodiments will be designated by the same reference numerals and description thereof will be omitted.
As shown in FIG. 7, the microscope system 100 is connected to the optical microscope 60, the image processing device 20, the optical microscope 60, and the image processing device 20, and controls the optical microscope 60 and the image processing device 20. A control unit) 80 is provided. As shown in FIG. 8, the operation of the image processing apparatus 20 corresponding to the image processing method according to the present embodiment is that the image processing apparatus of the first embodiment performs steps S8 to S10 instead of step S1. Different from 1.
 光学顕微鏡60は、ステージ61と、ステージ61上の試料と対向して配置される対物レンズ62と、対物レンズ62の倍率を第1の倍率と第2の倍率との間で変更する倍率変更部63と、対物レンズ62を通して試料Sを撮像するデジタルカメラ(撮像部)64とを備える。
 光学顕微鏡60は、倍率変更部63による倍率の変更のタイミングをメインコントローラ80に通知する。
The optical microscope 60 is a magnification changing unit that changes the magnification of the stage 61, the objective lens 62 arranged to face the sample on the stage 61, and the objective lens 62 between the first magnification and the second magnification. A 63 and a digital camera (imaging unit) 64 that captures the sample S through the objective lens 62 are provided.
The optical microscope 60 notifies the main controller 80 of the timing of changing the magnification by the magnification changing unit 63.
 図7の例において、倍率変更部63は、第1の倍率の対物レンズ62および第2の倍率の対物レンズ62を保持するレボルバであり、レボルバ63の回転によって、試料と対向し試料の観察に使用される対物レンズ62の倍率が変更される。
 対物レンズ62がズーム機能を有するズーム光学系である場合、倍率変更部63は、対物レンズ62であってもよい。この場合、倍率の情報を得るために、対物レンズ62の状態を検知する機構が設けられていてもよい。
In the example of FIG. 7, the magnification changing unit 63 is a revolver that holds the objective lens 62 of the first magnification and the objective lens 62 of the second magnification, and by rotating the revolver 63, it faces the sample and observes the sample. The magnification of the objective lens 62 used is changed.
When the objective lens 62 is a zoom optical system having a zoom function, the magnification changing unit 63 may be the objective lens 62. In this case, a mechanism for detecting the state of the objective lens 62 may be provided in order to obtain information on the magnification.
 画像処理装置20は、試料画像入力部2、基準画像入力部3、位置合わせ部4および位置取得部5に加えて、画像取得部6を備える。画像処理装置20は、位置合わせ部4および位置取得部5に代えて、第1の位置合わせ部41および第2の位置合わせ部51を備えていてもよい。 The image processing device 20 includes an image acquisition unit 6 in addition to the sample image input unit 2, the reference image input unit 3, the alignment unit 4, and the position acquisition unit 5. The image processing device 20 may include a first alignment unit 41 and a second alignment unit 51 in place of the alignment unit 4 and the position acquisition unit 5.
 図8に示されるように、画像取得部6は、カメラ64から画像を取得し(ステップS8)、倍率変更部63の対物レンズ62の倍率の変更の動作と連動して画像の中から第1の画像Aおよび第2の画像Bを選択し(ステップS9)、選択された第1の画像Aおよび第2の画像Bを試料画像入力部2に転送して入力する(ステップS10)。
 例えば、画像取得部6は、FIFO(first in,first out)メモリ6aを有する。カメラ64は、一定の時間間隔で試料Sを撮像し、取得された画像をFIFOメモリ6aに送信する。FIFOメモリ6aは、直近の複数の画像を一定時間保持する。
As shown in FIG. 8, the image acquisition unit 6 acquires an image from the camera 64 (step S8), and in conjunction with the operation of changing the magnification of the objective lens 62 of the magnification changing unit 63, the first image is taken from the image. Image A and the second image B are selected (step S9), and the selected first image A and the second image B are transferred to the sample image input unit 2 and input (step S10).
For example, the image acquisition unit 6 has a FIFO (first in, first out) memory 6a. The camera 64 captures the sample S at regular time intervals and transmits the acquired image to the FIFO memory 6a. The FIFO memory 6a holds the latest plurality of images for a certain period of time.
 メインコントローラ80は、光学顕微鏡60からの通知に応答して画像取得部6に第1の画像Aおよび第2の画像Bの選択を実行させる。
 例えば、作業者は、第1の倍率で試料Sの広範囲を観察して観察領域を決定し、第1の倍率から第2の倍率へ切り替え、観察領域を高倍率で観察する。この場合、時刻Tに第1の倍率から第2の倍率へ変更されたという通知に基づいて、画像取得部6は、FIFOメモリ6aに保持されている画像の中から、時刻Tの直前の時刻T-τ1に取得された画像を第1の画像Aとして選択し、時刻Tの直後の時刻T+τ2に取得された画像を第2の画像Bとして選択する。このように、倍率変更部63による倍率の変更のタイミングに基づいて、撮影範囲の中心が相互に一致または略一致した一対の画像A,Bを特定し選択することができる。
The main controller 80 causes the image acquisition unit 6 to select the first image A and the second image B in response to the notification from the optical microscope 60.
For example, the operator observes a wide range of the sample S at the first magnification to determine the observation area, switches from the first magnification to the second magnification, and observes the observation area at a high magnification. In this case, based on the notification that the first magnification is changed to the second magnification at time T, the image acquisition unit 6 selects the time immediately before time T from the images stored in the FIFA memory 6a. The image acquired at T-τ1 is selected as the first image A, and the image acquired at time T + τ2 immediately after the time T is selected as the second image B. In this way, a pair of images A and B whose centers of the photographing range coincide with each other or substantially coincide with each other can be specified and selected based on the timing of changing the magnification by the magnification changing unit 63.
 選択される画像A,Bは、必ずしも時刻Tの直前および直後の画像である必要は無く、第1の倍率の対物レンズ62および第2の倍率の対物レンズ62が使用されている期間内にそれぞれ取得された画像であればよい。 The images A and B selected do not necessarily have to be the images immediately before and after the time T, and are within the period in which the objective lens 62 of the first magnification and the objective lens 62 of the second magnification are used, respectively. Any acquired image may be used.
 このように、本実施形態によれば、倍率変更部63による倍率の変更と連動して、カメラ64によって取得される画像の中から、画像処理装置1の処理に必要な一対の第1の画像Aおよび第2の画像Bが自動的に選択され、一対の第1の画像Aおよび第2の画像Bが試料画像入力部2に自動的に入力される。これにより、作業者が第1の画像Aおよび第2の画像Bを選択する手間が不要となり、試料Sの観察の効率を向上することができる。 As described above, according to the present embodiment, the pair of first images required for the processing of the image processing device 1 from the images acquired by the camera 64 in conjunction with the change of the magnification by the magnification changing unit 63. A and the second image B are automatically selected, and a pair of the first image A and the second image B are automatically input to the sample image input unit 2. This eliminates the need for the operator to select the first image A and the second image B, and can improve the efficiency of observing the sample S.
 本実施形態において、画像処理装置20は、画像取得部6を必ずしも備えていなくてもよい。
 例えば、第1の倍率から第2の倍率への切り替えの直前および直後に作業者がカメラ64を操作し第1の画像Aおよび第2の画像Bの取得を実行させることによって、試料画像入力部2に画像A,Bが入力されるように構成されていてもよい。あるいは、カメラ64によって取得された画像の中から作業者が一対の第1の画像Aおよび第2の画像Bを選択し、選択された画像A,Bが試料画像入力部2に入力されるように構成されていてもよい。
In the present embodiment, the image processing device 20 does not necessarily have to include the image acquisition unit 6.
For example, the sample image input unit is caused by an operator operating the camera 64 to execute acquisition of the first image A and the second image B immediately before and immediately after switching from the first magnification to the second magnification. Images A and B may be input to 2. Alternatively, the operator selects a pair of the first image A and the second image B from the images acquired by the camera 64, and the selected images A and B are input to the sample image input unit 2. It may be configured in.
(第4の実施形態)
 次に、本発明の第4の実施形態に係る顕微鏡システム200および画像処理方法について図面を参照して説明する。
 本実施形態において、第1から第3の実施形態と異なる構成について説明し、第1から第3の実施形態と共通する構成については同一の符号を付して説明を省略する。
 顕微鏡システム200は、位置取得部51によって第2の領域R2の位置が的確に取得されたか否かを評価し、的確ではない場合に第1の画像Aを自動的に再取得する点において、第3の実施形態と相違している。
(Fourth Embodiment)
Next, the microscope system 200 and the image processing method according to the fourth embodiment of the present invention will be described with reference to the drawings.
In this embodiment, configurations different from those of the first to third embodiments will be described, and configurations common to the first to third embodiments will be designated by the same reference numerals and description thereof will be omitted.
The microscope system 200 evaluates whether or not the position of the second region R2 has been accurately acquired by the position acquisition unit 51, and if it is not accurate, the first image A is automatically reacquired. It is different from the third embodiment.
 顕微鏡システム200は、図9に示されるように、光学顕微鏡60と、画像処理装置30と、光学顕微鏡60および画像処理装置30と接続されたメインコントローラ(制御部)80とを備える。図10Aおよび図10Bに示されるように、本実施形態に係る画像処理方法に対応する画像処理装置30の作用は、ステップS7の第2の画像の位置合わせ後にステップS11~S16が追加される点において、第3の実施形態と相違する。 As shown in FIG. 9, the microscope system 200 includes an optical microscope 60, an image processing device 30, and a main controller (control unit) 80 connected to the optical microscope 60 and the image processing device 30. As shown in FIGS. 10A and 10B, the operation of the image processing apparatus 30 corresponding to the image processing method according to the present embodiment is that steps S11 to S16 are added after the alignment of the second image in step S7. Is different from the third embodiment.
 画像処理装置30は、試料画像入力部2、基準画像入力部3、第1の位置合わせ部41、第2の位置合わせ部51および画像取得部6に加えて、評価部7を備える。
 基準画像Cに対する第1の画像Aの位置合わせの失敗等が原因で第1の領域R1に対して第2の画像Bが適切に位置合わせされず、その結果、第2の領域R2の位置が的確に取得されない可能性が有る。図10Aに示されるように、評価部7は、第2の位置合わせ部51によって第2の領域R2の位置が的確に取得されたか否かを評価する(ステップS11)。具体的には、評価部7は、第2の画像Bと、第2の位置合わせ部51によって推定された第2の領域R2との間の類似度に基づいて、第1の領域R1に対する第2の画像Bの位置合わせが的確に行われたか否かを評価する。
The image processing device 30 includes an evaluation unit 7 in addition to the sample image input unit 2, the reference image input unit 3, the first alignment unit 41, the second alignment unit 51, and the image acquisition unit 6.
The second image B is not properly aligned with respect to the first region R1 due to a failure of the alignment of the first image A with respect to the reference image C, and as a result, the position of the second region R2 is changed. It may not be acquired accurately. As shown in FIG. 10A, the evaluation unit 7 evaluates whether or not the position of the second region R2 is accurately acquired by the second alignment unit 51 (step S11). Specifically, the evaluation unit 7 has a second with respect to the first region R1 based on the degree of similarity between the second image B and the second region R2 estimated by the second alignment unit 51. It is evaluated whether or not the alignment of the image B of 2 is performed accurately.
 例えば、評価部7は、第2の画像Bと第2の領域R2との間のSADを算出する。
 SADが所定値未満である場合、評価部7は、第1の領域R1に対する第2の画像Bの位置合わせが的確に行われたと評価する(ステップS11のYES)。この場合、図10Aに示されるように、評価部7は、的確の評価結果を第2の位置合わせ部51に通知し、第2の位置合わせ部51が第2の領域R2の位置の情報を出力する(ステップS6)。
 一方、SADが所定値以上である場合、評価部7は、第1の領域R1に対する第2の画像Bの位置合わせが的確に行われなかったと評価する(ステップS11のNO)。この場合、図10Bに示されるように、評価部7は、不的確の評価結果をメインコントローラ80に通知し(ステップS12)、第2の画像Bは、メモリ8に保持される(ステップS13)。
For example, the evaluation unit 7 calculates the SAD between the second image B and the second region R2.
When the SAD is less than a predetermined value, the evaluation unit 7 evaluates that the alignment of the second image B with respect to the first region R1 has been performed accurately (YES in step S11). In this case, as shown in FIG. 10A, the evaluation unit 7 notifies the second alignment unit 51 of the accurate evaluation result, and the second alignment unit 51 provides information on the position of the second region R2. Output (step S6).
On the other hand, when the SAD is equal to or higher than a predetermined value, the evaluation unit 7 evaluates that the alignment of the second image B with respect to the first region R1 has not been performed accurately (NO in step S11). In this case, as shown in FIG. 10B, the evaluation unit 7 notifies the main controller 80 of the inaccurate evaluation result (step S12), and the second image B is held in the memory 8 (step S13). ..
 メインコントローラ80は、不的確の評価結果の通知に応答し、光学顕微鏡60の倍率変更部63およびカメラ64を制御することによって、第1の画像Aを再度撮像させて試料画像入力部2に入力させる。すなわち、メインコントローラ80は、対物レンズ62の倍率を第1の倍率に変更させるための倍率変更信号を光学顕微鏡60に送信する。倍率変更部63は、倍率変更信号に応答して対物レンズ62の倍率を第1の倍率に変更し、続いて、カメラ64が、第1の画像Aを再度撮像する。画像取得部6は、第1の倍率への変更と連動して第1の画像Aを選択し試料画像入力部2に入力する(ステップS14からS16)。 The main controller 80 responds to the notification of the inaccurate evaluation result and controls the magnification changing unit 63 and the camera 64 of the optical microscope 60 to re-image the first image A and input it to the sample image input unit 2. Let me. That is, the main controller 80 transmits a magnification change signal for changing the magnification of the objective lens 62 to the first magnification to the optical microscope 60. The magnification changing unit 63 changes the magnification of the objective lens 62 to the first magnification in response to the magnification change signal, and then the camera 64 re-images the first image A. The image acquisition unit 6 selects the first image A in conjunction with the change to the first magnification and inputs it to the sample image input unit 2 (steps S14 to S16).
 第1の位置合わせ部41は、再度撮像された第1の画像Aを使用して第1の領域R1を再度推定し(ステップS3,S4)、第2の位置合わせ部51は、再度推定された第1の領域R1とメモリ8に保持された第2の画像Bとを使用して、第2の領域R2の位置を再度取得する(ステップS7)。
 このように、本実施形態によれば、第2の領域R2の推定精度が評価され、推定精度が低い場合に、画像の撮像、第1の領域の推定および第2の領域の推定が再実行される。これにより、第2の領域R2の正確な位置を確実に取得することができる。
The first alignment unit 41 re-estimates the first region R1 using the re-imaged first image A (steps S3 and S4), and the second alignment unit 51 is re-estimated. Using the first region R1 and the second image B held in the memory 8, the position of the second region R2 is acquired again (step S7).
As described above, according to the present embodiment, the estimation accuracy of the second region R2 is evaluated, and when the estimation accuracy is low, the image imaging, the estimation of the first region, and the estimation of the second region are re-executed. Will be done. As a result, the exact position of the second region R2 can be reliably obtained.
 本実施形態において、第2の領域R2の位置が的確に取得されなかったと評価された後、光学顕微鏡60が、対物レンズ62の倍率を自動的に切り替えることとしたが、これに代えて、作業者が、例えば表示装置に表示された不的確の評価結果に基づいて、対物レンズ62の倍率を手動で切り替えてもよい。 In the present embodiment, after it is evaluated that the position of the second region R2 is not accurately acquired, the optical microscope 60 automatically switches the magnification of the objective lens 62, but instead of this, the work The person may manually switch the magnification of the objective lens 62 based on, for example, an inaccurate evaluation result displayed on the display device.
 本実施形態において、第1の画像Aのみを再度撮像することとしたが、これに代えて、第1の画像Aおよび第2の画像Bの両方を再度撮像してもよい。この場合、再度撮像された一対の第1の画像Aおよび第2の画像Bを使用して、第2の領域R2が再度推定される。
 本実施形態において、画像処理装置30が、第1の位置合わせ部41および第2の位置合わせ部51を備えることとしたが、これに代えて、位置合わせ部4および位置取得部5を備えていてもよい。
In the present embodiment, only the first image A is imaged again, but instead, both the first image A and the second image B may be imaged again. In this case, the second region R2 is re-estimated using the pair of recaptured first images A and second images B.
In the present embodiment, the image processing device 30 is provided with the first alignment unit 41 and the second alignment unit 51, but instead of this, it is provided with the alignment unit 4 and the position acquisition unit 5. You may.
1,10,20,30 画像処理装置
2 試料画像入力部
3 基準画像入力部
4 位置合わせ部
5 位置取得部
41 第1の位置合わせ部(位置合わせ部)
51 第2の位置合わせ部(位置取得部)
6 画像取得部
7 評価部
60 光学顕微鏡
64 カメラ(撮像部)
80 メインコントローラ(制御部)
100,200 顕微鏡システム
A 第1の画像
B 第2の画像
C 基準画像
R1 第1の領域
R2 第2の領域
S 試料
1,10,20,30 Image processing device 2 Sample image input unit 3 Reference image input unit 4 Alignment unit 5 Position acquisition unit 41 First alignment unit (alignment unit)
51 Second alignment section (position acquisition section)
6 Image acquisition unit 7 Evaluation unit 60 Optical microscope 64 Camera (imaging unit)
80 Main controller (control unit)
100,200 Microscope system A 1st image B 2nd image C Reference image R1 1st region R2 2nd region S Sample

Claims (9)

  1.  試料の第1の画像および第2の画像が入力され、前記第1の画像および前記第2の画像は、第1の倍率および第2の倍率の対物レンズをそれぞれ通して撮像された顕微鏡画像であり、前記第1の倍率が前記第2の倍率よりも低く、前記第1の倍率の前記対物レンズの光軸と前記第2の倍率の前記対物レンズの光軸は相互に同一または略同一である、試料画像入力部と、
     前記第1の画像および前記第2の画像に対応する基準画像が入力され、該基準画像は、前記第1の画像の撮影範囲および前記第2の画像の撮影範囲よりも広い範囲の前記試料の画像である、基準画像入力部と、
     前記第1の画像を前記基準画像と照合することによって、前記基準画像のうち、前記第1の画像の撮影範囲に対応する第1の領域を推定する位置合わせ部と、
     該位置合わせ部によって推定された前記第1の領域に基づいて、前記基準画像のうち、前記第2の画像の撮影範囲に対応する第2の領域の位置を取得する位置取得部と、を備える画像処理装置。
    A first image and a second image of the sample are input, and the first image and the second image are microscopic images taken through objective lenses of the first magnification and the second magnification, respectively. Yes, the first magnification is lower than the second magnification, and the optical axis of the objective lens of the first magnification and the optical axis of the objective lens of the second magnification are the same or substantially the same as each other. There is a sample image input part,
    A reference image corresponding to the first image and the second image is input, and the reference image is a range of the sample that is wider than the shooting range of the first image and the shooting range of the second image. The reference image input section, which is an image,
    An alignment unit that estimates a first region of the reference image corresponding to the shooting range of the first image by collating the first image with the reference image.
    A position acquisition unit that acquires the position of a second region corresponding to the shooting range of the second image of the reference image based on the first region estimated by the alignment unit is provided. Image processing device.
  2.  前記位置取得部が、前記第1の領域の中心位置から所定の距離だけずれた位置を前記第2の領域の中心位置として取得し、前記所定の距離が、前記第1の倍率と前記第2の倍率との間の前記対物レンズの光軸の変位量に応じた距離である、請求項1に記載の画像処理装置。 The position acquisition unit acquires a position deviated from the center position of the first region by a predetermined distance as the center position of the second region, and the predetermined distance is the first magnification and the second. The image processing apparatus according to claim 1, which is a distance according to the amount of displacement of the optical axis of the objective lens with respect to the magnification of.
  3.  前記位置取得部が、前記第2の画像を前記第1の領域と照合することによって前記第2の領域を推定し、推定された前記第2の領域の位置を取得する、請求項1に記載の画像処理装置。 The first aspect of the present invention, wherein the position acquisition unit estimates the second region by collating the second image with the first region, and acquires the estimated position of the second region. Image processing equipment.
  4.  前記位置取得部が、前記第2の画像と前記第1の領域との照合において、前記第1の領域の中心位置から所定の距離の範囲で前記第2の画像に対応する前記第2の領域を探索し、前記所定の距離が、前記第1の倍率の前記対物レンズと前記第2の倍率の前記対物レンズとの間の光軸の変位量の保証範囲に対応する距離である、請求項3に記載の画像処理装置。 In collation of the second image with the first region, the position acquisition unit corresponds to the second image within a range of a predetermined distance from the center position of the first region. The predetermined distance is a distance corresponding to a guaranteed range of the amount of displacement of the optical axis between the objective lens having the first magnification and the objective lens having the second magnification. The image processing apparatus according to 3.
  5.  前記試料が、半導体回路基板であり、
     前記基準画像が、前記半導体回路基板の設計図面の画像である、請求項1から請求項4のいずれかに記載の画像処理装置。
    The sample is a semiconductor circuit board.
    The image processing apparatus according to any one of claims 1 to 4, wherein the reference image is an image of a design drawing of the semiconductor circuit board.
  6.  試料と対向して配置される対物レンズと、該対物レンズの倍率を第1の倍率と第2の倍率との間で変更する倍率変更部と、前記対物レンズを通して前記試料を撮像する撮像部と、を有し、前記第1の倍率が前記第2の倍率よりも低い、光学顕微鏡と、
     請求項1から請求項5のいずれかに記載の画像処理装置と、を備える顕微鏡システム。
    An objective lens arranged so as to face the sample, a magnification changing unit that changes the magnification of the objective lens between the first magnification and the second magnification, and an imaging unit that images the sample through the objective lens. With an optical microscope, the first magnification of which is lower than the second magnification.
    A microscope system comprising the image processing apparatus according to any one of claims 1 to 5.
  7.  前記撮像部から該撮像部によって撮像された画像を取得する画像取得部を備え、
     該画像取得部が、
     前記倍率変更部の前記対物レンズの倍率の変更の動作と連動して、前記撮像部から取得した画像の中から前記第1の画像および前記第2の画像を選択し、
     選択された前記第1の画像および前記第2の画像を前記試料画像入力部に入力する、請求項6に記載の顕微鏡システム。
    An image acquisition unit for acquiring an image captured by the image pickup unit from the image pickup unit is provided.
    The image acquisition unit
    In conjunction with the operation of changing the magnification of the objective lens of the magnification changing unit, the first image and the second image are selected from the images acquired from the imaging unit.
    The microscope system according to claim 6, wherein the selected first image and the second image are input to the sample image input unit.
  8.  前記光学顕微鏡および前記画像取得部を制御する制御部をさらに備え、
     前記画像処理装置は、前記位置取得部によって前記第2の領域の位置が的確に取得されたか否かを評価する評価部をさらに備え、
     前記評価部によって前記第2の領域の位置が的確に取得されていないと評価された場合、前記制御部が、前記光学顕微鏡および前記画像取得部を制御することによって、前記第1の画像を再度撮像させて前記試料画像入力部に入力させ、
     前記位置合わせ部が、再度撮像された前記第1の画像を用いて前記第1の領域を推定する、請求項7に記載の顕微鏡システム。
    A control unit for controlling the optical microscope and the image acquisition unit is further provided.
    The image processing apparatus further includes an evaluation unit that evaluates whether or not the position of the second region has been accurately acquired by the position acquisition unit.
    When it is evaluated that the position of the second region is not accurately acquired by the evaluation unit, the control unit controls the optical microscope and the image acquisition unit to re-image the first image. An image is taken and input to the sample image input unit.
    The microscope system according to claim 7, wherein the alignment unit estimates the first region using the first image captured again.
  9.  試料の第1の画像および第2の画像を処理する画像処理方法であって、前記第1の画像および前記第2の画像は、第1の倍率および第2の倍率の対物レンズをそれぞれ通して撮像された顕微鏡画像であり、前記第1の倍率が前記第2の倍率よりも低く、前記第1の倍率の前記対物レンズの光軸と前記第2の倍率の前記対物レンズの光軸は相互に同一または略同一であり、
     前記第1の画像および前記第2の画像が入力され、
     前記第1の画像および前記第2の画像に対応する基準画像が入力され、該基準画像は、前記第1の画像の撮影範囲および前記第2の画像の撮影範囲よりも広い範囲の前記試料の画像であり、
     前記第1の画像を前記基準画像と照合することによって、前記基準画像のうち、前記第1の画像の撮影範囲に対応する第1の領域を推定し、
     推定された前記第1の領域に基づいて、前記基準画像のうち、前記第2の画像の撮影範囲に対応する第2の領域の位置を取得する、画像処理方法。
    An image processing method for processing a first image and a second image of a sample, wherein the first image and the second image are passed through objective lenses of a first magnification and a second magnification, respectively. It is a microscopic image taken, the first magnification is lower than the second magnification, and the optical axis of the objective lens of the first magnification and the optical axis of the objective lens of the second magnification are mutual. Is the same or almost the same as
    The first image and the second image are input,
    A reference image corresponding to the first image and the second image is input, and the reference image is a range of the sample that is wider than the shooting range of the first image and the shooting range of the second image. It is an image
    By collating the first image with the reference image, a first region of the reference image corresponding to the shooting range of the first image is estimated.
    An image processing method for acquiring the position of a second region of the reference image corresponding to the shooting range of the second image based on the estimated first region.
PCT/JP2020/000318 2020-01-08 2020-01-08 Image processing device, microscope system, and image processing method WO2021140591A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/000318 WO2021140591A1 (en) 2020-01-08 2020-01-08 Image processing device, microscope system, and image processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/000318 WO2021140591A1 (en) 2020-01-08 2020-01-08 Image processing device, microscope system, and image processing method

Publications (1)

Publication Number Publication Date
WO2021140591A1 true WO2021140591A1 (en) 2021-07-15

Family

ID=76788161

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/000318 WO2021140591A1 (en) 2020-01-08 2020-01-08 Image processing device, microscope system, and image processing method

Country Status (1)

Country Link
WO (1) WO2021140591A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1021389A (en) * 1996-07-03 1998-01-23 Mitsubishi Electric Corp Template matching method and device for the method
JP2003109003A (en) * 2001-09-28 2003-04-11 Keyence Corp Pattern matching method using pyramidal structure search, image detection circuit, image processing program and computer readable storage medium
US6603882B2 (en) * 2001-04-12 2003-08-05 Seho Oh Automatic template generation and searching method
JP2007115991A (en) * 2005-10-21 2007-05-10 Hitachi High-Technologies Corp Semiconductor test equipment and semiconductor inspection method
JP2008232933A (en) * 2007-03-22 2008-10-02 Hitachi High-Technologies Corp Image processing system and scanning electron microscope system
JP2009258187A (en) * 2008-04-11 2009-11-05 Keyence Corp Optical microscope apparatus and data processor for optical microscope
JP2010130408A (en) * 2008-11-28 2010-06-10 Keyence Corp Imaging apparatus
WO2018096639A1 (en) * 2016-11-24 2018-05-31 株式会社ニコン Image processing device, microscope system, image processing method, and program

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1021389A (en) * 1996-07-03 1998-01-23 Mitsubishi Electric Corp Template matching method and device for the method
US6603882B2 (en) * 2001-04-12 2003-08-05 Seho Oh Automatic template generation and searching method
JP2003109003A (en) * 2001-09-28 2003-04-11 Keyence Corp Pattern matching method using pyramidal structure search, image detection circuit, image processing program and computer readable storage medium
JP2007115991A (en) * 2005-10-21 2007-05-10 Hitachi High-Technologies Corp Semiconductor test equipment and semiconductor inspection method
JP2008232933A (en) * 2007-03-22 2008-10-02 Hitachi High-Technologies Corp Image processing system and scanning electron microscope system
JP2009258187A (en) * 2008-04-11 2009-11-05 Keyence Corp Optical microscope apparatus and data processor for optical microscope
JP2010130408A (en) * 2008-11-28 2010-06-10 Keyence Corp Imaging apparatus
WO2018096639A1 (en) * 2016-11-24 2018-05-31 株式会社ニコン Image processing device, microscope system, image processing method, and program

Similar Documents

Publication Publication Date Title
JP5096301B2 (en) Imaging device
CA2507174C (en) Method of registering and aligning multiple images
JP5096302B2 (en) Imaging device
JP2010141699A (en) Imaging apparatus
TWI474363B (en) Pattern evaluation device and pattern evaluation method
US9341465B2 (en) Dimension measuring apparatus, dimension measuring method, and program for dimension measuring apparatus
EP3379333B1 (en) Detection apparatus, pattern forming apparatus, obtaining method, detection method, and article manufacturing method
JP6980631B2 (en) Inspection method and inspection equipment
CN113594076A (en) Method for aligning patterned wafer and semiconductor device
WO2021140591A1 (en) Image processing device, microscope system, and image processing method
JP5209099B2 (en) Imaging device
KR100597026B1 (en) A method of detecting a pattern and an apparatus thereof
CN115628685B (en) Method and equipment for measuring critical dimension and method for classifying and positioning critical dimension
US10018826B2 (en) Microscope system
US7387859B2 (en) Method for measuring overlay shift
JP5209100B2 (en) Imaging device
JP2018146492A (en) Defect inspection system and inspection device and review device used for the same
JP5209138B2 (en) Imaging device
US11887327B2 (en) Microscopic examination device and navigation method
JP5610579B2 (en) 3D dimension measuring device
KR100543468B1 (en) Systems and methods for measuring distance of semiconductor patterns
KR100287319B1 (en) Rotation direction detection method, measurement position determination method and apparatus therefor
JP2000251824A (en) Electron beam apparatus and stage movement positioning method thereof
JP5209097B2 (en) Imaging device
JP2004184411A (en) Position recognition method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20912223

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20912223

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP