WO2005026802A1 - オートフォーカス制御方法、オートフォーカス制御装置および画像処理装置 - Google Patents

オートフォーカス制御方法、オートフォーカス制御装置および画像処理装置 Download PDF

Info

Publication number
WO2005026802A1
WO2005026802A1 PCT/JP2004/012609 JP2004012609W WO2005026802A1 WO 2005026802 A1 WO2005026802 A1 WO 2005026802A1 JP 2004012609 W JP2004012609 W JP 2004012609W WO 2005026802 A1 WO2005026802 A1 WO 2005026802A1
Authority
WO
WIPO (PCT)
Prior art keywords
focus
evaluation value
image
image data
calculating
Prior art date
Application number
PCT/JP2004/012609
Other languages
English (en)
French (fr)
Japanese (ja)
Other versions
WO2005026802B1 (ja
Inventor
Hiroki Ebe
Masaya Yamauchi
Kiyoyuki Kikuchi
Kiyotaka Kuroda
Junichi Takahashi
Original Assignee
Sony Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corporation filed Critical Sony Corporation
Priority to US10/569,480 priority Critical patent/US20070187571A1/en
Publication of WO2005026802A1 publication Critical patent/WO2005026802A1/ja
Publication of WO2005026802B1 publication Critical patent/WO2005026802B1/ja

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals

Definitions

  • the present invention is suitably used, for example, in an apparatus for photographing, observing, and inspecting an object sample with a video camera.
  • the present invention relates to a control method, an autofocus control device, and an image processing device. Background art
  • image autofocus control has been performed by using a focus evaluation value that has been quantified by evaluating the degree of focus from image data of a subject sample (work).
  • the image data of the sample is collected while changing the distance between the lens and the work, and the focus evaluation value 3 ⁇ 4 is calculated for each of them to search for a suitable focus position.
  • Fig. 21 shows an example of the relationship between the lens-work distance (horizontal axis) and the focus evaluation value (vertical axis).
  • images are captured by changing the distance between the lens and the workpiece at regular intervals, and the focus evaluation value of each image is calculated and plotted.
  • the maximum focus evaluation value in the graph is the focus position, that is, the optimum focus position (focus position).
  • the plot of the focus evaluation value with respect to the lens-work distance is referred to as a “focus curve”.
  • the distance between the lens and the workpiece is changed within a predetermined search range.
  • the maximum value of the focus evaluation value in this graph was used as the optimum focus position, or the optimum focus position was calculated from the focus evaluation values before and after the maximum value.
  • the focus evaluation value the maximum value of the brightness, the differential value of the brightness, the variance of the brightness, the variance of the differential value of the brightness, and the like are used.
  • As an algorithm for finding an optimum focus position from the maximum focus evaluation value there is a hill-climbing method, and a method of dividing a search operation into several stages has been put to practical use in order to reduce search time (Japanese Patent Laid-Open Publication No. No. 6-2171800, Japanese Unexamined Patent Application Publication No. 2000-333351 and Japanese Patent No. 2971892.
  • the size of the target work has been miniaturized, and an improvement in resolution has been demanded for an inspection instrument to which this focus skin surgery is applied.
  • This improvement in resolution can be accommodated by shortening the wavelength of the illumination light source and using a single wavelength.
  • the optical resolution is increased by shortening the wavelength, and the effects of chromatic aberration and the like are avoided by shortening the wavelength.
  • speckle refers to a state in which the brightness of the screen is distributed in a patchy manner, and a unique pattern of light and shade is obtained depending on the wavelength of the light source and the configuration of the optical system.
  • the above-mentioned focus scab may have a larger value in the affected part of the optical system than the focus evaluation value of the optimum focus position as shown in FIG. Since the shape and numerical value range of the focus curve are not uniquely determined by the surface conditions such as the reflectivity of the object, the conventional technology that determines the focus position from the maximum focus evaluation value in this state stabilizes the optimal focus position You cannot ask for it.
  • the present invention has been made in view of the above-described problems, and provides an autofocus control method, an autofocus control device, and an image processing device capable of realizing a stable autofocus operation by eliminating an influence caused by an optical system. As an issue. Disclosure of the invention
  • the autofocus control method of the present invention includes an image acquisition step of acquiring an image of a subject at a plurality of focus positions having different distances between a lens and a subject.
  • An evaluation value calculation step of calculating a focus evaluation value for each of a plurality of focus positions based on each of the acquired image data; and a focus position calculation step of calculating a focus position at which the focus evaluation value becomes maximum as a focus position. Moving the lens relative to the subject to the calculated focus position, performing a smoothing process on the image data obtained in the image obtaining process, and performing the smoothing process on the image data.
  • the focus evaluation value is calculated based on the above.
  • an image smoothing process is added in the present invention. This smoothing process enables the focus evaluation value to be calculated appropriately by capturing the characteristics of the target sample (subject) while reducing the density distribution pattern of speckles.
  • the processing conditions such as the number of images to be processed (unit processing range), filtering coefficients, the number of times of processing, and the presence or absence of weighting are determined by the type of optical system to be applied and the size of the sample. It can be set appropriately according to the surface properties and the like.
  • the focus evaluation value it is preferable to detect a difference in luminance data between adjacent pixels in the acquired image data. Edge enhancement processing to be extracted can be used.
  • the calculated evaluation value is divided by the average luminance of the entire screen to normalize the focus evaluation value by the average luminance of the screen.
  • the autofocus control device includes an evaluation unit that calculates a focus evaluation value for each of a plurality of focus positions based on each image data acquired at a plurality of focus positions having different lens-subject distances.
  • Value calculation means focus position calculation means for calculating a focus position based on the calculated maximum focus evaluation value, and image smoothing means for smoothing the acquired image data. Of each image data based on the image data smoothed by the The focus evaluation value is calculated.
  • the autofocus control device of the present invention is configured as one image processing device in combination with image acquisition means for acquiring image data of a subject at a plurality of focus positions, drive means for adjusting the distance between a lens and a subject, and the like.
  • the image acquisition unit and the driving unit may be configured as a separate and independent member.
  • FIG. 1 is a schematic configuration diagram of an image processing apparatus 1 according to a first embodiment of the present invention.
  • FIG. 2 is a block diagram illustrating the configuration of the controller 7.
  • FIG. 3 is a flowchart illustrating the operation of the image processing apparatus 1.
  • FIG. 4 is a flowchart illustrating another operation example of the image processing apparatus 1.
  • FIG. 5 is an example of a focus curve for explaining an operation of the present invention.
  • FC 1 is an example when image smoothing processing and luminance normalization processing of focus evaluation values are performed, and
  • FC 2 is an image smoothing processing.
  • FC 3 shows an example in which only the conversion process is performed, and FC 3 shows a conventional example.
  • FIG. 6 is a diagram for explaining a method of calculating a focus position by approximating a curve near the maximum focus evaluation value.
  • FIG. 7 is a diagram showing the relationship between the command voltage for the lens driving unit 4 and the actual movement voltage of the lens.
  • FIG. 8 is a diagram for explaining a method of performing parallel processing of capturing a sample image and calculating a focus evaluation value.
  • FIG. 9 is a diagram showing the second embodiment of the present invention, and is a diagram for explaining a method of dividing a screen into a plurality of parts and detecting a focus position in each divided region.
  • FIG. 10 is a process flow chart according to the third embodiment of the present invention.
  • FIG. 11 is a memory configuration diagram applied to the third embodiment of the present invention.
  • FIG. 12 is a flowchart illustrating an all-focus image acquiring step.
  • FIG. 13 is a diagram showing the fourth embodiment of the present invention, and is a diagram for explaining a method of acquiring a stereoscopic image by combining the focus positions of the sample images in the focus axis direction.
  • FIG. 14 is a flowchart illustrating a method of synthesizing the above-mentioned stereoscopic image.
  • FIG. 15 is a functional block diagram showing a first configuration example of an auto force control device according to a fifth embodiment of the present invention.
  • FIG. 16 is a functional block diagram showing a second configuration example of the auto force control device according to the fifth embodiment of the present invention.
  • FIG. 17 is a functional block diagram showing a third configuration example of the auto force control device according to the fifth embodiment of the present invention.
  • FIG. 18 is a functional block diagram showing a fourth configuration example of the automatic force control device according to the fifth embodiment of the present invention.
  • FIG. 19 is a diagram showing a fifth configuration example of the auto force control device according to the fifth embodiment of the present invention.
  • FIGS. 20A to 20B are block diagrams showing modified examples of the configuration of the drive system of the image processing apparatus 1.
  • FIG. 20A to 20B are block diagrams showing modified examples of the configuration of the drive system of the image processing apparatus 1.
  • FIG. 21 is an example of a focus curve showing a relationship between a lens-work distance (focus position) and a focus evaluation value.
  • FIG. 22 is a diagram for explaining the problems of the prior art. BEST MODE FOR CARRYING OUT THE INVENTION
  • FIG. 1 is a schematic configuration diagram of an image processing apparatus to which an autofocus control method and an autofocus control device according to an embodiment of the present invention are applied.
  • the image processing apparatus 1 is used for observing the surface of an object sample (work), and is particularly used for detecting a defect of an element structure formed by performing fine processing on a surface such as a semiconductor device. It is configured as a microscope.
  • the image processing device 1 includes a measurement stage 2, an objective lens 3, a lens driving unit 4, a lens barrel 5, a CCD (Charge Coupled Dev ice) camera 6, a controller 7, a driver 8, a monitor 9, and an illumination light source 10. Is provided.
  • a measurement stage 2 an objective lens 3
  • a lens driving unit 4 a lens barrel 5
  • a CCD (Charge Coupled Dev ice) camera 6 a controller 7
  • a driver 8 a monitor 9, and an illumination light source 10. Is provided.
  • the measurement stage 2 supports a subject sample (for example, semiconductor wafer 18) W and is configured to move in the X-Y direction (the left-right direction in the figure and the direction perpendicular to the paper).
  • a subject sample for example, semiconductor wafer 18
  • the lens driving unit 4 moves the objective lens 3 to the subject sample W on the measurement stage 2 in a predetermined direction in the focus axis direction (vertical direction in the figure). Relative movement is performed over one spot position search range, and the lens-workpiece distance is variably adjusted.
  • the lens driving section 4 corresponds to the “driving J means” of the present invention.
  • the lens driving section 4 is constituted by a piezo element, but other than this, a precision feed mechanism such as a pulse motor can be employed.
  • the objective lens 3 is moved in the focus axis direction for adjusting the distance between the lens and the work, the measurement stage 2 may be moved in the focus axis direction instead.
  • the CCD camera 6 functions as a video camera that takes an image of a specific area on the surface of the subject sample W on the measurement stage 2 via the objective lens 3 that moves within the focus position search range, and outputs the acquired image data to the controller 7. I do.
  • another solid-state imaging device such as a CMOS imager may be applied.
  • the controller 7 is composed of a computer, controls the operation of the entire image processing apparatus 1, and detects an optimum focus position (focus position) in a specific area on the surface of the sample W. ) A control unit is provided.
  • the autofocus controller 11 corresponds to the “autofocus controller” of the present invention.
  • the driver 8 receives a control signal from the autofocus control unit 11 and generates a drive signal for driving the lens drive unit 4.
  • the driver 8 is constituted by a piezo driver having a hysteresis compensation function. Note that this Dryno 8 is It may be incorporated in the single control unit 11.
  • the auto focus control unit 11 drives the lens driving unit 4 via the driver 8 and changes the distance (lens-work distance) between the objective lens 3 and the subject sample W at a fixed interval.
  • the image data of the subject sample W is acquired by the CCD camera 6, and various processes described later are performed to detect an optimal focus position, that is, a focus position, in the imaging region of the subject sample W.
  • the monitor 9 displays the contents of processing by the controller 7 and also displays an image of the subject sample W captured by the CCD camera 6, and the like.
  • a continuous laser or a pulse laser light source having a wavelength of 196 nm is used as the illumination light source 10.
  • the wavelength range of the illumination light source is not limited to the above-described ultraviolet light range, and it is of course possible to use another ultraviolet light having a different wavelength range depending on the application or a light source in the visible light range.
  • FIG. 2 is a block diagram of a configuration of the image processing apparatus 1.
  • the analog image signal output from the CCD camera 6 is converted into a digital image signal by the A / D converter 13.
  • the output signal of the A / D converter 13 is supplied to the memory 14 and stored.
  • the autofocus controller 11 of the controller 7 reads the converted digital image signal from the memory 14 and performs autofocus control described later. Then, the driver 8 generates a drive signal for the lens drive unit 4 based on a control signal from the controller 7 supplied via the DZA converter 17.
  • the focus control unit 11 includes a smoothing processing circuit 11 A, an average luminance calculation circuit 11 B, an evaluation value calculation circuit 11 C, and a focus position calculation. Circuit 1 ID is provided.
  • the smoothing processing circuit 11 A smoothes the auto-force target area (the entire screen or a partial area within the screen) of each image signal (sample image) of the object sample W acquired at a plurality of focus positions. This is a circuit for performing the image processing, and corresponds to the “image smoothing means” of the present invention.
  • the focus control unit 11 reduces the uneven distribution (speckle) of brightness of each of the acquired sample images by the smoothing processing circuit 11A.
  • An example of the smoothing process is shown in [Equation 1].
  • the processing conditions for image smoothing are based on the sample W captured by the CCD camera 6. It can be set arbitrarily as long as the original feature and contour of the surface are not crushed, and these processing conditions are set via an input device 16 such as a keyboard, a mouse, and a touch panel. .
  • the average luminance calculation circuit 11B is a circuit that calculates the screen average luminance of the autofocus target area of each sample image, and corresponds to “average luminance calculation means” of the present invention.
  • the screen average luminance at each focus position obtained by the average luminance calculation circuit 11B is described later. This is used to calculate the focus evaluation value Pv of the focus position in the evaluation value calculation circuit 11C.
  • the evaluation value calculation circuit 11C is a circuit that calculates the focus evaluation value P V of each sample image, and corresponds to “evaluation value calculation means” of the present invention.
  • the evaluation value calculation circuit 11C is configured to include an edge enhancement processing circuit.
  • the focus evaluation value is an index that numerically evaluates a state in which a characteristic portion and a contour portion of an image are clearly visible. Looking at the change in luminance between pixels in the feature-contour portion, a sharp change occurs in a clear image, and a gradual change occurs in a blurred image. Therefore, in the present embodiment, the focus evaluation value Pv is calculated by evaluating the luminance data difference between adjacent pixels using edge enhancement processing. In addition, the focus evaluation value may be calculated based on a differential value of brightness, variance of brightness, or the like.
  • the pixel area to be processed is 3 ⁇ 3, but it may be 5 ⁇ 5 or 7 ⁇ 7.
  • the coefficients are weighted, the setting of the coefficients is arbitrary, and the processing may be performed without weighting.
  • the division processing is executed by the screen average luminance at the corresponding focus position calculated by the average luminance calculation circuit 11 ⁇ . That is, as shown in [Equation 3], the focus evaluation value ⁇ V of each sample image is obtained by dividing the focus evaluation value ⁇ obtained by the edge enhancement processing circuit by the screen average luminance P ave at the corresponding focus position. Value.
  • Equation 3 Smell Pv (i) at i-th focus position
  • the focus evaluation value P vo (i) whose brightness has been normalized is the focus evaluation value at the i-th focus position
  • P ave (i) is the screen average brightness at the i-th focus position.
  • the focus evaluation value Pv may be calculated by multiplying the calculated value obtained in [Equation 3] by the maximum value P millax of the screen average luminance. This compensates for the loss (quantity decrease) of the focus evaluation value due to division by the average luminance, and makes it easier to see the quantitative change of the focus evaluation value when referring to the focus force later.
  • the screen average luminance to be multiplied is not limited to the maximum value, and may be, for example, a minimum value.
  • the focus evaluation value (Pv) is a value obtained by dividing the evaluation value calculated by the edge enhancement processing by the screen average luminance
  • the focus evaluation value is calculated based on the evaluation point (pixel) and its surroundings. Since there is a variation in the brightness between the acquired images, the average brightness of the screen (the sum of the brightness of the individual pixels that compose the screen is calculated as This is to prevent the absolute value of the index calculated from the change if the luminance value itself divided by the total number of pixels changes).
  • the focus evaluation value calculated by the edge enhancement processing is normalized by the screen average luminance (P ave), thereby obtaining the focus evaluation value due to the screen luminance change.
  • the focus evaluation value when the screen average luminance is 50 and the luminance difference is 20% is 0.2 (10/50).
  • the pin evaluation value is also 0.2 (20/100), which matches each other. The effect on the evaluation value is eliminated.
  • the focus position calculation circuit 11D is a circuit that calculates the focus position based on the maximum value of the focus evaluation value calculated by the evaluation value calculation circuit 11C, and the “focus position calculation means” of the present invention.
  • image autofocus control involves acquiring sample images at multiple focus positions with different distances between the lens and the workpiece, and detecting the force position of the sample image from which the maximum focus evaluation value can be obtained. Determines the focus position. Therefore, the more the number of sample images (the smaller the amount of focus movement between samples), the more accurate autofocus control can be realized.
  • the number of samples increases, the time required for processing also increases, and the high-speed autofocus control cannot be ensured. Therefore, in the present embodiment, as shown in FIG.
  • the vicinity of the focus position is close to a quadratic curve convex upward. Therefore, the approximate quadratic curve is calculated by the least-squares method using the points near the focus position, the vertices are obtained, and this is set as the focus position.
  • the solid lines are 3 points (Pv (m), Pv (ml), Pv (m + 1)), and the broken lines are 5 points (Pv (m), Pv (ml), Pv (m + l).
  • a straight line passing through two points Pv (m) and Pv (m + l) and another line of PV (m-1) and PV (m-2) Calculate the point of intersection from a straight line passing through two points and use it as the focus position (linear approximation method), or use other approximation methods such as normal distribution curve approximation to detect the bin position. It may be.
  • memory 15 is used for various calculations of the CPU of controller 7.
  • a first memory unit 15A and a second memory unit 15B that are used for various operations in the autofocus control unit 11 are allocated to the memory space of the memory 15.
  • sample images are respectively obtained at multiple focus positions while continuously changing the distance between the lens and the work. This As a result, the speed of the autofocus control can be increased as compared with the case where the lens is stopped at each focus position to acquire an image.
  • FIG. 7 shows the relationship between the command voltage of the driver 8 for the lens driving unit 4 and the actual moving voltage of the lens driving unit 4.
  • the lens driving section 4 composed of a piezo element has a position control movement amount detection sensor.
  • the actual moving voltage in FIG. 7 is this sensor monitor signal.
  • the command voltage is changed by a predetermined amount for each video signal frame of the CCD camera 6 after moving the lens to the autofocus control start position. Comparing the indicated voltage and the actual moving voltage, although the response is delayed, the movement is smooth, and the slopes of both graphs in the gradually increasing region are almost the same while the steps of the indicated voltage are crushed. From this, it can be seen that the lens is operating at a constant speed with respect to the command voltage corresponding to the constant speed. Therefore, if a sample image is acquired in synchronization with the image synchronization signal, it becomes possible to calculate and acquire the focus evaluation value at fixed intervals of the focus axis coordinates.
  • the sample image obtaining step and the focus evaluation value calculating step are performed in parallel.
  • the focus evaluation value PV is calculated by processing the image data already captured in the second memory unit 15B while loading the image data into the first memory unit 15A as shown in Fig. 8. It can be configured with double buffering.
  • the first memory unit 15A processes image data captured in even-numbered frames
  • the second memory unit 15B processes image data captured in odd-numbered frames.
  • FIG. 3 is a process flow chart in the autofocus controller 11.
  • step S1 initial settings such as the autofocus processing area of the subject sample W, the focus position search range, the amount of focus movement between acquired image samples (focus axis step length), image smoothing processing conditions, and edge enhancement processing conditions are input.
  • step S1 the auto force control is executed.
  • the objective lens 3 starts moving along the focus axis direction from the auto-force control start position by driving the lens driving unit 4 (in the present embodiment, the direction approaching the subject sample W), and the surface image is synchronized. Acquire a sample image of the subject sample W in synchronization with the signal (steps S2 and S3). Next, the focus axis coordinates (coordinates of the distance between the lens and the work) of the obtained sample image are obtained (step S4).
  • focus evaluation processing including screen average luminance calculation processing, image smoothing processing, edge enhancement processing, and luminance normalization processing is performed on the acquired sample image (steps S5 to S8).
  • the screen average luminance calculation step (step S5) is calculated by the average luminance calculation circuit 11B.
  • the calculated average screen brightness is later used for calculating the focus evaluation value.
  • This screen average luminance calculation step may be performed after the smoothing processing step (step S6).
  • the image smoothing processing step (step S6) is processed by the smoothing processing circuit 11A.
  • the image smoothing process is performed using, for example, an arithmetic expression represented by [Equation 1]. As a result, in the acquired sample image, the influence of the scattering caused by the single-wavelength light source is eliminated.
  • the edge enhancement processing step includes an evaluation value calculation circuit 1 1 Executed in C.
  • the pixel of the feature part / contour part is obtained by the edge emphasis processing equation shown in the above [Equation 2]. The difference between the luminance data is calculated, and this is used as the basic data for the focus evaluation value.
  • Step S8 a brightness standardization process for normalizing the focus evaluation value calculated in Step S7 with the screen average brightness is performed.
  • This step is executed by the evaluation value calculation circuit 11C.
  • the focus evaluation value (Pvo (i)) obtained in the previous edge enhancement processing step (step S7) is obtained in the screen average luminance calculation step (step S5).
  • the focus evaluation value Pv (i) whose luminance has been normalized in Expression 3 is calculated.
  • steps S2 to S8 constitute the auto-force sloop (AF loop).
  • AF loop auto-force sloop
  • Step S3 and the focus evaluation value calculation process are processed as parallel stirrups (Figs. 6 and 7). Therefore, while calculating the focus evaluation value of the previously captured sample image, the next sampler image can be obtained. As a result, the focus evaluation value can be calculated in one frame cycle of the video signal, and Faster one-focus operation is realized.
  • the AF loop ends, and the focus evaluation value of each obtained sample image is multiplied by the maximum value of the average screen brightness (Pavemax). Is executed (steps S 9 and S 10). As a result, the focus value Pv of each sample image becomes the same as the case where it is obtained by the arithmetic expression shown in the above [Equation 4].
  • the AF loop is completed by calculating the focus evaluation value by the edge enhancement processing, and as shown by step S10A in FIG.
  • the result is as shown in Fig. 3. Processing equivalent to the above example can be realized.
  • the focus squib (FC 1) obtained by performing the smoothing process (step S 6 in FIG. 3) and the luminance normalization process (step S 8 in FIG. 3) is represented by a solid line and the screen average
  • the focus curves (FC 2) obtained by performing only the smoothing process without normalizing by the luminance are indicated by dashed lines.
  • the conventional focus curve (FC 3) shown in FIG. 22 is indicated by a dotted line.
  • the affected part of the optical system is greatly improved, and the peak of the focus evaluation value to be detected as the optimal focus position (focus position) is made to be obvious. Can be done. As a result, stable and accurate autofocus operation can be realized even in an optical system having a short wavelength and a single wavelength.
  • the luminance standardization processing step (step S8 in the third step) may be omitted as necessary. Brightness normalization processing By doing so, the effect of the optical system can be further improved, and the focus position can be detected more accurately.
  • a focus position calculation process force S is performed (step S11).
  • This focus position calculation processing is executed by the focus position calculation circuit 11D.
  • To calculate the focus position as described with reference to Fig. 6, an approximate curve passing through the maximum value of the focus evaluation value and each point of a plurality of focus evaluation values in the vicinity is obtained, and the vertex is detected. This is the focus position.
  • the focus position can be detected more efficiently and with higher precision than the hill-climbing method that has been widely used in the past, so that it is possible to significantly increase the speed of autofocus operation. .
  • the autofocus control according to the present embodiment is completed through a movement step of moving the objective lens 3 to the focus position (step S12).
  • the focus evaluation value is calculated for the entire acquired sample screen (or a part of the target area).
  • the same image smoothing processing and normalization processing based on the screen average luminance as in the first embodiment are executed. This will affect the optics
  • the focus position can be detected with high accuracy without the need.
  • the divided screens may overlap each other, and the number of screen divisions may be changed dynamically according to the use situation.
  • a special optical system such as a confocal optical system is used to obtain an in-focus image that is entirely in focus, or to obtain an all-focus image from images at different angles based on trigonometry.
  • confocal optical system is used to obtain an in-focus image that is entirely in focus, or to obtain an all-focus image from images at different angles based on trigonometry.
  • an all-focus image of the subject sample W is obtained in the process of executing the autofocus control method described in the first embodiment.
  • the control flow is shown in FIG. After the process of standardizing the focus evaluation value of the acquired image (sample point) with the screen average luminance (step S8), an image synthesis process (step S8M) is added.
  • the acquired sample screen is divided into a plurality of regions (FIG. 9) as described in the second embodiment described above, and each divided region W ij is used as an image.
  • the number of screen divisions is not particularly limited, and the greater the number of divisions, the more finely the processing can be performed, and the smaller the divided area down to one pixel unit.
  • the shape of the divided area is not limited to a square, but can be changed to a circular shape or the like.
  • a memory 15 (FIG. 2) includes a first memory unit 15A for processing image data captured in even frames and a second memory unit 15B for processing image data captured in odd frames.
  • a third memory section 15C for omnifocal processing is prepared.
  • the third memory unit 15C includes a composite image data storage area 15C1 and a height (distance between lenses and work) information storage area 15C2 of each divided area Wij constituting the composite image.
  • a focus evaluation value information storage area 15 C 3 of each of the divided areas W ij is provided.
  • sample images are obtained at a plurality of focus positions with different lens-park distances, and the focus evaluation value is calculated for each of the sample images for each divided area W ij Then, after extracting an image having the highest focus evaluation value independently from each other among the divided areas W ij, a process of synthesizing the entire image is performed.
  • the “all-focus image synthesizing unit” of the present invention is configured. Referring to the process flow chart shown in FIG. 10, the steps S1 to S8 are performed in the same manner as in the above-described first embodiment. 'After each execution, move to the image synthesis process in step S8M.
  • FIG. 12 shows the details of step S8M.
  • the third memory unit 15C is initialized using the first captured image (steps a and b). That is, in step b, the first image is copied to the composite image data storage area 15 C 1.
  • the height information storage area 15 C2 is filled with the first data, and the focus evaluation value is copied to the focus evaluation value information storage area 15 C 3 of each divided area Wij and initialized. .
  • the focus evaluation value of the acquired image and the focus evaluation value of the composite image are compared for each divided region Wij (step c;). If the focus evaluation value of the acquired image is large, the image is copied, and the corresponding height information and focus evaluation value information are updated (step d). Conversely, if the focus evaluation value of the acquired image is small, no processing is performed. This is repeated for the number of divisions (step e). This completes the processing of one frame (33.3 msec).
  • the above-described processing is performed, for example, while fetching even-numbered frame image data into the first memory unit 15A, and storing one frame already captured in the second memory unit 15B.
  • the process is performed for each divided area Wij of the previous odd-numbered frame image, and necessary data and information are copied or updated in the corresponding storage area of the third memory unit 15C.
  • the above-described processing is performed along with the autofocus control of the subject sample W described in the first embodiment, but it is needless to say that the processing is performed alone. Is also possible.
  • the objective lens 3 moves over the entire search range, and is divided by each divided area. Since the in-focus state can be observed, the displayed state of the height distribution of the subject sample W can be easily grasped during the autofocus operation.
  • the all-focus image of the object sample is synthesized by using the auto-focus control method according to the present invention, high-precision auto-focusing is possible while eliminating the influence caused by the short-wavelength, single-wavelength optical system. Control is ensured, so that an all-in-focus image of the surface of a hierarchically developed structure such as a semiconductor wafer can be acquired with high resolution.
  • a three-dimensional image can be synthesized by extracting a focused part from the acquired sample image and combining this with information in the height direction. For example, as shown in FIG. 13, after performing focus position detection on each of the sample images Ra, Rb, Rc, and Rd acquired during the autofocus operation, the focus position is determined. By extracting and combining this in the height direction (focus axis direction), A stereoscopic image of the structure R can be synthesized.
  • FIG. 3 An example of a method of synthesizing a stereoscopic image according to the present embodiment is shown in a flowchart of FIG. In the figure, steps corresponding to those in the above-described first embodiment (FIG. 3) are denoted by the same reference numerals, and detailed description thereof will be omitted.
  • a vertical #: screen notch clearing step (step S1A) is provided.
  • the memory area for storing the stereoscopic screen acquired in the past is initialized.
  • sample images of the subject sample are acquired at a plurality of focus positions, and for each of them, a focus evaluation value is calculated by smoothing processing and edge enhancement processing, and the calculated focus is calculated.
  • the standardization process is performed based on the screen average brightness of the evaluation value (Step S2 to Step S8).
  • step S8A After calculating the focus evaluation value, compare each point in the screen with the data captured so far and the captured data to see if they are in focus. If the captured data is in focus The data is updated (step S8A). This process is performed for each sample image.
  • the “stereoscopic image combining means” of the present invention is configured.
  • the screen is divided into a plurality of regions W ij as in the above-described second embodiment, and the above-described processing is performed for each of the divided regions.
  • the processing is not limited, and the processing may be performed in units of one pixel.
  • the stereoscopic image of the subject sample is synthesized using the autofocus control method according to the present invention, the high-precision autofocus control with eliminating the effects caused by the short-wavelength, single-wavelength optical system is eliminated.
  • a three-dimensional image of the surface of a hierarchically developed structure such as a semiconductor wafer or the like can be acquired with high resolution.
  • this autofocus control device can be configured with a video signal decoder, an arithmetic element represented by an FPGA (Field Programmable Gate Array), a memory for storing settings, and the like. Integrated circuits such as a ProcessingUnit), PMC (Pulse Motor Controller), and external memory are used. These elements are common It is used as a single board unit or as a package component that houses it by being mounted on a printed circuit board.
  • FPGA Field Programmable Gate Array
  • PMC Pulse Motor Controller
  • FIG. 15 shows a functional block diagram according to a first configuration example of the autofocus control device of the present invention.
  • the illustrated auto focus control device 31 includes a video signal decoder 41, an FPGA 42, a field memory 43, a CPU 44, a ROM / RAM 45, a PMC 46, and an I / F circuit 47. It is composed of
  • the video signal used for the focus operation is an analog image signal encoded in the NTSC system. This is a horizontal / vertical sync signal by the video signal decoder 41, EVEN (even number) ZO DD (odd number) field information, It is converted into a digital image signal of luminance information.
  • the FPGA 42 is configured by an arithmetic element that performs a predetermined arithmetic processing in the autofocus control flow (FIG. 3) according to the present invention described in the above-described first embodiment. “Means”, “edge enhancement processing means”, and “evaluation value calculation means”.
  • the FPGA 42 extracts effective portion information in the screen from the synchronization signal and the field information digitized by the video signal decoder 41, and stores the luminance information in the field memory 43. At the same time, the data is sequentially read from the field memory 43, and arithmetic processing such as filtering (image smoothing processing), average luminance calculation, and focus evaluation value calculation is performed. Note that, depending on the degree of integration of the FPGA 42, it is possible to incorporate the functions of the field memory 43, the CPU 44, and the PMC 46 into the FPGA 42.
  • the field memory 43 is used for temporarily storing the above-mentioned field information in order to handle a video signal which is output in an interlaced manner and is composed of even and odd fields.
  • the CPU 44 changes the distance between the lens and the work by moving the stage that supports the object sample via the PCM 46 and the IZF circuit 47, and is obtained at each focus position and calculated by the FPGA 42. It manages the operation of the entire system, such as calculating the optimal force position (focus position) from the focus evaluation value of each sample image obtained.
  • CPU 44 corresponds to the “focus position calculating means” of the present invention.
  • the ROM / RAM 45 is used for storing the operating software (program) of the CPU 44 and the parameters required for calculating the focus position.
  • the ROM / RAM 45 may be built in the CPU.
  • the PMC 46 is a drive control element for a pulse motor (not shown) for moving the stage, and controls the stage via an interface circuit (I / F circuit) 47. Further, the output of the sensor for detecting the stage position is supplied to the PCM 46 through the IZF circuit 47.
  • a video signal of a sample image is supplied from a CCD camera (not shown).
  • This video signal is input to the FPGA 42 through the video signal decoder 41, where the input image is smoothed, the average luminance is calculated, and the focus evaluation value is calculated.
  • the FPGA 42 focuses on the CPU 44 at the timing of the synchronization signal at the end of the field. Transfer pricing data.
  • the CPU 44 obtains the coordinates of the focus stage at the end of the field and uses it as the lens-work distance. After repeating the above processing the number of times necessary for the autofocus operation of the present invention, the CPU 44 calculates the focus position. Then, the stage is moved to the optimal focus position, and the auto focus operation ends. As necessary, a screen division function, an all-focus image synthesizing process of a subject sample, and / or a stereoscopic image synthesizing process are performed.
  • the autofocus control device of the present invention By organically connecting the autofocus control device of the present invention configured as described above to a focus axis moving means such as an existing CCD camera, monitor, pulse motor, or the like, a function equivalent to that of the image processing device 1 described above is obtained. Therefore, the autofocus control method of the present invention can be implemented with a simple and simple configuration, which is very advantageous in terms of cost and installation space.
  • FIG. 16 is a functional block diagram of a second example of the configuration of the autofocus control device according to the present embodiment. Parts corresponding to those in the first configuration example (FIG. 15) are denoted by the same reference numerals, and detailed description thereof will be omitted.
  • the autofocus control device 32 in this configuration example includes a video signal decoder 41, an FPGA 42, a CPU 44, an R / M / RAM 45, a PMC 46, and an IZF circuit 47. I have.
  • the field memory 43 is used to process the image of the in-night race as an image similar to a TV (television). Control from the frame information.
  • the frame information considering only the autofocus operation, there is no need to use the frame information, and processing on a field-by-field basis may be sufficient, and this may be an advantage.
  • the autofocus control device 32 in the present configuration example has a configuration in which the field memory 43 is removed from the first configuration example. With this configuration, there is no need to perform a timing process for transferring information to the field memory, so that a configuration that is physically and logically simpler than that of the above-described first configuration example can be made. Also, since focus evaluation processing can be performed in field units, there is an advantage that the sampling interval of focus evaluation values is shorter than in the first configuration example in which processing is performed in frame units.
  • FIG. 17 is a functional block diagram of a third configuration example of the autofocus control device according to the present embodiment. Parts corresponding to those in the first configuration example (FIG. 15) are denoted by the same reference numerals, and detailed description thereof will be omitted.
  • the autofocus control device 33 in this configuration example includes a video signal decoder 41, an FPGA 42, a CPU 44, a ROM / RAM 45>? 46 and a 1 / circuit 47. Has been established.
  • the autofocus control device 33 in this configuration example incorporates the PMC 46 logic block in the FPGA 42, and does not require an independent logic circuit for the PMC 46 as compared to the second configuration example described above. It has the following configuration. With this configuration, an independent IC chip for the PMC 46 is not required, and the board size and mounting cost can be reduced. (Fourth configuration example)
  • FIG. 18 is a functional block diagram of a fourth configuration example of the autofocus control device according to the present embodiment. Parts corresponding to those in the first configuration example (FIG. 15) are denoted by the same reference numerals, and detailed description thereof will be omitted.
  • the auto focus control device 34 in this configuration example is composed of a video signal decoder 41, FPGA 42, CPU 44, ROM / RAM 45, AD (Analog to Digital) DA (Digital to Analog) circuit 4 8, and IZF circuit 47.
  • the auto-focus control device 34 in this configuration example shows an example in which the driving source of the force stage is configured by a piezo stage of an analog signal controller from a pulse motor, and in the above-described second configuration example, An ADZDA circuit 48 is used instead of the PMC 46. Note that the circuit 708 can be taken into, for example, the CPU 44. In this case, the circuit 48 need not be an external circuit.
  • the DA circuit part is a circuit for converting the instruction voltage from the CPU 44 into an analog signal, and the AD circuit part detects the moving position of the piezo stage. This is a circuit for converting a signal from a sensor (not shown) to a digital signal and feeding it back to the CPU 44. When the feedback control is not performed, the AD circuit part can be omitted.
  • FIG. 19 shows a specific configuration example of the autofocus control device 33 in the above-described third configuration example (FIG. 17) as a fifth configuration example of the present embodiment. Note that the corresponding parts in the figure And the same reference numerals are given, and detailed description thereof is omitted.
  • the auto-focus control device 35 in this configuration example includes a video signal decoder 41, an FPGA 42, a CPU 44, a flash memory 45A, a SRAM (SRAM) on a common wiring board 50.
  • tic Random Access Memory) 45 B RS driver 47 A, power supply monitoring circuit 51, FPGA initialization ROM 52, and multiple connectors 53A, 53B, 53C, 53D It is implemented and configured.
  • the flash memory 45 A and the SR AM 45 B correspond to the ROM / RAM 45 described above, while the flash memory 45 A has an operation program of the CPU 44 and an initial state of auto focus operation.
  • the setting information (focus movement speed, smoothing processing conditions, etc.) is stored, and the other SRAM 45 B is used to temporarily store various parameters necessary for the CPU 44 to calculate the focus position. Used.
  • the RS dryino, 47A is an interface circuit necessary for communication with the external device that is connected via the connectors 53A to 53D.
  • a CCD camera is connected to the connector 53A, and a host controller or a CPU is connected to the connector 53B.
  • a power supply circuit is connected to the connector 53C, and a focus stage is connected to the connector 53D.
  • the focus stage includes a pulse motor as a drive source, and its controller, PMC, is incorporated in the FPGA 42.
  • the autofocus control device 35 of the present configuration example various elements capable of executing the algorithm for realizing the autoforce control method of the present invention are mounted on one wiring board 50.
  • it can be configured as a board mounted body having an outer dimension of, for example, 100 mm square. This reduces equipment costs and simplifies equipment configuration. Can be achieved.
  • the degree of freedom of equipment installation can be increased, it is possible to easily respond to the on-site needs that require autofocus operation in industrial fields that could not be used until now.
  • Di 2 may be moved.
  • the driving system for changing the distance between the lens and the sample is constituted by the lens driving unit 4 composed of a piezo element and its driver 8, but the present invention is not limited to this.
  • Other drive systems may be applied as long as the distance can be changed accurately and smoothly.
  • FIG. 20A shows an example in which a pulse motor 20 is used as a drive source.
  • the dryino 21 generates a drive signal for the pulse motor 20 based on a control signal supplied from the pulse motor controller 22.
  • the lens driving unit 4 and the pulse motor 20 are driven by so-called feedforward control.
  • a sensor for detecting the lens position or the stage position is provided to control the driving source by feed pack control.
  • a configuration is also applicable.
  • FIG. 20B shows an example of the configuration of a drive system that controls a drive source by feed pack control.
  • Driver 2 4 is output instruction circuit 2
  • a drive signal for the drive system 23 is generated based on the control signal supplied from 5.
  • a cylinder device, a motor, or the like can be applied as the drive system 23.
  • the position sensor 26 can be constituted by a strain gauge, a potentiometer, etc., and the output thereof is supplied to the acquisition circuit 27.
  • the take-in circuit 27 supplies a position compensation signal to the output instruction circuit 25 based on the output of the position sensor 26, and performs position correction of the drive system 23.
  • the video signal supplied from the CCD camera has been described in the NTSC format.
  • the present invention is not limited to this.
  • the video signal can be processed in a PAL (Phase Alternation by Line) format.
  • PAL Phase Alternation by Line
  • the function of the video signal decoder circuit can be incorporated into the FPGA 42.
  • the focus evaluation value and the focus position of each sample image obtained by executing the auto focus control of the present invention can be displayed on the monitor 9 (FIG. 1) together with the sample image.
  • an encoder circuit for converting such information into NTSSC or the like and displaying it may be provided separately.
  • This encoder circuit may be, for example, one of the board mounted components of the autofocus control device having the configuration described in the fifth embodiment.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Automatic Focus Adjustment (AREA)
  • Studio Devices (AREA)
  • Microscoopes, Condenser (AREA)
  • Stereoscopic And Panoramic Photography (AREA)
PCT/JP2004/012609 2003-08-26 2004-08-25 オートフォーカス制御方法、オートフォーカス制御装置および画像処理装置 WO2005026802A1 (ja)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/569,480 US20070187571A1 (en) 2003-08-26 2004-08-25 Autofocus control method, autofocus control apparatus and image processing apparatus

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2003-301918 2003-08-26
JP2003301918 2003-08-26
JP2004-212119 2004-07-20
JP2004212119A JP4158750B2 (ja) 2003-08-26 2004-07-20 オートフォーカス制御方法、オートフォーカス制御装置および画像処理装置

Publications (2)

Publication Number Publication Date
WO2005026802A1 true WO2005026802A1 (ja) 2005-03-24
WO2005026802B1 WO2005026802B1 (ja) 2005-05-26

Family

ID=34315613

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2004/012609 WO2005026802A1 (ja) 2003-08-26 2004-08-25 オートフォーカス制御方法、オートフォーカス制御装置および画像処理装置

Country Status (4)

Country Link
JP (1) JP4158750B2 (zh)
KR (1) KR20060123708A (zh)
TW (1) TWI245556B (zh)
WO (1) WO2005026802A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100426843C (zh) * 2005-04-15 2008-10-15 索尼株式会社 控制设备及其控制方法
CN100426842C (zh) * 2005-04-15 2008-10-15 索尼株式会社 控制装置及其控制方法,以及照相机

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007108455A (ja) * 2005-10-14 2007-04-26 Fujifilm Corp 自動合焦制御装置および自動合焦制御方法
JP2008234619A (ja) * 2007-02-20 2008-10-02 Toshiba Corp 顔認証装置および顔認証方法
KR101034282B1 (ko) 2009-07-31 2011-05-16 한국생산기술연구원 다중초점 대상체로부터 획득된 이미지에서의 초점 조절방법
JP5621325B2 (ja) * 2010-05-28 2014-11-12 ソニー株式会社 焦点制御装置、焦点制御方法、レンズ装置、フォーカスレンズ駆動方法、および、プログラム
SG10201710388RA (en) * 2013-08-09 2018-01-30 Musashi Eng Inc Focus adjustment method and device therefor
JP6476977B2 (ja) * 2015-02-19 2019-03-06 大日本印刷株式会社 識別装置、識別方法、プログラム
JP6750194B2 (ja) 2015-06-19 2020-09-02 ソニー株式会社 医療用画像処理装置、医療用画像処理方法、及び、医療用観察システム
KR102640848B1 (ko) 2016-03-03 2024-02-28 삼성전자주식회사 시료 검사 방법, 시료 검사 시스템, 및 이들을 이용한 반도체 소자의 검사 방법
CN109644242B (zh) 2016-06-30 2021-06-25 株式会社尼康 摄像装置
JP6793053B2 (ja) * 2017-02-09 2020-12-02 リコーエレメックス株式会社 検査装置、および、フォーカス調整支援方法
JP7037425B2 (ja) * 2018-04-23 2022-03-16 株式会社ディスコ レーザー光線の焦点位置検出方法
CN114257710B (zh) * 2020-09-23 2024-02-20 北京小米移动软件有限公司 光学防抖结构及具有其的摄像头模组、终端设备

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0583614A (ja) * 1991-09-24 1993-04-02 Canon Inc 電子スチルカメラ
JPH07284130A (ja) * 1994-04-14 1995-10-27 Rohm Co Ltd 立体ビジョンカメラ
JPH099132A (ja) * 1995-06-23 1997-01-10 Canon Inc 自動焦点調節装置及びカメラ
JPH09224259A (ja) * 1996-02-19 1997-08-26 Sanyo Electric Co Ltd 画像合成装置
JP2000275019A (ja) * 1999-03-23 2000-10-06 Takaoka Electric Mfg Co Ltd 能動共焦点撮像装置とそれを用いた三次元計測方法
JP2002048512A (ja) * 2000-07-31 2002-02-15 Nikon Corp 位置検出装置、光学機械及び露光装置
JP2003005088A (ja) * 2001-06-22 2003-01-08 Nikon Corp 顕微鏡用焦点合わせ装置およびそれを備えた顕微鏡
JP2003029138A (ja) * 2001-07-19 2003-01-29 Olympus Optical Co Ltd 自動合焦方法及び紫外線顕微鏡
JP2003029130A (ja) * 2001-07-11 2003-01-29 Sony Corp 光学式顕微鏡
JP2003075713A (ja) * 2001-09-03 2003-03-12 Minolta Co Ltd オートフォーカス装置及び方法、並びにカメラ
JP2003086498A (ja) * 2001-09-13 2003-03-20 Canon Inc 焦点位置検出方法及び焦点位置検出装置
JP2003163827A (ja) * 2001-11-22 2003-06-06 Minolta Co Ltd 被写体抽出装置及び撮影装置
JP2003195157A (ja) * 2001-12-21 2003-07-09 Agilent Technol Inc イメージング・システムの自動焦点調節
JP2003264721A (ja) * 2002-03-11 2003-09-19 Fuji Photo Film Co Ltd 撮像装置
JP2004101240A (ja) * 2002-09-05 2004-04-02 Mitsui Eng & Shipbuild Co Ltd 積層ベルトリング検査方法および装置

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0583614A (ja) * 1991-09-24 1993-04-02 Canon Inc 電子スチルカメラ
JPH07284130A (ja) * 1994-04-14 1995-10-27 Rohm Co Ltd 立体ビジョンカメラ
JPH099132A (ja) * 1995-06-23 1997-01-10 Canon Inc 自動焦点調節装置及びカメラ
JPH09224259A (ja) * 1996-02-19 1997-08-26 Sanyo Electric Co Ltd 画像合成装置
JP2000275019A (ja) * 1999-03-23 2000-10-06 Takaoka Electric Mfg Co Ltd 能動共焦点撮像装置とそれを用いた三次元計測方法
JP2002048512A (ja) * 2000-07-31 2002-02-15 Nikon Corp 位置検出装置、光学機械及び露光装置
JP2003005088A (ja) * 2001-06-22 2003-01-08 Nikon Corp 顕微鏡用焦点合わせ装置およびそれを備えた顕微鏡
JP2003029130A (ja) * 2001-07-11 2003-01-29 Sony Corp 光学式顕微鏡
JP2003029138A (ja) * 2001-07-19 2003-01-29 Olympus Optical Co Ltd 自動合焦方法及び紫外線顕微鏡
JP2003075713A (ja) * 2001-09-03 2003-03-12 Minolta Co Ltd オートフォーカス装置及び方法、並びにカメラ
JP2003086498A (ja) * 2001-09-13 2003-03-20 Canon Inc 焦点位置検出方法及び焦点位置検出装置
JP2003163827A (ja) * 2001-11-22 2003-06-06 Minolta Co Ltd 被写体抽出装置及び撮影装置
JP2003195157A (ja) * 2001-12-21 2003-07-09 Agilent Technol Inc イメージング・システムの自動焦点調節
JP2003264721A (ja) * 2002-03-11 2003-09-19 Fuji Photo Film Co Ltd 撮像装置
JP2004101240A (ja) * 2002-09-05 2004-04-02 Mitsui Eng & Shipbuild Co Ltd 積層ベルトリング検査方法および装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100426843C (zh) * 2005-04-15 2008-10-15 索尼株式会社 控制设备及其控制方法
CN100426842C (zh) * 2005-04-15 2008-10-15 索尼株式会社 控制装置及其控制方法,以及照相机

Also Published As

Publication number Publication date
WO2005026802B1 (ja) 2005-05-26
TW200527907A (en) 2005-08-16
JP2005099736A (ja) 2005-04-14
KR20060123708A (ko) 2006-12-04
TWI245556B (en) 2005-12-11
JP4158750B2 (ja) 2008-10-01

Similar Documents

Publication Publication Date Title
WO2005026802A1 (ja) オートフォーカス制御方法、オートフォーカス制御装置および画像処理装置
US20070187571A1 (en) Autofocus control method, autofocus control apparatus and image processing apparatus
JPS6398615A (ja) 自動焦点調節方法
JP2005210217A (ja) ステレオカメラ
CN105579880B (zh) 内窥镜用摄像系统、内窥镜用摄像系统的工作方法
EP3035104B1 (en) Microscope system and setting value calculation method
JP2007159047A (ja) カメラシステム、カメラ制御装置、パノラマ画像作成方法及びコンピュータ・プログラム
JPH09298682A (ja) 焦点深度伸長装置
JP2009145645A (ja) 光学機器
EP2136234B1 (en) Microscope imaging system, storage medium and exposure adjustment method
WO2017033346A1 (ja) デジタルカメラシステム,デジタルカメラ,交換レンズ,歪曲収差補正処理方法,歪曲収差補正処理プログラム
JPH11325819A (ja) 顕微鏡用電子カメラ
JP2010107866A (ja) デジタルカメラ及び光学機器
US8212865B2 (en) Microscope image pickup apparatus, microscope image pickup program product, microscope image pickup program transmission medium and microscope image pickup method
US7925149B2 (en) Photographing apparatus and method for fast photographing capability
CN100378487C (zh) 自动聚焦控制方法、自动聚焦控制器和图像处理器
JP2009069748A (ja) 撮像装置およびその自動焦点調節方法
US20160275657A1 (en) Imaging apparatus, image processing apparatus and method of processing image
JP2013246052A (ja) 距離測定装置
JP2013076823A (ja) 画像処理装置、内視鏡システム、画像処理方法及びプログラム
JP2006145793A (ja) 顕微鏡画像撮像システム
JP2011166497A (ja) 撮像装置
JPH11197097A (ja) 遠近画像を形成する電子内視鏡装置
US20090168156A1 (en) Microscope system, microscope system control program and microscope system control method
JP5996462B2 (ja) 画像処理装置、顕微鏡システム及び画像処理方法

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200480030003.4

Country of ref document: CN

AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BW BY BZ CA CH CN CO CR CU CZ DK DM DZ EC EE EG ES FI GB GD GE GM HR HU ID IL IN IS KE KG KP KR LC LK LR LS LT LU LV MA MD MG MN MW MX MZ NA NI NO NZ OM PG PL PT RO RU SC SD SE SG SK SL SY TM TN TR TT TZ UA UG US UZ VC YU ZA ZM

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GM KE LS MW MZ NA SD SZ TZ UG ZM ZW AM AZ BY KG MD RU TJ TM AT BE BG CH CY DE DK EE ES FI FR GB GR HU IE IT MC NL PL PT RO SE SI SK TR BF CF CG CI CM GA GN GQ GW ML MR SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
B Later publication of amended claims

Effective date: 20050331

WWE Wipo information: entry into national phase

Ref document number: 1020067004017

Country of ref document: KR

DPEN Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed from 20040101)
122 Ep: pct application non-entry in european phase
WWE Wipo information: entry into national phase

Ref document number: 10569480

Country of ref document: US

Ref document number: 2007187571

Country of ref document: US

WWP Wipo information: published in national office

Ref document number: 1020067004017

Country of ref document: KR

WWP Wipo information: published in national office

Ref document number: 10569480

Country of ref document: US