CN117479020A - Gray correction method for image vignetting and corresponding microscope - Google Patents

Gray correction method for image vignetting and corresponding microscope Download PDF

Info

Publication number
CN117479020A
CN117479020A CN202311171209.XA CN202311171209A CN117479020A CN 117479020 A CN117479020 A CN 117479020A CN 202311171209 A CN202311171209 A CN 202311171209A CN 117479020 A CN117479020 A CN 117479020A
Authority
CN
China
Prior art keywords
image
vignetting
camera
sub
microscope
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311171209.XA
Other languages
Chinese (zh)
Inventor
任亮
李海文
全伟铭
郭泽媛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Chaoshiji Biotechnology Co ltd
Original Assignee
Guangzhou Chaoshiji Biotechnology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Chaoshiji Biotechnology Co ltd filed Critical Guangzhou Chaoshiji Biotechnology Co ltd
Priority to CN202311171209.XA priority Critical patent/CN117479020A/en
Publication of CN117479020A publication Critical patent/CN117479020A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • G06T2207/10061Microscopic image from scanning electron microscope

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Microscoopes, Condenser (AREA)

Abstract

The invention discloses a gray level correction method for image vignetting and a corresponding microscope. The method comprises the following steps: step 1, acquiring vignetting distribution images G (u, v) of a camera for setting a field of view based on splicing scanning of a camera central area; step 2, performing image vignetting correction on an original image R (u, v) under a set view acquired by a camera based on the following formula, wherein FFC (u, v) =r (u, v) -G (u, v), wherein R (u, v) is an original image without image vignetting correction, FFC (u, v) is a corrected image after image vignetting correction, and u and v are row coordinates and column coordinates of the image, respectively. The invention does not need to use a uniform calibration standard component, but directly shoots an object to be imaged so as to acquire vignetting distribution images. Therefore, the calibration standard part does not need to be fed and discharged, and the consistency of the positions of the calibration standard part and the object to be imaged is not required to be ensured. Thus, the operation efficiency is high, and the image vignetting correction effect is good.

Description

Gray correction method for image vignetting and corresponding microscope
Technical Field
The invention relates to the technical field of imaging, in particular to a gray level correction method for image vignetting and a microscope adopting the method.
Background
Image vignetting, particularly natural vignetting, is a phenomenon common in optical imaging systems, which, as shown in fig. 1, is manifested by a non-linear decrease in image brightness from center to edge. Natural vignetting is due to the fact that the light rays emitted by an object have different throughput in different areas of the imaging lens, and has the characteristic that the farther from the center of the lens, the lower the light throughput of the object is. Image vignetting is manifested as a significant decrease in image brightness from center to edge, which will significantly affect image quality and subsequent image recognition, segmentation, analysis, and statistical processes in application scenarios where image uniformity is required. Correction of the image is required to remove the effects of vignetting.
In the vignetting correction method of the related art, first, an object to be measured (also referred to as a uniform calibration standard) having a uniform distribution must be imaged, thereby calibrating the image vignetting distribution P (u, v). Further, a camera dark field image D (u, v) is obtained by blocking external light, and then vignetting is removed by formula (1):
where R (u, v) is the pre-correction image, D (u, v) is the camera dark field image, P (u, v) is the vignetting distribution image,is the mean of the vignetting distribution image, FFC (u, v) is the corrected image, u, v representing the pixel coordinates in the image.
Therefore, it is necessary to first load a calibration standard, obtain a vignetting distribution image and a camera dark field image by calibration in advance, then load (remove) a uniform calibration standard, then replace an object to be imaged, image the object, and then remove vignetting in the image by using formula (1). The calibration standard is, for example, a uniform material and a flat article. The method in the prior art is complex to operate and needs to uniformly calibrate standard components during feeding and discharging. Moreover, the method also requires ensuring that the object to be imaged is consistent with the position of the calibration standard, i.e. the position of the two relative to the camera. This involves a high requirement on the accuracy of the handling of the loading and unloading of the objects or of the uniform calibration standards, and even on the shape of the uniform calibration standards.
Disclosure of Invention
It is an object of the present invention to provide a method which overcomes or at least alleviates at least one of the above-mentioned disadvantages of the prior art.
To achieve the above object, the present invention provides a gray-scale correction method of image vignetting, the method comprising:
step 1, acquiring vignetting distribution images G (u, v) of a camera for setting a field of view based on splicing scanning of a camera central area;
step 2, performing image vignetting correction on the original image R (u, v) under the set view acquired by the camera based on the following formula,
FFC(u,v)=R(u,v)-G(u,v)
wherein the method comprises the steps of
R (u, v) is the original image without image vignetting correction,
FFC (u, v) is a corrected image after image vignetting correction,
u and v are the row and column coordinates of the image, respectively.
The method of the invention does not need to use a uniform calibration standard when calibrating the vignetting distribution image, but directly shoots a set field of view (or an imaging area or an object to be imaged) to acquire the vignetting distribution image. Therefore, the uniform calibration standard part does not need to be fed and discharged, the consistency of the relative positions of the calibration standard part, the object to be imaged and the camera does not need to be ensured, and the camera dark field image does not need to be acquired. Therefore, the operation efficiency is improved, and the accuracy of image vignetting correction is improved.
It should also be noted that, for "ensuring the consistency of the relative positions of the calibration standard, the object to be imaged and the camera", the calibration standard is generally a flat object, and the object to be imaged is a three-dimensional object, so that, because the outline of the object to be imaged has concave-convex, the method of the present invention overcomes the following drawbacks of the prior art: it is in any case difficult to achieve said relative position uniformity, and certain deviations are necessary or present. Preferably, step 1 comprises the following steps;
step 11, acquiring an initial image R0 (u, v) of a camera for setting an imaging area, wherein the initial image R0 (u, v) is a single-view image, the width of the image is M, and the height of the image is N;
step 12, dividing an imaging region into m×n sub-regions; the u-direction size (i.e. the x-direction size) of each sub-area is M/M, the v-direction size (i.e. the y-direction size) is N/N, relative movement of the camera and the imaging area is realized on a plane parallel to the initial image, the relative movement is movement along the u-direction and/or the v-direction, a single-view test image is obtained by taking each sub-area as a center and is taken as a spliced image to be cut, M x N single-view test images or spliced images to be cut are obtained, the central area of each M x N single-view test images is intercepted, and each central area corresponds to a corresponding sub-area; intercepting and obtaining sub-images of m x n central areas;
step 13, sequentially stitching m×n sub-images into one stitched image K (u, v), wherein the size of the stitched image is m×n;
step 14, determining vignetting distribution image G (u, v) based on the following formula,
G(u,v)=R0(u,v)-K(u,v)。
preferably, in step 12, starting from the upper left corner, acquiring single-view test images for m×n sub-regions in sequence, specifically, aiming at the sub-region of the upper left corner first, and acquiring single-view test images; then sequentially acquiring single-view test images taking each subarea of the first row as a center from left to right; then acquiring a single-view test image corresponding to the right-most subarea of the next row; then, the single-view test images corresponding to the sub-areas of the present line are acquired in the left-right order opposite to the previous line, in the sequential order until the single-view test images corresponding to the m×n sub-areas are acquired.
Preferably, the camera is an electronic imaging unit of a microscope with a stage, and the relative movement of the camera and the imaging region is achieved by horizontal movement of the stage.
Preferably, the position of the object to be imaged relative to the microscope stage is fixed throughout the acquisition of the vignetting distribution image G (u, v) and the acquisition of the original image R (u, v) of the object to be imaged based on the camera central region stitching scan.
Preferably, step 1 is performed once after each focusing operation of the camera to update the vignetting distribution image G (u, v).
Preferably, m=n, m=n=3, wherein the original image R (u, v), i.e. the original image R0 (u, v), is also the stitched image to be cropped corresponding to the central sub-area; or under the condition of continuously acquiring the application scene of the original image of the object to be imaged, the spliced image to be cut corresponding to the center subarea is the first frame of original image.
The present invention also provides a microscope that performs gradation correction of image vignetting by the method described above, wherein the vignetting distribution image G (u, v) is reacquired when the distance between the camera and the stage of the microscope varies.
Preferably, the microscope has an electronic imaging unit, a stage, and a control unit capable of controlling the stage to move in a horizontal plane to acquire a single field of view test image centered on m×n sub-regions.
Preferably, the control unit is provided with a computer program to automatically perform image vignetting correction, and outputs a corrected image FFC (u, v) after the image vignetting correction.
The method of the invention can obtain the vignetting distribution of the image in situ and remove the vignetting without using a calibration standard part, thereby avoiding repeated loading and unloading of the object to be imaged. For example, in the case of a microscope scene, "in-situ" means that after the object to be photographed is fixed relative to the stage, the object to be photographed is no longer required to be moved relative to the stage, but only the relative distance between the object to be photographed and the camera is required to be kept constant throughout the process of correcting vignetting.
Drawings
Fig. 1 is a schematic view of image vignetting.
Fig. 2 shows the result of stitching 5*5 fields of view with vignetting. Such stitching results will affect the image effect and subsequent image processing.
Fig. 3 is a schematic diagram of the vignetting distribution P obtained by photographing a flat uniform object (calibration standard).
FIG. 4 is a schematic diagram of the relative positions of a camera and a scan field of view when a single field of view test image is acquired centered on different sub-regions, according to one embodiment of the invention.
Fig. 5 is a schematic diagram of stitching single fields of view by moving the single fields of view, taking the center area, taking m=n=3 as an example, according to one embodiment of the invention.
Fig. 6 is a schematic flow chart of a gray correction method of image vignetting according to an embodiment of the present invention.
Detailed Description
In the drawings, the same or similar reference numerals are used to denote the same or similar elements or elements having the same or similar functions. Embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
Embodiments of the present invention provide a method for obtaining an image vignetting distribution image G (u, v) in situ and removing vignetting therefrom without the use of calibration standards, as shown in fig. 4. FIG. 4 is a schematic diagram of the relative positions of a camera and a scan field of view when a single field of view test image is acquired centered on different sub-regions, according to one embodiment of the invention.
In one embodiment of the invention, the stationary object is stationary, the relative position of the camera and the plane in which it images the field of view is unchanged, and the camera is movable along the X-axis and the Y-axis. It should be noted that the relative movement may be achieved, or the camera may be fixed differently, and the object or stage may be moved.
From the vignetting distribution characteristics of the image, the image in the central area of the acquired image is not or less vignetting. Assume that the width and height of the center region of the image are 1/N, such as 1/3, of that of a single frame image, respectively, see, for example, FIG. 5. At this time, the camera is moved along the X-axis and the Y-axis, 9 frames of images can be sequentially acquired, and the central areas of the images are C1, C2, …, and C9. The 9 central regions may be stitched to the size of one single frame field of view, as shown in fig. 5 and 6.
The inventors found that when the shooting distance and the illumination condition are unchanged, the vignetting distribution image G is unchanged
Thus, the camera position is moved while the photographing distance is maintained, an uncorrected image R (with vignetting) is acquired, and a corrected image FFC is obtained by subtracting the vignetting distribution image G.
Thus, as shown in fig. 6, the gray-scale correction method of image vignetting according to an embodiment of the present invention includes the steps of:
step 1, acquiring vignetting distribution images G (u, v) of a camera for setting a field of view;
step 2, performing image vignetting correction on the original image R (u, v) under the set view acquired by the camera based on the following formula,
FFC(u,v)=R(u,v)-G(u,v)
wherein the method comprises the steps of
R (u, v) is the original image without image vignetting correction,
FFC (u, v) is a corrected image after image vignetting correction,
u and v are the row coordinate index and the column coordinate index of the image, respectively.
The value of each pixel in the image is, for example, a gray value. In one embodiment of the invention, the image format is a single channel gray scale image. In another embodiment of the invention, the image format is a multi-channel gray scale image. In the case of a multi-channel gray scale map, the value of each pixel in the image includes gray scale values for multiple channels.
The method of the embodiment of the invention does not need to use a uniform calibration standard component when calibrating the vignetting distribution image, but directly shoots a set field of view (or an imaging area or an object to be imaged) to acquire the vignetting distribution image. Therefore, the uniform calibration standard part does not need to be fed and discharged, the consistency of the relative positions of the calibration standard part, the object to be imaged and the camera does not need to be ensured, and the camera dark field image does not need to be acquired. Therefore, the operation efficiency is improved, and the accuracy of image vignetting correction is improved. The camera dark field image is only related to the camera itself. The camera's own electronics will generate a base value that is added to the image, i.e. the dark field image can be obtained when the image is taken with the outside light blocked.
It should also be noted that, for "ensuring the consistency of the relative positions of the calibration standard, the object to be imaged and the camera", the calibration standard is generally a flat object, and the object to be imaged is a three-dimensional object, so that, because the outline of the object to be imaged has concave-convex, the method of the present invention overcomes the following drawbacks of the prior art: it is in any case difficult to achieve said relative position uniformity, and certain deviations are necessary or present.
That is, the method of the present invention can obtain the image vignetting distribution in situ and remove the vignetting without using a calibration standard, thereby eliminating the need of repeatedly loading and unloading the object to be imaged. For example, in the case of a microscope scene, "in-situ" means that after the object to be photographed is fixed relative to the stage, the object to be photographed is no longer required to be moved relative to the stage, but only the relative distance between the object to be photographed and the camera is required to be kept constant throughout the process of correcting vignetting.
More specifically, step 1 includes the following steps;
step 11, acquiring an initial image R0 (u, v) of a camera for setting an imaging area, wherein the initial image R0 (u, v) is a single-view image, the width of the image is M, and the height of the image is N;
step 12, dividing an imaging region into m×n sub-regions; the u-direction size of each sub-area is M/M, the v-direction size of each sub-area is N/N, and relative movement of the camera and the imaging area is realized on a plane parallel to the initial image, wherein the relative movement is movement along the u-direction and/or the v-direction, and a single-view test image is obtained by taking each sub-area as a center and is used as a spliced image to be cut. Thus, m×n single-field test images or stitched images to be cut are obtained in total. And intercepting a central area of each of the m x n single-field test images, wherein each central area corresponds to a corresponding sub-area. Thus, sub-images of m×n central regions are obtained by total truncation. The size of each sub-image is the same as the size of a single sub-region. That is, each sub-image has a u-direction size of M/M and a v-direction size of N/N. And 13, sequentially stitching the m×n sub-images into one stitched image K (u, v), wherein the size of the stitched image is M×N. Here, in turn, means that the splicing is performed in such a manner as to correspond to the corresponding sub-region first.
Step 14, determining vignetting distribution image G (u, v) based on the following formula,
G(u,v)=R0(u,v)-K(u,v)。
it will be appreciated that upon relative movement, the relative position of the camera and the plane in which it is imaging field of view is unchanged. The camera is movable along the u-axis and the v-axis.
Preferably, in step 12, starting from the upper left corner, acquiring single-view test images for m×n sub-regions in sequence, specifically, aiming at the sub-region of the upper left corner first, and acquiring single-view test images; then sequentially acquiring single-view test images taking each subarea of the first row as a center from left to right; then acquiring a single-view test image corresponding to the right-most subarea of the next row; then, the single-view test images corresponding to the sub-areas of the present line are acquired in the left-right order opposite to the previous line, in the sequential order until the single-view test images corresponding to the m×n sub-areas are acquired.
The camera is, for example, an electronic imaging unit of a microscope with a stage, the relative movement of the camera and the imaging region being effected by a horizontal movement of the stage. The stage of the microscope has a driving unit for precisely driving, and can precisely control the position and displacement of the stage
In an alternative embodiment of the invention, the position of the object to be imaged relative to the microscope stage is fixed throughout the acquisition of the vignetting distribution image G (u, v) and the acquisition of the original image R (u, v) of the object to be imaged based on the camera central region stitching scan.
Preferably, step 1 is performed once after each focusing operation of the camera to update the vignetting distribution image G (u, v).
Referring to fig. 5, in yet another alternative embodiment, m=n, m=n=3. The X-axis corresponds to the u-coordinate index, for example, and the Y-axis corresponds to the v-coordinate index, for example. C1 to C9 are 9 subregions. It will be appreciated that the original image R (u, v), i.e. the original image R0 (u, v), is also the stitched image to be cropped corresponding to the central sub-region. Further, in an application scene in which original images of an object to be imaged are continuously acquired, a stitched image to be cropped corresponding to the center sub-region is a first frame original image.
Embodiments of the present invention also provide a microscope that performs gradation correction of image vignetting by the method described above, in which vignetting distribution images G (u, v) are reacquired in the event of a change in the distance between the camera and the stage of the microscope. Alternatively, in the case of reinstalling the object to be imaged, the vignetting distribution image G (u, v) is newly acquired.
The microscope has an electronic imaging unit, a stage, and a control unit. The control unit can control the stage to move on a horizontal plane so as to acquire single-view test images taking m x n subareas as centers.
The control unit is provided with a computer program to automatically perform image vignetting correction and output a corrected image FFC (u, v) after the image vignetting correction. The output correction image FFC (u, v) may be an image of a plurality of frames to reflect a change in the object to be imaged. And under the condition that the change of the object to be imaged is large and the vignetting distribution is obviously influenced, acquiring the vignetting distribution image again, and carrying out vignetting correction based on the newly acquired vignetting distribution image. Or, under the condition of not changing the object to be imaged, regularly acquiring vignetting distribution images, and carrying out vignetting correction by using new vignetting distribution images; alternatively, the two vignetting distribution images before and after the comparison are compared, and when the change of the vignetting distribution image is equal to or greater than a set threshold value, the vignetting correction is performed with the new vignetting distribution image.
Finally, it should be pointed out that: the above embodiments are only for illustrating the technical solution of the present invention, and are not limiting. Those of ordinary skill in the art will appreciate that: the technical schemes described in the foregoing embodiments may be modified or some of the technical features may be replaced equivalently; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.

Claims (10)

1. A gray scale correction method for image vignetting, comprising:
step 1, acquiring vignetting distribution images G (u, v) of a camera for setting a field of view based on splicing scanning of a camera central area;
step 2, performing image vignetting correction on the original image R (u, v) under the set view acquired by the camera based on the following formula,
FFC(u,v)=R(u,v)-G(u,v)
wherein the method comprises the steps of
R (u, v) is the original image without image vignetting correction,
FFC (u, v) is a corrected image after image vignetting correction,
u and v are the row and column coordinates of the image, respectively.
2. The gray scale correction method for image vignetting as claimed in claim 1, wherein the step 1 comprises the steps of;
step 11, acquiring an initial image R0 (u, v) of a camera for setting an imaging area, wherein the initial image R0 (u, v) is a single-view image, the width of the image is M, and the height of the image is N;
step 12, dividing an imaging region into m×n sub-regions; the u-direction size of each sub-area is M/M, the v-direction size of each sub-area is N/N, relative movement of the camera and the imaging area is realized on a plane parallel to the initial image, the relative movement is movement along the u-direction and/or the v-direction, a single-view test image is obtained by taking each sub-area as a center and is used as a splicing image to be cut, M x N single-view test images are obtained altogether, the central area of each of the M x N single-view test images is cut, and each central area corresponds to a corresponding sub-area; intercepting and obtaining sub-images of m x n central areas;
step 13, sequentially stitching m×n sub-images into one stitched image K (u, v), wherein the size of the stitched image is m×n;
step 14, determining vignetting distribution image G (u, v) based on the following formula,
G(u,v)=R0(u,v)-K(u,v)。
3. the gray scale correction method of image vignetting according to claim 2, characterized in that in step 12, a single field test image is acquired for m×n sub-areas in sequence from the upper left corner, specifically, the sub-area of the upper left corner is first aligned, and a single field test image is acquired; then sequentially acquiring single-view test images taking each subarea of the first row as a center from left to right; then acquiring a single-view test image corresponding to the right-most subarea of the next row; then, the single-view test images corresponding to the sub-areas of the present line are acquired in the left-right order opposite to the previous line, in the sequential order until the single-view test images corresponding to the m×n sub-areas are acquired.
4. The gray scale correction method for image vignetting of claim 2, wherein the camera is an electronic imaging unit of a microscope, the microscope having a stage, and the relative movement of the camera and the imaging region is achieved by horizontal movement of the stage.
5. The gray scale correction method of image vignetting as claimed in claim 4, wherein the position of the object to be imaged with respect to the microscope stage is fixed throughout the acquisition of the vignetting distribution image G (u, v) and the acquisition of the original image R (u, v) of the object to be imaged based on the camera center region stitching scan.
6. A gradation correction method for image vignetting as claimed in any one of claims 1-5, characterized in that step 1 is performed once after each focusing operation of the camera to update the vignetting distribution image G (u, v).
7. The gray scale correction method of image vignetting according to any one of claims 1-5, characterized in that m=n, m=n=3, wherein the original image R (u, v) is the original image R0 (u, v) and is also the stitched image to be cut corresponding to the central subregion; or under the condition of continuously acquiring the application scene of the original image of the object to be imaged, the spliced image to be cut corresponding to the center subarea is the first frame of original image.
8. A microscope employing the method of any one of claims 1-7 for gray scale correction of image vignetting, wherein vignetting distribution image G (u, v) is reacquired in the event of a change in the distance of the microscope's camera from the stage.
9. The microscope of claim 8, wherein the microscope has an electron imaging unit, a stage, and a control unit capable of controlling movement of the stage in a horizontal plane to acquire single field of view test images centered on m x n sub-regions.
10. The microscope of claim 9, wherein the control unit is provided with a computer program to automatically perform image vignetting correction, outputting the corrected image FFC (u, v) after the image vignetting correction.
CN202311171209.XA 2023-09-12 2023-09-12 Gray correction method for image vignetting and corresponding microscope Pending CN117479020A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311171209.XA CN117479020A (en) 2023-09-12 2023-09-12 Gray correction method for image vignetting and corresponding microscope

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311171209.XA CN117479020A (en) 2023-09-12 2023-09-12 Gray correction method for image vignetting and corresponding microscope

Publications (1)

Publication Number Publication Date
CN117479020A true CN117479020A (en) 2024-01-30

Family

ID=89624538

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311171209.XA Pending CN117479020A (en) 2023-09-12 2023-09-12 Gray correction method for image vignetting and corresponding microscope

Country Status (1)

Country Link
CN (1) CN117479020A (en)

Similar Documents

Publication Publication Date Title
US10746980B2 (en) Calibration of microscopy systems
US9596416B2 (en) Microscope system
US20200013146A1 (en) Image processing method, image processor, image capturing device, and image capturing method
US9013572B2 (en) Method for observing sample and electronic microscope
CN112505910B (en) Method, system, apparatus and medium for taking image of specimen with microscope
US7969475B2 (en) Low memory auto-focus and exposure system for large multi-frame image acquisition
KR101627634B1 (en) Projector image correction device and method
US20170076481A1 (en) Image processing device, imaging device, microscope system, image processing method, and computer-readable recording medium
CN116912233B (en) Defect detection method, device, equipment and storage medium based on liquid crystal display screen
CN112415733A (en) Method, system, apparatus and medium for controlling microscope to take sample image
US10656406B2 (en) Image processing device, imaging device, microscope system, image processing method, and computer-readable recording medium
CN116342435B (en) Distortion correction method for line scanning camera, computing equipment and storage medium
CN112866689B (en) SFR algorithm-based optical focusing method
CN117479020A (en) Gray correction method for image vignetting and corresponding microscope
US20070053583A1 (en) Image correcting apparatus, pattern inspection apparatus, and image correcting method, and reticle
US20170243386A1 (en) Image processing apparatus, imaging apparatus, microscope system, image processing method, and computer-readable recording medium
KR20070090778A (en) Color filter defect amend apparatus and color filter defect amend method
CN115439477B (en) 24 color card positioning method and device, electronic equipment and storage medium
CN116313710A (en) Focusing control method and device, scanning electron microscope and storage medium
CN108520499B (en) Image offset correction method based on white-band microscopic imaging
CN115578291A (en) Image brightness correction method, storage medium and electronic device
CN112839168B (en) Method for automatically adjusting camera imaging resolution in AOI detection system
JP5609459B2 (en) Binarization processing method and image processing apparatus
CN111917971B (en) Image capturing parameter optimization and adjustment system and method
JP2008003503A (en) Device for correcting substrate defect, and method for correcting substrate defect

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination