CN113640980A - Image measurement system, image measurement method, image measurement program, and image measurement storage medium - Google Patents

Image measurement system, image measurement method, image measurement program, and image measurement storage medium Download PDF

Info

Publication number
CN113640980A
CN113640980A CN202110510139.0A CN202110510139A CN113640980A CN 113640980 A CN113640980 A CN 113640980A CN 202110510139 A CN202110510139 A CN 202110510139A CN 113640980 A CN113640980 A CN 113640980A
Authority
CN
China
Prior art keywords
image
full
measurement
focal length
field
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110510139.0A
Other languages
Chinese (zh)
Inventor
卢存伟
辻野和广
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
School Juridical Person of Fukuoka Kogyo Daigaku
Original Assignee
School Juridical Person of Fukuoka Kogyo Daigaku
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by School Juridical Person of Fukuoka Kogyo Daigaku filed Critical School Juridical Person of Fukuoka Kogyo Daigaku
Publication of CN113640980A publication Critical patent/CN113640980A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/24Base structure
    • G02B21/241Devices for focusing
    • G02B21/244Devices for focusing using image analysis techniques

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

An image measuring system, an image measuring method, an image measuring program, and an image measuring storage medium for solving the problem of a narrow depth of field range of a photograph taken by a microscope, and generating a full-field image having a wide depth of field range and in which all parts of a measured object are clearly focused, by using a plurality of micrographs. The image measuring system includes an imaging unit, a measuring image generating unit, a full-field image generating unit, and the like. The photographing unit uses a camera device including an image sensor and a zoom lens capable of changing a focal length using an electrical signal, adjusts the focal length of the lens for a measured object, focuses the focal points of the lens at different places of the measured object, and photographs of one or more measured objects having a narrow depth of field at each focal length. The measurement image generating unit generates a measurement image for each focal length using one or more photographs taken by the photographing unit at each focal length. The full-field image generating unit generates a full-field image in which all parts of the object to be measured are sharply focused, the full-field image having a wide depth of field range, based on the measurement images of the respective focal lengths generated by the measurement image generating unit.

Description

Image measurement system, image measurement method, image measurement program, and image measurement storage medium
Technical Field
The present invention relates to an image measuring system, an image measuring method, an image measuring program, and an image measuring storage medium, which are capable of obtaining a full-field image having a wide depth range and clear pixels from a microscopic photograph having a narrow depth range.
Background
Optical microscopes are often used for observing small objects or specimens. The optical microscope has a narrow depth of field range, and the depth of field is narrower at higher magnification. Therefore, in order to observe different portions of the sample, it is necessary to frequently adjust the focal length of the microscope lens.
The focal length of the lens is adjusted, so that pictures of the sample under different focal lengths can be shot, and a three-dimensional image of the surface of the sample can be obtained according to the principle of implementing three-dimensional image measurement by adjusting the focal length of the lens. However, this three-dimensional measurement technique requires high-precision adjustment of the focal length of the lens, and is costly. In addition, the lens with the electric focal length adjuster is generally large in size and heavy in weight, can only be arranged in a laboratory and other places, and is difficult to be taken to an industrial field as a portable measuring instrument.
Recently, a microscope three-dimensional image measurement technique by an interference method as described in patent document 1, a technique of synthesizing a plurality of photographs with different magnifications as described in patent document 2, and the like have been proposed.
Documents of the prior art
Patent document
[ patent document 1] Japanese patent laid-open publication No. 2018-40644
[ patent document 2] Japanese unexamined patent publication No. 2014-219623
Disclosure of the invention
Problems to be solved by the invention
The three-dimensional shape measuring method described in patent document 1 has a problem that an optical system is complicated. The synthesis method described in patent document 2 requires adjustment of the magnification of the lens and a spectroscopic system.
To this end, an object of the present invention is to provide an image measuring system, an image measuring method, an image measuring program, and an image measuring storage medium for acquiring a full-field image of a photographed object, that is, an image having a wide range of depth of field in which all portions of the surface of the object to be measured are sharply focused.
Means for solving the problems
The image measuring system of the present invention includes an imaging unit, a measuring image generating unit, a full-field image generating unit, and the like. The photographing unit uses a camera device which comprises an image sensor and a zoom lens capable of changing a focal length by using an electric signal, adjusts the focal length of the lens, focuses the focal points of the lens at different places of the object to be measured respectively, and takes one or more pictures of the object to be measured with a narrow depth of field at each focal length. A measurement image generation unit that generates a measurement image for each focal length using one or more photographs taken by the photographing unit at each focal length; the full-field image generating unit generates a full-field image in which all parts of the object to be measured are sharply focused, the full-field image having a wide depth of field range, based on the measurement images of the respective focal lengths generated by the measurement image generating unit.
The image measuring method of the present invention has the following features: adjusting the focal length of a lens by using camera equipment which comprises an image sensor and a zoom lens capable of changing the focal length by using an electrical signal, respectively focusing the focal points of the lens at different places of a measured object, and shooting one or more pictures of the measured object with narrow depth of field at each focal length; generating, by a computer, images for measurement of the respective focal lengths using one or more photographs taken at the respective focal lengths; a computer generates a full-field image in which all parts of the object to be measured are sharply focused, and which has a wide depth of field, based on the measurement images of the respective focal lengths generated by the measurement image generation unit.
The technology of the invention uses camera equipment which comprises an image sensor and a zoom lens which can change the focal length by using an electric signal, adjusts the focal length of the lens, focuses the focal points of the lens on different places of a measured object respectively, and shoots one or more pictures of the measured object with narrow depth of field at each focal length; generating images for measurement of each focal length using one or more photographs taken at each focal length; based on the measurement images of the respective focal lengths generated by the measurement image generation unit, a full-field image having a wide depth of field and in which all parts of the object to be measured are sharply focused can be generated.
The image measuring system of the present invention preferably includes an illumination device for providing illumination necessary for photographing an object to be measured, and an illumination pattern generating unit for providing a continuous or intermittent annular illumination pattern necessary for photographing. Thus, the illumination device can be controlled in accordance with the illumination pattern generated by the illumination pattern generation unit, and the illumination pattern required for measurement and the illumination intensity required for measurement can be irradiated to the object to be measured.
The image measuring system of the present invention preferably includes a full-field image correction unit for correcting color intensity defects of a part of pixels in the full-field image generated using the full-field image generation unit, which are often caused by a composition error of the full-field image. This ensures that unnatural portions such as discontinuities and defects in the full-field image generated by the full-field image generating means are corrected.
The image measuring system of the present invention preferably includes a three-dimensional image generating unit for obtaining three-dimensional coordinates of the surface of the object to be measured and generating a three-dimensional image. This results in a three-dimensional image representing the surface of the object being measured.
The image measuring system of the present invention preferably includes a three-dimensional image correction unit for correcting a portion of the three-dimensional image generated by the three-dimensional image generation unit where the three-dimensional coordinate value is abnormal. This ensures that unnatural portions such as discontinuities and defects in the three-dimensional image generated by the three-dimensional image generation means are corrected.
The invention includes the image measurement procedure, utilize the computer to realize every function of every unit such as the photography unit, picture generation unit for measurement, full field of vision picture generation unit, etc., namely realize using the camera apparatus comprising image sensor and zoom lens that can use the electrical signal to change the focal length, adjust the focal length of the lens, focus on different places of the measured object separately the lens focus, shoot one or more pieces of photographs of the measured object with very narrow range of depth of field at each focal length; generating images for measurement of each focal length using one or more photographs taken at each focal length; based on the measurement images of the respective focal lengths generated by the measurement image generation unit, a full-field image having a wide depth of field and in which all parts of the object to be measured are sharply focused can be generated. Thus, the functions of the image processing system of the present invention can be realized by a computer.
ADVANTAGEOUS EFFECTS OF INVENTION
(1) The invention uses the camera equipment which comprises an image sensor and a zoom lens which can change the focal length by using an electric signal, adjusts the focal length of the lens, focuses the focal points of the lens on different places of a measured object respectively, and shoots one or more pictures of the measured object with narrow depth of field at each focal length; generating, by a computer, images for measurement of the respective focal lengths using one or more photographs taken at the respective focal lengths; a computer generates a full-field image in which all parts of the object to be measured are sharply focused, and which has a wide depth of field, based on the measurement images of the respective focal lengths generated by the measurement image generation unit. Thus, a wide field-of-view image with a wide depth of field can be obtained by using only an inexpensive image sensor and a variable focus lens.
(2) The invention provides illumination needed by photographing to a measured object by using an illumination device, and provides continuous or discontinuous annular illumination patterns needed by photographing by using an illumination pattern generating unit. Therefore, the required illumination pattern with required intensity can be projected to the part needing illumination of the measured object, so as to solve the problems of bright spots caused by over-strong surface reflection and insufficient light intensity caused by insufficient surface reflection which often occur in the process of taking the picture of the measured object.
(3) The full-field image correction unit according to the present invention can correct a portion of the full-field image generated by the full-field image generation unit where the color intensity value is abnormal. This ensures that unnatural parts such as discontinuities and defects in the full-field image generated by the full-field image generating unit are corrected to obtain a more realistic and natural full-field image.
(4) The three-dimensional image generating unit can obtain the three-dimensional coordinates of the surface of the measured object and generate the three-dimensional image capable of representing the surface shape of the measured object.
(5) The three-dimensional image correction unit of the present invention can correct the abnormal three-dimensional coordinate value in the three-dimensional image generated by the three-dimensional image generation unit, and ensure that the unnatural parts such as discontinuity and defect in the three-dimensional image generated by the three-dimensional image generation unit are corrected, so as to obtain a more real and natural three-dimensional image
Drawings
Fig. 1 is a schematic configuration diagram of an image measuring system according to an embodiment of the present invention.
Fig. 2 is a schematic view of a light emitting portion of the lower portion of the lighting device in fig. 1.
Fig. 3 is a block diagram of the configuration of the image measuring system shown in fig. 1.
Fig. 4 is an example of an illumination pattern.
Fig. 5 is an example of an illumination pattern.
Fig. 6 is an example of an illumination pattern.
Fig. 7 is an example of an illumination pattern.
Fig. 8 is a flowchart when measurement is performed using the image measuring system shown in fig. 1.
Fig. 9 is a detailed flowchart of the portion S101 in fig. 8.
Fig. 10 is a schematic diagram of generating a full field of view image using images of different focal lengths.
Fig. 11 is a schematic diagram of extracting pixels clearly photographed from images of different focal lengths.
Fig. 12 is a schematic diagram of full field image correction.
Fig. 13 is an example of a full-field image and a three-dimensional image generated by the image measuring system of the present embodiment.
[ notation ] to show
X object to be measured
1 Camera device
1A image sensor
1B zoom lens
2 Lighting device
3 computer
4 connecting wire
10 illumination pattern generation unit
11 photographing unit
12 image generating unit for measurement
13 full-field image generating unit
14 full-field image correction unit
15 three-dimensional image generating unit
16 three-dimensional image correction unit
17 calibration unit
18 detection result output unit
19 memory cell
Detailed Description
Fig. 1 is a schematic diagram of a configuration of an image measuring system in an embodiment of the present invention, fig. 2 is a schematic diagram of a light emitting portion at a lower portion of an illumination device in fig. 1, and fig. 3 is a block diagram of the configuration of the image measuring system shown in fig. 1.
As shown in fig. 1, the image measuring system of the present embodiment is composed of a camera device 1 for taking a photomicrograph of a measured object X, an illumination device 2 for illuminating the measured object, a computer 3, a connection line 4 for connecting the computer 3 and the camera device 1, and the like. Communication between the computer 3 and the camera device 1 may be performed by using the illustrated connection line 4, or may be performed by using a wireless communication method such as Wi-Fi (trademark), Bluetooth (trademark), or infrared communication.
The camera apparatus 1 includes an image sensor 1A and a zoom lens 1B. The image sensor 1A functions to take a picture of the object X to be measured illuminated by the illumination device 2 through the zoom lens 1B. The zoom lens 1B may be an electric lens whose focal length is adjusted by an electric signal, or may be a liquid lens. In the present embodiment, the camera device 1 is a microscope.
As shown in fig. 2, the illumination device 2 is a ring-shaped device including a plurality of light emitting members 2A to 2H for ensuring that each surface of an uneven object to be measured can be illuminated. Each of the light Emitting members 2A to 2H may be formed of a device such as an led (light Emitting diode). The light emitting parts 2A to 2H may emit light or not, and the light emission intensities thereof may be controlled independently.
The computer 3 realizes the functions of the illumination pattern generation unit 10, the imaging unit 11, the measurement image generation unit 12, the full-field image generation unit 13, the full-field image correction unit 14, the three-dimensional image generation unit 15, the three-dimensional image correction unit 16, the correction unit 17, the measurement result output unit 18, and the storage unit 19 shown in fig. 3 by executing the image measurement program of the present invention. The image processing program is stored in a storage medium readable by a computer, and is read and executed by the computer 1.
[ illumination pattern generating Unit 10]
The illumination pattern generation unit 10 controls whether or not the respective light emitting parts 2A to 2H of the illumination device 2 emit light and the luminance of the light emission thereof to generate an illumination pattern required for microscopic measurement in the present embodiment. Fig. 4-7 give examples of some of the results of the generation of illumination patterns, including various annular illumination patterns that are continuous and discontinuous. The default illumination pattern generated by the illumination pattern generation unit 10 is a full illumination pattern, i.e., all the light emitting parts 2A to 2H emit light.
If the surface reflection of the object to be measured is strong, bright spots with high brightness will appear when the default full illumination pattern is used. When a photo is shot, if a bright spot exists on a measured object, the phenomenon of brightness saturation easily occurs on the shot image; meanwhile, a portion other than the bright point where the reflection is weak or a portion where the reflected light hardly reaches the image sensor 1 due to the shape relationship causes a problem of underexposure. To solve these problems, in the present embodiment, an illumination pattern such as fig. 4 to 7 is used. In fig. 4 to 7, white indicates that the member emits light, and gray indicates that the member does not emit light.
In the example of fig. 4, (a) to (D) show examples in which 4 kinds of components having 1/4 emit light and the other components do not emit light, that is, 2 light-emitting components and 6 non-light-emitting components among the light-emitting components 2A to 2H, and the light-emitting and non-light-emitting components are changed in order. In fig. (a), the light-emitting devices 2G and 2H emit light, and 2A to 2F do not emit light. In fig. (B), the light-emitting devices 2A and 2B emit light, and 2C to 2H do not emit light. In fig. (C), the light-emitting devices 2C and 2D emit light, and 2A, 2B, 2E to 2H do not emit light. In fig. (D), the light-emitting devices 2E and 2F emit light, and 2A to 2D, 2G, and 2H do not emit light.
In the example of fig. 5, (a) to (D) show examples in which 4 kinds of components having 3/4 emit light and the other components do not emit light, that is, 6 components of the light emitting components 2A to 2H emit light and 2 components emitting light and not emitting light are changed in order. In fig. (a), the light-emitting devices 2A to 2F emit light, and 2G and 2H do not emit light. In fig. (B), the light emitting devices 2C to 2H emit light, and 2A and 2B do not emit light. In fig. (C), the light emitting devices 2A, 2B, 2E to 2H emit light, and 2C and 2D do not emit light. In fig. (D), the light-emitting devices 2A to 2D, 2G, 2H emit light, and 2E and 2F do not emit light.
In the example of fig. 6, (a) and (B) show an example in which the facing 1/2 component emits light and the facing 1/2 component intersecting it does not emit light, respectively, that is, in the light-emitting components 2A to 2H, there are 4 light-emitting components and 4 non-light-emitting components, and the light-emitting and non-light-emitting components are switched in order. In fig. (a), 4 light emitting devices in total emit light from 2 light emitting devices 2G and 2H and 2C and 2D opposite thereto, and 4 light emitting devices in total do not emit light from 2A and 2B and 2E and 2F opposite thereto. In fig. (B), the light emitting devices 2A and 2B and the facing 2E and 2F make 4 light emitting devices in total and the light emitting devices 2C and 2D and the facing 2G and 2H make 4 light emitting devices in total no light emission.
In the example of fig. 7, (a) to (H) respectively show 8 light emission patterns, and 1 component emits light and the other components do not emit light in each pattern. That is, among the light emitting members 2A to 2H, 1 light emitting and 7 non-light emitting members are present, and the light emitting and non-light emitting members are changed in order. In fig. (a), the light-emitting device 2H emits light, and 2A to 2G do not emit light. In fig. (B), the light-emitting device 2A emits light, and 2B to 2H do not emit light. By analogy, in fig. H, the light-emitting device 2G emits light, and 2H, 2A to 2F do not emit light.
[ photographing Unit 11]
The photographing unit 11 uses the camera device 1 to adjust the focal length of the lens for the object X to be measured, focuses the focal points of the lens on different places of the object X to be measured, and photographs of one or more objects X to be measured with a narrow depth of field at each focal length. The photographing unit 11 changes the focal length of the zoom lens 1B, and takes one or M pictures at N different focal lengths, respectively, using the illumination pattern generated by the illumination pattern generation unit 10.
First, the photographing unit 11 takes a picture of one object X to be measured using a default full illumination pattern. Then, the color intensity distribution of the picture is analyzed, the intensity distribution of each channel of RGB of each pixel is calculated, and whether the problems of light saturation or underexposure exist or not is judged. If there is substantially no light saturation or underexposure problem and the intensity profile of the image is substantially linear, it is determined that the picture can be used for image measurement, and only the picture needs to be taken at this focal length.
If the object X has strong luster, bright spots with too strong luster are easy to appear on the photo. If a bright spot appears on a picture, the saturation of color intensity or insufficient brightness tends to occur on the picture image taken with the default full illumination pattern, and such a picture is difficult to use for image measurement. In this case, the photographing unit 11 determines that the photograph taken under the default full illumination pattern is difficult to be used for image measurement, and in order to obtain an image that can be used for image measurement, it is necessary to select any one of the illumination modes of fig. 4 to 7 and take the photograph while changing the illumination pattern.
When the lighting system shown in fig. 4 to 7 is selected and the image is taken, the image taking unit 11 takes one picture using one pattern of (a) to (H) in the figure and takes M pictures in total. That is, M pictures are taken at each focal length, and N × M pictures are taken at N focal lengths throughout the measurement process. In the M photographs taken at each focal length, because different illumination patterns are used, illumination light is irradiated onto the object to be measured X from different directions, and any one part of the surface of the object to be measured X has two possibilities of being irradiated by strong light and not being irradiated by the strong light, so that the photographs without light saturation and insufficient exposure can be taken on any one part of the surface of the object to be measured X. That is, the image measuring system of the present embodiment effectively combines the illumination pattern generated by the illumination pattern generating unit 10 and the characteristics of the imaging unit 11, and solves the problem that the photograph of the object X to be measured is a bright spot or an underexposure.
[ image generating unit for measurement 12]
The measurement image generating unit 12 generates a measurement image at a certain focal distance using one or more photographs taken by the photographing unit 11 at the focal distance. If only one picture is taken at a certain focal length n using the photographing unit 11, the measurement image generating unit 12 takes the picture as a measurement image at the focal length n. If M photographs are taken at a certain focal length n using the photographing unit 11, pixels having color intensity values exceeding the high threshold value are not used in the image of each photograph in order to prevent the occurrence of the image saturation phenomenon due to bright spots. Similarly, in order to prevent the occurrence of insufficient color intensity in the image, pixels having color intensity values lower than the low threshold value are not used in the image of each photograph.
That is, when M photographs are taken at a certain focal length n using the photographing unit 11, the measurement image generating unit 12 performs image processing on the images of the M photographs to obtain a measurement image. The method comprises the following steps: for a certain pixel, if the color intensity values of the M images are not greater than the high domain threshold or less than the low domain threshold, taking the average value of the color intensity values of the pixel of the M images as the color intensity value of the pixel of the image for measurement at the focal length n; if the color intensity value of the RGB channel of the pixel of one image in the M images is larger than the high-range threshold or smaller than the low-range threshold, the color intensity value of the pixel of the image is not used, and the average value of the color intensity values of the pixels of other images which are left after the image is removed is taken as the color intensity value of the pixel of the image for measurement under the focal length n.
[ full-field image generating Unit 13]
The full-field image generating unit 13 generates a full-field image in which all the parts of the object to be measured that have been captured are sharply focused, and which has a wide depth of field, using the measurement images of the respective focal lengths generated by the measurement image generating unit 12. When a microscope is used to take a picture of an object X to be measured, and when the object X to be measured has surface unevenness, that is, depth variation, one focal length can only focus one depth, and it is impossible to focus all depths. For a particular focal length, the focused portion is sharp in the picture and the unfocused portion is blurred in the picture.
The full-field image generating unit 13 performs image processing on a total of N measurement images at N focal lengths generated by the measurement image generating unit 12. For any pixel, finding out the clearest image from the images for measurement under each focal length, and taking the RGB value of the pixel of the image as the RGB value of the pixel of the full-view image. Thus, all the extracted clear pixels are combined to form a full-view image with all the clear pixels. It can be seen that the depth of field range of the full-field image is very large.
Fig. 10 is a schematic diagram of acquiring a full-field image using measurement-use images at different focal lengths. As shown in the figure, 5 measurement images are obtained at different focal lengths n (5 focal lengths in the figure, where n is 1, 2, 3, 4, and 5). For convenience, they are referred to as focus 1 image to focus 5 image, respectively. In the focus 1 image of fig. 10, 4 pixels represented by gray color are clearly photographed, and the other pixels represented by white color are not clearly photographed and are blurred. In generating the full-view image with the full-view image generating unit 13, 4 pixels in the focal length 1 image, which are expressed by gray colors, are extracted for use in generating the full-view image, and the other pixels of white color will not be used.
In the focal length 2 image of fig. 10, 7 pixels represented by a lattice are clearly photographed, and the other pixels represented by white color are not clearly photographed and are blurred. When the full-view image is generated by the full-view image generation unit 13, 7 pixels in the focal length 2 image, which are represented by grids, are extracted for use in generating the full-view image, and the other pixels in the white color are not used. In the same manner, a part of pixels in each of the focal length 3 image, the focal length 4 image, and the focal length 5 image are extracted and used to generate a full-field image. Finally, the clear pixels extracted from the 5 images for measurement are combined to obtain a full-view image with clear pixels.
Next, a specific method of extracting a sharp pixel from an image for measurement will be described. In the image for measurement at each focal length, different positions of the surface of the object X to be measured have different depths, and portions having the same depth as the focal length used at the time of photographing are focused, and these portions are clearly photographed, while other portions are blurred because they are not focused. The gradient of the luminance change of a pixel which is clearly photographed is large compared with surrounding pixels. Whereas the pixels of the blurred portion which are not focused have a smaller gradient of luminance change compared with the surrounding pixels. With this feature, we can extract the clearly photographed pixels according to the gradient change of the brightness. The full-field image generating unit 13 first extracts feature points of N measurement images at each focal length, and normalizes the feature intensity to change between 0% and 100% to obtain N feature intensity images. Then, for each pixel, one image with the maximum characteristic intensity value is found from the N characteristic intensity images, and the focal length of the image is k. Finally, the color intensity value of the pixel of the image for measurement at the focal length k is used as the color intensity value of the pixel of the full-field image.
Fig. 11 is a schematic diagram of extracting a clear pixel from an image for measurement of each focal length. In this diagram, N is 5, that is, there are 5 measurement images with different focal lengths. In fig. 11, the normalized characteristic intensity distribution in the x direction at one y coordinate (for example, y1) of the measurement image of focal length 1 to focal length 5 is shown from top to bottom. As can be seen from fig. 11, the feature value of the focus 3 image is maximum at the x1 coordinate, that is, the pixel at (x1, y1) of the focus 3 image is clearly captured, and the pixels at (x1, y1) of the other focus images are not clear with the focus 3 image. Thus, we select the color intensity value of the pixel at (x1, y1) of the focal length 3 image as the color intensity value of the pixel at (x1, y1) of the full-field image.
Also, the feature value of the focus 4 image is maximum at the x2 coordinate, that is, the pixel at (x2, y1) of the image of focus 4 is clearly photographed, and the pixels at (x2, y1) of the images of other focuses are not clear with the focus 4 image. Thus, we select the color intensity value of the pixel at (x2, y1) of the focal length 4 image as the color intensity value of the pixel at (x2, y1) of the full-field image. Following the same principle, we chose the color intensity value of the pixel at (x3, y1) for the select focal length 1 image as the color intensity value of the pixel at (x3, y1) for the full-field image, the color intensity value of the pixel at (x4, y1) for the select focal length 2 image as the color intensity value of the pixel at (x4, y1) for the full-field image, and the color intensity value of the pixel at (x5, y1) for the select focal length 5 image as the color intensity value of the pixel at (x5, y1) for the full-field image.
According to this processing method, the full-field image generating unit 13 can find the sharpest one of the 5 measurement images with different focal lengths for all pixels. Thus, by combining the acquired color intensity information of each pixel, a full-field image in which all pixels are the clearest can be obtained. That is, each pixel in the full field of view image is equivalently focused and sharp. This corresponds to a very wide range of depth of the full field image.
[ Whole-field image correction Unit 14]
In the full-field image generated by the full-field image generating unit 13, since the color determination information of the adjacent pixels may be acquired from the measurement images of different focal lengths, it may cause discontinuity in the color intensity values of the adjacent pixels. In addition, false extractions may also result for various reasons. This results in errors in the generation of the full-field image. To mitigate this generation error, we correct the color intensity values of the pixels in the generated full-field image that have errors. In the present embodiment, the full-field image correction unit 14 corrects the full-field image. Specifically, for the pixel having an error generated by the full-field image generating unit 13, the color intensity value of the pixel is corrected by the color intensity values of other pixels in the periphery thereof, so as to solve the problems of discontinuity of the color intensity value of the pixel, pixel defect, unnatural generated image, and the like.
Fig. 12 is a schematic diagram of full field image correction. As shown in fig. 12, the full-field image generated by the full-field image generating unit 13 has a pixel defect at row 3 and column 2 for some reason, and the color intensity value at this point is zero, at which point the image has a black dot. Since no black dot exists at any point in the image of each focal length, it can be determined that the black dot at that point in the full-field image is a pixel defect. The full-field image correction unit 14 corrects the full-field image by using 8 pixels around the pixel, i.e., 8 pixel color intensity values of the pixel, i.e., the upper left, upper right, left, right, lower left, lower right, and calculating one color intensity value as the color intensity value of the pixel, using the property of continuity of color intensity variation of the image.
[ three-bit image generating Unit 15]
The three-dimensional image generating unit 15 generates a three-dimensional image in which the three-dimensional coordinates of the points on the surface of the object X to be measured can be known. The three-dimensional image generating unit 15 calculates the depth information of the corresponding measurement point on the object to be measured using the focal length value of each pixel in the full-field image, using the relationship between the focal length value of each pixel in the full-field image used in photographing and the depth coordinate of the measurement point on the corresponding object to be measured, and further using the focal length value of each pixel in the full-field image in photographing, and further calculates the three-dimensional coordinate of the measurement point. The three-dimensional coordinates are calculated using the following formula:
(formula 1)
Figure BDA0003060042290000131
In the above formula(X, Y, Z) represents the three-dimensional world coordinates of the measurement point on the surface of the object X, (X, Y) is the image coordinates of the measurement point in the full-field image, (X)0,Y0,Z0) Is the initial value of the three-dimensional world coordinate of the measurement point, f is the focal length value used when the pixel is shot in the full-field image, kx,ky,kfIs a coefficient indicating the correspondence between the focal length of the lens, the image coordinates of the image for measurement, and the three-dimensional coordinates of the measurement point.
As can be seen from (equation 1) above, the X value of the three-dimensional world coordinate of the surface of the object X to be measured calculated by the three-dimensional image generating unit 15 is linear to the X value of the image coordinate of the pixel, the Y value is linear to the Y value of the image coordinate, and the depth coordinate Z value is linear to the focal length f at the time of shooting. Thus, the three-dimensional world coordinates (X, Y, Z) of the measurement point on the surface of the object X to be measured corresponding to the pixel can be calculated easily using the image coordinates (X, Y) of the pixel and the focal length value f at the time of photographing.
[ three-dimensional image correction Unit 16]
The three-dimensional image correction unit 16 corrects an abnormal portion in the three-dimensional image generated by the three-dimensional image generation unit. As described above, errors such as pixel defects and color intensity value abnormalities occur in the process of generating the full-field image. These errors also occur during the three-dimensional image generation process. In order to eliminate or reduce these errors, the three-dimensional image needs to be corrected in a manner similar to that of the full-field image. In the three-dimensional image correction means, a pixel having a variation in three-dimensional world coordinates in a three-dimensional image is detected, and a new depth coordinate value is calculated as a depth coordinate value of the pixel using a depth coordinate value of a pixel around the pixel, thereby correcting the three-dimensional image. For example, the pixel may be corrected using the depth coordinate values of 8 other pixels around the pixel having the variation.
[ calibration Unit 17]
The calibration unit 17 calibrates parameters of the camera device to improve the accuracy of image measurement. The calibration here is mainlyObtaining k in (formula 1)x,ky,kfAnd (X)0,Y0,Z0)。
[ measurement result output unit 18]
The measurement result output unit 18 outputs the generated full-field image and three-dimensional image in a format such as an image format or a CG format. The measurement result output by the measurement result output unit 18 can be read not only by using the image measurement program of the present invention but also by using general software, application programs, and the like.
[ storage Unit 19]
The storage unit 19 is mainly constituted by a memory in a computer, an SSD (Solid State Drive) or an HDD (Hard Disk Drive) or the like. As shown in fig. 3, the storage unit 19 is mainly used for the illumination pattern generation unit 10, the photographing unit 11, the measurement image generation unit 12, the full-field image generation unit 13, the full-field image correction unit 14, the three-dimensional image generation unit 15, the three-dimensional image correction unit 16, the calibration unit 17, the detection result output unit 18, and the control of the illumination device 2 and the camera device 1, the reading of a photograph taken by the camera device 1, the image processing, the saving and output of a processing result, and the like.
Fig. 8 is a flowchart of a measurement performed using the image measurement system of fig. 1. In measurement, first, the object X to be measured is set below the zoom lens 1B of the camera device 1 (S100). Then, the camera device 1 performs pre-photographing using the photographing unit 11 to obtain information such as the surface reflection characteristic of the object X to be measured, and determines whether only one full-illumination picture needs to be taken or a plurality of pictures need to be taken in different illumination environments at each focal distance when photographing using different focal distances. If a plurality of pictures are required to be taken, the required illumination pattern and the number of times of taking M are determined next (S101).
Next, the photographing unit 11 adjusts the focal lengths of the zoom lenses, preparing to take a picture at each focal length n (S102). A full illumination picture or M pictures under different illumination patterns are taken at focal distance n (S103). Then, the camera device 1 sends the taken 1 or M pictures to the computer 3(S104), and the computer 3 generates a measurement image at n focal lengths from these pictures (S105).
After that, it is judged whether or not photographing at all focal lengths required for measurement is completed (S106), and if not, the focal length of the zoom lens 1B is changed, the focal length value n at the time of photographing is updated (S107), and preparation for photographing is made again (S102). If the required photographing at all the focal lengths has been completed, the system shifts to the full-field-of-view image generating unit 13, and generates a full-field-of-view image having a wide depth of field based on the N measurement images (S108). When the full-view image is generated, there is a possibility that a problem such as a pixel defect may occur, and in this case, it is necessary to correct the full-view image using the full-view image correction unit 14 (S109).
The computer 3 calculates the depth value of each pixel from the correspondence between the focal length value when each pixel is clearly photographed and the depth value of the measurement point on the surface of the object X to be measured corresponding to the pixel, and further calculates the three-dimensional world coordinates of each pixel, using the three-dimensional image generating unit 15, to generate a three-dimensional image (S110). In addition, as in the case of the correction of the full-field image, the generated three-dimensional image may need to be corrected by the three-dimensional image correction unit 16 (S111). Finally, the measurement result is output in a file format such as an image format or a text format (S112).
Fig. 9 is a detailed flowchart of a portion for determining the illumination pattern and the number of times of image capturing M in step 101 in fig. 8. First, a default full illumination pattern is generated, and the object X to be measured is illuminated using the pattern (S200). A full illumination picture is taken by the camera device 1 under illumination of this full illumination pattern (S201), and the taken picture is sent to the computer 3 (S202). The picture is processed in the computer 3 to determine whether it is suitable for measurement.
At this time, if there is no bright point or a place where the light quantity is insufficient in the image, this image is regarded as an image suitable for measurement, the full illumination pattern is set as a default full illumination pattern, and the number of required shots at each focal length is set once (S205). If there is a bright spot or a place where the amount of light is insufficient in the image, the image is not considered to be suitable for measurement, and at this time, it is necessary to update the illumination pattern (S204), project illumination patterns of different intensities from different directions, take pictures in respective illumination environments, and synthesize a measurement image from the pictures. These illumination patterns are defined as the illumination patterns necessary for measurement, and the number of shots is defined as the number of shots necessary for the focal length, i.e., the number of photos taken at the focal length (S205).
Fig. 13 is an exemplary view of a full-field image and a three-dimensional image generated by the image measuring system according to the present embodiment. The object to be measured is a small section of the threaded portion of the screw shown in fig. 13(a), measuring about 2 mm.
Fig. 13(B) to (D) are images for measurement at 3 different focal lengths generated by the image for measurement generation unit, (B) is an image with a clear background, (C) is an image with a clear middle portion, and (D) is an image with a clear foreground. Fig. 13(E) is a full-field image generated by the full-field image generating unit 13 using the three images of different focal lengths, from which it can be seen that all parts of the object being photographed are sharp. Fig. 13(F) is one expression example of the three-dimensional image generated by the three-dimensional image generating unit 15.
As described above, in the image measuring system of the present embodiment, the camera device 1 having the image sensor 1A and the zoom lens 1B is used, the focal length of the lens is adjusted so as to be focused at different depth positions with respect to the object X to be measured, and one or a plurality of pictures having a narrow depth range of the object X to be measured are taken at each focal length; generating a measurement image under each focal length by using one or M pictures shot by each focal length; by using the images for measurement at different focal lengths, a full-field image with clear all pixels and a wide depth of field range can be generated. Thus, a full-field image having a wide depth of field can be generated by using only inexpensive image sensors and zoom lenses. For example, in the image measuring system of the present embodiment, an inexpensive hand-held microscope is used as the camera device 1, and inexpensive LED illumination is used as the illumination device 2, whereby an image measuring system which is small in size, light in weight, and easy to use on site can be realized.
[ possibility of Industrial application ]
The image measuring system, the image measuring method, the image measuring program, and the image measuring storage medium according to the present invention can be used in the fields of high-precision surface quality inspection, defect inspection, and the like of electric products, electronic products, ICT equipment, cultural relics, industrial products, and the like, using a small-sized hand-held microscope as a camera device. Examples of practical applications include detection of scratches, dents, uneven coating, etc. on automobile bodies, inspection of micro cracks of automobile tires, precision inspection of local areas on large objects, quality inspection of printed circuit boards, quality inspection of products on production lines, inspection of micro scratches and depths on surfaces of mobile phones, precision inspection of scalps or skins, identification of antiques, famous painting analysis, and the like.

Claims (13)

1. An image measuring system comprises an imaging unit, an image generating unit for measurement, and a full-field image generating unit,
the camera unit uses camera equipment comprising an image sensor and a zoom lens capable of changing focal length by using an electric signal, adjusts the focal length of the lens for a measured object, focuses the focal points of the lens on different places of the measured object respectively, and shoots one or more pictures of the measured object with narrow depth of field at each focal length;
the measurement image generation unit generates measurement images for each focal length using one or more photographs taken by the photographing unit at each focal length;
the full-field image generating unit generates a full-field image in which all parts of the object to be measured are sharply focused, the full-field image having a wide depth of field, based on the measurement images of the respective focal lengths generated by the measurement image generating unit.
2. The image measurement system of claim 1, wherein:
the camera device is a microscope.
3. The image measurement system according to claim 1 or 2, wherein:
the image measuring system includes an illumination device for providing illumination to the object to be measured, and an illumination pattern generating unit for generating a continuous or discontinuous annular illumination pattern required for providing illumination to the object to be measured using the illumination device.
4. The image measurement system of claim 3, wherein:
the photographing unit changes the focal length of the zoom lens N times, and at each focal length, sets the illumination device to a full bright mode, takes one picture using the full bright pattern, or takes M pictures using different illumination patterns, taking N × M pictures in total.
5. The image measurement system according to any one of claims 1 to 4, characterized in that:
the measurement image generation unit takes only one picture at a certain focal point, if the photographing unit takes the picture as a measurement image of the focal length, and if the photographing unit takes M pictures at a certain focal point, analyzes the color intensity values of the respective pixels of the M pictures, removes pictures of pixels whose color intensity values are greater than a high-range threshold value or less than a low-range threshold value, averages the color intensity values of the pixels of the remaining pictures, and takes the average value as the color intensity value of the pixel of the focal length.
6. The image measurement system according to any one of claims 1 to 5, characterized in that:
the image measuring system includes a full-field image correction unit that corrects a color intensity value of a pixel containing a generation error in the full-field image generated by the full-field image generation unit.
7. The image measurement system according to claim 6, wherein:
the full-field image correction means corrects the color intensity value of the pixel including the generation error by using the color intensity value of the pixel around the pixel.
8. The image measurement system according to any one of claims 1 to 7, characterized in that:
the image measuring system comprises a three-dimensional image generating unit which can give out the three-dimensional coordinates of the surface of the measured object and generate the three-dimensional image of the surface of the measured object.
9. The image measurement system according to claim 8, characterized in that:
the three-dimensional image generating unit calculates the depth distance value of the measured point corresponding to each pixel based on the focal length value of the full-view image when each pixel is shot according to the corresponding relation between the focal length used when a certain pixel is shot clearly and the depth distance of the measured point corresponding to the pixel, and further generates the three-dimensional image of the surface shape of the measured object.
10. The image measuring system according to claim 8 or claim 9, wherein:
the image measuring system includes a three-dimensional image correction unit for correcting a portion of the three-dimensional coordinate abnormality in the three-dimensional image generated by the three-dimensional image generation unit.
11. The image measurement system of claim 10, wherein:
the three-dimensional image correction unit may eliminate a pixel having a three-dimensional coordinate variation, and correct the three-dimensional coordinate of the pixel having the three-dimensional coordinate variation by using the three-dimensional coordinates of other pixels around the pixel.
12. An image measurement method comprising:
using camera equipment comprising an image sensor and a zoom lens capable of changing focal length by using an electrical signal, adjusting the focal length of the lens for a measured object, respectively focusing the focal points of the lens at different places of the measured object, and taking one or more pictures of the measured object with narrow depth of field at each focal length;
generating, by a computer, images for measurement of the respective focal lengths using one or more photographs taken by a photographing unit at the respective focal lengths;
a computer generates a full-field image having a wide depth of field and in which all the parts of the object to be measured are focused sharply, using the measurement images of the respective focal lengths generated by the measurement image generating means.
13. A storage medium accessible by a computer, the storage medium storing an image measurement program that implements the functions of the claims.
CN202110510139.0A 2020-05-11 2021-05-11 Image measurement system, image measurement method, image measurement program, and image measurement storage medium Pending CN113640980A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020083377A JP6991600B2 (en) 2020-05-11 2020-05-11 Image measurement system, image measurement method, image measurement program and recording medium
JP2020-083377 2020-05-11

Publications (1)

Publication Number Publication Date
CN113640980A true CN113640980A (en) 2021-11-12

Family

ID=78415872

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110510139.0A Pending CN113640980A (en) 2020-05-11 2021-05-11 Image measurement system, image measurement method, image measurement program, and image measurement storage medium

Country Status (2)

Country Link
JP (1) JP6991600B2 (en)
CN (1) CN113640980A (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3961729B2 (en) 1999-03-03 2007-08-22 株式会社デンソー All-focus imaging device
JP4261743B2 (en) 1999-07-09 2009-04-30 株式会社日立製作所 Charged particle beam equipment
EP1624672A1 (en) 2004-08-07 2006-02-08 STMicroelectronics Limited A method of determining a measure of edge strength and focus
JP6487156B2 (en) 2014-06-23 2019-03-20 株式会社キーエンス Magnification observation apparatus, magnification image observation method, magnification image observation program, and computer-readable recording medium
JP6742783B2 (en) 2016-04-01 2020-08-19 株式会社ミツトヨ Imaging system and imaging method
JP6635893B2 (en) * 2016-07-22 2020-01-29 株式会社キーエンス Magnifying observation device

Also Published As

Publication number Publication date
JP6991600B2 (en) 2022-01-12
JP2021180364A (en) 2021-11-18

Similar Documents

Publication Publication Date Title
CN110392252B (en) Method for generating correction model of camera to correct aberration
JP5170154B2 (en) Shape measuring apparatus and calibration method
JP5900037B2 (en) Image processing apparatus and control method thereof
US10262431B2 (en) Three-dimensional measurement device
CN110519585A (en) A kind of imaging calibration method and device applied to image capture device
KR101921762B1 (en) Height measuring method and height measuring device
US10656406B2 (en) Image processing device, imaging device, microscope system, image processing method, and computer-readable recording medium
CN108010071B (en) System and method for measuring brightness distribution by using 3D depth measurement
JP2015119344A (en) Device for measuring sensitivity distribution of imaging element and its control method, and calibration device of image display device and its control method
CN106981065B (en) A kind of image Absolute Central Moment innovatory algorithm based on exposure compensating
JP2013246052A (en) Distance measuring apparatus
JP3825383B2 (en) 3D shape measuring method and 3D shape measuring apparatus
CN113640980A (en) Image measurement system, image measurement method, image measurement program, and image measurement storage medium
JP7262800B2 (en) 3D image generation system, 3D image generation method, 3D image generation program, and recording medium
JP2011133360A (en) Distance measuring device, distance measurement method, and program
CN114909994A (en) Calibration method of image measuring instrument
CN109813533B (en) Method and device for testing DOE diffraction efficiency and uniformity in batch
TWM379758U (en) Lens imaging quality detecting apparatus
JP6939501B2 (en) Image processing system, image processing program, and image processing method
JP2010121955A (en) Height information acquisition device, height information acquisition method, and program
TW201713919A (en) Three-dimensional measurement device
JP2021004762A (en) Measurement device, imaging device, measurement system, control method, program and recording medium
JP2020129187A (en) Contour recognition device, contour recognition system and contour recognition method
US20220398778A1 (en) Lens calibration method for digital imaging apparatus
Chien et al. An enhanced depth estimation system using RGB-D cameras and gyroscopes

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination