CN114119555A - Large-diameter element edge detection method based on object distance focusing method - Google Patents

Large-diameter element edge detection method based on object distance focusing method Download PDF

Info

Publication number
CN114119555A
CN114119555A CN202111428157.0A CN202111428157A CN114119555A CN 114119555 A CN114119555 A CN 114119555A CN 202111428157 A CN202111428157 A CN 202111428157A CN 114119555 A CN114119555 A CN 114119555A
Authority
CN
China
Prior art keywords
edge
image
value
camera
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111428157.0A
Other languages
Chinese (zh)
Inventor
赵林杰
陈明君
尹朝阳
程健
袁晓东
郑万国
廖威
王海军
张传超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Institute of Technology
Original Assignee
Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Institute of Technology filed Critical Harbin Institute of Technology
Priority to CN202111428157.0A priority Critical patent/CN114119555A/en
Publication of CN114119555A publication Critical patent/CN114119555A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Abstract

An object distance focusing method-based edge detection method for a large-diameter element relates to the technical field of engineering optics and is used for solving the problem that a globally clear focusing position cannot be obtained before an image is collected in the prior art. The technical points of the invention comprise: respectively moving a plurality of edges of the element into the visual field range of the camera, changing the object distance, and acquiring a plurality of images corresponding to each edge under different focal planes; automatically and clearly focusing each edge according to the variance variation curve of a plurality of images corresponding to each edge; after focusing is finished, acquiring a plurality of images containing each edge, and processing the plurality of images so as to obtain the positions of the plurality of edges; the method comprises the following steps of designing an edge automatic focusing strategy to carry out automatic focusing according to a variance variation curve of an image, so that the obtained edge image is clearer, and further the edge coordinate position of an element can be obtained more accurately. The method is easy to realize automation and can be used for edge detection of large-caliber elements.

Description

Large-diameter element edge detection method based on object distance focusing method
Technical Field
The invention relates to the technical field of engineering optics, in particular to a large-caliber element edge detection method based on an object distance focusing method.
Background
The construction of high power solid state laser devices requires a large number of optical components, which are susceptible to surface damage in a high laser environment, which weakens the material properties and further exacerbates the damage process. Research has shown that if damage is not repaired or inhibited in time, the damage sizes of the front and back surfaces of the element under laser irradiation will increase linearly and exponentially, respectively. In addition to the further aggravation of the damage of the element itself, the damage also degrades the quality of the light beam transmitted through the element to affect the quality of the focal spot, and the strong region generated by the modulation of the light field causes the damage of the downstream element, which is a vicious circle process. In order to prolong the service life of elements and reduce the maintenance cost of the device, CO is often adopted in engineering2The laser repairing method repairs the surface damage of the fused quartz element, and the damaged crack is healed through the heat effect of laser so as to improve the damage threshold of the element.
The laser repair of the large-caliber optical element is carried out on a detection and repair machine tool, and the element needs to be put on a lower frame and installed on the machine tool during repair. Due to the fact that clamping accuracy is limited, the position of the surface of the mounted element is uncertain, and the pose of the element needs to be determined again. Calculating the coordinates of the geometric center of the component by obtaining the edge positions is an important link in the automated determination of the surface position of the component. In order to automatically acquire the accurate position of the edge, a clear image of the edge of the element needs to be acquired and the edge needs to be extracted through image processing. In order to improve the detection accuracy of the edge position, a microscopic camera is used for image acquisition. Because the assembly adjustment precision is limited during installation, the focusing positions of the elements are inconsistent each time, and automatic focusing is required.
There are two types of auto-focusing methods, active and passive. The active focusing realizes automatic focusing by measuring the distance between the lens and the object, and when the measured distance exceeds the range of the depth of field, the motion mechanism is controlled to adjust the focusing position to obtain a clear image. The method has limited range finding precision, the depth of field of a microscopic imaging system is very small, and the range finding method cannot meet the use requirement of microscopic detection. Therefore, in engineering, a passive method is generally adopted to perform automatic focusing on a microscopic image, and the method performs definition evaluation on images at different focusing positions to acquire a position with the highest definition as a focusing position to realize automatic focusing. However, the depth of field of the micro-camera is far less than the height of the element chamfer in the depth of field direction, and the edge image of the element can only be clearly imaged in the depth of field range near the focal plane, so that the micro-images at different focusing positions can only be locally clear, and the clear focusing position at the edge can not be obtained through global definition evaluation.
Disclosure of Invention
In view of the above problems, the present invention provides a method for detecting an edge of a large-aperture element based on an object distance focusing method, so as to solve the problem that a globally clear focusing position cannot be obtained before an image is acquired in the prior art.
An object distance focusing method-based edge detection method for a large-caliber element comprises the following steps:
the method comprises the following steps that firstly, a plurality of edges of an element are respectively moved to a camera view range, the object distance is changed, and a plurality of images corresponding to each edge under different focal planes are acquired; automatically and clearly focusing each edge according to the variance variation curve of a plurality of images corresponding to each edge;
and step two, after focusing is finished, acquiring a plurality of images containing each edge, processing the plurality of images and acquiring the positions of the plurality of edges.
Further, the process of acquiring a plurality of images corresponding to each edge under different focal planes in the first step includes: and setting two searching steps, enabling the camera to move close to the edge of the element along the positive direction or the negative direction of the Z axis of the machine tool coordinate system according to the searching steps, and acquiring and obtaining a plurality of images under different focal planes.
Further, the element has a regular rectangular cross section, and the plurality of edges of the element include a left edge, a right edge, an upper edge, and a lower edge.
Further, the process of performing automatic sharp focusing on each edge according to the variance variation curve of the plurality of images corresponding to each edge in the first step includes:
dividing each acquired image into a plurality of subarea images, wherein the subarea images have consistent focusing states;
step two, calculating the gray scale variance value of each subregion image;
drawing according to the gray variance value of each subregion image to obtain a horizontal direction variance change curve corresponding to each image;
and step four, carrying out automatic focusing according to an edge automatic focusing strategy according to the variance variation curve in the horizontal direction.
Further, in the second step, the gray scale variance value of each subregion image x is calculated according to the following formula:
Figure BDA0003379242670000021
in the formula, I (I, j) represents the gray value of the pixel point (I, j); μ represents the subregion image gray value mean.
Further, the specific steps of the first step four include:
step one, four and one, firstly enabling the camera to search the step s1Moving the machine tool coordinate system along the positive direction or the negative direction of the Z axis to be close to the edge of the element, acquiring images and obtaining the level according to the steps from one to threeThe direction variance variation curve is that for an image collected under a focal plane, the corresponding horizontal direction variance variation curve comprises a fluctuation starting point O and one or more peak values, the gray variance value corresponding to the camera focused on the component chamfering area is set to be a peak value I, and the gray variance value corresponding to the camera focused on the boundary position of the component chamfering area and the component surface area is set to be a peak value II:
a1) when the value of the peak value II is larger than the preset threshold value t1Then, judging whether the difference between the position of the peak value I in the variance variation curve and the position of the fluctuation starting point O is smaller than a preset threshold value t or not2
a11) If the value is less than the threshold value, calculating the slope between the fluctuation starting point O and the peak value I: if the slope is larger than the preset threshold t3The slope is saved and recorded as kOI(ii) a If the slope is smaller than the preset threshold t3Make the camera follow the search step s1Moving the edge of the element along the Z-axis negative direction of the machine tool coordinate system, acquiring an image, and recalculating the slope between the fluctuation starting point O and the peak value I until the slope is greater than a preset threshold t3(ii) a Executing the first step, the second step;
a12) if not, the camera is enabled to be in accordance with the search step s1Moving the machine tool coordinate system along the positive direction of the Z axis to be close to the edge of the element, collecting images, and judging whether the difference between the position of the peak value I in the variance change curve and the position of the fluctuation starting point O is smaller than a preset threshold value t or not2If less than execution a 11); if not, repeatedly executing a 12);
a2) when the value of the peak value II is not more than the preset threshold value t1And comparing the current peak value II with the value of the peak value II obtained by acquiring the image after the camera is moved last time: if the value of the current peak value II is larger than the value of the peak value II obtained by acquiring the image after the camera is moved last time, enabling the camera to search the step s1Moving the element closer to the edge of the element in the direction of the last movement; otherwise, moving the element close to the edge in the direction opposite to the last moving direction; acquiring images, and repeatedly executing a1) according to the variance change curve in the horizontal direction obtained in the steps from one to three;
step one, four and two, then making the camera according to the searching step s2Moving the machine tool coordinate system in the positive direction or the negative direction of the Z axis to be close to the edge of the element, and acquiring images to obtain a variance change curve in the horizontal direction according to the steps from one to three: calculating the slope between the fluctuation starting point O and the peak value I, and judging whether the current slope is larger than the slope k stored in a11)OI: if the search step is larger than the step length s, enabling the camera to search according to the step length s2Moving the adjacent element edge along the positive direction of the Z axis of the machine tool coordinate system until the slope kOINo longer increased; if not, the camera is enabled to be in accordance with the search step s2Moving the near element edge along the negative direction of the Z axis of the machine tool coordinate system until the slope kOINo longer increased;
completing the automatic focusing according to the edge automatic focusing strategy from the first step one to the first step two, wherein the step s is searched1Greater than search step s2
Further, in the first step four, the calculation formula of the slope between the fluctuation starting point O and the peak value i is:
Figure BDA0003379242670000031
in the formula (n)0,T0)、(n1,T1) Respectively representing the coordinates of the fluctuation starting point O and the peak value I in the variance curve in the horizontal direction.
Further, the second step of collecting a plurality of images including each edge, and the specific step of processing the plurality of images includes: for each image, firstly convolving the image with a Sobel operator to obtain a gradient image in the horizontal direction of the image; then presetting a first fixed threshold value, and carrying out binarization processing on the gradient image to obtain a binarized image; then, presetting a second fixed threshold, counting the number of pixels with the pixel value of 255 in each row in the binary image, and determining the pixel row where the row is located at the edge when the number of pixels exceeds the preset second fixed threshold, namely determining the edge line; then, the distance between the edge line and the image center line is calculated, and the element edge position is obtained by calculation based on the distance.
Further, the plurality of edge positions obtained in step two include:
the midpoint of the left edge is in X-axis coordinate X 'under the machine coordinate system'LComprises the following steps:
X'L=XL+kpixelΔXL
in the formula, XLThe coordinate of the X axis when the middle point of the left edge which is calibrated in advance moves to the center of the camera visual field; k is a radical ofpixelThe actual size represented by a single pixel in the calibrated image; Δ XLIs the pixel distance between the center point of the left edge and the center line of the image;
the middle point of the right edge is in X-axis coordinate X 'under a machine tool coordinate system'RComprises the following steps:
X'R=XR+kpixelΔXR
in the formula, XRThe coordinate of the X axis when the middle point of the right edge which is calibrated in advance moves to the center of the camera visual field; Δ XRIs the pixel distance between the center point of the right edge and the center line of the image;
y-axis coordinate Y 'of midpoint of upper edge in machine tool coordinate system'TComprises the following steps:
Y′T=YT+kpixelΔYT
in the formula, YTThe coordinate of the Y axis when the middle point of the upper edge which is calibrated in advance moves to the center of the camera visual field; delta YTIs the pixel distance between the center point of the upper edge and the image center line;
y-axis coordinate Y 'of lower edge midpoint in machine tool coordinate system'DComprises the following steps:
Y′D=YD+kpixelΔYD
in the formula, YDThe coordinate of the Y axis when the middle point of the lower edge which is calibrated in advance moves to the center of the camera visual field; delta YDIs the pixel distance between the midpoint of the lower edge and the center line of the image.
The beneficial technical effects of the invention are as follows:
the invention adopts an object distance focusing method based on an image variance change curve to realize the automatic focusing of the element edge and obtain a clear image of the edge; the accurate position of the element edge is obtained by processing the edge clear image; the method realizes high-precision detection of the edge of the large-diameter element, is easy to realize automation, and can be used for the automatic determination process of the position and posture of the element.
Drawings
The present invention may be better understood by reference to the following description taken in conjunction with the accompanying drawings, which are incorporated in and form a part of this specification, and which are used to further illustrate preferred embodiments of the present invention and to explain the principles and advantages of the present invention.
FIG. 1 is a schematic structural diagram of an edge detection apparatus for a large-diameter device according to an embodiment of the present invention;
FIG. 2 is a schematic diagram illustrating dividing of edge image sub-regions according to an embodiment of the present invention;
FIG. 3 is a graph of variance of images at different focal plane positions in an embodiment of the invention;
FIG. 4 is a flow chart of edge autofocus in an embodiment of the present invention;
FIG. 5 is a diagram illustrating the result of automatic focusing at the edge of an element according to an embodiment of the present invention; wherein, panel (a) is the left edge; figure (b) is the right edge; FIG. (c) is the upper edge; FIG. (d) is the lower edge;
FIG. 6 is a diagram illustrating the edge detection result of the device according to the embodiment of the present invention; wherein, the image (a) is the image collected after the edge focusing is clear; the image (b) is a gradient binary image after image processing; fig. c is a graph showing the result of edge detection.
Detailed Description
In order that those skilled in the art will better understand the disclosure, exemplary embodiments or examples of the disclosure are described below with reference to the accompanying drawings. It is obvious that the described embodiments or examples are only some, but not all embodiments or examples of the invention. All other embodiments or examples obtained by a person of ordinary skill in the art based on the embodiments or examples of the present invention without any creative effort shall fall within the protection scope of the present invention.
The invention provides an object distance focusing method based on an image gray variance variation curve, which realizes automatic focusing of the edge position of an element, and obtains the edge contour position by processing an edge clear image, thereby realizing high-precision automatic detection of the edge position of the element.
The embodiment of the invention provides a method for detecting the edge of a large-caliber element based on an object distance focusing method, which comprises the following steps: the method comprises the following steps that firstly, a plurality of edges of an element are respectively moved to a camera view range, the object distance is changed, and a plurality of images corresponding to each edge under different focal planes are acquired; automatically and clearly focusing each edge according to the variance variation curve of a plurality of images corresponding to each edge; and step two, after focusing is finished, acquiring a plurality of images containing each edge, processing the plurality of images and acquiring the positions of the plurality of edges.
In this embodiment, optionally, the process of acquiring and obtaining a plurality of images corresponding to each edge under different focal planes in the first step includes: and setting two searching steps, enabling the camera to move close to the edge of the element along the positive direction or the negative direction of the Z axis of the machine tool coordinate system according to the searching steps, and acquiring and obtaining a plurality of images under different focal planes.
In this embodiment, optionally, the cross section of the element is a regular rectangle, and the plurality of edges of the element include a left edge, a right edge, an upper edge, and a lower edge.
In this embodiment, optionally, the process of performing automatic sharp focusing on each edge according to the variance variation curves of the plurality of images corresponding to each edge in the first step includes:
dividing each acquired image into a plurality of subarea images, wherein the subarea images have consistent focusing states;
step two, calculating the gray scale variance value of each subregion image;
drawing according to the gray variance value of each subregion image to obtain a horizontal direction variance change curve corresponding to each image;
and step four, carrying out automatic focusing according to an edge automatic focusing strategy according to the variance variation curve in the horizontal direction.
In this embodiment, optionally, in the second step, the gray scale variance value of each sub-region image x is calculated according to the following formula:
Figure BDA0003379242670000061
in the formula, I (I, j) represents the gray value of the pixel point (I, j); μ represents the subregion image gray value mean.
In this embodiment, optionally, the specific steps of step one and step four include:
step one, four and one, firstly enabling the camera to search the step s1Moving the acquired image to be close to the edge of the element along the positive direction or the negative direction of the Z axis of a machine tool coordinate system, acquiring a horizontal direction variance change curve according to the steps from one to three, setting a gray scale variance value corresponding to the image acquired under a focal plane, which contains a fluctuation starting point O and one or more peak values, as a peak value I when a camera is focused in an element chamfering area, and a gray scale variance value corresponding to the image acquired under the focal plane, which is focused at a boundary position of the element chamfering area and an element surface area, as a peak value II:
a1) when the value of the peak value II is larger than the preset threshold value t1Then, judging whether the difference between the position of the peak value I in the variance variation curve and the position of the fluctuation starting point O is smaller than a preset threshold value t or not2
a11) If the value is less than the threshold value, calculating the slope between the fluctuation starting point O and the peak value I: if the slope is larger than the preset threshold t3The slope is saved and recorded as kOI(ii) a If the slope is smaller than the preset threshold t3Make the camera follow the search step s1Moving the edge of the element along the Z-axis negative direction of the machine tool coordinate system, acquiring an image, and recalculating the slope between the fluctuation starting point O and the peak value I until the slope is greater than a preset threshold t3(ii) a Executing the first step, the second step;
a12) if not, the camera is enabled to be in accordance with the search step s1Moving the workpiece to approach the edge of the element along the positive direction of the Z axis of the machine tool coordinate system, collecting the image and judging the variance againWhether the difference between the position of the peak value I in the change curve and the position of the fluctuation starting point O is smaller than a preset threshold value t or not2If less than execution a 11); if not, repeatedly executing a 12);
a2) when the value of the peak value II is not more than the preset threshold value t1And comparing the current peak value II with the value of the peak value II obtained by acquiring the image after the camera is moved last time: if the value of the current peak value II is larger than the value of the peak value II obtained by acquiring the image after the camera is moved last time, enabling the camera to search the step s1Moving the element closer to the edge of the element in the direction of the last movement; otherwise, moving the element close to the edge in the direction opposite to the last moving direction; acquiring images, and repeatedly executing a1) according to the variance change curve in the horizontal direction obtained in the steps from one to three;
step one, four and two, then making the camera according to the searching step s2Moving the machine tool coordinate system in the positive direction or the negative direction of the Z axis to be close to the edge of the element, and acquiring images to obtain a variance change curve in the horizontal direction according to the steps from one to three: calculating the slope between the fluctuation starting point O and the peak value I, and judging whether the current slope is larger than the slope k stored in a11)OI: if the search step is larger than the step length s, enabling the camera to search according to the step length s2Moving the adjacent element edge along the positive direction of the Z axis of the machine tool coordinate system until the slope kOINo longer increased; if not, the camera is enabled to be in accordance with the search step s2Moving the near element edge along the negative direction of the Z axis of the machine tool coordinate system until the slope kOINo longer increased;
completing the automatic focusing according to the edge automatic focusing strategy from the first step one to the first step two, wherein the step s is searched1Greater than search step s2
In this embodiment, optionally, in the first step, a calculation formula of a slope between the fluctuation starting point O and the peak value i is as follows:
Figure BDA0003379242670000071
in the formula (n)0,T0)、(n1,T1) Respectively representing fluctuationsThe coordinates of the starting point O and the peak i in the variance curve in the horizontal direction.
In this embodiment, optionally, a plurality of images including each edge are collected in the second step, and the specific step of processing the plurality of images includes: for each image, firstly convolving the image with a Sobel operator to obtain a gradient image in the horizontal direction of the image; then presetting a first fixed threshold value, and carrying out binarization processing on the gradient image to obtain a binarized image; then, presetting a second fixed threshold, counting the number of pixels with the pixel value of 255 in each row in the binary image, and determining the pixel row where the row is located at the edge when the number of pixels exceeds the preset second fixed threshold, namely determining the edge line; then, the distance between the edge line and the image center line is calculated, and the element edge position is obtained by calculation based on the distance.
In this embodiment, optionally, the plurality of edge positions obtained in step two include:
the midpoint of the left edge is in X-axis coordinate X 'under the machine coordinate system'LComprises the following steps:
X′L=XL+kpixelΔXL
in the formula, XLThe coordinate of the X axis when the middle point of the left edge which is calibrated in advance moves to the center of the camera visual field; k is a radical ofpixelThe actual size represented by a single pixel in the calibrated image; Δ XLIs the pixel distance between the center point of the left edge and the center line of the image;
the middle point of the right edge is in X-axis coordinate X 'under a machine tool coordinate system'RComprises the following steps:
X'R=XR+kpixelΔXR
in the formula, XRThe coordinate of the X axis when the middle point of the right edge which is calibrated in advance moves to the center of the camera visual field; Δ XRIs the pixel distance between the center point of the right edge and the center line of the image;
y-axis coordinate Y 'of midpoint of upper edge in machine tool coordinate system'TComprises the following steps:
Y′T=YT+kpixelΔYT
in the formula, YTFor making a pre-markingY-axis coordinates when the midpoint of the fixed upper edge moves to the center of the camera view; delta YTIs the pixel distance between the center point of the upper edge and the image center line;
y-axis coordinate Y 'of lower edge midpoint in machine tool coordinate system'DComprises the following steps:
Y′D=YD+kpixelΔYD
in the formula, YDThe coordinate of the Y axis when the middle point of the lower edge which is calibrated in advance moves to the center of the camera visual field; delta YDIs the pixel distance between the midpoint of the lower edge and the center line of the image.
Another embodiment of the present invention provides a method for detecting an edge of a large-caliber element, as shown in fig. 1, in which a detection device is composed of a motion platform and a microscopic detection system. The motion platform comprises X, Y, Z three motion axes, and the motion directions of the motion axes are respectively consistent with the directions of the X, Y, Z coordinate axes of the machine tool coordinate system; the motion platform can carry optical large-caliber elements to realize the motion along the X, Y axis direction, and carry a microscopic detection system to realize the motion along the Z axis direction. The microscopic detection system consists of an area array CCD camera, a variable-focus microscope lens, a coaxial light source and an annular light source; the resolution of the area array CCD camera is 2456 multiplied by 2056, the detection range is 1.5mm multiplied by 1.3mm, and the detection precision is 0.63 mu m/pixel. During detection, the four edges of the element are moved to a microscopic detection visual field range one by one according to a preset coordinate, an object distance is changed by controlling a microscopic detection system to move along a Z axis for automatic focusing, and finally, an acquired edge image is processed to obtain an edge position. The method comprises the following specific steps:
step 1, carrying out initialization operation on a repair platform to complete optical element installation;
according to the embodiment of the invention, the positioning precision of the motion platform is +/-10 μm, the motion platform comprises X, Y, Z three motion axes, the platform can realize two-dimensional high-precision movement of the optical element by controlling X, Y axes, and the object distance adjustment of the microscopic detection system is realized by controlling the Z axis.
Step 2, controlling the motion platform to move along an X, Y, Z axis according to a pre-calibrated coordinate, moving the edge of the optical element to a microscopic view range, and automatically focusing a plurality of edges of the element according to a variance change curve of an image; wherein the plurality of edges comprises a left edge, a right edge, an upper edge, and a lower edge;
according to the embodiment of the invention, because the elements deflect in the installation process, the micro camera is not focused at the edge position, and automatic focusing is required to improve the edge detection precision. The camera detection range of the microscopic detection system is far larger than the component installation error, so that the edge to be detected can be ensured to enter a microscopic field of view through the position coordinates calibrated in advance. The illumination mode that the microscopic detection system adopted is coaxial light source and annular light source stack illumination, and coaxial light source can promote component surface brightness and be convenient for the differentiation of component and background, and the annular light source can reflect the slight characteristic in chamfer region and be convenient for the focus of camera to the edge.
The adopted method is an object distance focusing method based on an image gray variance change curve. The gray variance variation curve of the image in the horizontal or vertical direction has different characteristics under different focusing positions. By utilizing the characteristics, the focal plane is adjusted by controlling the Z-axis motion of the platform to change the object distance so as to realize the automatic focusing of the edge. Taking the left edge as an example, the image including the left edge acquired by the microscope camera is divided into 245 sub-regions as shown in fig. 2, the size of the sub-region is 10 pixels × 2056 pixels, and the sub-region is much smaller than the depth of field of the microscope camera, so that the image in the region can be considered to have a consistent focusing state. And (3) calculating the image gray scale variance value of each subarea through the following formula (1), and drawing a variance variation curve in the image horizontal direction.
Figure BDA0003379242670000091
Figure BDA0003379242670000092
Wherein x is 1,2, …,245 represents a subregion; i (I, j) represents the gray value of the pixel point (I, j); μ represents the mean of the gray values of a small rectangular area (sub-area).
Fig. 3 is a variance curve of images acquired by a camera in the horizontal direction under different focal planes, wherein a red line represents the position of the focal plane, an abscissa represents the position of the image in the horizontal direction, and an ordinate represents a gray variance. The difference between the gray values of the chamfer and the element surface image is larger, and a larger peak value (peak value II) appears on the gray value variance curve of the boundary position of the two areas, and the peak value can be used for judging whether the focal plane enters the vicinity of the microscopic image area. As shown in fig. 3 (a), (g), when the focal plane is farther from the edge, the variance is small except for the peak ii; as shown in (a) to (b) and (g) to (f) of fig. 3, as the focal plane gradually approaches the edge, the peak value ii gradually increases, and the variance change of the chamfer area gradually becomes obvious; as can be seen from comparison of (f) to (c) in fig. 3, as the focal plane is gradually moved toward the best focus position along the Z-axis, the position of the peak i is gradually moved toward the edge of the element; as can be seen from comparison of (b) to (c) in fig. 3, the peak value i in the curve gradually increases as the focal plane gradually approaches the best focus position in the negative Z-axis direction, and the peak value becomes maximum when the focal plane reaches the best focus position.
Based on the above features, the present invention designs an edge autofocus strategy as shown in FIG. 4, wherein (n)0,T0)、(n1,T1)、(n2,T2) Respectively representing the coordinates of a fluctuation starting point O, a curve peak value I and a curve peak value II of the variance change curve in the variance curve; t is t1、t2、t3Is a preset threshold value, t1This value is set to bring the focus position close to the edge, providing sufficient information for subsequent focusing, in relation to the peak value ii when the microscopic camera focal plane enters the field of view of the camera; t is t2Related to the width of the chamfered region, t3Peak value k of slope value when focusing on edge position with cameraOIIn this regard, both values are for the focal plane to be further closer to the edge; the preset threshold values are set according to experience values of multiple experiments; s1、s2Search step size, s, for movement along Z-axis1Taking a larger step distance to increase the search speed and prevent the local extreme point from being trapped, s2And a smaller step size is taken to improve the search precision. The strategy is divided into four parts:
moving a focal plane to the vicinity of an edge to ensure that a gray variance variation curve can provide enough information for automatic focusing, and the method specifically comprises the following steps:
A) the camera collects edge images and calculates the change curve of the gray variance of the images, and the value T of the peak value II is obtained according to the curve2-nowIf the value is greater than the threshold t1Completing the process I, otherwise entering the step B);
B) if T2-now>T2-preIf s is equal to s1Otherwise, let s be-s1. Controlling Z-axis moving step length s to make T2-pre=T2-nowThe camera collects the edge image and calculates the image gray variance curve, and the value T of the peak value II is obtained according to the curve2-now. If T2-now>t1If yes, finishing the process I, otherwise, repeating the step B);
and secondly, when the focal plane is gradually close to the edge along the positive direction of the Z axis, the focal plane can be further moved to the position near the edge by judging the position of the peak value I. The method comprises the following specific steps: the camera collects edge images and calculates the gray variance change curve of the images, and the position n of the peak value I is obtained according to the curve1. If n is1≤no+t2Then the process is completed, otherwise the platform is controlled to move in the positive direction s1Until n is satisfied1≤no+t2
And thirdly, when the focal plane is gradually close to the edge along the negative direction of the Z axis, the focal plane can be further moved to the vicinity of the edge by judging the slope between the point O and the peak value I. The method comprises the following specific steps: the camera collects an edge image and calculates an image gray variance change curve, and a slope between a point O and a peak value I is obtained according to the curve:
Figure BDA0003379242670000101
if k isOI≥t3Then the process is finished, otherwise the platform is controlled to move along the Z-axis negative direction s1Until k is satisfiedOI≥t3
And fourthly, finding the optimal focusing position by judging the change of the slope between the point O and the peak value I, and considering that the camera is focused on the edge of the optical element when the slope between the point O and the peak value I reaches the maximum value. The method comprises the following specific steps:
A) the slope between the point O and the peak I at the end of step (c) is recorded as kOI-preAnd entering step B);
B) controlling Z-axis movement step length s2The obtained camera collects edge images and calculates an image gray variance change curve, and a slope k between a point O and a peak value I is obtained according to the curveOI-now. If k isOI-now>kOI-preIf s is equal to s2Otherwise, let s be-s2Let kOI-pre=kOI-nowAnd entering step C);
C) controlling the Z axis to move by a step length s, controlling the camera to collect the edge image and calculating an image gray variance change curve, and obtaining a slope k between a point O and a peak value I according to the curveOI-now. If T2-now>t1Or kOI-now≤kOI-preThen the auto-focusing is completed, otherwise let kOI-pre=kOI-nowAnd repeating step C).
And 3, after focusing is finished, controlling a camera to collect microscopic images with clear edges, and processing the images to obtain edge positions.
According to the embodiment of the invention, the position of the edge in the image is obtained by adopting an edge extraction algorithm based on image gradient, the method describes the gray level mutation of the edge position of the image by utilizing the gradient, the edge profile can be obtained by carrying out binarization processing on the edge gradient image, and the machine tool coordinate of the edge can be calculated according to the profile position so as to complete the edge detection of the large-caliber element.
After obtaining a sharp image of the edge, the distance of the edge from the center of the microscope needs to be calculated. Still taking the left edge as an example, since the edge is an approximately vertical straight line in the image and there is a sudden change in the gray value in the direction perpendicular to the edge, the left edge is detected by using the Sobel operator in the horizontal direction according to the following formula:
Gx=Sobelx*I (3)
Figure BDA0003379242670000102
in the formula, SobelxRepresenting a horizontal Sobel operator; i represents an original image; gxGradient image showing the horizontal direction, from SobelxAnd performing planar convolution with the original image I to obtain the image.
For subsequent processing, the gradient image G is processedxImage I is obtained by binarization processing using fixed threshold value T shown in formula (5)BSimultaneously for image IBAnd (4) carrying out morphological opening operation to eliminate noise points generated by gray value fluctuation, and obtaining a final binarization result image.
Figure BDA0003379242670000111
Counting the number of pixels of each column of the binary image with a pixel value of 255 from left to right, determining that the column is a pixel column where the edge is located when the number of pixels exceeds a set threshold value for the first time, wherein the threshold value is half of the number of the pixels of the column in actual use, namely 1028. Calculating the distance DeltaX between the edge and the center of the microscopic field of viewLObtaining element left edge position X 'using formula (6)'L
X'L=XL+kpixelΔXL (6)
In the formula, XLThe machine tool coordinate of the left edge which is calibrated in advance and moved to the visual field center is calibrated by manually moving the left edge to the visual field center of the microscope camera and reading the grating ruler on the machine tool; k is a radical ofpixelThe coefficient is the actual size represented by a single pixel in the calibrated image, and can be calibrated by a standard calibration plate.
The position coordinates of the upper edge, the lower edge, the left edge and the right edge of the element can be obtained by adopting the process, so that the edge detection of the large-caliber element is completed.
Another embodiment of the present invention provides an example analysis of a method for detecting an edge of a large-diameter element based on an object distance focusing method, in which a batch of elements are detected by using the method, and the diameter of the elements is 430mm × 430 mm. The method realizes the automatic detection of the element edge by using autonomously developed 'large-caliber element surface defect automatic detection and repair control software', and comprises the following specific processes:
(1) and carrying out zero returning and error compensation on the moving platform and moving the moving platform to a mounting station to complete the mounting of the component.
(2) And sequentially moving the four edges of the element to a microscopic view range, and automatically focusing the edges. Fig. 5 is an image before and after focusing of four edges of the element saved in the automatic edge seeking process, and an image of an area near the edges is cut out for comparison. As can be seen from the figure, the object distance focusing method based on the image variance variation curve can achieve the acquisition of clear-edge images.
(3) And processing the acquired edge-sharp image. Fig. 6(a) shows a clear image of the left edge captured by the camera, and the horizontal gradient map is calculated and subjected to binarization processing, and the processing result is shown in fig. 6 (b). The number of pixels with a pixel value of 255 per column is counted, the column which reaches the threshold value first is regarded as the column where the edge is located, and the red lines in fig. 6(b) and (c) are the detected edge lines. The same method is adopted to obtain the edge lines of other edges, the distance deviating from the center of the microscopic field of view is calculated, and the result is as follows:
ΔXL,ΔXR,ΔYT,ΔYD=0.38mm,0.36mm,0.27mm,0.27mm
(4) acquiring the machine tool coordinate when the camera collects the edge, and superposing the deviation value on the corresponding machine tool coordinate to obtain the accurate position of the element edge as follows:
X′L,X′R,Y′T,Y′D=-55.595mm,373.474mm,215.426mm,-213.623mm
according to the invention, the high-precision detection of the edge of the large-caliber element is realized through the process, and accurate position reference is provided for the determination of the subsequent element pose.
While the invention has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of this description, will appreciate that other embodiments can be devised which do not depart from the scope of the invention as described herein. The present invention has been disclosed in an illustrative rather than a restrictive sense, and the scope of the present invention is defined by the appended claims.

Claims (9)

1. An object distance focusing method-based edge detection method for a large-caliber element is characterized by comprising the following steps:
the method comprises the following steps that firstly, a plurality of edges of an element are respectively moved to a camera view range, the object distance is changed, and a plurality of images corresponding to each edge under different focal planes are acquired; automatically and clearly focusing each edge according to the variance variation curve of a plurality of images corresponding to each edge;
and step two, after focusing is finished, acquiring a plurality of images containing each edge, processing the plurality of images and acquiring the positions of the plurality of edges.
2. The method for detecting the edge of the large-aperture element based on the object distance focusing method according to claim 1, wherein the step one of acquiring a plurality of images corresponding to each edge under different focal planes comprises: and setting two searching steps, enabling the camera to move close to the edge of the element along the positive direction or the negative direction of the Z axis of the machine tool coordinate system according to the searching steps, and acquiring and obtaining a plurality of images under different focal planes.
3. The method for detecting the edge of the large-caliber element based on the object distance focusing method as claimed in claim 2, wherein the cross section of the element is a regular rectangle, and the plurality of edges of the element comprise a left edge, a right edge, an upper edge and a lower edge.
4. The method for detecting the edge of the heavy caliber element based on the object distance focusing method as claimed in claim 3, wherein the step one of automatically focusing each edge according to the variance variation curve of the plurality of images corresponding to each edge comprises:
dividing each acquired image into a plurality of subarea images, wherein the subarea images have consistent focusing states;
step two, calculating the gray scale variance value of each subregion image;
drawing according to the gray variance value of each subregion image to obtain a horizontal direction variance change curve corresponding to each image;
and step four, carrying out automatic focusing according to an edge automatic focusing strategy according to the variance variation curve in the horizontal direction.
5. The method for detecting the edge of the large-aperture element based on the object distance focusing method as claimed in claim 4, wherein the gray scale variance value of each sub-area image x is calculated according to the following formula in the step two:
Figure FDA0003379242660000011
in the formula, I (I, j) represents the gray value of the pixel point (I, j); μ represents the subregion image gray value mean.
6. The method for detecting the edge of the large-caliber element based on the object distance focusing method as claimed in claim 5, wherein the specific steps of the first step and the fourth step comprise:
step one, four and one, firstly enabling the camera to search the step s1Moving the acquired image to be close to the edge of the element along the positive direction or the negative direction of the Z axis of a machine tool coordinate system, acquiring a horizontal direction variance change curve according to the steps from one to three, setting a gray scale variance value corresponding to the image acquired under a focal plane, which contains a fluctuation starting point O and one or more peak values, as a peak value I when a camera is focused on an element chamfering area, and a gray scale variance value corresponding to the image acquired under the focal plane, which is focused on a boundary position between the element chamfering area and an element surface area, as a peak value II:
a1) when the value of the peak value II is larger than the preset threshold value t1Then, the position of the peak value I in the variance variation curve is judgedWhether the difference between the position of the starting point O and the position of the fluctuation is less than a preset threshold value t2
a11) If the value is less than the preset value, calculating the slope between the fluctuation starting point O and the peak value I: if the slope is larger than the preset threshold t3The slope is saved and recorded as kOI(ii) a If the slope is smaller than the preset threshold t3Make the camera follow the search step s1Moving the edge of the element along the Z-axis negative direction of the machine tool coordinate system, collecting images, and recalculating the slope between the fluctuation starting point O and the peak value I until the slope is greater than a preset threshold t3(ii) a Executing the first step, the second step;
a12) if not, the camera is enabled to be in accordance with the search step s1Moving the machine tool coordinate system along the positive direction of the Z axis to be close to the edge of the element, collecting images, and judging whether the difference between the position of the peak value I in the variance change curve and the position of the fluctuation starting point O is less than a preset threshold value t or not2If less than execution a 11); if not, repeatedly executing a 12);
a2) when the value of the peak value II is not more than the preset threshold value t1And comparing the current peak value II with the value of the peak value II obtained by acquiring the image after the camera is moved last time: if the value of the current peak value II is larger than the value of the peak value II obtained by acquiring the image after the camera is moved last time, enabling the camera to search the step s1Moving the approaching element edge in the last moving direction, otherwise moving the approaching element edge in the direction opposite to the last moving direction; acquiring images, and repeatedly executing a1) according to the variance change curve in the horizontal direction obtained in the steps from one to three;
step one, four and two, then making the camera according to the searching step s2Moving the machine tool coordinate system in the positive direction or the negative direction of the Z axis to be close to the edge of the element, and acquiring images to obtain a variance change curve in the horizontal direction according to the steps from one to three: calculating the slope between the fluctuation starting point O and the peak value I, and judging whether the current slope is larger than the slope k stored in a11)OI: if the search step is larger than the step length s, enabling the camera to search according to the step length s2Moving the adjacent element edge along the positive direction of the Z axis of the machine tool coordinate system until the slope kOINo longer increased; if not, the camera is enabled to be in accordance with the search step s2Along the Z-axis of the machine tool coordinate systemMoving the direction to close to the edge of the element until the slope kOINo longer increased;
completing the automatic focusing according to the edge automatic focusing strategy from the first step one to the first step two, wherein the step s is searched1Greater than search step s2
7. The method for detecting the edge of the large-diameter element based on the object distance focusing method as claimed in claim 6, wherein the calculation formula of the slope between the fluctuation starting point O and the peak value I in the first step four is:
Figure FDA0003379242660000021
in the formula (n)0,T0)、(n1,T1) Respectively, the coordinates of the fluctuation start point O and the peak value I in the variance curve in the horizontal direction.
8. The method for detecting the edge of the large-caliber element based on the object distance focusing method as claimed in claim 7, wherein a plurality of images including each edge are collected in the second step, and the specific step of processing the plurality of images comprises: for each image, firstly convolving the image with a Sobel operator to obtain a gradient image in the horizontal direction of the image; then presetting a first fixed threshold value, and carrying out binarization processing on the gradient image to obtain a binarized image; then, presetting a second fixed threshold, counting the number of pixels with the pixel value of 255 in each row in the binary image, and determining the pixel row where the row is located at the edge when the number of pixels exceeds the preset second fixed threshold, namely determining the edge line; then, the distance between the edge line and the image center line is calculated, and the element edge position is obtained by calculation based on the distance.
9. The method for detecting the edge of the large-aperture element based on the object distance focusing method according to claim 8, wherein the plurality of edge positions obtained in the second step include:
the midpoint of the left edge is in X-axis coordinate X 'under the machine coordinate system'LComprises the following steps:
X'L=XL+kpixel△XL
in the formula, XLThe coordinate of the X axis when the middle point of the left edge which is calibrated in advance moves to the center of the camera visual field; k is a radical ofpixelThe actual size represented by a single pixel in the calibrated image; delta XLIs the pixel distance between the center point of the left edge and the center line of the image;
the middle point of the right edge is in X-axis coordinate X 'under a machine tool coordinate system'RComprises the following steps:
X'R=XR+kpixel△XR
in the formula, XRThe coordinate of the X axis when the middle point of the right edge which is calibrated in advance moves to the center of the camera visual field; delta XRIs the pixel distance between the center point of the right edge and the center line of the image;
y-axis coordinate Y 'of midpoint of upper edge in machine tool coordinate system'TComprises the following steps:
Y′T=YT+kpixel△YT
in the formula, YTThe coordinate of the Y axis when the middle point of the upper edge which is calibrated in advance moves to the center of the camera visual field; delta YTIs the pixel distance between the center point of the upper edge and the image center line;
y-axis coordinate Y 'of lower edge midpoint in machine tool coordinate system'DComprises the following steps:
Y′D=YD+kpixel△YD
in the formula, YDThe coordinate of the Y axis when the middle point of the lower edge which is calibrated in advance moves to the center of the camera visual field; delta YDIs the pixel distance between the midpoint of the lower edge and the center line of the image.
CN202111428157.0A 2021-11-29 2021-11-29 Large-diameter element edge detection method based on object distance focusing method Pending CN114119555A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111428157.0A CN114119555A (en) 2021-11-29 2021-11-29 Large-diameter element edge detection method based on object distance focusing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111428157.0A CN114119555A (en) 2021-11-29 2021-11-29 Large-diameter element edge detection method based on object distance focusing method

Publications (1)

Publication Number Publication Date
CN114119555A true CN114119555A (en) 2022-03-01

Family

ID=80370699

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111428157.0A Pending CN114119555A (en) 2021-11-29 2021-11-29 Large-diameter element edge detection method based on object distance focusing method

Country Status (1)

Country Link
CN (1) CN114119555A (en)

Similar Documents

Publication Publication Date Title
CN107356608B (en) Rapid dark field detection method for surface microdefects of large-caliber fused quartz optical element
CN110006905B (en) Large-caliber ultra-clean smooth surface defect detection device combined with linear area array camera
CA2669922C (en) Method for assessing image focus quality
CN110411346B (en) Method for quickly positioning surface micro-defects of aspheric fused quartz element
US7120286B2 (en) Method and apparatus for three dimensional edge tracing with Z height adjustment
US10521895B2 (en) Dynamic automatic focus tracking system
JPH0736613B2 (en) Focus adjustment method for imaging device
CN114113114B (en) Automatic process method for detecting and repairing micro defects on surface of large-caliber element
CN111390377B (en) Surface automatic focusing method and system for laser processing and storage medium
CN112697112A (en) Method and device for measuring horizontal plane inclination angle of camera
US6295384B1 (en) Removing noise caused by artifacts from a digital image signal
CN209992407U (en) Large-caliber ultra-clean smooth surface defect detection device combined with linear array camera
CN110653016B (en) Pipetting system and calibration method thereof
TWI521295B (en) Bevel-axial auto-focus microscopic system and method thereof
CN114119555A (en) Large-diameter element edge detection method based on object distance focusing method
CN114113115B (en) High-precision automatic positioning method for micro defects on surface of large-caliber element
CN114113112B (en) Surface micro defect positioning and identifying method based on three-light-source microscopic system
Cao et al. Alignment methods for micron-scale surface defects automatic evaluation of large-aperture fine optics
CN113079318B (en) System and method for automatically focusing edge defects and computer storage medium
CN114111578A (en) Automatic pose determination method for large-diameter element
CN112839168B (en) Method for automatically adjusting camera imaging resolution in AOI detection system
CN114113116A (en) Accurate detection process method for micro-defects on surface of large-diameter element
JP3180091B2 (en) Non-contact dimension measurement method by laser autofocus
JP4384446B2 (en) Autofocus method and apparatus
CN115547909B (en) Wafer definition positioning method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination