CN113516709B - Flange positioning method based on binocular vision - Google Patents
Flange positioning method based on binocular vision Download PDFInfo
- Publication number
- CN113516709B CN113516709B CN202110776691.4A CN202110776691A CN113516709B CN 113516709 B CN113516709 B CN 113516709B CN 202110776691 A CN202110776691 A CN 202110776691A CN 113516709 B CN113516709 B CN 113516709B
- Authority
- CN
- China
- Prior art keywords
- flange
- target flange
- image
- target
- center
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 45
- 238000004364 calculation method Methods 0.000 claims description 40
- 230000000877 morphologic effect Effects 0.000 claims description 14
- 238000012545 processing Methods 0.000 claims description 10
- 125000004122 cyclic group Chemical group 0.000 claims description 6
- 238000012216 screening Methods 0.000 claims description 4
- 230000003287 optical effect Effects 0.000 claims description 3
- 230000000007 visual effect Effects 0.000 abstract description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000013461 design Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000007547 defect Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 210000001503 joint Anatomy 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
Abstract
The invention relates to the technical field of visual positioning, and particularly discloses a flange positioning method based on binocular vision, which comprises the following steps: shooting a flange to be positioned through a binocular camera to obtain a left image and a right image; identifying a target flange from the left image, calculating the coordinate of the center of the circle of the target flange in the left image and the pixel length of the inner diameter of the target flange in the left image, and calculating the approximate depth of field of the target flange; determining the approximate coordinate of the center of the circle of the target flange in the right image according to the approximate depth of field of the target flange, identifying the target flange in the right image, and determining the accurate coordinate of the center of the circle of the target flange in the right image; and calculating the position of the target flange under the coordinate system of the binocular camera according to the coordinates of the circle center of the target flange in the left and right images, the distance between the left and right cameras and the focal length of the binocular camera, so as to obtain a flange positioning result. The invention solves the problem of flange positioning in the coarse positioning of the intelligent oil conveying arm, and has higher accuracy.
Description
Technical Field
The invention relates to the technical field of visual positioning, in particular to a flange positioning method based on binocular vision.
Background
The butt joint of traditional marine oil delivery arm is all carried out with manual remote control's mode, not only can produce more and more human cost, still does not have higher efficiency. Thus, there is a need for an intelligent oil delivery arm that can be docked automatically to address the above issues. The design of the intelligent arm is divided into coarse positioning and fine positioning, and a binocular vision sensor (binocular camera) is used for measuring and calculating the position of the target flange in the coarse positioning stage. Coarse positioning accuracy of binocular vision guidance can meet design requirements.
Disclosure of Invention
Aiming at the defects in the prior art, the invention provides a flange positioning method based on binocular vision, which solves the problem of flange positioning in coarse positioning of an intelligent oil delivery arm.
As a first aspect of the present invention, there is provided a flange positioning method based on binocular vision, comprising:
respectively shooting a flange to be positioned through a left camera and a right camera in the binocular camera to obtain a left image and a right image;
identifying a target flange from the left image, and respectively calculating the coordinates of the center of the target flange in the left image and the pixel length of the inner diameter of the target flange in the left image;
calculating the approximate depth of field of the target flange according to the pixel length of the inner diameter of the target flange in the left image;
determining the approximate coordinates of the circle center of the target flange in the right image according to the approximate depth of field of the target flange and the distance between the left camera and the right camera;
identifying the target flange in the right image according to the approximate coordinates of the center of the target flange in the right image, and determining the accurate coordinates of the center of the target flange in the right image;
and calculating the position of the target flange under the binocular camera coordinate system according to the coordinate of the center of the target flange in the left image, the accurate coordinate of the center of the target flange in the right image, the distance between the left camera and the right camera and the focal length of the binocular camera, so as to obtain a flange positioning result.
Further, before shooting the flange to be positioned by the binocular camera, calibrating the binocular camera includes:
determining lens parameters of the binocular camera according to the distance range from the flange to be positioned to the intelligent arm base at the present place;
selecting a binocular camera with proper lens parameters and a distance between the binocular cameras according to the precision requirement of coarse positioning;
designing a corresponding fixture to fix the selected binocular camera;
and calibrating the fixed binocular camera.
Further, in the calculating the approximate depth of field of the target flange according to the pixel length of the inner diameter of the target flange in the left image, the method further comprises:
the approximate depth of field depth of the target flange is calculated by using a monocular ranging principle, and the formula is as follows:
the inner diameter size of the target flange, the pixel size of the left camera and the focal length of the left camera in the depth of field depth calculation formula are obtained in advance.
Further, determining the approximate coordinates of the center of the circle of the target flange in the right image according to the approximate depth of field of the target flange and the distance between the left camera and the right camera, and further comprising:
setting coordinates of a center of the target flange in the left image as (xl, yl), and setting approximate coordinates of a center of the target flange in the right image as (xr, yr), wherein the coordinates of the center of the target flange in the left image and the right image are identical in ordinate y of the center of the same target flange in the left image and the right image, namely yl=yr, and the x-axis is different, and estimating the approximate coordinates (xr, yr) of the center of the target flange in the right image according to the distance between the depth of field depth of the target flange measured by the left camera and the left camera, wherein the x-axis of the center of the target flange in the right image has a calculation formula as follows:
xr=xl-binocular wheelbase x left camera focal length ≡depth
Where xl is the abscissa of the center of the target flange in the left image, the binocular wheelbase is the distance between the left and right cameras, and depth is the approximate depth of field of the target flange.
Further, in the step of identifying the target flange in the right image according to the approximate coordinates of the center of the target flange in the right image, and determining the precise coordinates of the center of the target flange in the right image, the method further comprises:
taking the approximate coordinates (xr, yr) of the center of the target flange in the right image as a midpoint, intercepting a region in a certain range in the right image, identifying the target flange in the region, and updating the approximate coordinates (xr, yr) of the center of the target flange in the identification process to obtain the accurate coordinates of the center of the target flange in the right image.
Further, in the calculating the position of the target flange under the binocular camera coordinate system according to the coordinate of the center of the target flange in the left image, the accurate coordinate of the center of the target flange in the right image, the distance between the left camera and the right camera, and the focal length of the binocular camera, to obtain a flange positioning result, the method further includes:
taking the camera coordinates of the left camera as world coordinates, assuming that the center of a circle of the target flange is a point P in space, the world coordinates of the point P are (x, y, z), and the coordinates of the center P of the target flange on the left image are P L (xl, yl) on the right image at the coordinates P R (xr,yr),O W O L With O RC O R Optical axes of left and right cameras, O W O RC The distance between the two is a base line b, and according to the principle of triangle similarity, the method obtains
The calculation formula of the center (x, y, z) of the target flange is obtained after the above simplification is as follows:
wherein d=xl-xr is binocular parallax of the P point, b is distance between the left camera and the right camera, f is focal length of the binocular camera, therefore, the coordinates of the center of the target flange in the left image and the coordinates of the center of the target flange in the right image are substituted into the calculation formula of the center (x, y, z) of the target flange, and then the spatial three-dimensional coordinates of the center of the target flange can be obtained, namely, the position of the target flange under the coordinate system of the binocular camera is obtained.
Further, the step of identifying the target flange from the left image or the right image includes:
step S1: acquiring a left image and a right image;
step S2: processing the left image or the right image to obtain a processed flange image;
step S3: performing image binarization processing on the processed flange image to obtain a binarized image;
step S4: performing morphological operation on the binarized image to obtain a flange image after morphological operation;
step S5: screening all contours in the flange image after morphological operation to obtain a plurality of target flange contours;
step S6: respectively carrying out ellipse fitting on a plurality of target flange profiles to obtain a plurality of fitted target flange ellipse profiles;
step S7: circularly calculating the total roundness value s of the plurality of fitted target flange elliptical profiles, and setting the minimum value s of the total roundness in advance min Updating the total roundness minimum s according to the cyclic calculation result min ;
Step S8: judging whether the calculated number i of the total roundness value S is larger than or equal to the total calculated number M, when the calculated number i is smaller than the total calculated number M, adding 1 to the calculated number i, returning to the step S3 by adding 1 to start cyclic calculation of the total roundness value S, and continuously updating the total roundness minimum value S min Until the calculated times i is more than or equal to the calculated times M; when the total calculated number M is greater than or equal to the total roundness minimum value s updated last is selected min The corresponding minimum total roundness value S, the corresponding fitted target flange elliptical profile is obtained according to the minimum total roundness value S, and step S9 is executed;
step S9: and displaying the fitted target flange elliptical profile corresponding to the minimum total roundness value s on the left image and the right image to obtain a target flange identification result.
Further, in the step S6, the method further includes:
and respectively carrying out ellipse fitting on a plurality of target flange profiles by adopting a least square method to obtain a plurality of fitted target flange ellipse profiles.
Further, the step S7 further includes:
respectively calculating the roundness value of each fitted target flange elliptical contour, and calculating the total roundness value s of the plurality of fitted target flange elliptical contours according to the roundness value of each fitted target flange elliptical contour;
calculating the roundness value of each fitted target flange elliptical profile through the formula c=b/a, wherein a represents the major half axis of the ellipse, b represents the minor half axis of the ellipse, and the calculation formula of the total roundness value s of all fitted target flange elliptical profiles is:
wherein C is 1 ,C 2 ,...,C n Respectively representing the roundness value of each fitted target flange elliptical contour, wherein n represents the number of the determined target flange elliptical contours, and the total roundness value s is not larger than 1 under normal conditions, so that the total roundness minimum value s is obtained min The initial value of (1) is set to be 1, and the currently calculated total roundness value s and the total roundness minimum value s are calculated min In comparison with s min S > s, then assign s to s min To update the total roundness minimum s min 。
Further, in the step S8, the method further includes:
when the total circle isWhen the calculated times i of the degree value S is smaller than the calculated times M, the calculated times i is increased by 1, the step S3 is returned to by the calculated times i and 1, the total roundness value S of all the fitted target flange elliptical profiles is calculated again, and if the calculated total roundness value S is smaller than the updated total roundness minimum value S in the last calculation min The calculated total roundness value s is continuously assigned to the minimum value s of the total roundness min To continue to update the total roundness minimum s min The updated total roundness minimum s will be continued min And in the next calculation of the total roundness value s, the calculated number i is equal to or greater than the total calculated number M.
The flange positioning method based on binocular vision provided by the invention has the following advantages: the flange positioning problem in the intelligent oil delivery arm coarse positioning is solved, and the method is good in universality and high in accuracy.
Drawings
The accompanying drawings are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification, illustrate the invention and together with the description serve to explain, without limitation, the invention.
Fig. 1 is a flow chart of a flange positioning method based on binocular vision.
Fig. 2 is a schematic diagram of the binocular positioning principle provided by the present invention.
FIG. 3 is a flowchart of a target flange identification method provided by the invention.
Fig. 4 is a schematic diagram of a left image or a right image provided by the present invention.
Fig. 5 is a schematic diagram of a target flange recognition result provided by the present invention.
Detailed Description
In order to further describe the technical means and effects adopted by the invention to achieve the preset aim, the following detailed description refers to the specific implementation, structure, characteristics and effects of a flange positioning method based on binocular vision according to the invention by combining the accompanying drawings and the preferred embodiment. It will be apparent that the described embodiments are some, but not all, embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to fall within the scope of the invention.
In this embodiment, a flange positioning method based on binocular vision is provided, as shown in fig. 1, where the flange positioning method based on binocular vision includes:
respectively shooting a flange to be positioned through a left camera and a right camera in the binocular camera to obtain a left image and a right image;
identifying a target flange from the left image, and respectively calculating the coordinates of the center of the target flange in the left image and the pixel length of the inner diameter of the target flange in the left image;
calculating the approximate depth of field of the target flange according to the pixel length of the inner diameter of the target flange in the left image;
determining the approximate coordinates of the circle center of the target flange in the right image according to the approximate depth of field of the target flange and the distance between the left camera and the right camera;
identifying the target flange in the right image according to the approximate coordinates of the center of the target flange in the right image, and determining the accurate coordinates of the center of the target flange in the right image;
and calculating the position of the target flange under the binocular camera coordinate system according to the coordinate of the center of the target flange in the left image, the accurate coordinate of the center of the target flange in the right image, the distance between the left camera and the right camera and the focal length of the binocular camera, so as to obtain a flange positioning result.
Preferably, before shooting the flange to be positioned by the binocular camera, calibrating the binocular camera includes:
determining lens parameters of the binocular camera according to the distance range from the flange to be positioned to the intelligent arm base at the present place;
selecting a binocular camera with proper lens parameters and a distance between the binocular cameras according to the precision requirement of coarse positioning;
designing a corresponding fixture to fix the selected binocular camera;
calibrating the fixed binocular camera, and positioning the flange by using the calibrated binocular camera.
Preferably, in the calculating the approximate depth of field of the target flange according to the pixel length of the inner diameter of the target flange in the left image, the method further comprises:
the approximate depth of field depth of the target flange is calculated by using a monocular ranging principle, and the formula is as follows:
the inner diameter size of the target flange, the pixel size of the left camera and the focal length of the left camera in the depth of field depth calculation formula are obtained in advance.
Preferably, determining the approximate coordinates of the center of the circle of the target flange in the right image according to the approximate depth of field of the target flange and the distance between the left camera and the right camera further comprises:
setting coordinates of a center of the target flange in the left image as (xl, yl), and setting approximate coordinates of a center of the target flange in the right image as (xr, yr), wherein the coordinates of the center of the target flange in the left image and the right image are identical in ordinate y of the center of the same target flange in the left image and the right image, namely yl=yr, and the x-axis is different, and estimating the approximate coordinates (xr, yr) of the center of the target flange in the right image according to the distance between the depth of field depth of the target flange measured by the left camera and the left camera, wherein the x-axis of the center of the target flange in the right image has a calculation formula as follows:
xr=xl-binocular wheelbase x left camera focal length ≡depth
Where xl is the abscissa of the center of the target flange in the left image, the binocular wheelbase is the distance between the left and right cameras, and depth is the approximate depth of field of the target flange.
Since the three-dimensional coordinate accuracy of the monocular measurement is insufficient, the approximate position of the target flange in the right image is calculated from the target flange position in the left image, and the position information is the center (xr, yr), and since y is almost identical, yr=yl.
Preferably, the method further includes, in the approximate coordinates of the center of the circle of the target flange in the right image, identifying the target flange in the right image, and determining the precise coordinates of the center of the circle of the target flange in the right image:
taking the approximate coordinates (xr, yr) of the center of the target flange in the right image as a midpoint, intercepting a region in a certain range in the right image, identifying the target flange in the region, and updating the approximate coordinates (xr, yr) of the center of the target flange in the identification process to obtain the accurate coordinates of the center of the target flange in the right image.
In the flange identification process, according to an ellipse fitting algorithm, the length of the center coordinates of the flange and the inner diameter of the flange in the pixel can be obtained, so that the coordinates (xr, yr) of the center of the target flange can be updated.
Preferably, in the calculating the position of the target flange under the binocular camera coordinate system according to the coordinate of the center of the target flange in the left image, the accurate coordinate of the center of the target flange in the right image, the distance between the left camera and the right camera, and the focal length of the binocular camera, to obtain a flange positioning result, the method further includes:
according to the binocular positioning principle, the position of the target flange under the camera coordinate system can be calculated, as shown in fig. 2, the camera coordinate of the left camera is taken as the world coordinate, the center of the target flange is assumed to be a point P in space, the world coordinate of the point P is (x, y, z), and the coordinate of the center P of the target flange on the left image is P L (xl, yl) on the right image at the coordinates P R (xr,yr),O W O L With O RC O R Optical axes of left and right cameras, O W O RC The distance between the two is a base line b, and according to the principle of triangle similarity, the method obtains
The calculation formula of the center (x, y, z) of the target flange is obtained after the above simplification is as follows:
wherein d=xl-xr is binocular parallax of the P point, b is distance between the left camera and the right camera, f is focal length of the binocular camera, therefore, the coordinates of the center of the target flange in the left image and the coordinates of the center of the target flange in the right image are substituted into the calculation formula of the center (x, y, z) of the target flange, and then the spatial three-dimensional coordinates of the center of the target flange can be obtained, namely, the position of the target flange under the coordinate system of the binocular camera is obtained.
Preferably, as shown in fig. 3, the step of identifying the target flange from the left image or the right image includes:
step S1: acquiring a left image and a right image as shown in fig. 4;
step S2: processing the left image or the right image to obtain a processed flange image;
step S3: performing image binarization processing on the processed flange image to obtain a binarized image;
step S4: performing morphological operation on the binarized image to obtain a flange image after morphological operation;
step S5: screening all contours in the flange image after morphological operation to obtain a plurality of target flange contours;
step S6: respectively carrying out ellipse fitting on a plurality of target flange profiles to obtain a plurality of fitted target flange ellipse profiles;
step S7: circularly calculating the total roundness value s of the plurality of fitted target flange elliptical profiles, and setting the minimum value s of the total roundness in advance min Updating the minimum total roundness according to the cyclic calculation resultValue s min ;
Step S8: judging whether the calculated number i of the total roundness value S is larger than or equal to the total calculated number M, when the calculated number i is smaller than the total calculated number M, adding 1 to the calculated number i, returning to the step S3 by adding 1 to start cyclic calculation of the total roundness value S, and continuously updating the total roundness minimum value S min Until the calculated times i is more than or equal to the calculated times M; when the total calculated number M is greater than or equal to the total roundness minimum value s updated last is selected min The corresponding minimum total roundness value S, the corresponding fitted target flange elliptical profile is obtained according to the minimum total roundness value S, and step S9 is executed;
step S9: and displaying the fitted target flange elliptical profile corresponding to the minimum total roundness value s on the left image and the right image to obtain a target flange identification result, such as a black filling part shown in fig. 5.
Preferably, in the step S2, further includes:
performing median filtering treatment on the left image or the right image to obtain a filtered flange image;
and converting the filtered flange image into a gray image.
It should be noted that, in the process of acquiring the left image or the right image, unnecessary interference information is inevitably generated, which affects the quality of the image to a great extent, and may have a larger influence on the processing result of the image.
Preferably, in said steps S3 and S4, further comprises:
performing image binarization processing on the gray level image to obtain a binarized image;
and sequentially performing a closing operation and an opening operation on the binarized image to obtain the flange image after morphological operation.
It should be noted that some tiny holes and some burrs may exist in the binarized image, the binarized image is subjected to a closing operation to fill the tiny holes, and then is subjected to an opening operation to remove burrs on the edge, so that a flange image after morphological operation is obtained, and the flange can be better identified.
Preferably, the method further comprises:
judging whether the filtered flange image is a gray image or not;
if the filtered flange image is a color image, the filtered flange image is first grayed, and the graying formula is Gray (i, j) =ar (i, j) +bg (i, j) +cb (i, j), wherein Gray represents a value after graying of a pixel point, R, G, B represents color component values of red, green and blue on the corresponding pixel, a, b and c are weights of red, green and blue respectively, and herein a, b and c are standard values according to the sensitivity degree of human eyes, and are respectively 0.299, 0.578 and 0.114;
and if the filtered flange image is a gray image, directly performing image binarization processing.
Preferably, in the step S3, further includes:
setting the initial threshold of image binarization as 0, wherein the formula of image binarization is as follows:
where T represents a threshold, the formula for binarizing the image means that pixels above the threshold are set to be brightest and pixels below the threshold are set to be darkest.
The gray image was binarized starting with 0 as the threshold value.
It should be noted that, all contours in the flange image after morphological operation are screened, and the too large or too small contours are removed through size limitation so as to obtain a plurality of target flange contours; specifically, the contour detection and screening are performed on the flange image after morphological operation, and the contour detection of the binarized image is performed, namely, the contour of each connected domain is found. The area size range of the flange in the image can be determined according to the size of the flange on the ship and the distance range of the flange to the oil conveying arm, and the contours which are not necessarily the flange can be eliminated according to the area size range, so that the contours which are possibly the flange can be screened out.
Preferably, in the step S6, further includes:
and respectively carrying out ellipse fitting on a plurality of target flange profiles by adopting a least square method to obtain a plurality of fitted target flange ellipse profiles.
It should be noted that, because the flange plane and the camera plane have certain angles, the photographed flange is not a perfect circle but is very close to the circle, so that the ellipse fitting is performed on each screened target flange contour by using the least square method, and the fitted target flange ellipse contour is obtained.
It should be noted that, when the ellipse is fitted, the center of the flange and the inner diameter of the flange can be obtained from an ellipse fitting algorithm, and the result can be drawn, and the ellipse fitting algorithm is an existing algorithm and is a conventional technology known to those skilled in the art, and is not described herein.
Preferably, in the step S7, further includes:
respectively calculating the roundness value of each fitted target flange elliptical contour, and calculating the total roundness value s of the plurality of fitted target flange elliptical contours according to the roundness value of each fitted target flange elliptical contour;
calculating the roundness value of each fitted target flange elliptical profile through the formula c=b/a, wherein a represents the major half axis of the ellipse, b represents the minor half axis of the ellipse, and the calculation formula of the total roundness value s of all fitted target flange elliptical profiles is:
wherein C is 1 ,C 2 ,. Cn represents the roundness value of each fitted elliptical profile of the target flange, n represents the number of elliptical profiles of the target flange determined, and the total roundness value s is not normally largeAt 1, therefore, the total roundness minimum s min The initial value of (1) is set to be 1, and the currently calculated total roundness value s and the total roundness minimum value s are calculated min In comparison with s min S > s, then assign s to s min To update the total roundness minimum s min 。
It should be noted that, by limiting the value of the roundness value of each fitted target flange ellipse contour to exclude the contour that is unlikely to be a flange, the non-flange portion in the image after morphological operation processing is almost effectively removed because the flange is very close to a circle in the image, and a plurality of flanges may exist in one image, so the above formula is used to determine the total roundness value s of all fitted target flange ellipse contours on one image.
It should be understood that in one cycle, through the last steps, the minor and major half axes of the fitted elliptical profile of the target flange in the image can be obtained, the roundness value C of the minor and major half axes is obtained, and a plurality of flanges may exist in one image, which is n in number, so s is a total roundness value, and the minimum total roundness value s of each cycle is taken, so that the minimum total roundness value s can ensure that the taken flange edge is closest to the real edge.
Preferably, in the step S8, further includes:
when the calculated number i of the total roundness values S is smaller than the total calculated number M, the calculated number i is increased by 1, the step S3 is returned to by the calculated number i plus 1, the total roundness values S of all the fitted target flange elliptical profiles are calculated again, and if the calculated total roundness values S are smaller than the updated total roundness minimum values S in the last calculation, the calculated total roundness values S are calculated again min The calculated total roundness value s is continuously assigned to the minimum value s of the total roundness min To continue to update the total roundness minimum s min The updated total roundness minimum s will be continued min And in the next calculation of the total roundness value s, the calculated number i is equal to or greater than the total calculated number M.
The total calculation number M is a threshold value of the cycle calculation number, and the present invention is not limited thereto, and may be set by itself according to the need, for example, the total calculation number M may be 255.
The total roundness value s of all fitted target flange elliptical profiles is calculated in a circulating way, and the minimum value s of the total roundness is updated according to the circulating calculation result min Obtaining the final updated total roundness minimum s min Selecting a final updated total roundness minimum s min The corresponding minimum total roundness value s, and the corresponding fitted target flange elliptical profile is obtained according to the minimum total roundness value s, and the following is exemplified:
in the first calculation: total roundness value s=0.9, since total roundness minimum s min The initial value is set to 1,1 > 0.9, and 0.9 is assigned to the total roundness minimum s min At this time, the total roundness minimum s min =0.9;
In the second calculation: total roundness value s=0.8, since the updated total roundness minimum s in the first calculation min =0.9, 0.9 > 0.8, assigning 0.8 to the total roundness minimum s min At this time, the total roundness minimum s min =0.8;
In the third calculation: total roundness value s=0.7, since the updated total roundness minimum s in the second calculation min = 0.8,0.8 > 0.7, assigning 0.7 to the total roundness minimum s min At this time, the total roundness minimum s min =0.7;
In the fourth calculation: total roundness value s=0.8, since the updated total roundness minimum s in the third calculation min =0.7, 0.8 > 0.7, at which time the total roundness minimum s is not updated min Total roundness minimum s min Still 0.7;
......
and so on, until in the ith calculation: total roundness value s=0.5, since the updated total roundness minimum s in the i-1 th calculation min = 0.6,0.6 > 0.5, assigning 0.5 to the total roundness minimum s min At this time, the total roundness minimum s min =0.5, selecting the minimum total roundness value s corresponding to the last updated total roundness minimum value 0.5, and obtaining the corresponding fitted target flange elliptical profile according to the minimum total roundness value 0.5, namely the target flange elliptical profile is calculated in the ith timeAnd fitting the pair to obtain the elliptic profile of the target flange.
Preferably, in the step S1, further includes: and acquiring a left image or a right image by adopting a high-definition camera.
Specifically, a device formed by combining a large constant CCD camera and a sea-Kangwei visual lens is selected to collect a left image or a right image, wherein the height of the collected left image or right image is 3672 pixels, and the width of the collected left image or right image is 5496 pixels.
The flange positioning method based on binocular vision solves the problem of flange positioning in coarse positioning of the intelligent oil conveying arm, and has good universality and higher accuracy.
The present invention is not limited to the above-mentioned embodiments, but is intended to be limited to the following embodiments, and any modifications, equivalents and modifications can be made to the above-mentioned embodiments without departing from the scope of the invention.
Claims (6)
1. The flange positioning method based on binocular vision is characterized by comprising the following steps of:
respectively shooting a flange to be positioned through a left camera and a right camera in the binocular camera to obtain a left image and a right image;
identifying a target flange from the left image, and respectively calculating the coordinates of the center of the target flange in the left image and the pixel length of the inner diameter of the target flange in the left image;
calculating the depth of field of the target flange according to the pixel length of the inner diameter of the target flange in the left image;
determining the approximate coordinates of the circle center of the target flange in the right image according to the depth of field of the target flange and the distance between the left camera and the right camera;
identifying the target flange in the right image according to the approximate coordinates of the center of the target flange in the right image, and determining the accurate coordinates of the center of the target flange in the right image;
calculating the position of the target flange under a binocular camera coordinate system according to the coordinate of the center of the target flange in the left image, the accurate coordinate of the center of the target flange in the right image, the distance between the left camera and the right camera and the focal length of the binocular camera, so as to obtain a flange positioning result;
wherein, in the calculating the depth of field of the target flange according to the pixel length of the inner diameter of the target flange in the left image, the method further comprises:
the depth of field depth of the target flange is calculated by using a monocular ranging principle, and the formula is as follows:
the inner diameter size of the target flange, the pixel size of the left camera and the focal length of the left camera in the depth of field depth calculation formula are obtained in advance;
wherein, in determining the approximate coordinates of the center of the circle of the target flange in the right image according to the depth of field of the target flange and the distance between the left camera and the right camera, the method further comprises:
setting coordinates of a center of the target flange in the left image as (xl, yl), and setting approximate coordinates of a center of the target flange in the right image as (xr, yr), wherein the coordinates of the center of the target flange in the left image and the right image are identical in ordinate y of the center of the same target flange in the left image and the right image, namely yl=yr, and the x-axis is different, and estimating the approximate coordinates (xr, yr) of the center of the target flange in the right image according to the distance between the depth of field depth of the target flange measured by the left camera and the left camera, wherein the x-axis of the center of the target flange in the right image has a calculation formula as follows:
xr=xl-binocular wheelbase x left camera focal length ≡depth
Wherein xl is the abscissa of the center of the target flange in the left image, binocular wheelbase is the distance between the left camera and the right camera, depth is the depth of field of the target flange;
wherein, in the rough coordinate of the center of the circle of the target flange in the right image, the target flange in the right image is identified, and the accurate coordinate of the center of the circle of the target flange in the right image is determined, the method further comprises:
taking the approximate coordinates (xr, yr) of the center of the target flange in the right image as a midpoint, intercepting a region in a certain range in the right image, identifying the target flange in the region, and updating the approximate coordinates (xr, yr) of the center of the target flange in the identification process to obtain the accurate coordinates of the center of the target flange in the right image;
the method further includes, in the calculating, according to the coordinates of the center of the target flange in the left image, the accurate coordinates of the center of the target flange in the right image, the distance between the left camera and the right camera, and the focal length of the binocular camera, the position of the target flange in the binocular camera coordinate system to obtain a flange positioning result:
taking the camera coordinates of the left camera as world coordinates, assuming that the center of a circle of the target flange is a point P in space, the world coordinates of the point P are (x, y, z), and the coordinates of the center P of the target flange on the left image are P L (xl, yl) on the right image at the coordinates P R (xr,yr),O W O L With O RC O R Optical axes of left and right cameras, O W O RC The distance between the two is a base line b, and according to the principle of triangle similarity, the method obtains
The calculation formula of the center (x, y, z) of the target flange is obtained after the above simplification is as follows:
wherein d=xl-xr is binocular parallax of the P point, b is distance between the left camera and the right camera, f is focal length of the binocular camera, therefore, the coordinates of the center of the target flange in the left image and the coordinates of the center of the target flange in the right image are substituted into the calculation formula of the center (x, y, z) of the target flange, and then the spatial three-dimensional coordinates of the center of the target flange can be obtained, namely, the position of the target flange under the coordinate system of the binocular camera is obtained.
2. The flange positioning method based on binocular vision according to claim 1, wherein calibrating the binocular camera before shooting the flange to be positioned by the binocular camera comprises:
determining lens parameters of the binocular camera according to the distance range from the flange to be positioned to the intelligent arm base at the present place;
selecting a binocular camera with proper lens parameters and a distance between the binocular cameras according to the precision requirement of coarse positioning;
designing a corresponding fixture to fix the selected binocular camera;
and calibrating the fixed binocular camera.
3. The binocular vision based flange positioning method of claim 1, wherein the step of identifying the target flange from the left or right image comprises:
step S1: acquiring a left image and a right image;
step S2: processing the left image or the right image to obtain a processed flange image;
step S3: performing image binarization processing on the processed flange image to obtain a binarized image;
step S4: performing morphological operation on the binarized image to obtain a flange image after morphological operation;
step S5: screening all contours in the flange image after morphological operation to obtain a plurality of target flange contours;
step S6: respectively carrying out ellipse fitting on a plurality of target flange profiles to obtain a plurality of fitted target flange ellipse profiles;
step S7: circularly calculating the total roundness value s of the plurality of fitted target flange elliptical profiles, and setting the minimum value s of the total roundness in advance min Updating the total roundness minimum s according to the cyclic calculation result min ;
Step S8: judging whether the calculated number i of the total roundness value S is larger than or equal to the total calculated number M, when the calculated number i is smaller than the total calculated number M, adding 1 to the calculated number i, returning to the step S3 by adding 1 to start cyclic calculation of the total roundness value S, and continuously updating the total roundness minimum value S min Until the calculated times i is more than or equal to the calculated times M; when the total calculated number M is greater than or equal to the total roundness minimum value s updated last is selected min The corresponding minimum total roundness value S, the corresponding fitted target flange elliptical profile is obtained according to the minimum total roundness value S, and step S9 is executed;
step S9: and displaying the fitted target flange elliptical profile corresponding to the minimum total roundness value s on the left image and the right image to obtain a target flange identification result.
4. A flange positioning method based on binocular vision according to claim 3, further comprising, in the step S6:
and respectively carrying out ellipse fitting on a plurality of target flange profiles by adopting a least square method to obtain a plurality of fitted target flange ellipse profiles.
5. A flange positioning method based on binocular vision according to claim 3, further comprising, in the step S7:
respectively calculating the roundness value of each fitted target flange elliptical contour, and calculating the total roundness value s of the plurality of fitted target flange elliptical contours according to the roundness value of each fitted target flange elliptical contour;
calculating the roundness value of each fitted target flange elliptical profile through the formula c=b/a, wherein a represents the major half axis of the ellipse, b represents the minor half axis of the ellipse, and the calculation formula of the total roundness value s of all fitted target flange elliptical profiles is:
wherein C is 1 ,C 2 ,...,C n Respectively representing the roundness value of each fitted target flange elliptical contour, wherein n represents the number of the determined target flange elliptical contours, and the total roundness value s is not larger than 1 under normal conditions, so that the total roundness minimum value s is obtained min The initial value of (1) is set to be 1, and the currently calculated total roundness value s and the total roundness minimum value s are calculated min In comparison with s min S > s, then assign s to s min To update the total roundness minimum s min 。
6. The flange positioning method based on binocular vision according to claim 5, further comprising, in the step S8:
when the calculated number i of the total roundness values S is smaller than the total calculated number M, the calculated number i is increased by 1, the step S3 is returned to by the calculated number i plus 1, the total roundness values S of all the fitted target flange elliptical profiles are calculated again, and if the calculated total roundness values S are smaller than the updated total roundness minimum values S in the last calculation, the calculated total roundness values S are calculated again min The calculated total roundness value s is continuously assigned to the minimum value s of the total roundness min To continue to update the total roundness minimum s min The updated total roundness minimum s will be continued min And in the next calculation of the total roundness value s, the calculated number i is equal to or greater than the total calculated number M.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110776691.4A CN113516709B (en) | 2021-07-09 | 2021-07-09 | Flange positioning method based on binocular vision |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110776691.4A CN113516709B (en) | 2021-07-09 | 2021-07-09 | Flange positioning method based on binocular vision |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113516709A CN113516709A (en) | 2021-10-19 |
CN113516709B true CN113516709B (en) | 2023-12-29 |
Family
ID=78066750
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110776691.4A Active CN113516709B (en) | 2021-07-09 | 2021-07-09 | Flange positioning method based on binocular vision |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113516709B (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010095770A1 (en) * | 2009-02-19 | 2010-08-26 | 인하대학교 산학협력단 | Method for automatically adjusting depth of field to visualize stereoscopic image |
CN107144241A (en) * | 2017-06-09 | 2017-09-08 | 大连理工大学 | A kind of binocular vision high-precision measuring method compensated based on the depth of field |
CN107798694A (en) * | 2017-11-23 | 2018-03-13 | 海信集团有限公司 | A kind of pixel parallax value calculating method, device and terminal |
CN109035307A (en) * | 2018-07-16 | 2018-12-18 | 湖北大学 | Setting regions target tracking method and system based on natural light binocular vision |
CN109211207A (en) * | 2018-06-29 | 2019-01-15 | 南京邮电大学 | A kind of screw identification and positioning device based on machine vision |
CN111179330A (en) * | 2019-12-27 | 2020-05-19 | 福建(泉州)哈工大工程技术研究院 | Binocular vision scene depth estimation method based on convolutional neural network |
CN112082493A (en) * | 2020-09-03 | 2020-12-15 | 武汉工程大学 | Pipeline flange inner radius visual measurement method based on binocular imaging |
-
2021
- 2021-07-09 CN CN202110776691.4A patent/CN113516709B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010095770A1 (en) * | 2009-02-19 | 2010-08-26 | 인하대학교 산학협력단 | Method for automatically adjusting depth of field to visualize stereoscopic image |
CN107144241A (en) * | 2017-06-09 | 2017-09-08 | 大连理工大学 | A kind of binocular vision high-precision measuring method compensated based on the depth of field |
CN107798694A (en) * | 2017-11-23 | 2018-03-13 | 海信集团有限公司 | A kind of pixel parallax value calculating method, device and terminal |
CN109211207A (en) * | 2018-06-29 | 2019-01-15 | 南京邮电大学 | A kind of screw identification and positioning device based on machine vision |
CN109035307A (en) * | 2018-07-16 | 2018-12-18 | 湖北大学 | Setting regions target tracking method and system based on natural light binocular vision |
CN111179330A (en) * | 2019-12-27 | 2020-05-19 | 福建(泉州)哈工大工程技术研究院 | Binocular vision scene depth estimation method based on convolutional neural network |
CN112082493A (en) * | 2020-09-03 | 2020-12-15 | 武汉工程大学 | Pipeline flange inner radius visual measurement method based on binocular imaging |
Non-Patent Citations (2)
Title |
---|
"Recognizing and locating of objects using binocular vision system";Guo-Shing Huang;《IEEE》;全文 * |
"基于双目视觉的裂缝信息测距与定位技术研究";郭胜杰;《中国优秀硕士论文全文数据库》;全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN113516709A (en) | 2021-10-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109632809B (en) | Product quality detection method and device | |
CN109712192B (en) | Camera module calibration method and device, electronic equipment and computer readable storage medium | |
CN110490936B (en) | Calibration method, device and equipment of vehicle camera and readable storage medium | |
CN110717942B (en) | Image processing method and device, electronic equipment and computer readable storage medium | |
CN105812790B (en) | Method for evaluating verticality between photosensitive surface and optical axis of image sensor and optical test card | |
CN112669394B (en) | Automatic calibration method for vision detection system | |
CN109559353B (en) | Camera module calibration method and device, electronic equipment and computer readable storage medium | |
KR102118173B1 (en) | System and method for image correction based estimation of distortion parameters | |
CN110261069B (en) | Detection method for optical lens | |
US20200342583A1 (en) | Method, apparatus and measurement device for measuring distortion parameters of a display device, and computer-readable medium | |
CN117036641A (en) | Road scene three-dimensional reconstruction and defect detection method based on binocular vision | |
CN110889874B (en) | Error evaluation method for binocular camera calibration result | |
CN116740061A (en) | Visual detection method for production quality of explosive beads | |
CN109299696B (en) | Face detection method and device based on double cameras | |
CN113516709B (en) | Flange positioning method based on binocular vision | |
CN116912329A (en) | Binocular vision optimal precision measurement method | |
JPH05215547A (en) | Method for determining corresponding points between stereo images | |
CN116596987A (en) | Workpiece three-dimensional size high-precision measurement method based on binocular vision | |
US20220076428A1 (en) | Product positioning method | |
CN114964032A (en) | Blind hole depth measuring method and device based on machine vision | |
CN114693626A (en) | Method and device for detecting chip surface defects and computer readable storage medium | |
CN111062887B (en) | Image definition judging method based on improved Retinex algorithm | |
CN113469980B (en) | Flange identification method based on image processing | |
CN114049304A (en) | 3D grating detection method and device, computer equipment and readable storage medium | |
US10736504B2 (en) | Method for determining the pupil diameter of an eye with high accuracy, and corresponding apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |