CN113409395A - High-precision detection and positioning method for pipe end of catheter - Google Patents

High-precision detection and positioning method for pipe end of catheter Download PDF

Info

Publication number
CN113409395A
CN113409395A CN202010185998.2A CN202010185998A CN113409395A CN 113409395 A CN113409395 A CN 113409395A CN 202010185998 A CN202010185998 A CN 202010185998A CN 113409395 A CN113409395 A CN 113409395A
Authority
CN
China
Prior art keywords
camera
point cloud
points
catheter
circle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010185998.2A
Other languages
Chinese (zh)
Other versions
CN113409395B (en
Inventor
赵吉宾
夏仁波
陈月玲
赵亮
付生鹏
张天宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenyang Intelligent Robot National Research Institute Co ltd
Shenyang Intelligent Robot Innovation Center Co Ltd
Shenyang Institute of Automation of CAS
Original Assignee
Shenyang Intelligent Robot National Research Institute Co ltd
Shenyang Intelligent Robot Innovation Center Co Ltd
Shenyang Institute of Automation of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenyang Intelligent Robot National Research Institute Co ltd, Shenyang Intelligent Robot Innovation Center Co Ltd, Shenyang Institute of Automation of CAS filed Critical Shenyang Intelligent Robot National Research Institute Co ltd
Priority to CN202010185998.2A priority Critical patent/CN113409395B/en
Publication of CN113409395A publication Critical patent/CN113409395A/en
Application granted granted Critical
Publication of CN113409395B publication Critical patent/CN113409395B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mathematical Physics (AREA)
  • Mathematical Analysis (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Algebra (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to a high-precision detection and positioning method for a pipe end of a catheter. The method comprises the steps of firstly calibrating internal and external parameters of a binocular vision system, then synchronously collecting catheter images, carrying out image processing on the collected images to obtain a catheter tube end projection sub-pixel edge, solving three-dimensional space point coordinates of edge points by means of polar line constraint and sequence consistency constraint, then analyzing a space circular perspective projection mathematical model, establishing a target function, optimizing the target function, and finally solving accurate tube end three-dimensional space points. The method effectively considers the mathematical model of the pipe end projection of the conduit, reduces the measurement error caused by the distortion of the perspective projection shape of the pipe end circle, has high measurement precision and high efficiency, and lays a foundation for realizing the stress-free assembly in the manufacturing process of the conduit.

Description

High-precision detection and positioning method for pipe end of catheter
Technical Field
The invention relates to the technical field of computer vision, in particular to a high-precision detection and positioning method for a pipe end of a catheter.
Background
The conduit end refers to the center point of the end face of the pipeline. Before the pipeline is assembled, the relative position precision of the central point of the end face of the pipeline is the key for ensuring the accurate installation of the pipeline. However, the pipe can be elastically deformed during the bending process of the pipe bender, so that the positions and distances of the end points at the two ends of the pipe are changed, and the assembly is affected, so that the pipe end of the pipe needs to be measured with high precision to ensure stress-free assembly.
At present, the measurement mode of the formed bent pipe is mainly contact type, such as a mode-dependent method, a three-coordinate measuring machine, an articulated coordinate measuring machine and the like, and the methods have the defects that the manufacturing of a comparison device is difficult, damage or deformation is easily caused, the efficiency is low, the subjective judgment of an operator is excessively relied, a three-dimensional model cannot be reconstructed, no positioning reference exists and the like. Therefore, how to effectively shorten the development time, reduce the development cost and improve the product quality is the main purpose of fast processing and manufacturing the bent pipe.
The traditional method for fitting the tail end profile is to perform ellipse fitting on the projection of the tail end of the conduit, and when the pipe diameter is small, the projection of the tail end of the pipeline cannot fit the ellipse, which brings large positioning errors.
In the 'multi-vision-based pipeline endpoint coordinate measurement method research', the sun peng et al uses projection gradient to combine a fitted contour with an endpoint positioning method of a central line to calculate the catheter endpoints at different angles, but the calculated endpoint precision is poor.
Disclosure of Invention
Aiming at the defects in the prior art, the technical problem to be solved by the invention is to provide the high-precision detection method for the end of the conduit.
The technical scheme adopted by the invention for realizing the purpose is as follows: the high-precision detection and positioning method for the end face of the conduit comprises the following steps:
step 1: performing multi-view visual calibration on an industrial camera to obtain internal parameters and external parameters of the camera;
step 2: synchronously triggering a camera 1 and a camera 2 to acquire 2 original conduit end face images respectively;
and step 3: respectively carrying out sub-pixel precision processing on the two images, and respectively obtaining edge coordinates of a projection ellipse of the end face of the conduit;
and 4, step 4: according to the internal and external parameters of the camera, epipolar constraint and sequence consistency rules, finding out a corresponding matching point of the edge coordinate of the end face projection ellipse of one of the conduits in the other image;
and 5: reconstructing three-dimensional space coordinate point clouds of all matching points by using the calibrated space position relation between binocular vision and a stereoscopic vision three-dimensional measurement model;
step 6: removing outliers from the three-dimensional space coordinate point cloud of the matching points by using a Randac algorithm, and using the remaining points as initial values of a point cloud fitting circle;
and 7: establishing an objective function of the point cloud fitting circle, wherein the objective function comprises: the distance between all points on the point cloud and a plane formed by the point cloud fitting circle and the distance between all points on the point cloud and the point cloud fitting circle center;
and 8: carrying out linearization processing on a target function of the point cloud fitting circle;
and step 9: and calculating a fitting space circle with the objective function being zero by using a least square method to serve as the positioning of the end surface of the conduit.
The internal parameters are transformation matrixes from an industrial camera coordinate system to an image coordinate system, the external parameters are transformation matrixes from the camera coordinate system to a world coordinate system, and the world coordinate system is used for outputting the coordinates of the subsequent actual control points.
The internal parameters of each industrial camera are calibrated by adopting a two-dimensional flexible target.
The external parameters are realized by a rotation matrix R and a translation matrix T.
The catheter is placed on a backlight plate, and images shot by the camera 1 and the camera 2 are the same end face image of the catheter under different visual angles.
The stereoscopic vision three-dimensional model comprises:
the stereoscopic vision three-dimensional model is represented as follows:
Figure BDA0002414204970000031
wherein (x)i,yi,zi) For the reconstructed three-dimensional spatial coordinate point pi(X) of (C)1i,Y1i) For an image edge point l under the camera 1iCoordinate (x)2i,y2i) For an image edge point q under camera 2iCoordinate, fm、fnFocal ratio of camera 1 and camera 2, transformation matrix R between camera 1 and camera 2, respectively12Is composed of
Figure BDA0002414204970000032
Thus (r)1,r2,r3,r7,r8,r9,t1,t2,t3) For transforming the matrix R12The parameter value of (2).
The center of the point cloud is the centroid of all points on the point cloud, the radius of the circle is the average distance from all points on the point cloud to the centroid, and the normal vector of the plane where the fitting circle is located is obtained by fitting the normal vector of the plane with all points on the point cloud.
The objective function f is as follows:
Figure BDA0002414204970000033
wherein n is the number of points contained in the point cloud, (a)0,b0,c0,d0) Is the normal vector of the plane of the fitted circle, (x)0,y0,z0) Is the centroid of all points in the point cloud, (x)i,yi,zi) Is any point on the point cloud. r is0Is the radius of the fitted circle.
The invention has the following beneficial effects and advantages:
1. the high-precision detection method for the pipe end of the catheter can effectively reconstruct the three-dimensional coordinates of the central point of the pipe end.
2. The high-precision detection method for the end of the conduit pipe has high measurement precision and can provide effective guarantee for the stress-free assembly of the conduit pipe.
3. The high-precision detection method for the pipe end of the conduit has high measurement efficiency and lays a solid foundation for the digital manufacturing of the conduit.
Drawings
FIG. 1 is a schematic diagram of a hardware apparatus for the method of the present invention;
FIG. 2 is a flow chart of the method of the present invention;
FIG. 3(a) is a left circle edge coordinate point in the image of the camera 1 according to the method of the present invention;
FIG. 3(b) is a right circular edge coordinate point in the image of the camera 2 according to the method of the present invention;
FIG. 4 is a three-dimensional coordinate point cloud of edge matching points reconstructed by the method of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples.
Referring to fig. 1, the hardware set of the method of the present invention comprises: the device comprises a backlight plate, a conduit to be detected, four industrial cameras and a computer serving as a controller. The positioning matching detection of one end face projection ellipse of the conduit can be detected by combining the industrial cameras two by two, the computer is provided with programs for each step of method detection, the positioning matching detection result of the two end face projection ellipses of the conduit which are finally detected by the computer is matched with the computer and output to the upper computer, and the upper computer can control the actuator to assemble the conduit on the pre-connecting device according to the positioning matching detection result of the two end face projection ellipses of the conduit, so that the conduit assembly is realized.
Referring to the attached drawings 2, 3(a), 3(b) and 4, the method for detecting the end of the catheter with high precision comprises the following specific steps:
step 1: and performing multi-view visual calibration on the four industrial cameras to obtain internal parameters of all the cameras and a conversion matrix from a coordinate system of all the cameras to a world coordinate system, namely external parameters.
The intrinsic parameters represent a transformation matrix from the industrial camera coordinate system to the image coordinate system; the distortion degree of each lens is different, including radial distortion and eccentric distortion; and calibrating the internal parameters of each industrial camera by using the two-dimensional flexible target.
The extrinsic parameters represent a transformation matrix from the industrial camera coordinate system to the world coordinate system, which is achieved by a rotation matrix R and a translation matrix T. The world coordinate system is used for the actual control point coordinate output later.
Step 2: synchronously triggering two industrial cameras, namely a camera 1 and a camera 2, to acquire two original catheter images placed on a backlight plate;
and step 3: respectively solving the edge image coordinates of the two original conduit images after the sub-pixel precision processing of the conduit end face projection ellipse;
and 4, step 4: establishing a corresponding matching point relation of a sub-pixel precision edge coordinate of a projection ellipse of one conduit end face in another image by means of internal and external parameters, epipolar constraint and sequence consistency of a camera;
define the two-dimensional edge coordinate point L (L) of the image under the camera 11,l2,...ln) Finding out the corresponding two-dimensional edge coordinate point Q (Q) of the image under the camera 2 according to the internal and external parameters, epipolar constraint and sequence consistency principles of the camera1,q2,...qn)。
And 5: reconstructing a three-dimensional space coordinate point cloud P (P) of all matching points by means of a stereoscopic vision three-dimensional measurement model by utilizing the calibrated spatial position relationship between binocular vision (namely, a transformation matrix among an industrial camera coordinate system, an image coordinate system and a world coordinate system)1,p2,…pn);
The stereoscopic vision three-dimensional model is represented as follows:
Figure BDA0002414204970000051
wherein (x)i,yi,zi) For the reconstructed three-dimensional spatial coordinate point pi(X) of (C)1i,Y1i) For an image edge point l under the camera 1iCoordinate (x)2i,y2i) For an image edge point q under camera 2iCoordinate, fm、fnFocal ratio of camera 1 and camera 2, transformation matrix R between camera 1 and camera 2, respectively12Is composed of
Figure BDA0002414204970000052
Thus (r)1,r2,r3,r7,r8,r9,t1,t2,t3) For transforming the matrix R12The parameter value of (2).
Step 6: removing outliers from the three-dimensional space coordinate point cloud of the matching points by using a Ranpac algorithm, and establishing an initial value of a point cloud fitting circle by using the remaining points, wherein the center of the circle is the centroid of all the points on the point cloud, the radius of the circle is the average distance from all the points on the point cloud to the centroid, and the normal vector of the plane where the fitting circle is located is obtained by the normal vector of the fitting plane of all the points on the point cloud;
and 7: establishing a target function of a point cloud fitting circle, wherein the target function is composed of two parts, the first part is the distance between all points on the point cloud and a plane formed by the point cloud fitting circle, the second part is the distance between all points on the point cloud and the center of the point cloud fitting circle, and the target function f is as follows:
Figure BDA0002414204970000061
wherein n is the number of points contained in the point cloud, (a)0,b0,c0,d0) Is the normal vector of the plane of the fitted circle, (x)0,y0,z0) Is the centroid of all points in the point cloud, (x)i,yi,zi) Is any point on the point cloud. r is0Is the radius of the fitted circle.
And 8: linearize f to obtain
Figure BDA0002414204970000062
Wherein
Figure BDA0002414204970000063
(ii) an objective function f vs. x0Make a derivation
Figure BDA0002414204970000064
② objective function f to y0Make a derivation
Figure BDA0002414204970000065
③ target function f to z0Make a derivation
Figure BDA0002414204970000066
Fourthly, the objective function f is to r0Make a derivation
Figure BDA0002414204970000067
Fifthly, the objective function f is to a0Make a derivation
Figure BDA0002414204970000068
Sixthly, the target function f is to b0Make a derivation
Figure BDA0002414204970000071
Seventhly, the target functions f to c0Make a derivation
Figure BDA0002414204970000072
R objective function f to d0Make a derivation
Figure BDA0002414204970000073
And step 9: to make the objective function f equal to 0, let
-f0=Α1x02y03z04a05b06c07d08r0
By least square method, it can be converted into MX ═ B
Wherein
Figure BDA0002414204970000074
X=[x0,y0,z0,a0,b0,c0,r0]T
B=[f01,f02,…,f0m]T
Wherein, each row of the matrix M is a partial derivative value obtained by each point of the point cloud to each variable; t is a transposed symbol; the anisotropic quantity of matrix B is respectively obtained by the variables of each point of the point cloud0. The final spatial circular parameter value X can be obtained.
And step 10, performing the same operation according to the step 2 to the step 9 by using the industrial camera 3 and the industrial camera 4 to obtain the pipe end circular parameter value at the other end of the catheter.
And 11, assembling the catheter and performing other operations by using the obtained pipe end circle parameters at the two ends of the catheter.
The program steps are loaded on an on-site computer, four industrial cameras are arranged on the site to acquire images and send the images to the on-site computer, and the images are processed according to the steps, so that the actual space point cloud circular parameter values of the pipe ends on the two sides of the cut surface of the conduit can be acquired. The coordinate values of the final space circle are transmitted to a robot or other equipment for other operations.
The method adopts the step that the vs2010 is tested in a Win7 system, the space circle measurement precision can reach 0.05mm, and the repetition precision can reach 0.01 mm. The measuring accuracy is high, and is efficient, has stronger stability.
While the foregoing is directed to the preferred embodiment of the present invention, it will be understood by those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (8)

1. A high-precision detection and positioning method for end faces of a conduit is characterized by comprising the following steps:
step 1: performing multi-view visual calibration on an industrial camera to obtain internal parameters and external parameters of the camera;
step 2: synchronously triggering a camera 1 and a camera 2 to acquire 2 original conduit end face images respectively;
and step 3: respectively carrying out sub-pixel precision processing on the two images, and respectively obtaining edge coordinates of a projection ellipse of the end face of the conduit;
and 4, step 4: according to the internal and external parameters of the camera, epipolar constraint and sequence consistency rules, finding out a corresponding matching point of the edge coordinate of the end face projection ellipse of one of the conduits in the other image;
and 5: reconstructing three-dimensional space coordinate point clouds of all matching points by using the calibrated space position relation between binocular vision and a stereoscopic vision three-dimensional measurement model;
step 6: removing outliers from the three-dimensional space coordinate point cloud of the matching points by using a Randac algorithm, and using the remaining points as initial values of a point cloud fitting circle;
and 7: establishing an objective function of the point cloud fitting circle, wherein the objective function comprises: the distance between all points on the point cloud and a plane formed by the point cloud fitting circle and the distance between all points on the point cloud and the point cloud fitting circle center;
and 8: carrying out linearization processing on a target function of the point cloud fitting circle;
and step 9: and calculating a fitting space circle with the objective function being zero by using a least square method to serve as the positioning of the end surface of the conduit.
2. The method for detecting and positioning the pipe end of the catheter in high precision according to claim 1, wherein the internal parameters are a transformation matrix from an industrial camera coordinate system to an image coordinate system, the external parameters are a transformation matrix from a camera coordinate system to a world coordinate system, and the world coordinate system is used for the subsequent actual control point coordinate output.
3. The method for detecting and positioning the end of a catheter tube with high precision as claimed in claim 2, wherein the internal parameters of each industrial camera are calibrated by using a two-dimensional flexible target.
4. The method for detecting and positioning the end of a catheter tube with high precision as claimed in claim 2, wherein the external parameters are realized by a rotation matrix R and a translation matrix T.
5. The method for detecting and positioning the end of a catheter with high precision as claimed in claim 2, wherein the catheter is placed on a backlight plate, and the images taken by the camera 1 and the camera 2 are the same end face image of the catheter under different view angles.
6. The method for detecting and positioning the end of a conduit pipe with high precision according to claim 1, wherein the stereoscopic three-dimensional model comprises:
the stereoscopic vision three-dimensional model is represented as follows:
Figure FDA0002414204960000021
wherein (x)i,yi,zi) For the reconstructed three-dimensional spatial coordinate point pi(X) of (C)1i,Y1i) For an image edge point l under the camera 1iCoordinate (x)2i,y2i) For an image edge point q under camera 2iCoordinate, fm、fnFocal ratio of camera 1 and camera 2, transformation matrix R between camera 1 and camera 2, respectively12Is composed of
Figure FDA0002414204960000022
Thus (r)1,r2,r3,r7,r8,r9,t1,t2,t3) For transforming the matrix R12The parameter value of (2).
7. The method for detecting and positioning the end of a catheter tube with high precision as claimed in claim 1, wherein the center of the point cloud is the centroid of all points on the point cloud, the radius of the circle is the average distance from all points on the point cloud to the centroid, and the normal vector of the plane where the circle is fitted is obtained by fitting the normal vector of the plane to all points on the point cloud.
8. The method for detecting and positioning the end of a catheter tube with high precision according to claim 1, wherein the objective function f is as follows:
Figure FDA0002414204960000023
wherein n is the number of points contained in the point cloud, (a)0,b0,c0,d0) Is the normal vector of the plane of the fitted circle, (x)0,y0,z0) Is the centroid of all points in the point cloud, (x)i,yi,zi) Is any point on the point cloud. r is0Is the radius of the fitted circle.
CN202010185998.2A 2020-03-17 2020-03-17 High-precision detection and positioning method for catheter end Active CN113409395B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010185998.2A CN113409395B (en) 2020-03-17 2020-03-17 High-precision detection and positioning method for catheter end

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010185998.2A CN113409395B (en) 2020-03-17 2020-03-17 High-precision detection and positioning method for catheter end

Publications (2)

Publication Number Publication Date
CN113409395A true CN113409395A (en) 2021-09-17
CN113409395B CN113409395B (en) 2024-05-07

Family

ID=77677127

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010185998.2A Active CN113409395B (en) 2020-03-17 2020-03-17 High-precision detection and positioning method for catheter end

Country Status (1)

Country Link
CN (1) CN113409395B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112381847A (en) * 2020-10-27 2021-02-19 新拓三维技术(深圳)有限公司 Pipeline end head space pose measuring method and system
CN114001654A (en) * 2021-11-01 2022-02-01 北京卫星制造厂有限公司 Workpiece end face pose evaluation method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130058581A1 (en) * 2010-06-23 2013-03-07 Beihang University Microscopic Vision Measurement Method Based On Adaptive Positioning Of Camera Coordinate Frame
CN103196370A (en) * 2013-04-01 2013-07-10 北京理工大学 Measuring method and measuring device of conduit connector space pose parameters
CN109470170A (en) * 2018-12-25 2019-03-15 山东大学 Stereoscopic vision space circle pose high-precision measuring method and system based on optimal projection plane
CN110160442A (en) * 2019-04-22 2019-08-23 南京航空航天大学 A kind of flexible measuring tooling and its scaling method for conduit end face of flange vision-based detection

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130058581A1 (en) * 2010-06-23 2013-03-07 Beihang University Microscopic Vision Measurement Method Based On Adaptive Positioning Of Camera Coordinate Frame
CN103196370A (en) * 2013-04-01 2013-07-10 北京理工大学 Measuring method and measuring device of conduit connector space pose parameters
CN109470170A (en) * 2018-12-25 2019-03-15 山东大学 Stereoscopic vision space circle pose high-precision measuring method and system based on optimal projection plane
CN110160442A (en) * 2019-04-22 2019-08-23 南京航空航天大学 A kind of flexible measuring tooling and its scaling method for conduit end face of flange vision-based detection

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张广生;马志霞;王齐峰;史腾飞;张煜;: "数码相机定位的优化设计", 西南民族大学学报(自然科学版), no. 06 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112381847A (en) * 2020-10-27 2021-02-19 新拓三维技术(深圳)有限公司 Pipeline end head space pose measuring method and system
CN112381847B (en) * 2020-10-27 2024-02-13 新拓三维技术(深圳)有限公司 Pipeline end space pose measurement method and system
CN114001654A (en) * 2021-11-01 2022-02-01 北京卫星制造厂有限公司 Workpiece end face pose evaluation method
CN114001654B (en) * 2021-11-01 2024-03-26 北京卫星制造厂有限公司 Workpiece end face pose evaluation method

Also Published As

Publication number Publication date
CN113409395B (en) 2024-05-07

Similar Documents

Publication Publication Date Title
CN110370286B (en) Method for identifying rigid body space position of dead axle motion based on industrial robot and monocular camera
CN110378969B (en) Convergent binocular camera calibration method based on 3D geometric constraint
CN109242908B (en) Calibration method for underwater binocular vision measurement system
CN107367229B (en) Free binocular stereo vision rotating shaft parameter calibration method
CN109579695B (en) Part measuring method based on heterogeneous stereoscopic vision
CN111415391B (en) External azimuth parameter calibration method for multi-camera by adopting mutual shooting method
CN112648934B (en) Automatic elbow geometric form detection method
CN112985293B (en) Binocular vision measurement system and measurement method for single-camera double-spherical mirror image
CN108801175B (en) A kind of high-precision spatial pipeline measuring system and method
CN109974618B (en) Global calibration method of multi-sensor vision measurement system
CN109272555B (en) External parameter obtaining and calibrating method for RGB-D camera
CN109544642B (en) N-type target-based TDI-CCD camera parameter calibration method
WO2020199439A1 (en) Single- and dual-camera hybrid measurement-based three-dimensional point cloud computing method
CN113409395B (en) High-precision detection and positioning method for catheter end
CN113724337B (en) Camera dynamic external parameter calibration method and device without depending on tripod head angle
CN106500625A (en) A kind of telecentricity stereo vision measuring apparatus and its method for being applied to the measurement of object dimensional pattern micron accuracies
CN114359405A (en) Calibration method of off-axis Samm 3D line laser camera
CN114001651B (en) Large-scale slender barrel type component pose in-situ measurement method based on binocular vision measurement and priori detection data
CN112362034A (en) Solid engine multi-cylinder section butt joint guiding measurement algorithm based on binocular vision
CN109285210B (en) Pipeline three-dimensional reconstruction method combining topological relation and epipolar constraint
Zhang et al. Improved camera calibration method and accuracy analysis for binocular vision
CN114066976A (en) Catheter center line matching method based on multi-view constraint
CN110672033B (en) Pipeline error measurement method using 3D rendering
CN112489141B (en) Production line calibration method and device for single-board single-image strip relay lens of vehicle-mounted camera
CN114092500A (en) Method for processing point cloud of central line of catheter under multi-view vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant