CN113409395B - High-precision detection and positioning method for catheter end - Google Patents

High-precision detection and positioning method for catheter end Download PDF

Info

Publication number
CN113409395B
CN113409395B CN202010185998.2A CN202010185998A CN113409395B CN 113409395 B CN113409395 B CN 113409395B CN 202010185998 A CN202010185998 A CN 202010185998A CN 113409395 B CN113409395 B CN 113409395B
Authority
CN
China
Prior art keywords
camera
point cloud
points
catheter
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010185998.2A
Other languages
Chinese (zh)
Other versions
CN113409395A (en
Inventor
赵吉宾
夏仁波
陈月玲
赵亮
付生鹏
张天宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenyang Intelligent Robot National Research Institute Co ltd
Shenyang Intelligent Robot Innovation Center Co ltd
Shenyang Institute of Automation of CAS
Original Assignee
Shenyang Intelligent Robot National Research Institute Co ltd
Shenyang Intelligent Robot Innovation Center Co ltd
Shenyang Institute of Automation of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenyang Intelligent Robot National Research Institute Co ltd, Shenyang Intelligent Robot Innovation Center Co ltd, Shenyang Institute of Automation of CAS filed Critical Shenyang Intelligent Robot National Research Institute Co ltd
Priority to CN202010185998.2A priority Critical patent/CN113409395B/en
Publication of CN113409395A publication Critical patent/CN113409395A/en
Application granted granted Critical
Publication of CN113409395B publication Critical patent/CN113409395B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mathematical Physics (AREA)
  • Mathematical Analysis (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Algebra (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to a high-precision detection and positioning method for a catheter end. Firstly, calibrating internal and external parameters of a binocular vision system, then synchronously acquiring catheter images, processing the acquired images to obtain catheter end projection sub-pixel edges, solving three-dimensional space point coordinates of edge points by means of epipolar constraint and sequential consistency constraint, analyzing a space circular perspective projection mathematical model, establishing an objective function, optimizing the objective function, and finally solving accurate three-dimensional space points of the catheter ends. The method effectively considers the mathematical model of the projection of the pipe end of the catheter, reduces the measurement error caused by the distortion of the perspective projection shape of the pipe end circle, has high measurement precision and high efficiency, and lays a foundation for realizing stress-free assembly in the manufacturing process of the catheter.

Description

High-precision detection and positioning method for catheter end
Technical Field
The invention relates to the technical field of computer vision, in particular to a high-precision detection and positioning method for the end of a catheter.
Background
The end of the conduit refers to the center point of the end face of the pipeline. Before the pipeline is assembled, the accuracy of the relative position of the center point of the end face of the pipeline is a key for ensuring accurate installation of the pipeline. However, since the pipe is elastically deformed in the bending process of the pipe bender, the positions and the distances of the end points of the two ends of the pipe are changed, and the assembly is affected, high-precision measurement is required to be performed on the pipe ends of the pipe, so that the stress-free assembly is ensured.
At present, the measurement mode of the formed bent pipe is mainly contact type, such as a profiling method, a three-coordinate measuring machine, a joint coordinate measuring machine and the like, and the methods have the defects that the comparison device is difficult to manufacture, is easy to damage or deform, has low efficiency, excessively depends on subjective judgment of an operator, cannot reconstruct a three-dimensional model, and has no positioning reference and the like. Therefore, how to effectively shorten the development time, reduce the development cost and improve the product quality is a main purpose of quick processing and manufacturing of the bent pipe.
The traditional method for fitting the tail end profile is to carry out ellipse fitting on the projection of the tail end of the catheter, when the pipe diameter is smaller, the projection of the tail end of the pipeline cannot be fitted with ellipse, and larger positioning error can be caused.
Sun Peng et al, in the "study of a multi-vision based line endpoint coordinate measurement method", combined the endpoint positioning method of fitting contours and centerlines by using projected inclinations, calculated the catheter endpoints at different angles, however, the calculated endpoint accuracy was poor.
Disclosure of Invention
Aiming at the defects in the prior art, the technical problem to be solved by the invention is to provide a high-precision detection method for the end of the catheter, which is used for establishing an accurate projection model for the end of the catheter, reconstructing the tail end of the catheter in three dimensions, has simple operation and higher measurement efficiency, and can effectively improve the measurement precision of the end point of the pipeline.
The technical scheme adopted by the invention for achieving the purpose is as follows: the high-precision detection and positioning method for the end face of the catheter comprises the following steps:
Step 1: performing multi-vision calibration on an industrial camera to obtain internal parameters and external parameters of the camera;
step 2: synchronously triggering the camera 1 and the camera 2 to acquire 2 original catheter end face images respectively;
step 3: sub-pixel precision processing is respectively carried out on the two images, and edge coordinates of the projected ellipse of the end face of the catheter are respectively obtained;
step 4: according to the internal and external parameters of the camera, epipolar constraint and sequence consistency rules, finding out a corresponding matching point of the edge coordinate of the projected ellipse of one catheter end surface in the other image;
Step 5: reconstructing three-dimensional space coordinate point clouds of all matching points by using the calibrated spatial position relation between binocular vision and a stereoscopic vision three-dimensional measurement model;
Step 6: removing outliers from the three-dimensional space coordinate point cloud of the matching points by using Ransac algorithm, and using the rest points as initial values of the point cloud fitting circle;
step 7: establishing an objective function of a point cloud fitting circle, wherein the objective function comprises: the distances between all points on the point cloud and a plane formed by the point cloud fitting circles and the distances between all points on the point cloud and the point cloud fitting circle center;
Step 8: linearizing the objective function of the point cloud fitting circle;
step 9: and calculating a fitting space circle with the objective function being zero by using a least square method as the positioning of the end face of the conduit.
The inner parameter is a transformation matrix from an industrial camera coordinate system to an image coordinate system, and the outer parameter is a transformation matrix from a camera coordinate system to a world coordinate system, which is used for the subsequent actual control point coordinate output.
The internal parameters of each industrial camera are calibrated by adopting a two-dimensional flexible target.
The outer parameters are realized by a rotation matrix R and a translation matrix T.
The guide pipe is placed on the backlight plate, and the images shot by the camera 1 and the camera 2 are the same end face image of the guide pipe under different visual angles.
The stereoscopic three-dimensional model includes:
the stereoscopic three-dimensional model is expressed as:
wherein (X i,yi,zi) is the coordinate of the reconstructed three-dimensional space coordinate point p i, (X 1i,Y1i) is the coordinate of an image edge point l i under the camera 1, (X 2i,y2i) is the coordinate of an image edge point q i under the camera 2, f m、fn is the focal ratio of the camera 1 and the camera 2, and the transformation matrix R 12 between the camera 1 and the camera 2 is
(r1,r2,r3,r7,r8,r9,t1,t2,t3) Is thus the parameter value of the transformation matrix R 12.
The circle center of the point cloud is the mass center of all points on the point cloud, the circle radius is the average distance between all points on the point cloud and the mass center, and the normal vector of the plane where the fitting circle is located is obtained by the normal vector of the fitting plane of all points on the point cloud.
The objective function f is as follows:
Wherein n is the number of points contained in the point cloud, (a 0,b0,c0,d0) is the normal vector of the plane where the fitting circle is located, (x 0,y0,z0) is the mass center of all points on the point cloud, and (x i,yi,zi) is any point on the point cloud. r 0 is the radius of the fitted circle.
The invention has the following beneficial effects and advantages:
1. the high-precision detection method for the pipe end of the catheter can effectively reconstruct the three-dimensional coordinates of the center point of the pipe end.
2. The high-precision detection method for the pipe end of the catheter has high measurement precision and can provide effective guarantee for the stress-free assembly of the catheter.
3. The high-precision detection method for the pipe end of the catheter has high measurement efficiency and lays a solid foundation for digital manufacturing of the catheter.
Drawings
FIG. 1 is a schematic diagram of a hardware apparatus of the method of the present invention;
FIG. 2 is a flow chart of the method of the present invention;
FIG. 3 (a) is a left circular edge coordinate point in an image under camera 1 of the method of the present invention;
FIG. 3 (b) is a right circular edge coordinate point in the image under camera 2 of the method of the present invention;
Fig. 4 is a three-dimensional spatial coordinate point cloud of the reconstructed edge matching points of the method of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples.
Referring to fig. 1, the hardware device of the method of the present invention includes: the device comprises a backlight plate, a conduit to be detected, four industrial cameras and a computer serving as a controller. The industrial camera can be combined in pairs to measure the positioning matching detection of the projection ellipses of one end face of the guide pipe, the computer is provided with a program for each step of method detection, the computer is matched to finally detect the positioning matching detection results of the projection ellipses of the two end faces of the guide pipe, the positioning matching detection results are output to the upper computer, and the upper computer can control the actuator to assemble the guide pipe on the pre-connection device according to the positioning matching detection results of the projection ellipses of the two end faces of the guide pipe, so that the guide pipe assembly is realized.
Referring to fig. 2, fig. 3 (a), fig. 3 (b) and fig. 4, the method for detecting the high precision of the end of the catheter according to the present invention comprises the following specific steps:
Step 1: and performing multi-vision calibration on the four industrial cameras to obtain internal parameters of all cameras and a conversion matrix from all camera coordinate systems to a world coordinate system, namely external parameters.
The internal parameters represent a transformation matrix from the industrial camera coordinate system to the image coordinate system; the distortion degree of each lens is different and comprises radial distortion and eccentric distortion; and calibrating the internal parameters of each industrial camera by using the two-dimensional flexible targets.
The extrinsic parameters represent a transformation matrix from the industrial camera coordinate system to the world coordinate system, implemented by a rotation matrix R and a translation matrix T. The world coordinate system is used for the subsequent actual control point coordinate output.
Step 2: two industrial cameras, namely a synchronous triggering camera 1 and a camera 2, collect two original catheter images placed on a backlight plate;
Step 3: respectively solving edge image coordinates after subpixel accuracy processing of the catheter end face projection ellipse in the two original catheter images;
Step 4: establishing a corresponding matching point relation of sub-pixel precision edge coordinates of an ellipse projected by one catheter end surface in another image by means of internal and external parameters of a camera, epipolar constraint and sequential consistency;
defining an image two-dimensional edge coordinate point L (L 1,l2,...ln) under the camera 1, and finding a corresponding two-dimensional edge coordinate point Q (Q 1,q2,...qn) of the image under the camera 2 according to the internal and external parameters of the camera, the epipolar constraint and the sequential consistency principle.
Step 5: reconstructing a three-dimensional space coordinate point cloud P (P 1,p2,…pn) of all matching points by using the calibrated space position relation between binocular vision (namely, transformation matrixes among an industrial camera coordinate system, an image coordinate system and a world coordinate system) and using a three-dimensional vision three-dimensional measurement model;
the stereoscopic three-dimensional model is expressed as:
wherein (X i,yi,zi) is the coordinate of the reconstructed three-dimensional space coordinate point p i, (X 1i,Y1i) is the coordinate of an image edge point l i under the camera 1, (X 2i,y2i) is the coordinate of an image edge point q i under the camera 2, f m、fn is the focal ratio of the camera 1 and the camera 2, and the transformation matrix R 12 between the camera 1 and the camera 2 is
(r1,r2,r3,r7,r8,r9,t1,t2,t3) Is thus the parameter value of the transformation matrix R 12.
Step 6: removing outlier points from a three-dimensional space coordinate point cloud of the matched points by utilizing Ransac algorithm, and establishing an initial value of a point cloud fitting circle by utilizing the rest points, wherein the circle center is the mass center of all points on the point cloud, the circle radius is the average distance between all points on the point cloud and the mass center, and the normal vector of the plane of the fitting circle is obtained by the normal vector of the fitting plane of all points on the point cloud;
step 7: establishing an objective function of a point cloud fitting circle, wherein the objective function consists of two parts, the first part is the distance between all points on the point cloud and a plane formed by the point cloud fitting circle, the second part is the distance between all points on the point cloud and the point cloud fitting circle center, and the objective function f is as follows:
Wherein n is the number of points contained in the point cloud, (a 0,b0,c0,d0) is the normal vector of the plane where the fitting circle is located, (x 0,y0,z0) is the mass center of all points on the point cloud, and (x i,yi,zi) is any point on the point cloud. r 0 is the radius of the fitted circle.
Step 8: linearizing f to obtain
Wherein the method comprises the steps of
① Derivation of x 0 by the objective function f
② Derivation of y 0 by objective function f
③ The objective function f derives z 0
④ Derivation of r 0 by the objective function f
⑤ Derivation of a 0 by the objective function f
⑥ Derivation of b 0 by the objective function f
⑦ Derivation of c 0 by the objective function f
⑧ Derivation of d 0 by objective function f
Step 9: in order to make the objective function f=0, let
-f0=Α1x02y03z04a05b06c07d08r0
Using least squares, it can be transformed to mx=b
Wherein the method comprises the steps of
X=[x0,y0,z0,a0,b0,c0,r0]T
B=[f01,f02,…,f0m]T
Each row of the matrix M is a partial derivative value obtained by each point of the point cloud for each variable; t is a transposed symbol; each vector of the matrix B is f 0 obtained by each variable of each point of the point cloud. The final space circle parameter value X can be obtained.
Step 10, the industrial camera 3 and the industrial camera 4 are used for operating in the same way as the steps 2 to 9, so that the pipe end circle parameter value at the other end of the pipe can be obtained.
And 11, assembling the catheter and other operations can be performed by using the obtained pipe end circle parameters at the two ends of the catheter.
The program steps are loaded on a site computer, four industrial cameras are arranged on the site to acquire images and send the images to the site computer, and the images are processed according to the steps, so that the actual space point cloud circle parameter values of the pipe ends at the two sides of the pipe section can be acquired. The final spatial circle coordinate values are transmitted to a robot or other device for other operations.
The method adopts vs2010 to carry out experiments under a Win7 system, the space circle measurement precision can reach 0.05mm, and the repetition precision can reach 0.01mm. The measuring precision is high, the efficiency is high, and the stability is high.
While the foregoing is directed to the preferred embodiments of the present invention, it will be appreciated by those skilled in the art that various modifications and adaptations can be made without departing from the principles of the present invention, and such modifications and adaptations are intended to be comprehended within the scope of the present invention.

Claims (3)

1. The high-precision detection and positioning method for the catheter end is characterized by comprising the following steps of:
Step 1: performing multi-vision calibration on an industrial camera to obtain internal parameters and external parameters of the camera; the internal parameters are obtained by calibrating the internal parameters of each industrial camera by adopting a two-dimensional flexible target for a transformation matrix from an industrial camera coordinate system to an image coordinate system; the external parameters are transformation matrixes from a camera coordinate system to a world coordinate system, and are realized through a rotation matrix R and a translation matrix T; the world coordinate system is used for outputting the coordinates of the actual control points;
Step 2: the synchronous triggering camera 1 and the camera 2 respectively acquire 2 original catheter end face images;
step 3: sub-pixel precision processing is respectively carried out on the two images, and edge coordinates of the projected ellipse of the end face of the catheter are respectively obtained;
step 4: according to the internal and external parameters of the camera, epipolar constraint and sequence consistency rules, finding out a corresponding matching point of the edge coordinate of the projected ellipse of one catheter end surface in the other image;
step 5: reconstructing three-dimensional space coordinate point clouds of all matching points by using the calibrated spatial position relation between binocular vision and a stereoscopic vision three-dimensional measurement model; the reconstructed three-dimensional coordinate point cloud is expressed as:
Wherein (X i,yi,zi) is the coordinate of the reconstructed three-dimensional space coordinate point p i, (X 1i,Y1i) is the coordinate of an image edge point l i under the camera 1, (X 2i,Y2i) is the coordinate of an image edge point q i under the camera 2, f m、fn is the focal ratio of the camera 1 and the camera 2, and the transformation matrix R 12 between the camera 1 and the camera 2 is
(r1,r2,r3,r7,r8,r9,t1,t2,t3) Is thus the parameter value of the transformation matrix R 12;
Step 6: removing outliers from the three-dimensional space coordinate point cloud of the matching points by using Ransac algorithm, and using the rest points as initial values of the point cloud fitting circle;
Step 7: establishing an objective function of a point cloud fitting circle, wherein the objective function comprises: the distances between all points on the point cloud and a plane formed by the point cloud fitting circles and the distances between all points on the point cloud and the point cloud fitting circle center; the objective function f is as follows:
Wherein n is the number of points contained in the point cloud, (a 0,b0,c0,d0) is the normal vector of the plane where the fitting circle is located, (x 0,y0,z0) is the mass centers of all points on the point cloud, (x i,yi,zi) is any point on the point cloud, and r 0 is the radius of the fitting circle;
Step 8: linearizing the objective function of the point cloud fitting circle;
step 9: and calculating a fitting space circle with the objective function being zero by using a least square method as the positioning of the end face of the conduit.
2. The high-precision detection and positioning method for the pipe end of the pipe according to claim 1, wherein the pipe is placed on a backlight plate, and images shot by a camera 1 and a camera 2 are the same end face image of the pipe under different visual angles.
3. The high-precision detection and positioning method for the catheter end of claim 1, wherein the circle center of the point cloud is the mass center of all points on the point cloud, the circle radius is the average distance from all points on the point cloud to the mass center, and the normal vector of the plane of the fitted circle is obtained by the normal vector of the fitted plane of all points on the point cloud.
CN202010185998.2A 2020-03-17 2020-03-17 High-precision detection and positioning method for catheter end Active CN113409395B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010185998.2A CN113409395B (en) 2020-03-17 2020-03-17 High-precision detection and positioning method for catheter end

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010185998.2A CN113409395B (en) 2020-03-17 2020-03-17 High-precision detection and positioning method for catheter end

Publications (2)

Publication Number Publication Date
CN113409395A CN113409395A (en) 2021-09-17
CN113409395B true CN113409395B (en) 2024-05-07

Family

ID=77677127

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010185998.2A Active CN113409395B (en) 2020-03-17 2020-03-17 High-precision detection and positioning method for catheter end

Country Status (1)

Country Link
CN (1) CN113409395B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112381847B (en) * 2020-10-27 2024-02-13 新拓三维技术(深圳)有限公司 Pipeline end space pose measurement method and system
CN114001654B (en) * 2021-11-01 2024-03-26 北京卫星制造厂有限公司 Workpiece end face pose evaluation method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103196370A (en) * 2013-04-01 2013-07-10 北京理工大学 Measuring method and measuring device of conduit connector space pose parameters
CN109470170A (en) * 2018-12-25 2019-03-15 山东大学 Stereoscopic vision space circle pose high-precision measuring method and system based on optimal projection plane
CN110160442A (en) * 2019-04-22 2019-08-23 南京航空航天大学 A kind of flexible measuring tooling and its scaling method for conduit end face of flange vision-based detection

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101876533B (en) * 2010-06-23 2011-11-30 北京航空航天大学 Microscopic stereovision calibrating method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103196370A (en) * 2013-04-01 2013-07-10 北京理工大学 Measuring method and measuring device of conduit connector space pose parameters
CN109470170A (en) * 2018-12-25 2019-03-15 山东大学 Stereoscopic vision space circle pose high-precision measuring method and system based on optimal projection plane
CN110160442A (en) * 2019-04-22 2019-08-23 南京航空航天大学 A kind of flexible measuring tooling and its scaling method for conduit end face of flange vision-based detection

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
数码相机定位的优化设计;张广生;马志霞;王齐峰;史腾飞;张煜;;西南民族大学学报(自然科学版)(第06期);全文 *

Also Published As

Publication number Publication date
CN113409395A (en) 2021-09-17

Similar Documents

Publication Publication Date Title
CN110378969B (en) Convergent binocular camera calibration method based on 3D geometric constraint
CN107367229B (en) Free binocular stereo vision rotating shaft parameter calibration method
CN111415391B (en) External azimuth parameter calibration method for multi-camera by adopting mutual shooting method
CN114399554B (en) Calibration method and system of multi-camera system
CN112465912B (en) Stereo camera calibration method and device
CN113409395B (en) High-precision detection and positioning method for catheter end
CN112648934B (en) Automatic elbow geometric form detection method
CN107941153B (en) Visual system for optimizing calibration of laser ranging
WO2020199439A1 (en) Single- and dual-camera hybrid measurement-based three-dimensional point cloud computing method
CN109544642B (en) N-type target-based TDI-CCD camera parameter calibration method
CN109272555B (en) External parameter obtaining and calibrating method for RGB-D camera
CN109974618B (en) Global calibration method of multi-sensor vision measurement system
CN113724337B (en) Camera dynamic external parameter calibration method and device without depending on tripod head angle
CN106500625A (en) A kind of telecentricity stereo vision measuring apparatus and its method for being applied to the measurement of object dimensional pattern micron accuracies
CN105374067A (en) Three-dimensional reconstruction method based on PAL cameras and reconstruction system thereof
CN113804128A (en) Double-bearing-hole coaxiality error visual measurement device and measurement method
CN114001651B (en) Large-scale slender barrel type component pose in-situ measurement method based on binocular vision measurement and priori detection data
CN113310433A (en) Virtual binocular stereo vision measuring method based on line structured light
CN114219866A (en) Binocular structured light three-dimensional reconstruction method, reconstruction system and reconstruction equipment
Zhang et al. Improved camera calibration method and accuracy analysis for binocular vision
CN109285210B (en) Pipeline three-dimensional reconstruction method combining topological relation and epipolar constraint
CN114066976A (en) Catheter center line matching method based on multi-view constraint
CN112489141B (en) Production line calibration method and device for single-board single-image strip relay lens of vehicle-mounted camera
CN112971984B (en) Coordinate registration method based on integrated surgical robot
CN115046497A (en) Improved calibration method based on grating projection measurement system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant