CN106952347B - Ultrasonic surgery auxiliary navigation system based on binocular vision - Google Patents
Ultrasonic surgery auxiliary navigation system based on binocular vision Download PDFInfo
- Publication number
- CN106952347B CN106952347B CN201710190310.8A CN201710190310A CN106952347B CN 106952347 B CN106952347 B CN 106952347B CN 201710190310 A CN201710190310 A CN 201710190310A CN 106952347 B CN106952347 B CN 106952347B
- Authority
- CN
- China
- Prior art keywords
- image
- coordinate system
- surgical instrument
- ultrasonic
- dimensional
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/003—Navigation within 3D models or images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/14—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
- G06K7/1404—Methods for optical code recognition
- G06K7/1439—Methods for optical code recognition including a method step for retrieval of the optical code
- G06K7/1443—Methods for optical code recognition including a method step for retrieval of the optical code locating of the code in an image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10132—Ultrasound image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- Computer Graphics (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Health & Medical Sciences (AREA)
- Electromagnetism (AREA)
- General Health & Medical Sciences (AREA)
- Toxicology (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Image Processing (AREA)
Abstract
the invention discloses an ultrasonic surgery auxiliary navigation system based on binocular vision, and belongs to the technical field of visual positioning. The invention is based on the binocular vision positioning technology, utilizes a coordinate system calibration device to fix a binocular vision positioning device and an ultrasonic probe together, simultaneously images points under a world coordinate system, calculates coordinates of the images under respective coordinate systems, then calculates conversion relations of the respective coordinate systems, fuses surgical instruments and focuses in the same three-dimensional coordinate system through the conversion relations and displays the surgical instruments and the focuses in an image workstation.
Description
Technical Field
the invention belongs to the field of machine vision, and particularly relates to an ultrasonic surgery auxiliary navigation system based on binocular vision.
background
the interventional puncture operation under the guidance of the ultrasonic wave has very important application in clinical operations for treating various diseases, can effectively improve the operation precision, shorten the operation time and reduce the operation wounds and complications. The current mainstream puncture surgery method depends on ultrasound image guidance, but the method has many limitations, firstly, the method is only suitable for the situation that the surgical instrument can image under ultrasound, and many traditional surgical instruments can not image under ultrasound; secondly, the ultrasonic imaging area is limited and is a two-dimensional plane, so that the surgical instrument can be observed in the ultrasonic image only after the surgical instrument enters the ultrasonic imaging area along the ultrasonic imaging plane, and doctors usually cooperate with a special puncture frame to operate in the actual operation, thereby greatly reducing the flexibility and the operability of the operation.
percutaneous Nephrolithotomy (PNL) is one of the main techniques for treating renal calculus, and after a puncture needle is punctured into a kidney collecting system from the skin to realize percutaneous nephrolithotomy, a skin-kidney operation channel is established through fascia expansion, and the calculus is removed through the operation channel. The main complication of PNL surgery is caused by the process of establishing a surgical tunnel during fascia dilatation. At present, no effective guidance method for imaging technology exists in the process of establishing an operation channel by fascia expansion.
the radio frequency ablation operation generally adopts local anesthesia and CT scanning positioning, the tumor is (percutaneously) punctured under the guidance of color B-ultrasound according to the measured distance and angle, an electrode needle is punctured into the center of the tumor, then an electrode is deployed, and the radio frequency ablation is started. The greatest risk of such surgery is the damage to the adjacent organs due to the inability to see the three-dimensional positional relationship of the electrode needle and the tumor in real time.
with the development of medical instrument technology, an optical positioning method based on machine vision becomes the most widely used, highest-precision and most reliable positioning method at present, and the optical positioning method develops a passive mode and an active mode according to whether an observed target is active or not. The passive optical positioning system is installed on the surgical instrument by using a reflective marker as a marking point, and the marking point reflects light and images to a sensor. The mark points arranged on the surgical instrument are accurately positioned in the operation, so that the position and the direction of the surgical instrument can be accurately calculated. The active optical positioning system usually uses a set of light emitting diodes as index points to be installed on the surgical instrument, when the system is working, the light emitting diodes on the surgical instrument emit infrared light according to a specific sequence, and the binocular camera can determine the spatial position of the light emitting diodes according to the light emitting sequence of the diodes and the imaging position in the image by shooting the surgical instrument. Whether the optical positioning system is active or passive, the difficulty in using the existing system is that the binocular camera is often placed at a higher position such as a ceiling, and the situation that the identification point is blocked frequently occurs in the operation process. Although other non-contact positioning methods, such as ultrasonic positioning and electromagnetic positioning, are also continuously developed and applied to a certain extent, the optical positioning technology based on machine vision becomes a development trend and a research hotspot of surgical guidance due to the high precision, flexible design, low price and the like of the optical positioning technology.
Disclosure of Invention
aiming at the defects or the improvement requirements of the prior art, the invention provides an ultrasonic operation auxiliary navigation system based on binocular vision, which aims to fix a binocular vision positioning device and an ultrasonic probe together by utilizing a coordinate system calibration device based on a binocular vision positioning technology, image points under a world coordinate system at the same time, calculate the coordinates of the imaged points under respective coordinate systems, calculate the conversion relation of the respective coordinate systems, fuse an operation instrument and a focus in the same three-dimensional coordinate system through the conversion relation and display the three-dimensional coordinate system in an image workstation.
in order to achieve the above object, the present invention provides an ultrasound surgery aided navigation system based on binocular vision, the system comprising:
the binocular vision positioning device is fixed on the ultrasonic probe, comprises two cameras and is used for acquiring images of surgical instruments in real time and transmitting the images to the image workstation;
the marker is one or more different bar codes and is used for being pasted or printed on the surgical instrument in a surrounding way;
The ultrasonic diagnostic apparatus comprises an ultrasonic probe and an ultrasonic diagnostic apparatus, and is used for carrying out ultrasonic imaging on human tissues in real time and transmitting focus ultrasonic imaging to the image workstation;
And the image workstation is used for determining a matching point according to the imaging position relation of the surgical instrument and the marker on the image of the surgical instrument, calculating the three-dimensional coordinates of each point of the surgical instrument by using the parallax of the matching point and the parameters of the camera, carrying out three-dimensional reconstruction, and displaying the surgical instrument and the focus under the same coordinate system by using an image fusion technology.
Further, the system still includes coordinate system calibration device, and coordinate system calibration device's main part is a rectangle container, and one side is fixed with chess board check calibration board, and inside is fixed with many nylon wires, and the container top is fixed with an ultrasonic probe rack, wherein:
The ultrasonic probe placing rack is used for fixing the ultrasonic probe in the rectangular container;
The nylon wire is used for ultrasonic imaging of the nylon wire transversely placed in the purified water by the ultrasonic probe;
The rectangular container is used for filling purified water to submerge the nylon thread;
And the chessboard pattern calibration plate is used for carrying out image acquisition on the chessboard pattern calibration plate by the camera.
furthermore, the binocular vision positioning device is also used for collecting images of the chessboard pattern calibration plate by the camera when the coordinate system is positioned, and transmitting the collected camera calibration images to the image workstation.
furthermore, the ultrasonic diagnostic apparatus is also used for carrying out ultrasonic imaging on the nylon wire transversely placed in the purified water by the ultrasonic probe when the coordinate system is positioned, and transmitting the ultrasonic calibration imaging to the image workstation.
further, the image workstation comprises:
the surgical instrument three-dimensional reconstruction module is used for carrying out image processing on the surgical instrument images, acquiring the same position points in a pair of surgical instrument images as matching points, calculating the three-dimensional coordinates of each point of the surgical instrument by using the parallax of the matching points and the camera parameters and carrying out three-dimensional reconstruction;
The coordinate transformation relation acquisition module is used for carrying out three-dimensional reconstruction according to the camera calibration image and the ultrasonic calibration imaging to obtain coordinate values of the mark points of the camera calibration image and the ultrasonic calibration imaging in respective coordinate systems, and calculating the transformation relation between the two-camera three-dimensional coordinate system and the ultrasonic three-dimensional coordinate system by using the coordinate values of the mark points and the coordinate values of the mark points in the world coordinate system;
and the three-dimensional image fusion module is used for displaying the surgical instrument image and the focus ultrasonic imaging in the same three-dimensional coordinate system by utilizing the transformation relation between the two-camera three-dimensional coordinate system and the ultrasonic probe three-dimensional coordinate system.
further, the surgical instrument three-dimensional reconstruction module comprises:
the image preprocessing unit is used for reducing the noise of the original image by adopting a Gaussian fuzzy algorithm;
The bar code coarse positioning unit is used for carrying out edge detection on the image subjected to noise reduction, preferably adopting a Canny operator to carry out edge detection on the image subjected to noise reduction, carrying out local maximum value down-sampling operation on the edge image, carrying out morphological closing operation on the image subjected to down-sampling, and then carrying out edge detection again to obtain a rough contour map of the surgical instrument; then obtaining the approximate positions of the two boundaries of the surgical instrument and the bar code through linear detection, preferably obtaining the approximate positions of the two boundaries of the surgical instrument and the bar code through Hough transformation linear detection;
the device comprises a characteristic point determining unit, a feature point determining unit and a feature point determining unit, wherein the characteristic point determining unit is used for outwards expanding two boundaries of the surgical instrument to a set range, only reserving edge image information between the two boundaries, performing morphological closing operation on the edge image, then performing edge detection to obtain an accurate outline of the surgical instrument, obtaining the two boundaries of the surgical instrument through linear detection, preferably obtaining the two boundaries of the surgical instrument through Hough transform linear detection, determining a central axis of the surgical instrument according to the two boundaries, and defining an intersection point of the central axis and two ends of a complete bar code in the image as the characteristic point;
the bar code identification unit is used for identifying and distinguishing the bar code on the surgical instrument image according to a bar code coding rule;
the characteristic point matching unit is used for defining the same characteristic point on the same bar code on the surgical instrument image acquired by the two cameras at the same time into a pair of matching points;
And the surgical instrument three-dimensional reconstruction unit is used for obtaining the three-dimensional coordinates of the matching points by using the parallax of the matching points and the camera parameters through a binocular vision three-dimensional reconstruction algorithm, obtaining the three-dimensional coordinates of each point of the surgical instrument in a two-camera three-dimensional coordinate system according to the three-dimensional coordinates of the matching points and the positions of the two-dimensional codes where the matching points are located on the surgical instrument, and performing surgical instrument three-dimensional reconstruction.
further, the coordinate transformation relation obtaining module includes:
the world three-dimensional coordinate system establishing unit is used for selecting one vertex of the coordinate system calibration device as a coordinate origin to establish a world three-dimensional coordinate system;
The shooting coordinate transformation relation acquisition unit is used for selecting at least three angular points on the chessboard pattern calibration plate as mark points and measuring and acquiring the coordinates of the mark points in a world three-dimensional coordinate system; then, obtaining coordinate values of the mark points under the double-camera three-dimensional coordinate system by utilizing the parallax and the camera parameters of the same mark points on the camera calibration image, and calculating a conversion relation between the two coordinate systems through the coordinate values of the mark points under the world three-dimensional coordinate system and the double-camera three-dimensional coordinate system to define the conversion relation as a first conversion relation;
the ultrasonic coordinate transformation relation acquisition unit is used for selecting bright spots generated by at least three nylon lines on ultrasonic calibration imaging as mark points, establishing an ultrasonic probe three-dimensional coordinate system by taking the original point of an ultrasonic calibration imaging coordinate system as the original point and taking the longitudinal axis of the image plane, the transverse axis of the image plane and the normal direction of the image plane as three coordinate axes, and calculating coordinate values of the mark points under the ultrasonic probe three-dimensional coordinate system; meanwhile, finding the actual position point of the same mark point on the nylon line, measuring the coordinate value of the position point under the world three-dimensional coordinate system, and calculating the conversion relation between the two coordinate systems through the coordinate values of the mark point under the world three-dimensional coordinate system and the ultrasonic probe three-dimensional coordinate system, wherein the second conversion relation is defined;
And a coordinate transformation relation determination unit for deriving a transformation relation between the two-camera three-dimensional coordinate system and the ultrasonic probe three-dimensional coordinate system from the first transformation relation and the second transformation relation.
further, the camera parameters calculate the internal and external parameters of the camera by a Zhang friend making plane template method.
generally, compared with the prior art, the technical scheme of the invention has the following technical characteristics and beneficial effects:
the invention overcomes the problems that the traditional puncture interventional operation guide can not be continuous in real time and the operation is not flexible and convenient, adopts the positioning device based on machine vision and the image processing algorithm to carry out the auxiliary guide of the operation, and has the advantages of high speed, high precision, small occupied space and the like;
furthermore, the system of the invention positions the conversion relation between the two cameras and the three-dimensional coordinate system of the ultrasonic probe by using the coordinate system calibration device, and can accurately reflect the coordinate conversion relation between the two cameras and ultrasonic imaging in the operation as long as positioning is carried out once before the operation;
furthermore, the marker of the system of the invention adopts one or more different bar codes and is pasted or printed on the surgical instrument in a surrounding way; when the double cameras are used for sampling images of the surgical instrument, the markers on the surgical instrument can be accurately identified when the cameras and the surgical instrument are in any angle relative positions;
Furthermore, the image workstation of the system carries out three-dimensional transformation on the collected surgical instrument image and the focus ultrasonic image, and the surgical instrument and the focus are fused in the same three-dimensional coordinate system to be displayed by utilizing the transformation relation between the two cameras and the three-dimensional coordinate system of the ultrasonic probe, so that the surgical process is clearly, visually and accurately displayed, the surgical precision is effectively improved, and the surgical time is shortened.
drawings
FIG. 1 is a schematic diagram of the system operation of an embodiment of the present invention;
FIG. 2 is a surgical instrument having a marker according to an embodiment of the present invention;
FIG. 3a is a noise-reduced image and FIG. 3b is an edge view thereof according to an embodiment of the present invention;
FIGS. 4a, 4b, 4c and 4d are schematic views of a coarse detection process of a surgical instrument according to an embodiment of the present invention;
FIG. 5 is an edge view of a surgical instrument according to an embodiment of the present invention;
FIGS. 6a and 6b are diagrams illustrating the precise detection of a surgical instrument according to an embodiment of the present invention;
FIG. 7 is a boundary view of a surgical instrument according to an embodiment of the present invention;
FIG. 8 is a centerline view of a surgical instrument in accordance with an embodiment of the present invention;
FIG. 9 is a feature point diagram of a marker of an embodiment of the present invention;
FIG. 10 is a diagram of a coordinate system calibration apparatus according to an embodiment of the present invention;
FIG. 11 is an ultrasound image of a marker in a coordinate system calibration device in accordance with an embodiment of the present invention;
FIG. 12 is a schematic diagram of spatial coordinate system transformation according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. In addition, the technical features involved in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
As shown in fig. 1, the system of the present invention comprises:
The binocular vision positioning device is fixed on the ultrasonic probe, comprises two cameras and is used for acquiring images of surgical instruments in real time and transmitting the images to the image workstation;
the marker is one or more different bar codes and is used for being pasted or printed on the surgical instrument in a surrounding way; FIG. 2 illustrates a surgical instrument with a plurality of bar codes affixed around it;
The ultrasonic diagnostic apparatus comprises an ultrasonic probe and an ultrasonic diagnostic apparatus, and is used for carrying out ultrasonic imaging on human tissues in real time and transmitting focus ultrasonic imaging to the image workstation;
and the image workstation is used for determining a matching point according to the imaging position relation of the surgical instrument and the marker on the image of the surgical instrument through an image processing algorithm, calculating the three-dimensional coordinates of each point of the surgical instrument by using the parallax of the matching point and the camera parameter, carrying out three-dimensional reconstruction, and displaying the surgical instrument and the focus under the same coordinate system by using an image fusion technology.
The system of the invention is carried out according to the following steps when starting to work:
(1) Calculating internal and external parameters of the camera by a Zhang friend seeking plane template method;
(2) synchronously acquiring marker images by using a binocular vision positioning device, identifying information of a complete bar code in an imaging area by using image characteristics of a marker, and calculating coordinate values of matching points of a left image and a right image;
the intersection points of the boundaries of the upper end and the lower end of the complete bar code in the image and the central axis of the surgical instrument are regarded as feature points, and the same feature point of the same bar code in the left image and the right image is regarded as a pair of matching points according to information carried by the bar code. The detection of the characteristic points and the identification of the bar code information can be realized by the following image processing algorithms:
(21) Preprocessing the original image, and reducing image noise by using Gaussian blur, as shown in FIG. 3 (a);
(22) roughly positioning the bar code, and carrying out edge detection on the original image by using a Canny operator, wherein the edge detection is shown in a figure 3 (b); performing local maximum downsampling operation on the edge map, specifically: sliding a window h x h on the image by a step h (the size of h is approximately equivalent to the code width of the bar code), and taking the maximum value of the pixels in the window as the pixel value of the corresponding position of the downsampled image, wherein the result is shown in fig. 4 (a); performing morphological closing operation on the down-sampled image to make the coding region corresponding to each code value become a single continuous domain, as shown in fig. 4 (b); the edge detection is performed again, so that the approximate outline of the surgical instrument can be obtained as shown in fig. 4 (c); two boundaries of the surgical instrument are obtained through Hough transformation straight line detection, and the two boundaries are shown in figure 4 (d); the approximate position of the bar code and the approximate orientation of the surgical instrument can be quickly obtained through the operation, and the accurate position of the surgical instrument can be quickly obtained in the local range of the image according to the information in the subsequent operation;
(23) determining the characteristic points of the bar code, roughly positioning the position of the bar code, outwards expanding two boundaries of the surgical instrument to a certain range, and only keeping the edge image information between the two boundaries, as shown in figure 5; performing a morphological closing operation on the edge image, as shown in fig. 6 (a); then, edge detection is carried out, so that the accurate outline of the surgical instrument can be obtained, and the figure 6(b) is shown; two boundaries of the surgical instrument are obtained through Hough transform line detection, and the two boundaries are shown in figure 7; the middle line of the two boundaries is the central axis of the surgical instrument, as shown in fig. 8; the intersection point of the central axis and the two ends of a certain complete bar code in the image is a characteristic point, which is shown as a red dot in the bar code of FIG. 9;
(24) identifying bar code information, wherein the width combinations of black and white bar codes of different bar codes are different, and a certain distance is reserved between the bar codes, so that after the position of a central axis is obtained, the gray values of the pixel points are obtained, and the information represented by the bar code can be identified according to the existing coding rule;
(3) Reconstructing a three-dimensional coordinate of the specified position of the bar code by a binocular vision three-dimensional reconstruction algorithm, and calculating the three-dimensional coordinate of the surgical instrument in a visual system coordinate system by combining the coding information of the bar code;
the method specifically comprises the following steps: after the bar code characteristic points and the bar code information in the left image and the right image are obtained in the step (2), firstly, matching the characteristic points, if the bar code information of the left image and the bar code information of the right image are consistent, regarding the characteristic points as a pair of matching points, and then obtaining the three-dimensional coordinates of two end points on the central axis of the bar code by using the parallax of the matching points and the relevant parameters of the camera obtained in the step (1); when the surgical instrument marker is designed, the position of each bar code on the surgical instrument is fixed, so that the relative position of each bar code and the tip point of the surgical instrument is known, and the three-dimensional coordinates of the tip point of the surgical instrument can be calculated through the three-dimensional coordinates of the two end points of the bar codes and the distance from the tip point of the surgical instrument to the end point of the bar code;
(4) solving the transformation relation between a binocular vision coordinate system and a two-dimensional ultrasonic imaging plane coordinate system through a self-designed coordinate system calibration device;
the method specifically comprises the following steps: the three-dimensional coordinate of the surgical instrument tip point obtained in the step (3) is in a coordinate system of a binocular vision positioning device, and is actually a coordinate system established by taking the optical center of the left camera as a coordinate origin, and the position of a focus in the ultrasonic image is in a two-dimensional coordinate system of the ultrasonic image, so that a transformation relation between the ultrasonic image coordinate system and the visual coordinate system needs to be established, and the ultrasonic image can be understood to be placed in a three-dimensional space coordinate system of the visual system;
in order to complete the calibration, the invention designs a calibration device, the main body of which is a cuboid container, a checkerboard calibration plate is fixed on one side of the cuboid container, a plurality of nylon wires are fixed in the container, and an ultrasonic probe placing rack is shown in figure 10; after the container is filled with purified water, the ultrasonic probe device is fixed, and when the ultrasonic imaging plane cuts through the nylon lines, a plurality of bright spot characteristic points are generated on the ultrasonic image, as shown in fig. 11;
selecting one vertex of a calibration device as a coordinate origin to establish a three-dimensional rectangular coordinate system (world coordinate system); selecting three (or more) suitable corner points on a chessboard pattern calibration plate as mark points, and obtaining the coordinates of the mark points in a world coordinate system through measurement; acquiring three-dimensional coordinates of a mark point under a coordinate system (the origin is in the optical center of a left camera) of a binocular vision system (the calibration of internal and external parameters of a camera is realized by adopting a Zhang friend finding plane template method); calculating a conversion relation between a world coordinate system and a binocular vision system coordinate system through coordinate values of the mark points in different coordinate systems;
similarly, the ultrasonic probe images the transversely placed nylon wires in the water by virtue of the ultrasonic probe placing rack, bright spots generated by three (or more) nylon wires are selected as mark points, the three-dimensional coordinates of the mark points in a world coordinate system are obtained by measurement, and the two-dimensional coordinates of the mark points in an ultrasonic image coordinate system are read by a person; establishing a three-dimensional rectangular coordinate system of the ultrasonic probe by taking an original point of an image coordinate system as an original point and taking a longitudinal axis of an image plane, a transverse axis of the image plane and a normal direction of the image plane as three coordinate axes; then the three-dimensional coordinates of the mark point under the coordinate system of the ultrasonic probe can be obtained; the transformation relation between the coordinate system of the ultrasonic probe and the world coordinate system can be calculated through the coordinate values of the mark points in the world coordinate system and the coordinate system of the ultrasonic probe;
therefore, by taking the world coordinate system as reference, the transformation relation between the coordinate system of the ultrasonic probe and the coordinate system of the binocular vision system can be obtained, namely the transformation relation between the coordinate system of the ultrasonic image and the coordinate system of the vision system is obtained;
calculating the transformation relation of two coordinate systems by three or more common points by adopting a space variation method based on a Rodrigue matrix; the principle of the method is as follows:
Two space rectangular coordinate systems are respectively O-XYZ and O '-X' Y 'Z', see FIG. 12, the origin points of the coordinate systems are inconsistent, and three translation parameters delta X, delta Y and delta Z exist; the coordinate axes between the two points are not parallel to each other, three rotation parameters exist, and the coordinates of the same point A in two coordinate systems can be respectively expressed as (X, Y, Z) and (X ', Y ', Z ');
obviously, the two coordinate systems can be consistent through translation and rotation transformation of coordinate axes, and the transformation relationship between the coordinates is as follows:
Conventionally, R is called a rotation matrix, [ Δ X, [ Δ Y, [ Δ Z ]]Tis a translation matrix. Therefore, conversion between the two coordinate systems can be realized by only directly solving the rotation matrix and the translation matrix. In order to solve for the transformation parameters between coordinate systems, it is usually necessary to provide a certain number (at least three sets) of common points with two sets of coordinates.
from the rodreg matrix, the rotation matrix R can be represented as:
R=(I-S)-1(I+S), (2)
wherein I is an identity matrix; s is an antisymmetric matrix. Therefore, it is possible to provide:
in the formula, a, b and c are the Rodrigue parameters.
subtracting the two sets of coordinates to eliminate the translation matrix, and then substituting equations (2) and (3) into the two sets of coordinates, and performing expansion and finishing to obtain:
for n common points, a formula (4) can be used for directly solving the Rodrigue parameters according to the least square principle, then the Rodrigue parameters are replaced to the formula (2) to obtain a rotation matrix R, and then the rotation matrix R is substituted into a group of common points through the formula (1) to obtain a translation matrix.
(5) the surgical instruments and the ultrasonic images are unified under a binocular vision three-dimensional coordinate system and are displayed in a projection mode on a screen of an image workstation.
the above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.
Claims (7)
1. a binocular vision based ultrasound surgery assisted navigation system, the system comprising:
The binocular vision positioning device is fixed on the ultrasonic probe, comprises two cameras and is used for acquiring images of surgical instruments in real time and transmitting the images to the image workstation;
the marker is one or more different bar codes and is used for being pasted or printed on the surgical instrument in a surrounding way;
the ultrasonic diagnostic apparatus comprises an ultrasonic probe and an ultrasonic diagnostic apparatus, and is used for carrying out ultrasonic imaging on human tissues in real time and transmitting focus ultrasonic imaging to the image workstation;
the image workstation is used for determining a matching point according to the imaging position relation of the surgical instrument and the marker on the image of the surgical instrument, calculating the three-dimensional coordinates of each point of the surgical instrument by using the parallax of the matching point and the parameters of the camera, carrying out three-dimensional reconstruction, and displaying the surgical instrument and the focus under the same coordinate system by using an image fusion technology;
the image workstation comprises:
The surgical instrument three-dimensional reconstruction module is used for carrying out image processing on the surgical instrument images, acquiring the same position points in a pair of surgical instrument images as matching points, calculating the three-dimensional coordinates of each point of the surgical instrument by using the parallax of the matching points and the camera parameters and carrying out three-dimensional reconstruction;
The surgical instrument three-dimensional reconstruction module comprises:
The image preprocessing unit is used for reducing the noise of the original image by adopting a Gaussian fuzzy algorithm;
The bar code coarse positioning unit is used for carrying out edge detection on the image subjected to noise reduction, carrying out local maximum value down-sampling operation on the edge image, carrying out morphological closing operation on the down-sampled image, and then carrying out edge detection again to obtain an approximate outline image of the surgical instrument; then two boundaries of the surgical instrument and the approximate position of the bar code are obtained through linear detection;
The characteristic point determining unit is used for expanding the two boundaries of the surgical instrument outwards to a set range, only reserving edge image information between the two boundaries, performing morphological closed operation on the edge image, then performing edge detection to obtain an accurate outline of the surgical instrument, obtaining the two boundaries of the surgical instrument through linear detection, determining a central axis of the surgical instrument according to the two boundaries, and defining intersection points of the central axis and the two ends of the complete bar code in the image as characteristic points;
the bar code identification unit is used for identifying and distinguishing the bar code on the surgical instrument image according to a bar code coding rule;
the characteristic point matching unit is used for defining the same characteristic point on the same bar code on the surgical instrument image acquired by the two cameras at the same time into a pair of matching points;
And the surgical instrument three-dimensional reconstruction unit is used for obtaining the three-dimensional coordinates of the matching points by using the parallax of the matching points and the camera parameters through a binocular vision three-dimensional reconstruction algorithm, obtaining the three-dimensional coordinates of each point of the surgical instrument in a two-camera three-dimensional coordinate system according to the three-dimensional coordinates of the matching points and the positions of the two-dimensional codes where the matching points are located on the surgical instrument, and performing surgical instrument three-dimensional reconstruction.
2. The binocular vision-based ultrasonic surgery aided navigation system as claimed in claim 1, further comprising a coordinate system calibration device, wherein the coordinate system calibration device is a rectangular container, a checkerboard calibration plate is fixed on one side of the rectangular container, a plurality of nylon wires are fixed inside the rectangular container, and an ultrasonic probe placing rack is fixed above the rectangular container, wherein:
the ultrasonic probe placing rack is used for fixing the ultrasonic probe in the rectangular container;
the nylon wire is used for ultrasonic imaging of the nylon wire transversely placed in the purified water by the ultrasonic probe;
the rectangular container is used for filling purified water to submerge the nylon thread;
and the chessboard pattern calibration plate is used for carrying out image acquisition on the chessboard pattern calibration plate by the camera.
3. the binocular vision based ultrasonic surgery aided navigation system as claimed in claim 2, wherein the binocular vision positioning device is further used for acquiring images of the checkerboard calibration plate by the camera when positioning the coordinate system, and transmitting the acquired camera calibration images to the image workstation.
4. the binocular vision based ultrasonic surgery aided navigation system as claimed in claim 2, wherein the ultrasonic diagnostic apparatus is further configured to, when performing coordinate system positioning, perform ultrasonic imaging on a nylon thread transversely placed in purified water by the ultrasonic probe, and transmit ultrasonic calibration imaging to the image workstation.
5. the binocular vision based ultrasound surgery assisted navigation system of claim 1 or 2, wherein the image workstation further comprises:
The coordinate transformation relation acquisition module is used for carrying out three-dimensional reconstruction according to the camera calibration image and the ultrasonic calibration imaging to obtain coordinate values of the mark points of the camera calibration image and the ultrasonic calibration imaging in respective coordinate systems, and calculating the transformation relation between the two-camera three-dimensional coordinate system and the ultrasonic three-dimensional coordinate system by using the coordinate values of the mark points and the coordinate values of the mark points in the world coordinate system;
and the three-dimensional image fusion module is used for displaying the surgical instrument image and the focus ultrasonic imaging in the same three-dimensional coordinate system by utilizing the transformation relation between the two-camera three-dimensional coordinate system and the ultrasonic probe three-dimensional coordinate system.
6. The binocular vision based ultrasonic surgery assisted navigation system of claim 5, wherein the coordinate transformation relation obtaining module comprises:
the world three-dimensional coordinate system establishing unit is used for selecting one vertex of the coordinate system calibration device as a coordinate origin to establish a world three-dimensional coordinate system;
the shooting coordinate transformation relation acquisition unit is used for selecting at least three angular points on the chessboard pattern calibration plate as mark points and measuring and acquiring the coordinates of the mark points in a world three-dimensional coordinate system; then, obtaining coordinate values of the mark points under the double-camera three-dimensional coordinate system by utilizing the parallax and the camera parameters of the same mark points on the camera calibration image, and calculating a conversion relation between the two coordinate systems through the coordinate values of the mark points under the world three-dimensional coordinate system and the double-camera three-dimensional coordinate system to define the conversion relation as a first conversion relation;
the ultrasonic coordinate transformation relation acquisition unit is used for selecting bright spots generated by at least three nylon lines on ultrasonic calibration imaging as mark points, establishing an ultrasonic probe three-dimensional coordinate system by taking the original point of an ultrasonic calibration imaging coordinate system as the original point and taking the longitudinal axis of the image plane, the transverse axis of the image plane and the normal direction of the image plane as three coordinate axes, and calculating coordinate values of the mark points under the ultrasonic probe three-dimensional coordinate system; meanwhile, finding the actual position point of the same mark point on the nylon line, measuring the coordinate value of the position point under the world three-dimensional coordinate system, and calculating the conversion relation between the two coordinate systems through the coordinate values of the mark point under the world three-dimensional coordinate system and the ultrasonic probe three-dimensional coordinate system, wherein the second conversion relation is defined;
and a coordinate transformation relation determination unit for deriving a transformation relation between the two-camera three-dimensional coordinate system and the ultrasonic probe three-dimensional coordinate system from the first transformation relation and the second transformation relation.
7. the binocular vision-based ultrasonic surgery assisted navigation system of claim 1, wherein the internal and external parameters of the camera are calculated by a Zhangyingyou plane template method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710190310.8A CN106952347B (en) | 2017-03-28 | 2017-03-28 | Ultrasonic surgery auxiliary navigation system based on binocular vision |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710190310.8A CN106952347B (en) | 2017-03-28 | 2017-03-28 | Ultrasonic surgery auxiliary navigation system based on binocular vision |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106952347A CN106952347A (en) | 2017-07-14 |
CN106952347B true CN106952347B (en) | 2019-12-17 |
Family
ID=59473216
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710190310.8A Expired - Fee Related CN106952347B (en) | 2017-03-28 | 2017-03-28 | Ultrasonic surgery auxiliary navigation system based on binocular vision |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106952347B (en) |
Families Citing this family (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108830905A (en) * | 2018-05-22 | 2018-11-16 | 苏州敏行医学信息技术有限公司 | The binocular calibration localization method and virtual emulation of simulating medical instrument cure teaching system |
CN109118545B (en) * | 2018-07-26 | 2021-04-16 | 深圳市易尚展示股份有限公司 | Three-dimensional imaging system calibration method and system based on rotating shaft and binocular camera |
CN108969099B (en) * | 2018-07-26 | 2020-02-18 | 艾瑞迈迪医疗科技(北京)有限公司 | Correction method, surgical navigation system, electronic device and storage medium |
CN108992084B (en) * | 2018-09-07 | 2023-08-01 | 广东工业大学 | Method for imaging by using combination of CT system and ultrasonic system and CT-ultrasonic inspection equipment |
CN109171808A (en) * | 2018-09-07 | 2019-01-11 | 东南大学 | Three-dimension ultrasonic imaging system based on measuring three-dimensional profile |
CN109493386B (en) * | 2018-11-26 | 2022-05-06 | 刘伟民 | Surgical instrument transmission device based on image recognition and control method |
CN109480906A (en) * | 2018-12-28 | 2019-03-19 | 无锡祥生医疗科技股份有限公司 | Ultrasonic transducer navigation system and supersonic imaging apparatus |
EP3986279A4 (en) * | 2019-06-24 | 2023-06-28 | Dm1 Llc | Optical system and apparatus for instrument projection and tracking |
CN110532905B (en) * | 2019-08-13 | 2022-11-22 | 齐霄强 | Auxiliary operation and control method for operation images of self-loading and unloading mechanism |
CN113048878B (en) * | 2019-12-27 | 2023-08-29 | 苏州因确匹电子科技有限公司 | Optical positioning system and method and multi-view three-dimensional reconstruction system and method |
CN111260765B (en) * | 2020-01-13 | 2023-04-28 | 浙江未来技术研究院(嘉兴) | Dynamic three-dimensional reconstruction method for microsurgery field |
CN113436265A (en) * | 2020-03-08 | 2021-09-24 | 天津理工大学 | Binocular calibration device for thermal infrared imager and visible light camera and use method thereof |
CN113313754A (en) * | 2020-12-23 | 2021-08-27 | 南京凌华微电子科技有限公司 | Bone saw calibration method and system in surgical navigation |
CN112704514B (en) * | 2020-12-24 | 2021-11-02 | 重庆海扶医疗科技股份有限公司 | Focus positioning method and focus positioning system |
CN112833788B (en) * | 2021-01-07 | 2022-07-08 | 深圳许多米科技有限公司 | Gun body positioning method, device, equipment, simulation gun and readable storage medium |
CN112734776B (en) * | 2021-01-21 | 2023-04-18 | 中国科学院深圳先进技术研究院 | Minimally invasive surgical instrument positioning method and system |
CN112971991B (en) * | 2021-02-05 | 2022-11-11 | 上海电气集团股份有限公司 | Method and device for controlling movement of mechanical arm system |
CN113648060B (en) * | 2021-05-14 | 2024-02-27 | 上海交通大学 | Ultrasonic guided soft tissue deformation tracking method, device, storage medium and system |
CN115252992B (en) * | 2022-07-28 | 2023-04-07 | 北京大学第三医院(北京大学第三临床医学院) | Trachea cannula navigation system based on structured light stereoscopic vision |
CN117237268A (en) * | 2022-11-18 | 2023-12-15 | 杭州海康慧影科技有限公司 | Ultrasonic image processing method and device, electronic equipment and storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102860841A (en) * | 2012-09-25 | 2013-01-09 | 陈颀潇 | Aided navigation system and method of puncture operation under ultrasonic image |
CN103705307A (en) * | 2013-12-10 | 2014-04-09 | 中国科学院深圳先进技术研究院 | Surgical navigation system and medical robot |
CN104933718A (en) * | 2015-06-23 | 2015-09-23 | 广东省自动化研究所 | Physical coordinate positioning method based on binocular vision |
CN105942959A (en) * | 2016-06-01 | 2016-09-21 | 安翰光电技术(武汉)有限公司 | Capsule endoscope system and three-dimensional imaging method thereof |
-
2017
- 2017-03-28 CN CN201710190310.8A patent/CN106952347B/en not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102860841A (en) * | 2012-09-25 | 2013-01-09 | 陈颀潇 | Aided navigation system and method of puncture operation under ultrasonic image |
CN103705307A (en) * | 2013-12-10 | 2014-04-09 | 中国科学院深圳先进技术研究院 | Surgical navigation system and medical robot |
CN104933718A (en) * | 2015-06-23 | 2015-09-23 | 广东省自动化研究所 | Physical coordinate positioning method based on binocular vision |
CN105942959A (en) * | 2016-06-01 | 2016-09-21 | 安翰光电技术(武汉)有限公司 | Capsule endoscope system and three-dimensional imaging method thereof |
Non-Patent Citations (1)
Title |
---|
超声图像引导下介入式手术导航关键技术研究;朱永杰;《中国优秀硕士学位论文全文数据库》;20170315(第3期);第11-16页 * |
Also Published As
Publication number | Publication date |
---|---|
CN106952347A (en) | 2017-07-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106952347B (en) | Ultrasonic surgery auxiliary navigation system based on binocular vision | |
CN107456278B (en) | Endoscopic surgery navigation method and system | |
US10593052B2 (en) | Methods and systems for updating an existing landmark registration | |
CN109464196B (en) | Surgical navigation system adopting structured light image registration and registration signal acquisition method | |
US5999840A (en) | System and method of registration of three-dimensional data sets | |
US11986252B2 (en) | ENT image registration | |
US5792147A (en) | Video-based systems for computer assisted surgery and localisation | |
CN101474075B (en) | Navigation system of minimal invasive surgery | |
KR101572487B1 (en) | System and Method For Non-Invasive Patient-Image Registration | |
CN107689045B (en) | Image display method, device and system for endoscope minimally invasive surgery navigation | |
US20070276234A1 (en) | Systems and Methods for Intraoperative Targeting | |
US7369694B2 (en) | Method and medical device for the automatic determination of coordinates of images of marks in a volume dataset | |
US20110033100A1 (en) | Registration of three-dimensional image data to 2d-image-derived data | |
CN113974830B (en) | Surgical navigation system for ultrasonic guided thyroid tumor thermal ablation | |
CN110279467A (en) | Ultrasound image under optical alignment and information fusion method in the art of puncture biopsy needle | |
CN106236264B (en) | Gastrointestinal surgery navigation method and system based on optical tracking and image matching | |
CN114092480B (en) | Endoscope adjusting device, surgical robot and readable storage medium | |
CN114373003B (en) | Binocular vision-based passive infrared marking surgical instrument registration method | |
CN108113629B (en) | Hard tube endoscope rotation angle measuring method and device | |
EP3292835B1 (en) | Ent image registration | |
CN108143501B (en) | Anatomical projection method based on body surface vein features | |
US9386908B2 (en) | Navigation using a pre-acquired image | |
CN108836377A (en) | A kind of method of collecting device for outline and registered placement | |
Wang et al. | Ultrasound tracking using probesight: camera pose estimation relative to external anatomy by inverse rendering of a prior high-resolution 3d surface map | |
CN220193149U (en) | Surgical instrument calibration device for surgical navigation system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20191217 |
|
CF01 | Termination of patent right due to non-payment of annual fee |