CN106691500B - Ultrasonic puncture guide line imaging method based on automatic identification of puncture needle tip - Google Patents

Ultrasonic puncture guide line imaging method based on automatic identification of puncture needle tip Download PDF

Info

Publication number
CN106691500B
CN106691500B CN201510437528.XA CN201510437528A CN106691500B CN 106691500 B CN106691500 B CN 106691500B CN 201510437528 A CN201510437528 A CN 201510437528A CN 106691500 B CN106691500 B CN 106691500B
Authority
CN
China
Prior art keywords
puncture
ultrasonic
image
calculating
optical flow
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201510437528.XA
Other languages
Chinese (zh)
Other versions
CN106691500A (en
Inventor
李凯
郑荣琴
任杰
黄翰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Third Affiliated Hospital Sun Yat Sen University
Original Assignee
Third Affiliated Hospital Sun Yat Sen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Third Affiliated Hospital Sun Yat Sen University filed Critical Third Affiliated Hospital Sun Yat Sen University
Priority to CN201510437528.XA priority Critical patent/CN106691500B/en
Publication of CN106691500A publication Critical patent/CN106691500A/en
Application granted granted Critical
Publication of CN106691500B publication Critical patent/CN106691500B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The invention discloses an ultrasonic puncture guide wire imaging method based on automatic identification of a puncture needle tip, which comprises the following steps: (a) reading an ultrasonic video image; (b) selecting a tracking judgment area in an ultrasonic video image; (c) identifying a strong echo image part in the tracking discrimination area; (d) obtaining the length direction of the strong echo image; (e) drawing an extension line of the length direction of the strong echo image in the ultrasonic video image, and outputting the video image; (f) repeating steps (c) to (e) to update the position of the extension line in the video image. The ultrasonic puncture guide line imaging method based on the automatic identification of the puncture needle tip can effectively track the position of the puncture needle tool, draw the predicted line according to the actual position and the state of the puncture needle tool, enable the predicted line to be correspondingly changed in time and accurately according to the position adjustment of the puncture needle tool, and contribute to improving the accuracy and the flexibility of the puncture operation.

Description

Ultrasonic puncture guide line imaging method based on automatic identification of puncture needle tip
Technical Field
The invention relates to the field of computer vision, in particular to an auxiliary image imaging method for assisting a medical operation.
Background
The ultrasound is a real-time, simple and accurate imaging method, which is not only widely applied to the diagnosis of clinical diseases, but also is a main imaging means for guiding focus needle biopsy and tumor ablation treatment. During the ultrasonic guided biopsy and ablation treatment process, the puncture needle is mainly monitored and guided to reach a target position by utilizing the ultrasonic.
At present, there are 3 ways of ultrasound-guided puncture, (1) based on the guidance of a puncture guide, as shown in fig. 1, a needle slot 4 for guiding a puncture needle 3 is arranged on the puncture guide 2, the puncture guide 2 is provided with a clamp for mounting and fixing with an ultrasound probe 1, and the moving track of the puncture needle 3 is limited by the needle slot 3, so that the moving track of the needle 3 is fixed at the ultrasound imaging picture; therefore, according to the positional relationship between the needle and the ultrasound probe, a guide line showing the movement trajectory can be directly generated on the image, as shown by the arrow in fig. 2, to grasp the movement direction of the needle: the target lesion is placed on the puncture line, and then the puncture needle 3 is inserted through the needle groove 4 of the puncture guide frame 2, and can be punctured to the target position. The main drawbacks of such guiding are two: a. the actual moving track of the needle tool is related to the actual installation position of the guide frame, and when the installation of the guide frame is inconsistent with the preset installation position, the guide line cannot correctly indicate the moving direction of the needle tool; b. the movement of the needle tool is limited by the needle groove, the focus position may shift in the puncture process, and the reasons include the change of the body position of the patient, the respiration of the patient, the tissue shift caused by puncture itself, and the like, so the needle inserting direction of the puncture needle tool needs to be adjusted in real time in the puncture process, and the needle groove can obviously limit the direction adjustment of the puncture needle tool at the moment, and the accuracy of puncture is influenced. Therefore, this method is often suitable for a beginner to perform a puncturing operation which is generally difficult, and is not suitable for a puncturing operation which requires high accuracy.
(2) Based on the puncture probe guidance, the ultrasonic probe is designed and manufactured to manufacture a needle groove capable of guiding the puncture needle tool, the needle groove is directly fixed with the ultrasonic probe, so that the moving track of the puncture needle tool is fixed at an imaging picture, a guide line displaying the moving track is directly generated during imaging, the moving direction of the puncture needle tool is grasped through the guide line, and the puncture operation is completed. The method solves the problem that the mounting position of the puncture guide frame and the ultrasonic probe is inaccurate, but the practice of flexibly adjusting the direction of the needle tool cannot be realized, and the puncture accuracy is improved.
(3) The hand puncture does not need to fix the puncture needle tool by means of the puncture needle groove, and the hand puncture is fixed only by hand, so that the operation is relatively simple and convenient. And the puncture needle tool has flexible angle during bare-handed puncture, and is convenient to adjust when the target is shifted, so the puncture precision is high. However, the puncture line is not used as an indication when puncturing by bare hand, the operator is completely required to judge whether the needle can be punctured to a target position along the current direction by experience, and the needle inserting angle is required to be continuously adjusted in the needle inserting process, so that the purpose of accurate puncture is achieved, and therefore, the disadvantage of the bare-hand puncture is that the experience dependency is high.
Disclosure of Invention
In view of the above, the present invention provides an auxiliary method for flexibly adjusting a puncture needle and accurately indicating a movement track of the puncture needle.
In order to solve the technical problems, the technical scheme adopted by the invention is as follows: the ultrasonic puncture guide line imaging method based on the automatic identification of the puncture needle tip comprises the following steps:
(a) reading an ultrasonic video image;
(b) selecting a tracking judgment area in an ultrasonic video image;
(c) identifying a strong echo image part in the tracking discrimination area;
(d) drawing an extension line of the length direction of the strong echo image in the ultrasonic video image, and outputting the video image;
(e) repeating steps (c) to (d) to update the position of the extension line in the video image;
the puncture needle can be identified in the ultrasonic image in such a way, and the movement prediction line is drawn according to the movement position trend of the puncture needle, so that the puncture operation can be assisted according to the prediction line, and the flexibility is improved.
The further technical scheme is that in the step (b), the strong echo image part is identified by an image feature point acquisition algorithm.
In the step (b), the strong echo image part of the user-selected area is extracted by a FAST feature point detection algorithm.
Still further, the step (c) comprises
(c-1) calculating inter-frame motion using an optical flow method; predicting direction of motion
And (c-2) matching the target characteristic points in the previous frame according to the motion trend of the video.
The method effectively reduces the noise influence and enables the analysis of the strong echo image to be more accurate.
And (c-1) calculating the inter-frame motion by adopting an optical flow technology. Compared with other motion prediction technologies, the optical flow method does not need the premise of stable background, and can avoid the influence of image jitter in the video on the drawing of the prediction line. And the prediction of the motion direction is realized through the analysis of the optical flow field.
Still further technical solution is that the step (c-1) includes:
(c-11) calculating an optical flow field, and counting the direction and the amplitude of an optical flow in the optical flow field;
(c-12) clustering the optical streams using the isodata clustering method;
(c-13) if the clustering result obtained in the step (c-12) contains more than 1 classification, regarding the classification containing the largest quantity of optical flows as the optical flows caused by puncturing, regarding the rest classifications as noise, and removing the noise from the optical flow field;
and (c-14) calculating the screened optical flow field obtained in the step (c-12), calculating the mean value of the direction and the amplitude of the optical flow, and updating the motion direction and the motion amplitude if the mean value of the amplitude is greater than a minimum amplitude threshold value. Otherwise, the original movement direction and amplitude are kept.
The step (c-2) matches feature points of the corresponding region according to a motion trend of the optical flow field, and the step (c-2) includes:
(c-21) delimiting an area where the target may fall according to the moving direction and amplitude;
(c-22) calculating feature points of the region in the step (c-21);
(c-23) calculating the difference degree between the feature point obtained in the step (c-22) and the feature point of the target in the previous frame of image, and if the difference degree is smaller than the maximum difference threshold value, determining that the two are matched;
(c-24) calculating the Euclidean distance and direction between every two matched feature points obtained in the step (c-23);
and (c-25) calculating the variance between the Euclidean distance and the direction between the matched feature points obtained in the step (c-24), and if the variance is smaller than the maximum matching variance threshold value, selecting the point with the strongest intensity from the matched feature points, and updating the target position.
The method further overcomes the defect of unstable output and improves the anti-noise effect
The ultrasonic puncture guide line imaging method based on the automatic identification of the puncture needle tip can effectively track the position of the puncture needle tool, draw the predicted line according to the actual position and the state of the puncture needle tool, enable the predicted line to be correspondingly changed in time and accurately according to the position adjustment of the puncture needle tool, and contribute to improving the accuracy and the flexibility of the puncture operation.
The foregoing description is only an overview of the technical solutions of the present invention, and in order to make the technical means of the present invention more clearly understood, the present invention may be implemented in accordance with the content of the description, and in order to make the above and other objects, features, and advantages of the present invention more clearly understood, the following preferred embodiments are described in detail with reference to the accompanying drawings.
Drawings
Fig. 1 is a schematic diagram of a puncture guide frame based on the prior art.
Fig. 2 is a schematic diagram of a prior art ultrasound image based on a puncture guide.
FIG. 3 is a flow chart of the present invention.
Detailed Description
To further illustrate the technical means and effects of the present invention adopted to achieve the predetermined objects, the following detailed description of the embodiments, structures, features and effects according to the present invention with reference to the accompanying drawings and preferred embodiments is as follows:
as shown in fig. 3, the ultrasonic puncture guide wire imaging method based on the puncture needle tip automatic identification of the invention comprises the following steps:
(a) reading an ultrasonic video image;
(b) selecting a tracking judgment area in an ultrasonic video image;
(c) identifying a strong echo image part in the tracking discrimination area;
(d) drawing an extension line of the length direction of the strong echo image in the ultrasonic video image, and outputting the video image;
(e) repeating steps (c) to (d) to update the position of the extension line in the video image;
during the ultrasonic operation, can show the trend of puncture needle utensil (promptly) on the ultrasonic image according to the actual position and the angle of puncture needle utensil, can assist the operation doctor in time to adjust and prick needle angle, position, help improving the flexibility ratio and the accuracy of operation.
In the step (b), the strong echo image part is extracted by a FAST feature point detection algorithm. The needle point and the needle bar of the puncture needle tool are shown as obvious strong echoes in the ultrasonic image, the gray value can be obviously distinguished from other images (such as muscles, tissues and the like), and the FAST algorithm is characterized in that whether a target point is a characteristic point or not is determined by judging the protruding degree between the target point and surrounding points, so that the possible positions of the puncture needle tool can be quickly and effectively extracted by utilizing the FAST characteristic point detection algorithm. Specifically, in this embodiment, the FAST feature point calculation method is as follows:
Figure BDA0000765746710000041
wherein circle (x, y) is a set of points which are c pixels away from the point (x, y), and customarily c takes a value of 3, and the number of the pixel points of circle (x, y) is 16 at this time. I (x, y) represents the gray value of the point (x, y), and I (x ', y') represents the gray value of the point (x ', y'), i.e., the 16 pixel points of the circle (x, y) described above. Epsilon represents a preset gray level difference value and can be set according to the specific imaging capability of the ultrasonic equipment. The formula shows that the number of pixel points with gray difference value larger than epsilon (namely N) in 16 pixel points with the distance of 3 around the pixel point (x, y) is countedFAST). If N is presentFASTGreater than tcIf the value is not the feature point, the value is regarded as the feature point, otherwise, the value is not the feature point, and t is the feature point in the embodimentcSet to 12.
The FAST algorithm is only one embodiment of extracting an image, and in actual implementation, other operators may be used to extract feature points of the strong echo image, or the strong echo image may be directly identified by extracting a boundary (e.g., a gradient operator).
Specifically, in order to achieve better fitting effect, the step (c) comprises
(c-1) calculating inter-frame motion using an optical flow method; predicting direction of motion
And (c-2) matching the target characteristic points in the previous frame according to the motion trend of the video.
The method effectively reduces the noise influence and enables the analysis of the strong echo image to be more accurate.
And (c-1) calculating the inter-frame motion by adopting an optical flow technology. Compared with other motion prediction technologies, the optical flow method does not need the premise of stable background, and can avoid the influence of image jitter in the video on the drawing of the prediction line. And the prediction of the motion direction is realized through the analysis of the optical flow field.
Still further technical solution is that the step (c-1) includes:
(c-11) calculating an optical flow field, and counting the direction and the amplitude of an optical flow in the optical flow field;
(c-12) clustering the optical streams using the isodata clustering method;
(c-13) if the clustering result obtained in the step (c-12) contains more than 1 classification, regarding the classification containing the largest quantity of optical flows as the optical flows caused by puncturing, regarding the rest classifications as noise, and removing the noise from the optical flow field;
and (c-14) calculating the screened optical flow field obtained in the step (c-12), calculating the mean value of the direction and the amplitude of the optical flow, and updating the motion direction and the motion amplitude if the mean value of the amplitude is greater than a minimum amplitude threshold value. Otherwise, the original motion direction and amplitude are maintained, wherein the minimum amplitude threshold value is set as tm
The step (c-1) comprises clustering analysis of the optical flow field and analysis of the motion trend of the optical flow field. The video image is essentially continuous playing of a plurality of frames of images, and if the noise in the video is more, light streams with different directions and amplitudes appear to influence the analysis result of the motion trend of the optical flow field, so that the influence of the noise in the video on the drawing of the prediction line can be effectively reduced by introducing the cluster analysis of the optical flow field.
The step (c-2) matches feature points of the corresponding region according to a motion trend of the optical flow field, and the step (c-2) includes:
(c-21) delimiting an area where the target may fall according to the moving direction and amplitude;
(c-22) calculating feature points of the region in the step (c-21); specifically, the feature points may be calculated by the FAST algorithm as well;
(c-23) calculating the difference degree between the feature point obtained in the step (c-22) and the feature point of the target in the previous frame of image, and if the difference degree is smaller than the maximum difference threshold value, determining that the two are matched;
(c-24) calculating the Euclidean distance and direction between every two matched feature points obtained in the step (c-23);
(c-25) calculating the variance of the Euclidean distance and the direction between the matched feature points obtained in the step (c-24), if the variance is smaller than a maximum matching variance threshold value, selecting a point with the strongest intensity in the matched feature points, and updating the target position, wherein the maximum matching variance threshold value is tσ=5。
In step (c-23), the current frame is given to be located at (x)1,y1) Is located at (x) from the previous frame2,y2) The calculation method of the difference degree of the characteristic points comprises the following steps:
Figure BDA0000765746710000061
wherein f iskThe (x, y) is the intensity value of the pixel at the (x, y) point in the k frame image, N is the neighborhood size and is set as the number of the pixels occupied by the needle in the image, w is the weight matrix with the size of 2N +1 × 2N +1, and the calculation method is as follows:
Figure BDA0000765746710000062
maximum difference threshold tdiffLet 15 if the difference Diff between two feature points is less than tdiffThe two are considered to be matched, otherwise the two are considered to be not matched.
In the step (c-24), the distance and direction are calculated by:
Figure BDA0000765746710000063
Figure BDA0000765746710000064
wherein (x)1,y1)、(x2,y2) The coordinates of the two matched feature points are respectively.
The above embodiments are only preferred embodiments of the present invention, and the protection scope of the present invention is not limited thereby, and any insubstantial changes and substitutions made by those skilled in the art based on the present invention are within the protection scope of the present invention.

Claims (2)

1. The ultrasonic puncture guide line imaging method based on the automatic identification of the puncture needle tip is characterized in that: the method comprises the following steps:
(a) reading an ultrasonic video image;
(b) selecting a tracking judgment area in an ultrasonic video image;
(c) identifying a strong echo image part in the tracking discrimination area;
(d) drawing an extension line of the length direction of the strong echo image in the ultrasonic video image, and outputting the video image;
(e) repeating steps (c) to (d) to update the position of the extension line in the video image;
said step (c) comprises
(c-1) calculating inter-frame motion using an optical flow method; predicting a motion direction;
(c-2) matching the target feature points in the previous frame according to the motion trend of the video;
the step (c-2) comprises:
(c-21) delimiting an area where the target may fall according to the moving direction and amplitude;
(c-22) calculating feature points of the region in the step (c-21);
(c-23) calculating the difference degree between the feature point obtained in the step (c-22) and the feature point of the target in the previous frame of image, and if the difference degree is smaller than the maximum difference threshold value, determining that the two are matched;
(c-24) calculating the Euclidean distance and direction between every two matched feature points obtained in the step (c-23);
(c-25) calculating the variance of the Euclidean distance and the direction between the matched feature points obtained in the step (c-24), and if the variance is smaller than the maximum matching variance threshold value, selecting the point with the strongest intensity from the matched feature points and updating the target position;
in step (c-23), the current frame is given to be located at (x)1,y1) Is located at (x) from the previous frame2,y2) The calculation method of the difference degree of the characteristic points comprises the following steps:
Figure 164822DEST_PATH_IMAGE002
whereinf k (x, y) is the intensity value of the pixel at the (x, y) point in the k frame image;Nsetting the number of pixels occupied by the needle in the image as the neighborhood size; the calculation method of w is as follows:
Figure DEST_PATH_IMAGE003
maximum difference thresholdt diff Set as 15 if the difference degree of two feature pointsDiffIs less thant diff The two are considered to be matched, otherwise the two are considered to be not matched.
2. The method of claim 1, wherein the step of imaging the guide wire comprises the steps of: the step (c-1) comprises:
(c-11) calculating an optical flow field, and counting the direction and the amplitude of an optical flow in the optical flow field;
(c-12) clustering the optical streams using the isodata clustering method;
(c-13) if the clustering result obtained in the step (c-12) contains more than 1 classification, regarding the classification containing the largest quantity of optical flows as the optical flows caused by puncturing, regarding the rest classifications as noise, and removing the noise from the optical flow field;
and (c-14) calculating the screened optical flow field obtained in the step (c-13), calculating the mean value of the direction and the amplitude of the optical flow, updating the motion direction and the motion amplitude if the mean value of the amplitude is greater than a minimum amplitude threshold value, and otherwise, keeping the original motion direction and the original motion amplitude.
CN201510437528.XA 2015-07-23 2015-07-23 Ultrasonic puncture guide line imaging method based on automatic identification of puncture needle tip Expired - Fee Related CN106691500B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510437528.XA CN106691500B (en) 2015-07-23 2015-07-23 Ultrasonic puncture guide line imaging method based on automatic identification of puncture needle tip

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510437528.XA CN106691500B (en) 2015-07-23 2015-07-23 Ultrasonic puncture guide line imaging method based on automatic identification of puncture needle tip

Publications (2)

Publication Number Publication Date
CN106691500A CN106691500A (en) 2017-05-24
CN106691500B true CN106691500B (en) 2020-06-23

Family

ID=58894747

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510437528.XA Expired - Fee Related CN106691500B (en) 2015-07-23 2015-07-23 Ultrasonic puncture guide line imaging method based on automatic identification of puncture needle tip

Country Status (1)

Country Link
CN (1) CN106691500B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3636158A1 (en) * 2018-10-10 2020-04-15 Koninklijke Philips N.V. Image guidance for implanted lead extraction

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102961166A (en) * 2011-08-31 2013-03-13 通用电气公司 Method for detecting and tracing needle
CN103781426A (en) * 2011-09-08 2014-05-07 株式会社日立医疗器械 Ultrasound diagnostic device and ultrasound image display method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9226729B2 (en) * 2010-09-28 2016-01-05 Fujifilm Corporation Ultrasound diagnostic system, ultrasound image generation apparatus, and ultrasound image generation method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102961166A (en) * 2011-08-31 2013-03-13 通用电气公司 Method for detecting and tracing needle
CN103781426A (en) * 2011-09-08 2014-05-07 株式会社日立医疗器械 Ultrasound diagnostic device and ultrasound image display method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于光流法的运动目标检测与跟踪技术;裴巧娜;《北方工业大学硕士学位论文》;20091231;第3-4、6-12、14、27-30、48-49页 *

Also Published As

Publication number Publication date
CN106691500A (en) 2017-05-24

Similar Documents

Publication Publication Date Title
US9612142B2 (en) Method and system for measuring flow through a heart valve
WO2018035942A1 (en) Automatic tracking apparatus and method for tip of flexible puncture needle
US8303502B2 (en) Method and apparatus for tracking points in an ultrasound image
WO2019100212A1 (en) Ultrasonic system and method for planning ablation
US9256947B2 (en) Automatic positioning of imaging plane in ultrasonic imaging
US9237929B2 (en) System for guiding a medical instrument in a patient body
US11284855B2 (en) Ultrasound needle positioning system and ultrasound needle positioning method utilizing convolutional neural networks
JP5486449B2 (en) Ultrasonic image generating apparatus and method of operating ultrasonic image generating apparatus
US20100022871A1 (en) Device and method for guiding surgical tools
KR20030058423A (en) Method and apparatus for observing biopsy needle and guiding the same toward target object in three-dimensional ultrasound diagnostic system using interventional ultrasound
EP2730229B1 (en) Ultrasound system and method for providing guide line of needle
US10163239B2 (en) Computer-aided diagnostic apparatus and method based on diagnostic intention of user
CN107595390B (en) Real-time matching fusion method of ultrasonic image and CT image
RU2012103004A (en) METHOD AND DEVICE FOR TRACKING IN THE MEDICAL PROCEDURE
JP2011505951A (en) Robot ultrasound system with fine adjustment and positioning control using a feedback responsive to the acquired image data
EP2967424B1 (en) System and methods for processing a biopsy sample
US11666203B2 (en) Using a camera with an ENT tool
TW201347737A (en) Breast ultrasound scanning and diagnosis aid system
US20170084022A1 (en) Real Time Image Based Risk Assessment for an Instrument along a Path to a Target in an object
CN109620303A (en) A kind of lung's aided diagnosis method and device
CN106691500B (en) Ultrasonic puncture guide line imaging method based on automatic identification of puncture needle tip
CN114886521A (en) Device and method for determining the position of a puncture needle
US8663110B2 (en) Providing an optimal ultrasound image for interventional treatment in a medical system
US9538983B2 (en) Device for guiding a medical imaging probe and method for guiding such a probe
US20230177681A1 (en) Method for determining an ablation region based on deep learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200623