CN112137693B - Imaging method and device for four-dimensional ultrasonic guided puncture - Google Patents

Imaging method and device for four-dimensional ultrasonic guided puncture Download PDF

Info

Publication number
CN112137693B
CN112137693B CN202010935904.9A CN202010935904A CN112137693B CN 112137693 B CN112137693 B CN 112137693B CN 202010935904 A CN202010935904 A CN 202010935904A CN 112137693 B CN112137693 B CN 112137693B
Authority
CN
China
Prior art keywords
information
rendering
dimensional
target tissue
puncture needle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010935904.9A
Other languages
Chinese (zh)
Other versions
CN112137693A (en
Inventor
魏芳
丁浩
邢锐桐
孙瑞超
李彬
陈晶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Lanying Medical Technology Co ltd
Original Assignee
Shenzhen Lanying Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Lanying Medical Technology Co ltd filed Critical Shenzhen Lanying Medical Technology Co ltd
Priority to CN202010935904.9A priority Critical patent/CN112137693B/en
Publication of CN112137693A publication Critical patent/CN112137693A/en
Application granted granted Critical
Publication of CN112137693B publication Critical patent/CN112137693B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • A61B2017/3413Needle locating or guiding means guided by ultrasound

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Veterinary Medicine (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Public Health (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Image Generation (AREA)

Abstract

The application provides an imaging method and device for four-dimensional ultrasonic guided puncture, and the method is applied to the generation of a real-time three-dimensional ultrasonic image by acquiring real-time ultrasonic volume data to assist medical staff in clinical guided puncture; the ultrasonic volume data is a set of a plurality of time-continuous two-dimensional ultrasonic image data; the method comprises the following steps: determining integral space information in the ultrasonic volume data, and generating three-dimensional rendering space information according to the integral space information; determining puncture needle information in the ultrasonic volume data, and generating three-dimensional rendering puncture needle information according to the puncture needle information; determining target tissue area information in the ultrasonic volume data, and generating three-dimensional rendering target tissue area information according to the target tissue area information; and generating a three-dimensional ultrasonic image according to the three-dimensional rendering space information, the three-dimensional rendering puncture needle information and the three-dimensional rendering target tissue area information. The three-dimensional ultrasonic image can clearly and intuitively display the spatial position of the puncture needle and the spatial anatomical structure of the target and surrounding tissues.

Description

Imaging method and device for four-dimensional ultrasonic guided puncture
Technical Field
The application relates to the field of medical detection, in particular to a four-dimensional ultrasonic guided puncture imaging method and device.
Background
Puncture is mainly used for definite diagnosis, estimation of prognosis or stage of disease, auxiliary decision of treatment scheme and the like in clinical examination and diagnosis. Because of the difference of the human tissue structure, the safety and the effectiveness of the puncture are a great difficulty for doctors. By means of the auxiliary means of imaging technology, the puncture process and the surrounding tissue structures are visually displayed, so that the accuracy, safety and efficiency of puncture are improved to a great extent, the confidence of clinicians is enhanced, and the injury to patients is reduced. Compared with other imaging technologies (such as CT and MRI), the puncture under ultrasonic guidance has the advantages of real-time visualization, safety, no radiation, convenience in carrying, low price and the like, and shows great potential in clinical application, such as: clinical applications such as vascular puncture, nerve block, pain management, biopsy puncture, interventional therapy and the like can be guided by ultrasound, and a target can be safely and accurately found.
However, the current ultrasound guided puncture technique has limitations: the puncture needle image and the ultrasonic sound beam of the target tissue are not on the same plane, so that the phenomenon that the needle point of the puncture needle cannot be seen sometimes occurs, and a doctor needs to have higher coordination capability between a probe and the puncture needle; the strength of the echo signal reflected by the puncture needle is influenced by the emission angle of the ultrasonic sound beam, and the mirror reflection of the puncture needle can cause the unclear image of the needle point and influence the observation of a doctor.
The techniques proposed and adopted at present to overcome these deficiencies depend greatly on the professional background and operating skills of the sonographer, and have the following disadvantages: the puncture needle image and the 2D image of the target tissue are not on the same plane, and the puncture needle image which is not clear enough makes it difficult for a doctor to capture the position of the needle tip relative to the target tissue; although real-time three-dimensional imaging can directly capture puncture needle information in a scanning space, a better visualization method is not available so far for clearly displaying information of a puncture needle different from tissues so as to visually embody the spatial position information of the puncture needle, and the common method is to display needle point information on a plurality of two-dimensional imaging sections but still cannot embody the spatial position relation. These disadvantages all bring difficulties to clinical applications such as doctors dynamically tracking the insertion track of the puncture needle in real time, guiding the needle body to reach a target object with a shortest path, and avoiding the needle tip from damaging important tissues and organs.
Disclosure of Invention
In view of the above, the present application is directed to providing a four-dimensional ultrasound guided puncture imaging method and apparatus that overcomes or at least partially solves the above problems, comprising:
an imaging method of four-dimensional ultrasonic guided puncture is applied to assist medical staff in performing clinical guided puncture by acquiring real-time ultrasonic volume data to generate a real-time three-dimensional ultrasonic image; wherein the ultrasound volume data is a set of temporally continuous multiple two-dimensional ultrasound image data;
the method comprises the following steps:
determining integral space information in the ultrasonic volume data, and generating three-dimensional rendering space information according to the integral space information;
determining puncture needle information in the ultrasonic volume data, and generating three-dimensional rendering puncture needle information according to the puncture needle information;
determining target tissue area information in the ultrasonic volume data, and generating three-dimensional rendering target tissue area information according to the target tissue area information;
and generating the three-dimensional ultrasonic image according to the three-dimensional rendering space information, the three-dimensional rendering puncture needle information and the three-dimensional rendering target tissue area information.
Further, the step of determining overall spatial information in the ultrasound volume data and generating three-dimensional rendering spatial information according to the overall spatial information includes:
setting global gray scale information in the ultrasonic volume data as the overall space information;
and generating the three-dimensional rendering spatial information according to the color value and the opacity of each voxel point in the overall spatial information.
Further, the step of determining puncture needle information in the ultrasound volume data and generating three-dimensional rendering puncture needle information according to the puncture needle information includes:
determining a first position coordinate corresponding to a voxel point of which the gray scale information is greater than a preset threshold T in the ultrasonic volume data;
generating a spherical coordinate system corresponding to the ultrasonic volume data according to three-dimensional Hough transform;
determining a spherical coordinate parameter which is larger than a preset threshold th in the spherical coordinate system, and determining a second position coordinate of a voxel point of the ultrasonic volume data corresponding to the spherical coordinate parameter which is larger than the preset threshold th;
determining the puncture needle information according to the first position coordinate and the second position coordinate;
performing gray scale enhancement rendering on the voxel points corresponding to the puncture needle information, and performing gray scale inhibition rendering on the rest voxel points in the ultrasonic volume data;
and generating the three-dimensional rendering puncture needle information according to the voxel points corresponding to the puncture needle information after the gray scale enhancement rendering and the rest voxel points in the ultrasonic volume data after the gray scale suppression rendering.
Further, the step of performing gray scale enhancement rendering on the voxel points corresponding to the puncture needle information and performing gray scale suppression rendering on the remaining voxel points in the ultrasound volume data includes:
performing gray scale enhancement rendering on the voxel points corresponding to the puncture needle information according to the color values and the opacities of the voxel points corresponding to the puncture needle information;
and performing gray scale suppression rendering on the rest voxel points in the ultrasonic volume data according to the color values and the opacities of the rest voxel points in the ultrasonic volume data.
Further, the step of determining target tissue region information in the ultrasound volume data and generating three-dimensional rendering target tissue region information according to the target tissue region information includes:
eliminating voxel points corresponding to the puncture needle information in the ultrasonic volume data;
acquiring a target tissue type, and determining gray scale information and texture characteristics of the target tissue region according to the target tissue type;
determining the position information of a blood vessel area, a target tissue area and a noise area according to the gray scale information and the texture characteristics of the target tissue area;
performing gray scale enhancement rendering on the voxel points corresponding to the blood vessel region and the target tissue region, and performing gray scale inhibition rendering on the voxel points corresponding to the noise region;
and generating the three-dimensional rendering target tissue area information according to the voxel points corresponding to the blood vessel area and the target tissue area after gray scale enhancement rendering and the voxel points corresponding to the noise area after gray scale suppression rendering.
Further, the step of performing gray scale enhancement rendering on the voxel points corresponding to the blood vessel region and the target tissue region and performing gray scale suppression rendering on the voxel points corresponding to the noise region includes:
performing gray scale enhancement rendering on voxel points corresponding to the blood vessel area and the target tissue area according to the color values and the opacities of the voxel points corresponding to the blood vessel area and the target tissue area;
and performing gray scale inhibition rendering on the voxel point corresponding to the noise area according to the color value and the opacity of the voxel point corresponding to the noise area.
Further, the step of generating the three-dimensional ultrasound image according to the three-dimensional rendering space information, the three-dimensional rendering puncture needle information, and the three-dimensional rendering target tissue region information includes:
performing weighted fusion on the three-dimensional rendering space information, the three-dimensional rendering puncture needle information and the three-dimensional rendering target tissue area information, and generating a three-dimensional ultrasonic image; and the fusion weight of the three-dimensional rendering puncture needle information is greater than any one of the three-dimensional rendering space information and the three-dimensional rendering target tissue region information.
An imaging device for four-dimensional ultrasound-guided puncture is applied to assist medical staff in performing clinical guided puncture by acquiring real-time ultrasound volume data to generate a real-time three-dimensional ultrasound image; wherein the ultrasound volume data is a set of temporally continuous multiple two-dimensional ultrasound image data;
the method specifically comprises the following steps:
the three-dimensional rendering space information generating module is used for determining the whole space information in the ultrasonic volume data and generating three-dimensional rendering space information according to the whole space information;
the three-dimensional rendering puncture needle information generating module is used for determining puncture needle information in the ultrasonic volume data and generating three-dimensional rendering puncture needle information according to the puncture needle information;
the three-dimensional rendering target tissue area information generating module is used for determining target tissue area information in the ultrasonic volume data and generating three-dimensional rendering target tissue area information according to the target tissue area information;
and the three-dimensional ultrasonic image generation module is used for generating the three-dimensional ultrasonic image according to the three-dimensional rendering space information, the three-dimensional rendering puncture needle information and the three-dimensional rendering target tissue area information.
A computer device comprising a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the method of imaging of four-dimensional ultrasound guided puncture as described above.
A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method of imaging of a four-dimensional ultrasound-guided puncture as described above.
The application has the following advantages:
in the embodiment of the application, the three-dimensional rendering space information is generated by determining the whole space information in the ultrasonic volume data and according to the whole space information; determining puncture needle information in the ultrasonic volume data, and generating three-dimensional rendering puncture needle information according to the puncture needle information; determining target tissue area information in the ultrasonic volume data, and generating three-dimensional rendering target tissue area information according to the target tissue area information; and generating the three-dimensional ultrasonic image according to the three-dimensional rendering space information, the three-dimensional rendering puncture needle information and the three-dimensional rendering target tissue area information. Through analyzing and processing ultrasonic volume data, puncture needle information and target tissue area information can be automatically identified, puncture needle information and target tissue area are respectively enhanced and rendered, then global space information is rendered, and then intermediate results after rendering are fused to obtain a clear real-time three-dimensional ultrasonic image for ultrasonic guided puncture, so that the three-dimensional ultrasonic image can clearly and visually display the space position of the puncture needle and the space anatomical structure of a target object and surrounding tissues, and the space position of the puncture needle relative to the target tissue can be dynamically tracked and positioned without depending on other auxiliary positioning technologies, thereby shortening the puncture path.
Drawings
In order to more clearly illustrate the technical solutions of the present application, the drawings needed to be used in the description of the present application will be briefly introduced below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive labor.
Fig. 1 is a flowchart illustrating steps of a four-dimensional ultrasound-guided puncture imaging method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a volume rendering algorithm model of an imaging method for four-dimensional ultrasound-guided puncture according to an embodiment of the present application;
fig. 3 is a block diagram of an imaging apparatus for four-dimensional ultrasound-guided puncture according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a computer device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, features and advantages of the present application more comprehensible, the present application is described in further detail with reference to the accompanying drawings and the detailed description. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without making any creative effort belong to the protection scope of the present application.
It should be noted that, in any embodiment of the present invention, before performing the method steps disclosed in the present invention, the method further includes a series of steps: acquiring front-end data of an ultrasonic probe; and carrying out three-dimensional reconstruction calculation processing on the acquired front-end data, and carrying out three-dimensional post-processing on the front-end data after the three-dimensional reconstruction calculation processing so as to obtain the ultrasonic volume data in the method.
It should be noted that, in any embodiment of the present invention, the specific process of ultrasound-guided puncture may be: the medical staff places the volume probe at the examination part of the target to be punctured and preliminarily sets the puncture position and the puncture direction of the needle point; after obtaining a complete front end data containing puncture needle information, carrying out three-dimensional reconstruction calculation and three-dimensional post-processing on the volume data; and then the processed ultrasonic volume data is subjected to the steps of the method of the invention to finally obtain the three-dimensional ultrasonic image. The medical staff can monitor the anatomical structure of the target tissue area and the dynamic path of the puncture needle in real time according to the visual three-dimensional ultrasonic image, safely and effectively guide the needle point to reach the appointed target, and avoid or weaken unnecessary injury to the patient.
Referring to fig. 1, an imaging method of four-dimensional ultrasound-guided puncture provided by an embodiment of the present application is illustrated, which is applied to assist a medical care provider in performing clinical-guided puncture by acquiring real-time ultrasound volume data to generate a real-time three-dimensional ultrasound image; wherein the ultrasound volume data is a set of temporally continuous multiple two-dimensional ultrasound image data;
the method comprises the following steps:
s110, determining integral space information in the ultrasonic volume data, and generating three-dimensional rendering space information according to the integral space information;
s120, determining puncture needle information in the ultrasonic volume data, and generating three-dimensional rendering puncture needle information according to the puncture needle information;
s130, determining target tissue area information in the ultrasonic volume data, and generating three-dimensional rendering target tissue area information according to the target tissue area information;
s140, generating the three-dimensional ultrasonic image according to the three-dimensional rendering space information, the three-dimensional rendering puncture needle information and the three-dimensional rendering target tissue area information.
In the embodiment of the application, the three-dimensional rendering space information is generated by determining the whole space information in the ultrasonic volume data and according to the whole space information; determining puncture needle information in the ultrasonic volume data, and generating three-dimensional rendering puncture needle information according to the puncture needle information; determining target tissue area information in the ultrasonic volume data, and generating three-dimensional rendering target tissue area information according to the target tissue area information; and generating the three-dimensional ultrasonic image according to the three-dimensional rendering space information, the three-dimensional rendering puncture needle information and the three-dimensional rendering target tissue area information. Through analyzing and processing ultrasonic volume data, puncture needle information and target tissue area information can be automatically identified, puncture needle information and target tissue area are respectively enhanced and rendered, then global space information is rendered, and then intermediate results after rendering are fused to obtain a clear real-time three-dimensional ultrasonic image for ultrasonic guided puncture, so that the three-dimensional ultrasonic image can clearly and visually display the space position of the puncture needle and the space anatomical structure of a target object and surrounding tissues, and the space position of the puncture needle relative to the target tissue can be dynamically tracked and positioned without depending on other auxiliary positioning technologies, thereby shortening the puncture path.
Next, an imaging method of the four-dimensional ultrasound-guided puncture in the present exemplary embodiment will be further described.
As described in step S110, overall spatial information in the ultrasound volume data is determined, and three-dimensional rendering spatial information is generated according to the overall spatial information.
In an embodiment of the present invention, a specific process of "determining the whole spatial information in the ultrasound volume data and generating the three-dimensional rendering spatial information according to the whole spatial information" in step S110 may be further described with reference to the following description.
Setting global gray scale information in the ultrasonic volume data as the overall space information;
and generating the three-dimensional rendering spatial information according to the color value and the opacity of each voxel point in the overall spatial information as follows.
It should be noted that the above steps are mainly to perform volume rendering calculation on the global gray scale information in the ultrasound volume data to obtain the three-dimensional rendering space information. The rendering calculation involves algorithms including, but not limited to, a ray casting algorithm, a ray traveling algorithm, and a ray tracing algorithm.
Referring to fig. 2, as an example, a volume rendering calculation process performed in generating the three-dimensional rendering spatial information according to the color value and opacity of each voxel point in the overall spatial information is as follows:
setting the space position of an observation plane, emitting a ray from a certain pixel point of the observation plane, as shown in fig. 2, projecting the ray into a volume data space, rendering all 1~n individual voxel points (the voxel points are pixel points in the ultrasonic volume data) which pass through, wherein the characteristics of the voxel points are represented by color values color and opacity alpha,
assuming that the ray passes through the ith voxel point, the fused computational expression for opacity is as follows:
Figure 288799DEST_PATH_IMAGE001
(1)
in the formula (I), the compound is shown in the specification,
Figure 737098DEST_PATH_IMAGE002
denotes the first
Figure 782415DEST_PATH_IMAGE003
Opacity of individual pixels;
Figure 567837DEST_PATH_IMAGE004
represents the 1 st point to the first
Figure 647788DEST_PATH_IMAGE003
-
Figure 735830DEST_PATH_IMAGE006
Fusion calculation results of opacity of all voxel points between the points;
Figure 268443DEST_PATH_IMAGE007
represents the 1 st point to the first point
Figure 873868DEST_PATH_IMAGE003
The fusion of the opacity of all voxel points between the points is calculated.
The computational expression for rendering color is as follows:
Figure 808325DEST_PATH_IMAGE008
(2)
wherein the content of the first and second substances,
Figure 67268DEST_PATH_IMAGE009
a color value representing an ith voxel point;
Figure 713276DEST_PATH_IMAGE010
representing the rendering calculation result of the color values of all voxel points from the 1 st point to the (i-1) th point;
Figure 715867DEST_PATH_IMAGE011
and representing the rendering calculation results of the color values of all the voxel points from the 1 st point to the ith point.
Therefore, n individual pixel point information on the light is sequentially accessed from near to far (or from far to near) according to the light projection direction, and rendering calculation is performed by combining the information, so that the color value of the pixel point on the observation plane is obtained.
The above process is repeatedly performed on the voxel points in the global gray scale information in the ultrasound volume data, so as to obtain the color value of each corresponding pixel point on the observation plane, and therefore, the image information of the observation plane is the three-dimensional rendering result of the global gray scale information of the ultrasound volume data, that is, the three-dimensional rendering space information.
In step S120, puncture needle information in the ultrasound volume data is determined, and three-dimensional rendering puncture needle information is generated according to the puncture needle information.
It should be noted that, because the structural morphological characteristics of a general puncture needle are single, the acoustic characteristics are strong echo signals, and the shape is a linear shape; the structural morphological characteristics of the target tissue region are complex, and tissues such as cysts, tumors, blood vessels, nerves and the like may exist, such as cysts and blood vessels, and the acoustic characteristics of the target tissue region are weak echo signals and are represented as a cavity structure. Therefore, the puncture needle information and the target tissue region information can be identified in the ultrasonic volume data by the echo signal intensity difference and the structural morphological characteristic difference between the two.
Wherein, the determination of the puncture needle information can be based on a gray scale information value judgment method; or according to a three-dimensional Hough transform straight line detection method; it may also be a decision classification method combined with a maximum search.
In an embodiment of the present invention, a specific process of "determining the puncture needle information in the ultrasound volume data and generating three-dimensional rendering puncture needle information according to the puncture needle information" in step S120 may be further described with reference to the following description.
Determining a first position coordinate corresponding to a voxel point of which the gray scale information is greater than a preset threshold T in the ultrasonic volume data;
it should be noted that, because the puncture needle represents a straight line with a higher gray value in the ultrasound volume data, the identification of the spatial position of the puncture needle can be performed by feature extraction, wherein the puncture needle feature extraction method can be a threshold T which can be set for medical staff, an initial value of the threshold T is generally set as an average value of the first 20 maximum values of the gray scale values of the voxel points in the ultrasound volume data, and if the puncture needle information displayed visually is not obvious, the medical staff can manually adjust the value of the threshold T through an operation interface provided by the device.
The position coordinates corresponding to the part of the body element points with the gray scale information of the body element points larger than the threshold value T are set as puncture needle information, namely the first position coordinates, but the puncture needle information obtained in the mode has larger position fuzzy error, so the first position coordinates are only used as a crude extraction step in the puncture needle information determination process.
Generating a spherical coordinate system corresponding to the ultrasonic volume data according to three-dimensional Hough transform;
as an example, the scaling procedure for determining the spherical coordinate system is as follows:
setting a three-dimensional rectangular coordinate system space of the ultrasonic volume data, taking a hypothesis point p as an example, the hypothesis point p is a voxel point in the rectangular coordinate system, the spatial position in the rectangular coordinate system is (x, y, z), and the spherical coordinate position of the corresponding parameter space is (x, y, z)
Figure 504831DEST_PATH_IMAGE012
Wherein, in the step (A),
Figure 200255DEST_PATH_IMAGE013
representing the distance from the origin of coordinates to point p,
Figure 582826DEST_PATH_IMAGE014
representing the angle of the line from the origin of coordinates to point p with the z-axis,
Figure 123528DEST_PATH_IMAGE015
the included angle between the projection line of the connecting line from the coordinate origin to the point p and the xy plane and the x axis is represented, so that the equation expression of mapping the straight line of the three-dimensional rectangular coordinate system to the spherical coordinate system is as follows:
Figure 298158DEST_PATH_IMAGE016
(3)
the voxel points in the ultrasonic volume data are represented as corresponding trigonometric function curves in the spherical coordinate system, and the straight lines in the ultrasonic volume data are represented as points in the spherical coordinate system.
Initializing a spherical coordinate parameter space
Figure 23537DEST_PATH_IMAGE017
Wherein, in the step (A),
Figure 18038DEST_PATH_IMAGE018
is the position parameter of a straight line in the ultrasonic volume data in a spherical coordinate system,
Figure 362432DEST_PATH_IMAGE019
for statistically characterizing spherical coordinate parameters of a straight line
Figure 125988DEST_PATH_IMAGE018
The number of the corresponding voxel points in the ultrasonic volume data;
from all voxel points in the ultrasound volume data
Figure 38581DEST_PATH_IMAGE020
In the method, the method for finding the object satisfying the formula (3) in the spherical coordinate parameter space is screened out
Figure 254798DEST_PATH_IMAGE021
Coordinates and records
Figure 668462DEST_PATH_IMAGE022
Counting all equations (3) in the sphere coordinate parameter space
Figure 736843DEST_PATH_IMAGE019
The number of the cells.
Determining a spherical coordinate parameter which is larger than a preset threshold th in the spherical coordinate system, and determining a second position coordinate of a voxel point of the ultrasonic volume data corresponding to the spherical coordinate parameter which is larger than the preset threshold th;
the statistics can be calculated in order to filter out lines that interfere with the information of the puncture needle
Figure 944970DEST_PATH_IMAGE019
And setting the local maximum value as a threshold value
Figure 648484DEST_PATH_IMAGE023
And screening out
Figure 475626DEST_PATH_IMAGE024
And searching the coordinate position of each voxel point corresponding to the partial spherical coordinate parameter in the three-dimensional rectangular coordinate system space corresponding to the ultrasonic volume data, namely the second position coordinate.
Determining the puncture needle information according to the first position coordinate and the second position coordinate;
it should be noted that, by performing intersection processing on the first position coordinate and the second position coordinate, voxel points that are recorded with the first position coordinate and the second position coordinate at the same time are screened out, coordinate positions of a three-dimensional rectangular coordinate system space corresponding to the voxel points in the ultrasonic volume data are obtained, and then the coordinate positions of the voxel points are set as the puncture needle information, and the intersection processing is performed on position information that is obtained by two different methods, so that the obtaining precision of the puncture needle information is improved.
Performing gray scale enhancement rendering on the voxel points corresponding to the puncture needle information, and performing gray scale inhibition rendering on the rest voxel points in the ultrasonic volume data;
it should be noted that, the above steps are mainly to perform volume rendering calculation on the global gray scale information in the ultrasound volume data subjected to directional gray scale enhancement on the puncture needle information, that is, according to the set of the voxel points corresponding to the puncture needle information after gray scale enhancement rendering and the rest voxel points in the ultrasound volume data after gray scale suppression rendering, so as to obtain the three-dimensional rendering puncture needle information. The rendering calculation involves algorithms including, but not limited to, a ray casting algorithm, a ray traveling algorithm, and a ray tracing algorithm.
In an embodiment of the present invention, a specific process of performing gray scale enhancement rendering on the voxel points corresponding to the puncture needle information and performing gray scale suppression rendering on the remaining voxel points in the ultrasound volume data may be further described with reference to the following description.
Performing gray scale enhancement rendering on the voxel point corresponding to the puncture needle information according to the color value and the opacity of the voxel point corresponding to the puncture needle information;
and performing gray scale suppression rendering on the rest voxel points in the ultrasonic volume data according to the color values and the opacities of the rest voxel points in the ultrasonic volume data.
It should be noted that, the processes adopted by the method for performing gray-scale enhancement rendering on the voxel point corresponding to the puncture needle information and performing gray-scale suppression rendering on the remaining voxel points in the ultrasound volume data are substantially similar to those in the example of the embodiment for generating the three-dimensional rendering spatial information, the adopted calculation formulas are formula (1) and formula (2), and the relevant points of calculation can be described with reference to the example in the embodiment for generating the three-dimensional rendering spatial information.
And generating the three-dimensional rendering puncture needle information according to the voxel points corresponding to the puncture needle information after the gray scale enhancement rendering and the rest voxel points in the ultrasonic volume data after the gray scale suppression rendering.
It should be noted that the display identification degree of the puncture needle in the ultrasound volume data is improved by performing gray-scale enhancement rendering on the voxel points corresponding to the puncture needle information, the display identification degree of the voxel points other than the puncture needle is reduced by performing gray-scale suppression rendering on the rest voxel points in the ultrasound volume data, so that the display of the puncture needle is further highlighted, the puncture needle can be more obviously displayed in the three-dimensional ultrasound image output after subsequent data fusion by enhancing and suppressing different voxel points, and the display precision of the puncture needle in the three-dimensional ultrasound image is also improved.
Determining target tissue region information in the ultrasound volume data, and generating three-dimensional rendering target tissue region information according to the target tissue region information, as described in the step S130.
In an embodiment of the present invention, the specific process of "determining the target tissue region information in the ultrasound volume data and generating the three-dimensional rendering target tissue region information according to the target tissue region information" in step S130 can be further described with reference to the following description.
Eliminating voxel points corresponding to the puncture needle information in the ultrasonic volume data;
in addition, as described in the embodiment of generating the three-dimensional rendering puncture needle information, since the morphological structure of the puncture needle is different from the structural feature of the target tissue region, when the three-dimensional rendering target tissue region information is generated, the voxel points corresponding to the puncture needle information in the ultrasound volume data are first removed, and then the remaining ultrasound volume data is analyzed, so as to improve the efficiency of the subsequent analysis process. Wherein analyzing the remaining ultrasound volume data may be by identifying each tissue in the ultrasound volume data and performing enhancement processing on structural features of each tissue, respectively.
Specifically, the feature identification and classification method can be simple feature extraction based on gray scale distribution, can provide a settable threshold for medical personnel, and performs tissue classification according to the threshold; or a three-dimensional edge detection method; machine learning classification methods and the like are also possible.
Acquiring a target tissue type, and determining gray scale information and texture characteristics of the target tissue region according to the target tissue type;
it should be noted that, because the blood vessel is a weak echo signal, the mural membrane is a strong echo, and the wall membrane is morphologically represented as a cavity structure, and the suspected lesion tissue is generally represented as a dense sheet-shaped high echo region, the gray scale information and the texture features of the lesion region and the blood vessel region have obvious differences.
As an example, in a clinical diagnosis of liver-guided puncture, a target tissue region will contain a liver blood vessel and a suspected pathological liver tissue, and the blood vessel, the suspected pathological liver tissue and a noise region can be easily identified in the ultrasound volume data according to the difference between gray scale information and texture features.
Determining the position information of a blood vessel area, a target tissue area and a noise area according to the gray scale information and the texture characteristics of the target tissue area;
as an example, the determining process may specifically be:
roughly identifying the approximate outlines and spatial positions of the blood vessel tissues and the suspected lesion tissues in a three-dimensional image segmentation mode according to the ultrasonic volume data from which the puncture needle information is removed;
then, the spatial position of the ultrasonic volume data is further adjusted by utilizing a machine learning classification method (such as a K-Nearest Neighbor algorithm (K-Nearest Neighbor algorithm) of a simple classifier, KNN considers that K Nearest Neighbor voxel points in the ultrasonic volume data belong to a class), and the specific calculation steps are as follows:
generating a three-dimensional edge detection operator, carrying out three-dimensional filtering processing on the ultrasonic volume data, and marking edge information;
connecting edge body elements to obtain a three-dimensional connected curved surface, and combining gray scale information of the edge body elements to divide a plurality of tissue block areas;
carrying out KNN algorithm decision on voxel points of each tissue block area, calculating the distance between two voxel points, judging whether a similarity index exists or not, and avoiding matching errors, wherein the calculation expression of the distance is as follows:
Figure 213775DEST_PATH_IMAGE025
(4)
in the formula (I), the compound is shown in the specification,
Figure 592803DEST_PATH_IMAGE026
and
Figure 518034DEST_PATH_IMAGE027
is ultrasoundThe spatial coordinates of two voxel points in the volume data.
Performing gray scale enhancement rendering on the voxel points corresponding to the blood vessel region and the target tissue region, and performing gray scale suppression rendering on the voxel points corresponding to the noise region, as described in the following steps;
in an embodiment of the present invention, a specific process of performing gray scale enhancement rendering on the voxel points corresponding to the blood vessel region and the target tissue region and performing gray scale suppression rendering on the voxel points corresponding to the noise region may be further described in conjunction with the following description.
Performing gray scale enhancement rendering on voxel points corresponding to the blood vessel area and the target tissue area according to color values and opacities of the voxel points corresponding to the blood vessel area and the target tissue area;
it should be noted that, in the process of performing enhanced volume rendering calculation on various tissues, when light rays are projected to pass through each voxel point in volume data, the structural characteristics of the light rays are firstly judged, and if the light rays are blood vessel tissues, blood vessel enhancement processing is performed; if the tissue is suspected to be lesion tissue, enhancement treatment of the lesion tissue is carried out; if the area is a noise area, noise suppression processing is carried out;
among them, since the blood vessel has a low gray scale value and appears as a dark region, the brightness of the blood vessel can be increased to highlight the shape and position of the blood vessel. Specifically, in volume rendering calculation, the color value of any voxel point in the blood vessel is represented by subtracting the actual color value of the voxel point from the maximum value of the voxel point (the general maximum value is 1 or 255), and then the rendering calculation expression of the color value is:
Figure 398134DEST_PATH_IMAGE028
(5)
in the formula (I), the compound is shown in the specification,
Figure 725210DEST_PATH_IMAGE009
is shown as
Figure 275140DEST_PATH_IMAGE003
Color values of the individual voxel points;
Figure 953246DEST_PATH_IMAGE010
representing the rendering calculation result of the color values of all voxel points from the 1 st point to the (i-1) th point;
Figure 122191DEST_PATH_IMAGE011
represents the 1 st point to the first
Figure 569353DEST_PATH_IMAGE003
Rendering calculation results of color values of all voxel points between the points.
It should be noted that, the process adopted by the method for performing gray scale suppression rendering on the voxel point corresponding to the target tissue region is substantially similar to that in the example part in the foregoing embodiment for generating three-dimensional rendering spatial information, and the adopted calculation formula is formula (1) and formula (2), and the relevant points of calculation can be described with reference to the example part in the foregoing embodiment for generating three-dimensional rendering spatial information.
And performing gray scale suppression rendering on the voxel point corresponding to the noise area according to the color value and the opacity of the voxel point corresponding to the noise area.
It should be noted that, the process adopted by the method for performing gray scale suppression rendering on the voxel point corresponding to the noise region is substantially similar to that in the example part in the foregoing embodiment for generating three-dimensional rendering spatial information, and the adopted calculation formula is formula (1) and formula (2), and the relevant points of calculation may be described with reference to the example part in the foregoing embodiment for generating three-dimensional rendering spatial information.
Generating the three-dimensional rendering target tissue area information according to the voxel points corresponding to the blood vessel area and the target tissue area after gray scale enhancement rendering and the voxel points corresponding to the noise area after gray scale suppression rendering, as described in the following steps.
It should be noted that, the display identification degree of the blood vessel region and the target tissue region in the ultrasound volume data is improved by performing gray scale enhancement rendering on the voxel points corresponding to the blood vessel region and the target tissue region, and the display accuracy of the synthesized three-dimensional ultrasound image is improved by performing gray scale suppression rendering on the voxel points corresponding to the noise region, so that the sound of noise in the ultrasound volume data is reduced; by eliminating the rendering process of the voxel points corresponding to the puncture needle information, the display identification degree of the puncture needle is not influenced under the condition of improving the display identification degree of the blood vessel area and the target tissue area of the synthesized three-dimensional ultrasonic image, and the content in the synthesized three-dimensional ultrasonic image is reserved to the maximum extent.
In step S140, the three-dimensional ultrasound image is generated according to the three-dimensional rendering space information, the three-dimensional rendering puncture needle information, and the three-dimensional rendering target tissue region information.
In an embodiment of the present invention, a specific process of "generating the three-dimensional ultrasound image according to the three-dimensional rendering space information, the three-dimensional rendering puncture needle information, and the three-dimensional rendering target tissue region information" in step S140 may be further described with reference to the following description.
Performing weighted fusion on the three-dimensional rendering space information, the three-dimensional rendering puncture needle information and the three-dimensional rendering target tissue region information to generate a three-dimensional ultrasonic image; and the fusion weight of the three-dimensional rendering puncture needle information is greater than any one of the three-dimensional rendering space information and the three-dimensional rendering target tissue region information.
It should be noted that, the three-dimensional rendering result of the overall spatial information (i.e. the three-dimensional rendering spatial information), the three-dimensional rendering result enhanced by the puncture needle (i.e. the three-dimensional rendering puncture needle information), and the three-dimensional rendering result enhanced by the target tissue region (i.e. the three-dimensional rendering target tissue region information) are fused, and when the rendering map of the three fused three-dimensional anatomical structures includes both the puncture needle enhancement information and the enhancement information of each structural feature of the target tissue region, the rendering map also includes the three-dimensional ultrasound image of the relative spatial position relationship between the puncture needle and the target tissue region.
The fusion processing method can set fusion weights of three rendering intermediate results according to the distance of the light projection, wherein the closer to the observation plane, the larger the fusion weight value of the puncture needle and the target tissue area is; conversely, away from the observation plane, the smaller the fusion weight value is; in the volume rendering calculation, the weights of the three may be set according to the gray scale information or the structural features of the voxel points, and the weight corresponding to the puncture needle may be adjusted to be greater than the rest weights in order to highlight the display of the puncture needle.
Such as: and weighting and fusing the opacity and the color value calculated by ray projection in S110, the opacity and the color value calculated by ray projection in S120 and the opacity and the color value calculated by ray projection in S130, wherein the three-dimensional stereoscopic impression is considered by a weight setting rule, and the image resolution of the puncture needle and the target tissue region is also considered.
As an example, the weighting of the color value of the puncture needle information obtained in S120 may be performed
Figure 24605DEST_PATH_IMAGE029
And weighting the color values of the target tissue region information obtained in S130
Figure 190007DEST_PATH_IMAGE030
Is set to be the same as
Figure 647795DEST_PATH_IMAGE031
) And the weight of the overall spatial information obtained in S110 is
Figure 215043DEST_PATH_IMAGE032
The puncture needle and the target tissue information are considered at the same time, and the spatial three-dimensional information is also alternated. In setting the weight
Figure 841196DEST_PATH_IMAGE029
And
Figure 493894DEST_PATH_IMAGE030
in time, the value difference should not be too large, otherwise, the image resolution of the information with larger weight value is strengthenedThe image resolution of the information with a smaller weight value is weakened, which affects the detail display recognition of the synthesized three-dimensional ultrasound image.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
Referring to fig. 3, an imaging apparatus for four-dimensional ultrasound-guided puncture provided by an embodiment of the present application is illustrated, and the apparatus is applied to assist a medical care provider in performing clinical-guided puncture by acquiring real-time ultrasound volume data to generate a real-time three-dimensional ultrasound image; wherein the ultrasound volume data is a set of temporally continuous multiple two-dimensional ultrasound image data;
the method specifically comprises the following steps:
a three-dimensional rendering space information generating module 310, configured to determine overall space information in the ultrasound volume data, and generate three-dimensional rendering space information according to the overall space information;
a three-dimensional rendering puncture needle information generating module 320, configured to determine puncture needle information in the ultrasound volume data, and generate three-dimensional rendering puncture needle information according to the puncture needle information;
a three-dimensional rendering target tissue region information generating module 330, configured to determine target tissue region information in the ultrasound volume data, and generate three-dimensional rendering target tissue region information according to the target tissue region information;
the three-dimensional ultrasound image generating module 340 is configured to generate the three-dimensional ultrasound image according to the three-dimensional rendering space information, the three-dimensional rendering puncture needle information, and the three-dimensional rendering target tissue region information.
In an embodiment of the present invention, the three-dimensional rendering space information generating module 310 includes:
the whole space information setting submodule is used for setting the whole gray scale information in the ultrasonic volume data as the whole space information;
and the three-dimensional rendering space information generating submodule is used for generating the three-dimensional rendering space information according to the color value and the opacity of each voxel point in the whole space information.
In an embodiment of the present invention, the three-dimensional rendering puncture needle information generating module 320 includes:
the first position coordinate determination submodule is used for determining a first position coordinate corresponding to a voxel point of which the gray scale information is greater than a preset threshold value T in the ultrasonic volume data;
the spherical coordinate system generation submodule is used for generating a spherical coordinate system corresponding to the ultrasonic volume data according to three-dimensional Hough transformation;
the second position coordinate determination submodule is used for determining a ball coordinate parameter which is greater than a preset threshold th in the ball coordinate system and determining a second position coordinate of a voxel point of the ultrasonic volume data corresponding to the ball coordinate parameter which is greater than the preset threshold th;
the puncture needle information determining submodule is used for determining puncture needle information according to the first position coordinate and the second position coordinate;
the first voxel point rendering submodule is used for performing gray scale enhancement rendering on the voxel points corresponding to the puncture needle information and performing gray scale inhibition rendering on the rest voxel points in the ultrasonic volume data;
and the three-dimensional rendering puncture needle information generation submodule is used for generating the three-dimensional rendering puncture needle information according to the voxel points corresponding to the puncture needle information after the gray scale enhancement rendering and the rest voxel points in the ultrasonic volume data after the gray scale suppression rendering.
In an embodiment of the present invention, the first voxel point rendering sub-module includes:
the first gray scale enhancement rendering submodule is used for carrying out gray scale enhancement rendering on the voxel point corresponding to the puncture needle information according to the color value and the opacity of the voxel point corresponding to the puncture needle information;
and the first gray scale inhibition rendering submodule is used for carrying out gray scale inhibition rendering on the rest voxel points in the ultrasonic volume data according to the color values and the opaqueness of the rest voxel points in the ultrasonic volume data.
In an embodiment of the present invention, the three-dimensional rendering target tissue region information generating module 330 includes:
the puncture needle information removing sub-module is used for removing voxel points corresponding to the puncture needle information in the ultrasonic volume data;
the gray scale information and textural feature determination submodule is used for acquiring a target tissue type and determining the gray scale information and textural features of the target tissue area according to the target tissue type;
the blood vessel area, the target tissue area and the noise area are used for determining the position information of the blood vessel area, the target tissue area and the noise area according to the gray scale information and the texture characteristics of the target tissue area;
a second voxel point rendering submodule for performing gray scale enhancement rendering on the voxel points corresponding to the blood vessel region and the target tissue region and performing gray scale suppression rendering on the voxel points corresponding to the noise region;
and the three-dimensional rendering target tissue area information generating submodule is used for generating the three-dimensional rendering target tissue area information according to the voxel points corresponding to the blood vessel area and the target tissue area after gray scale enhancement rendering and the voxel points corresponding to the noise area after gray scale suppression rendering.
In an embodiment of the present invention, the second voxel rendering sub-module includes:
the second gray scale enhancement rendering submodule is used for performing gray scale enhancement rendering on voxel points corresponding to the blood vessel area and the target tissue area according to the color values and the opacities of the voxel points corresponding to the blood vessel area and the target tissue area;
and the second gray scale inhibition rendering submodule is used for performing gray scale inhibition rendering on the voxel point corresponding to the noise area according to the color value and the opacity of the voxel point corresponding to the noise area.
In an embodiment of the present invention, the three-dimensional ultrasound image generation module 340 includes:
the three-dimensional ultrasonic image generation submodule is used for performing weighted fusion on the three-dimensional rendering space information, the three-dimensional rendering puncture needle information and the three-dimensional rendering target tissue area information and generating the three-dimensional ultrasonic image; and the fusion weight of the three-dimensional rendering puncture needle information is greater than any one of the three-dimensional rendering space information and the three-dimensional rendering target tissue region information.
Referring to fig. 4, a computer device of an imaging method for four-dimensional ultrasound-guided puncture according to the present invention is shown, which may specifically include the following:
the computer device 12 described above is embodied in the form of a general purpose computing device, and the components of the computer device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, and a bus 18 that couples various system components including the system memory 28 and the processing unit 16.
Bus 18 represents one or more of any of several types of bus 18 structures, including a memory bus 18 or memory controller, a peripheral bus 18, an accelerated graphics port, and a processor or local bus 18 using any of a variety of bus 18 architectures. By way of example, such architectures include, but are not limited to, industry Standard Architecture (ISA) bus 18, micro-channel architecture (MAC) bus 18, enhanced ISA bus 18, audio Video Electronics Standards Association (VESA) local bus 18, and Peripheral Component Interconnect (PCI) bus 18.
Computer device 12 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by computer device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM) 30 and/or cache memory 32. Computer device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (commonly referred to as a "hard disk drive"). Although not shown in FIG. 4, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. The memory may include at least one program product having a set (e.g., at least one) of program modules 42, with the program modules 42 configured to carry out the functions of embodiments of the invention.
A program/utility 40 having a set (at least one) of program modules 42 may be stored, for example, in memory, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules 42, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 42 generally carry out the functions and/or methodologies of the described embodiments of the invention.
Computer device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, camera, etc.), with one or more devices that enable a healthcare worker to interact with computer device 12, and/or with any devices (e.g., network card, modem, etc.) that enable computer device 12 to communicate with one or more other computing devices. Such communication may be through an input/output (I/O) interface 22. Also, computer device 12 may communicate with one or more networks (e.g., a Local Area Network (LAN)), a Wide Area Network (WAN), and/or a public network (e.g., the Internet) via network adapter 20. As shown, the network adapter 20 communicates with the other modules of the computer device 12 over the bus 18. It should be appreciated that although not shown in FIG. 4, other hardware and/or software modules may be used in conjunction with computer device 12, including but not limited to: microcode, device drivers, redundant processing units 16, external disk drive arrays, RAID systems, tape drives, and data backup storage systems 34, etc.
The processing unit 16 executes programs stored in the system memory 28 to perform various functional applications and data processing, such as implementing a four-dimensional ultrasound-guided puncture imaging method provided by an embodiment of the present invention.
That is, the processing unit 16 implements, when executing the program,: determining integral space information in the ultrasonic volume data, and generating three-dimensional rendering space information according to the integral space information; determining puncture needle information in the ultrasonic volume data, and generating three-dimensional rendering puncture needle information according to the puncture needle information; determining target tissue area information in the ultrasonic volume data, and generating three-dimensional rendering target tissue area information according to the target tissue area information; and generating the three-dimensional ultrasonic image according to the three-dimensional rendering space information, the three-dimensional rendering puncture needle information and the three-dimensional rendering target tissue area information.
In an embodiment of the present invention, the present invention further provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor, implements the imaging method of four-dimensional ultrasound guided puncture as provided in all embodiments of the present application:
that is, the program when executed by the processor implements: determining integral space information in the ultrasonic volume data, and generating three-dimensional rendering space information according to the integral space information; determining puncture needle information in the ultrasonic volume data, and generating three-dimensional rendering puncture needle information according to the puncture needle information; determining target tissue area information in the ultrasonic volume data, and generating three-dimensional rendering target tissue area information according to the target tissue area information; and generating the three-dimensional ultrasonic image according to the three-dimensional rendering space information, the three-dimensional rendering puncture needle information and the three-dimensional rendering target tissue area information.
Any combination of one or more computer-readable media may be employed. The computer readable medium may be a computer-readable storage medium or a computer-readable signal medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPOM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the healthcare worker computer, partly on the healthcare worker computer, as a stand-alone software package, partly on the healthcare worker computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the healthcare worker's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
While preferred embodiments of the present application have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including the preferred embodiment and all such alterations and modifications as fall within the true scope of the embodiments of the application.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising a … …" does not exclude the presence of another identical element in a process, method, article, or terminal apparatus that comprises the element.
The imaging method and device for four-dimensional ultrasound-guided puncture provided by the present application are introduced in detail above, and specific examples are applied in the present application to explain the principle and the implementation of the present application, and the description of the above embodiments is only used to help understand the method and the core idea of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (7)

1. An imaging method of four-dimensional ultrasonic guided puncture is characterized in that the method is applied to generate a real-time visualized three-dimensional ultrasonic image by acquiring real-time ultrasonic volume data to assist medical staff in carrying out clinical guided puncture; wherein the ultrasound volume data is a set of temporally continuous multiple two-dimensional ultrasound image data;
the method comprises the following steps:
determining integral space information in the ultrasonic volume data, and generating three-dimensional rendering space information according to the integral space information; specifically, global gray scale information in the ultrasound volume data is set as the global spatial information; generating the three-dimensional rendering spatial information according to the color value and the opacity of each voxel point in the overall spatial information;
determining puncture needle information in the ultrasonic volume data, and generating three-dimensional rendering puncture needle information according to the puncture needle information; specifically, determining a first position coordinate corresponding to a voxel point of which the gray scale information is greater than a preset threshold T in the ultrasonic volume data; generating a spherical coordinate system corresponding to the ultrasonic volume data according to three-dimensional Hough transform; determining a spherical coordinate parameter which is larger than a preset threshold th in the spherical coordinate system, and determining a second position coordinate of a voxel point of the ultrasonic volume data corresponding to the spherical coordinate parameter which is larger than the preset threshold th; determining the puncture needle information according to the first position coordinate and the second position coordinate; performing gray scale enhancement rendering on the voxel points corresponding to the puncture needle information, and performing gray scale inhibition rendering on the rest voxel points in the ultrasonic volume data; generating three-dimensional rendering puncture needle information according to voxel points corresponding to the puncture needle information after gray scale enhancement rendering and other voxel points in the ultrasonic volume data after gray scale suppression rendering;
determining target tissue area information in the ultrasonic volume data, and generating three-dimensional rendering target tissue area information according to the target tissue area information; specifically, voxel points corresponding to the puncture needle information in the ultrasonic volume data are removed; acquiring a target tissue type, and determining gray scale information and texture characteristics of the target tissue region according to the target tissue type; determining the position information of a blood vessel area, a target tissue area and a noise area according to the gray scale information and the texture characteristics of the target tissue area; performing gray scale enhancement rendering on the voxel points corresponding to the blood vessel region and the target tissue region, and performing gray scale inhibition rendering on the voxel points corresponding to the noise region; generating the three-dimensional rendering target tissue area information according to the voxel points corresponding to the blood vessel area and the target tissue area after gray scale enhancement rendering and the voxel points corresponding to the noise area after gray scale suppression rendering;
and generating the three-dimensional ultrasonic image according to the three-dimensional rendering space information, the three-dimensional rendering puncture needle information and the three-dimensional rendering target tissue area information.
2. The method of claim 1, wherein the step of performing a gray scale enhancement rendering of the voxel points corresponding to the puncture needle information and a gray scale suppression rendering of the remaining voxel points in the ultrasound volume data comprises:
performing gray scale enhancement rendering on the voxel point corresponding to the puncture needle information according to the color value and the opacity of the voxel point corresponding to the puncture needle information;
and performing gray scale suppression rendering on the rest voxel points in the ultrasonic volume data according to the color values and the opacities of the rest voxel points in the ultrasonic volume data.
3. The method of claim 1, wherein said step of gray scale enhancement rendering said voxel points corresponding to said vascular zone and said target tissue zone and gray scale suppression rendering voxel points corresponding to said noise zone comprises:
performing gray scale enhancement rendering on voxel points corresponding to the blood vessel area and the target tissue area according to the color values and the opacities of the voxel points corresponding to the blood vessel area and the target tissue area;
and performing gray scale inhibition rendering on the voxel point corresponding to the noise area according to the color value and the opacity of the voxel point corresponding to the noise area.
4. The method of claim 1, wherein the step of generating the three-dimensional ultrasound image based on the three-dimensional rendering space information, the three-dimensional rendering puncture needle information, and the three-dimensional rendering target tissue region information comprises:
performing weighted fusion on the three-dimensional rendering space information, the three-dimensional rendering puncture needle information and the three-dimensional rendering target tissue area information, and generating a three-dimensional ultrasonic image; and the fusion weight of the three-dimensional rendering puncture needle information is greater than any one of the three-dimensional rendering space information and the three-dimensional rendering target tissue region information.
5. An imaging device for four-dimensional ultrasound-guided puncture is characterized in that the device is applied to generate a real-time visualized three-dimensional ultrasound image by acquiring real-time ultrasound volume data to assist medical staff in performing clinical guided puncture; wherein the ultrasound volume data is a set of temporally continuous multiple two-dimensional ultrasound image data;
the method specifically comprises the following steps:
the three-dimensional rendering space information generating module is used for determining the whole space information in the ultrasonic volume data and generating three-dimensional rendering space information according to the whole space information; specifically, global gray scale information in the ultrasound volume data is set as the global spatial information; generating the three-dimensional rendering spatial information according to the color value and the opacity of each voxel point in the overall spatial information;
the three-dimensional rendering puncture needle information generating module is used for determining puncture needle information in the ultrasonic volume data and generating three-dimensional rendering puncture needle information according to the puncture needle information; specifically, determining a first position coordinate corresponding to a voxel point of which the gray scale information is greater than a preset threshold T in the ultrasonic volume data; generating a spherical coordinate system corresponding to the ultrasonic volume data according to three-dimensional Hough transform; determining a spherical coordinate parameter which is larger than a preset threshold th in the spherical coordinate system, and determining a second position coordinate of a voxel point of the ultrasonic volume data corresponding to the spherical coordinate parameter which is larger than the preset threshold th; determining the puncture needle information according to the first position coordinate and the second position coordinate; performing gray scale enhancement rendering on the voxel points corresponding to the puncture needle information, and performing gray scale inhibition rendering on the rest voxel points in the ultrasonic volume data; generating three-dimensional rendering puncture needle information according to voxel points corresponding to the puncture needle information after gray scale enhancement rendering and other voxel points in the ultrasonic volume data after gray scale suppression rendering;
the three-dimensional rendering target tissue area information generating module is used for determining target tissue area information in the ultrasonic volume data and generating three-dimensional rendering target tissue area information according to the target tissue area information; specifically, voxel points corresponding to the puncture needle information in the ultrasonic volume data are removed; acquiring a target tissue type, and determining gray scale information and texture characteristics of the target tissue region according to the target tissue type; determining the position information of a blood vessel area, a target tissue area and a noise area according to the gray scale information and the texture characteristics of the target tissue area; performing gray scale enhancement rendering on the voxel points corresponding to the blood vessel region and the target tissue region, and performing gray scale inhibition rendering on the voxel points corresponding to the noise region; generating the three-dimensional rendering target tissue area information according to the voxel points corresponding to the blood vessel area and the target tissue area after gray scale enhancement rendering and the voxel points corresponding to the noise area after gray scale suppression rendering;
and the three-dimensional ultrasonic image generation module is used for generating the three-dimensional ultrasonic image according to the three-dimensional rendering space information, the three-dimensional rendering puncture needle information and the three-dimensional rendering target tissue area information.
6. A computer arrangement comprising a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the method of any one of claims 1 to 4.
7. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 4.
CN202010935904.9A 2020-09-08 2020-09-08 Imaging method and device for four-dimensional ultrasonic guided puncture Active CN112137693B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010935904.9A CN112137693B (en) 2020-09-08 2020-09-08 Imaging method and device for four-dimensional ultrasonic guided puncture

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010935904.9A CN112137693B (en) 2020-09-08 2020-09-08 Imaging method and device for four-dimensional ultrasonic guided puncture

Publications (2)

Publication Number Publication Date
CN112137693A CN112137693A (en) 2020-12-29
CN112137693B true CN112137693B (en) 2023-01-03

Family

ID=73889988

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010935904.9A Active CN112137693B (en) 2020-09-08 2020-09-08 Imaging method and device for four-dimensional ultrasonic guided puncture

Country Status (1)

Country Link
CN (1) CN112137693B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114968952B (en) * 2022-05-11 2023-06-16 沈阳东软智能医疗科技研究院有限公司 Medical image data compression method, rendering method, device and medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103889337A (en) * 2012-10-23 2014-06-25 株式会社东芝 Ultrasonic diagnostic device and ultrasonic diagnostic device control method
CN104434273A (en) * 2014-12-16 2015-03-25 深圳市开立科技有限公司 Enhanced display method, device and system of puncture needle
CN106236133A (en) * 2015-06-12 2016-12-21 三星麦迪森株式会社 For the method and apparatus showing ultrasonoscopy
CN110087553A (en) * 2017-05-24 2019-08-02 深圳迈瑞生物医疗电子股份有限公司 Ultrasonic device and its three-dimensional ultrasound pattern display methods
CN111281423A (en) * 2018-12-07 2020-06-16 深圳迈瑞生物医疗电子股份有限公司 Ultrasonic image optimization method and ultrasonic imaging equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103889337A (en) * 2012-10-23 2014-06-25 株式会社东芝 Ultrasonic diagnostic device and ultrasonic diagnostic device control method
CN104434273A (en) * 2014-12-16 2015-03-25 深圳市开立科技有限公司 Enhanced display method, device and system of puncture needle
CN106236133A (en) * 2015-06-12 2016-12-21 三星麦迪森株式会社 For the method and apparatus showing ultrasonoscopy
CN110087553A (en) * 2017-05-24 2019-08-02 深圳迈瑞生物医疗电子股份有限公司 Ultrasonic device and its three-dimensional ultrasound pattern display methods
CN111281423A (en) * 2018-12-07 2020-06-16 深圳迈瑞生物医疗电子股份有限公司 Ultrasonic image optimization method and ultrasonic imaging equipment

Also Published As

Publication number Publication date
CN112137693A (en) 2020-12-29

Similar Documents

Publication Publication Date Title
US9495794B2 (en) Three-dimensional image display apparatus, method, and program
US10347033B2 (en) Three-dimensional image display apparatus, method, and program
CN107111875B (en) Feedback for multi-modal auto-registration
CN107809955B (en) Real-time collimation and ROI-filter localization in X-ray imaging via automatic detection of landmarks of interest
Wei et al. Segmentation of lung lobes in high-resolution isotropic CT images
JP2015047506A (en) Method and apparatus for registering medical images
US20190192229A1 (en) System and method for guiding invasive medical treatment procedures based upon enhanced contrast-mode ultrasound imaging
JP2019528881A (en) Apparatus and method for detecting interventional tools
WO2016064921A1 (en) Automatic detection of regions of interest in 3d space
CN111145160A (en) Method, device, server and medium for determining coronary artery branch where calcified area is located
WO2019104241A1 (en) System and method for guiding invasive medical treatment procedures based upon enhanced contrast-mode ultrasound imaging
Cao et al. Automated catheter detection in volumetric ultrasound
CN112137693B (en) Imaging method and device for four-dimensional ultrasonic guided puncture
CN109919953B (en) Method, system and apparatus for carotid intima-media thickness measurement
CN113012118B (en) Image processing method and image processing apparatus
US7653225B2 (en) Method and system for ground glass nodule (GGN) segmentation with shape analysis
CN112116623B (en) Image segmentation method and device
Kalaiselvi et al. Brain tumor boundary detection by edge indication map using Bi-Modal fuzzy histogram thresholding technique from MRI T2-weighted scans
Klein et al. Visual computing for medical diagnosis and treatment
Pourtaherian et al. Benchmarking of state-of-the-art needle detection algorithms in 3D ultrasound data volumes
WO2021081839A1 (en) Vrds 4d-based method for analysis of condition of patient, and related products
JP2001216517A (en) Object recognition method
Memiş et al. Fast and accurate registration of the proximal femurs in bilateral hip joint images by using the random sub-sample points
CN108366779B (en) Device and method for detecting tool
Sakellarios et al. IVUS image processing methodologies

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: No.103, baguang District Service Center, No.2 BaiShaWan Road, baguang community, Kuiyong street, Dapeng New District, Shenzhen, Guangdong 518000

Applicant after: Shenzhen Lanying Medical Technology Co.,Ltd.

Address before: 518000 1st floor, building B, jingchengda Industrial Park, Keji 4th Road, Langxin community, Shiyan street, Bao'an District, Shenzhen City, Guangdong Province

Applicant before: SHENZHEN LANYUN MEDICAL IMAGE CO.,LTD.

GR01 Patent grant
GR01 Patent grant