CN111340861A - Prostate nuclear magnetic ultrasonic image registration fusion method - Google Patents
Prostate nuclear magnetic ultrasonic image registration fusion method Download PDFInfo
- Publication number
- CN111340861A CN111340861A CN202010079085.2A CN202010079085A CN111340861A CN 111340861 A CN111340861 A CN 111340861A CN 202010079085 A CN202010079085 A CN 202010079085A CN 111340861 A CN111340861 A CN 111340861A
- Authority
- CN
- China
- Prior art keywords
- image
- prostate
- contour
- nuclear magnetic
- points
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 210000002307 prostate Anatomy 0.000 title claims abstract description 82
- 238000007500 overflow downdraw method Methods 0.000 title claims abstract description 15
- 230000011218 segmentation Effects 0.000 claims abstract description 38
- 238000002604 ultrasonography Methods 0.000 claims abstract description 29
- 238000013178 mathematical model Methods 0.000 claims abstract description 17
- 238000012549 training Methods 0.000 claims abstract description 16
- 238000007637 random forest analysis Methods 0.000 claims abstract description 9
- 230000006870 function Effects 0.000 claims description 27
- 238000004422 calculation algorithm Methods 0.000 claims description 16
- 238000005457 optimization Methods 0.000 claims description 10
- 230000002068 genetic effect Effects 0.000 claims description 5
- 238000005070 sampling Methods 0.000 claims description 4
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 claims description 3
- 238000006243 chemical reaction Methods 0.000 claims description 3
- 239000011159 matrix material Substances 0.000 claims description 3
- 230000009466 transformation Effects 0.000 claims description 3
- 238000013519 translation Methods 0.000 claims description 3
- 238000002372 labelling Methods 0.000 claims description 2
- 238000010023 transfer printing Methods 0.000 claims description 2
- 210000004907 gland Anatomy 0.000 claims 1
- 238000010801 machine learning Methods 0.000 claims 1
- 238000003384 imaging method Methods 0.000 abstract description 5
- 238000004195 computer-aided diagnosis Methods 0.000 abstract description 2
- 238000003709 image segmentation Methods 0.000 abstract 1
- 238000000034 method Methods 0.000 description 17
- 230000004927 fusion Effects 0.000 description 6
- 206010060862 Prostate cancer Diseases 0.000 description 5
- 208000000236 Prostatic Neoplasms Diseases 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 238000003745 diagnosis Methods 0.000 description 4
- 108010072866 Prostate-Specific Antigen Proteins 0.000 description 3
- 102100038358 Prostate-specific antigen Human genes 0.000 description 3
- 238000001574 biopsy Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 239000000203 mixture Substances 0.000 description 2
- 230000005311 nuclear magnetism Effects 0.000 description 2
- 208000012661 Dyskinesia Diseases 0.000 description 1
- 208000015592 Involuntary movements Diseases 0.000 description 1
- 206010028980 Neoplasm Diseases 0.000 description 1
- 201000011510 cancer Diseases 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- PCHJSUWPFVWCPO-UHFFFAOYSA-N gold Chemical compound [Au] PCHJSUWPFVWCPO-UHFFFAOYSA-N 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000003902 lesion Effects 0.000 description 1
- 238000002595 magnetic resonance imaging Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000017311 musculoskeletal movement, spinal reflex action Effects 0.000 description 1
- 238000013188 needle biopsy Methods 0.000 description 1
- 208000023958 prostate neoplasm Diseases 0.000 description 1
- 210000000664 rectum Anatomy 0.000 description 1
- 230000029058 respiratory gaseous exchange Effects 0.000 description 1
- 239000000523 sample Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/004—Artificial life, i.e. computing arrangements simulating life
- G06N3/008—Artificial life, i.e. computing arrangements simulating life based on physical entities controlled by simulated intelligence so as to replicate intelligent life forms, e.g. based on robots replicating pets or humans in their appearance or behaviour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/12—Computing arrangements based on biological models using genetic models
- G06N3/126—Evolutionary algorithms, e.g. genetic algorithms or genetic programming
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
- G06T5/94—Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/12—Edge-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10088—Magnetic resonance imaging [MRI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30081—Prostate
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Biophysics (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Evolutionary Computation (AREA)
- Evolutionary Biology (AREA)
- Software Systems (AREA)
- Mathematical Physics (AREA)
- Artificial Intelligence (AREA)
- Biomedical Technology (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Genetics & Genomics (AREA)
- Physiology (AREA)
- Robotics (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Image Analysis (AREA)
Abstract
The invention provides a registration and fusion method of prostate nuclear magnetic ultrasonic images, and belongs to the field of computer-aided diagnosis. The registration fusion method of the prostate nuclear magnetic ultrasonic image comprises the steps of firstly, training an active appearance model of a prostate ultrasonic image aiming at an image segmentation task, and establishing a mathematical model of a boundary driving function based on a random forest to realize automatic segmentation of the ultrasonic image; and then, extracting the shape feature vector of the contour established by the preoperative segmented nuclear magnetic image and the automatic segmented ultrasonic image, and performing feature matching and image registration. The prostate nuclear magnetic ultrasound image registration fusion method provided by the invention can adapt to imaging parameters of different equipment manufacturers, improves the image registration puncture efficiency, and can effectively solve the problem of low manual segmentation registration efficiency of ultrasound images due to low imaging quality.
Description
Technical Field
The invention belongs to the field of computer-aided diagnosis, and particularly relates to a prostate nuclear magnetic ultrasonic image registration fusion method, which is used for segmenting an ultrasonic image and registering the ultrasonic nuclear magnetic image.
Background
Currently, the diagnosis of Prostate cancer relies primarily on the Prostate specific antigen PSA (PSA), followed by a needle biopsy, which is clinically used as the gold standard for definitive diagnosis of Prostate cancer due to the poor specificity of PSA. The prostate puncture biopsy guided by transrectal ultrasound is the most common method for diagnosing prostate cancer clinically due to the advantages of instantaneity, low cost, easy operation and the like. However, due to the low imaging quality of the rectal ultrasound images, it is difficult to accurately locate the malignant tumor region from the images. The false negative rate of the prostate 6-needle aspiration biopsy under the guidance of rectal ultrasound is as high as 30%. On the other hand, multi-parameter magnetic resonance imaging (mpMRI) is the currently accepted optimal imaging technique for diagnosing prostate cancer, and can accurately locate a suspicious lesion region, thereby achieving the purpose of targeted puncture. In order to improve the detection rate of prostate cancer, a magnetic resonance/Trans-Rectal Ultrasound (MR/TRUS) fusion guide targeted puncture technology comes, and aims to perform registration fusion on a preoperative MR image and a real-time TRUS image so as to improve puncture precision. However, preoperative MR is not consistent with real-time TRUS with prostate morphology due to inflation of the coil or intrarectal gas during MR scanning, patient breathing and involuntary movements, and compression of the prostate gland by the ultrasound probe in the rectum. To compensate for the effects of shape changes, a deformation registration of the MR/TRUS images is typically required.
Due to the low signal-to-noise ratio of the ultrasound image and the complex gray scale change relationship on the MR/TRUS image, it is difficult to perform positioning registration on the corresponding structural features on the MR/TRUS image only according to the image gray scale features, so the registration fusion of the prostate MR/TRUS is usually based on the deformation registration of the segmented prostate surface. The existing registration methods mainly include a registration method based on gray scale and a registration method based on a segmentation surface. The former mainly aims at the measurement standard of the similarity of the structural region under the complex gray scale change of the MR/TRUS image, and the latter mainly comprises the steps of segmenting the MR/TRUS image and carrying out registration fusion on the MR/TRUS image through a specific marking point or directly on the whole prostate section. However, since automatic segmentation of the prostate is a very challenging task, existing fusion methods are mainly based on manual or semi-automatic segmentation of the MR/TRUS image of the prostate and rigid or non-rigid registration based on manual placement or special physiological feature points.
In order to fully exert the advantages of the prostate MR/TRUS image in the tumor diagnosis and puncture guiding process, the MR image before the prostate patient operation is manually segmented and the puncture region is marked, and in the process of real-time puncture guiding of the prostate TRUS image, the MR image information labeled before the operation is registered to the TRUS image, namely, the registration mode of the prostate region manually segmented by the MR image before the operation to the prostate region automatically segmented by the TRUS image in the operation is adopted for image fusion and puncture guiding. Considering the influence of the shape and size change and texture gray scale of the prostate TRUS image in the segmentation process, an active appearance model of the prostate TRUS image is established by adopting a mixing method based on the shape and gray scale change for automatic segmentation of the image, and meanwhile, a corresponding contour region on the prostate MR/TRUS image is subjected to registration fusion by adopting a thin plate spline.
Disclosure of Invention
Aiming at the problem that the manual segmentation registration efficiency of an ultrasonic image is low due to low imaging quality, the invention provides a prostate MR/TRUS image automatic segmentation method based on supervised learning, which is used for non-rigid registration with a preoperative nuclear magnetic image. The invention is based on a supervised learning method, estimates the prostate contour segmentation parameters of an ultrasonic image by establishing a boundary driving function, and automatically segments the ultrasonic image by applying an active appearance model. In the registration process, a shape context operator is established based on contour feature points of prostate nuclear magnetism and ultrasonic images, a KM algorithm is used for feature matching, and meanwhile, the positioning error of the contour feature points is introduced to serve as a regular factor of a thin plate spline for image registration.
In order to achieve the purpose, the technical scheme of the invention is as follows:
a registration and fusion method of prostate nuclear magnetic ultrasonic images comprises the following steps:
step 1: manually labeling a precise contour of a prostate region by a prostate ultrasonic image in a training set of prostate ultrasonic images, wherein the precise contour of the prostate region comprises a plurality of label points, and training an active appearance model of the prostate based on the label points;
step 2: performing channel alignment on the prostate ultrasonic image training set in the step 1 by using an ECC algorithm, and eliminating a scale error; extracting training features F ═ (X, V) of the prostate region, wherein X ═ Xpixel,ypixel) Expressing the image coordinates of pixel points, wherein V is (mean, std) expressing the mean value and standard deviation of a 3 × 3 neighborhood taking X as the center;
and step 3: completing automatic segmentation of a new ultrasound image, comprising:
step 3.1: classifying a new pair of ultrasonic images by using the random forest model trained in the step 2 to realize binary segmentation and obtain a pre-segmented binary image;
step 3.2: establishing a mathematical model of a boundary driving function of the pre-segmented binary image region obtained in the step 3.1, and performing parameter optimization on the mathematical model of the boundary driving function by using a genetic algorithm to obtain segmentation attitude parameters of the prostate region on the ultrasonic image;
step 3.3: performing contour segmentation in the active appearance model trained in the step 1 based on the attitude parameters calculated in the step 3.2 to obtain an automatic segmentation result of the prostate contour of the ultrasonic image;
and 4, step 4: extracting the automatic segmentation result of the prostate contour of the ultrasonic image obtained in the step 3 and the shape context characteristics of the contour boundary of the preoperative manual segmentation nuclear magnetic image, and solving matched contour points by adopting a KM algorithm to obtain the matching relation between the nuclear magnetic image and the contour point pairs on the ultrasonic image;
and 5: and (4) according to the matching relation of the contour point pairs obtained in the step (4), interpolating the contour points with the matching relation by using a thin plate spline, and finally carrying out image registration based on a TPS (thermal transfer printing) function fitted by interpolation.
In a specific embodiment, in step 3.2, the boundary contour of the prostate region of the pre-segmented binary image obtained in step 3.1 is taken as the mean shapeTarget posture ofEstablishing a boundary driving function f:
Gi=outsidei-insidei(2)
in the formula: n represents the mean shapeNumber of label points, GiRepresents the boundary gradient value, outside, in the normal direction at the label point at the i (i ═ 1,2, …, n) th positioniAnd imideiRepresenting the pixel sum values at the normal, back to the centroid and towards the centroid, respectively;
to establish the boundary driving function f and the mean shapeDefining a mean shapeAt the target attitudeShape of(C represents the number of labels of the contour points, and C is preferably 40), and the conversion relationship is calculated by the following formula:
in the formula:the representation is an affine transformation matrix;is in a target postureIs as followsThe coordinates of the center of mass of the image,representing a set of points under the attitude t; θ represents a rotation angle, which is positive when rotated clockwise on the x-axis; scale represents a scaling factor;represents aA set of points in random state k.
In the target postureAny label point belowTaking a 2D line segment along the normal line as the center, and recording pijIs a pointThe gray value of the pixel at distance j on the normal line, the image coordinate of which is recorded asAndthen there are:
in the formula:the slope in the normal direction is shown, 2D is taken as the length of the normal, j is-D, -1,0,1 …, D represents different sampling distances. According to the mathematical model, the target posture is obtainedWhen changed, the value range of the boundary driving function f is [ -255,0 [)]The fluctuation is carried out and only when f → -255 the contour will iterate to the optimal solution, at which point the contour will beWill coincide to the maximum extent with the contours of the pre-segmented binary image of the prostate. In outlineA posture parameter under a random state k is used as an optimization target parameter, and a boundary driving function f is established as a mathematical model of an optimization target:
in the formula: attitude parameter (x)k,yk,θk,scalek) Respectively representing translation, scaling and scale factors in a random state k; x is the number ofp、ypα, s can be prepared fromAnd calculating the minimum circumscribed rectangle parameter of the pre-divided binary image to obtain (x)p,yp) Image coordinates, w, representing the centroid of the contour of the pre-segmented binary image1、h1、w2、h2Respectively representing the width and length of the minimum bounding rectangle,α denotes the rotation angle of the circumscribed rectangle, as shown in FIG. 5.
In a particular embodiment, in step 4, point piAnd point qj(i,j=1,2,…,npt) Respectively representing contour points on the pre-prostate-operation manually-segmented nuclear magnetic image and contour points on the transrectal ultrasonic image obtained by automatic segmentation in the step 3, wherein the corresponding shape context vector of the contour points and the contour points is hiAnd hj;
Dividing feature descriptors of shape context operators into num ═ numr·numcSub-intervals, denoted as krc(1≤r≤numr,1≤c≤numc) Statistics of subspace krcNumber of upper contour points, denoted NrcThen, then
Point piAnd point qjMatching cost C ofijSatisfies the following conditions:
defining the arrangement pi (q)i) So that the total cost H (pi) of the point set satisfies:
and solving the cost matching problem by adopting a KM algorithm to obtain the contour point pair matching relation of the prostate area on the ultrasonic image and the nuclear magnetic image.
The invention has the advantages that:
the invention provides a method for automatically segmenting a prostate ultrasonic image and registering and fusing a nuclear magnetic ultrasonic image, which can improve the positive rate of prostate tumor puncture biopsy and reduce the number of repeatedly punctured needles, and has higher diagnosis efficiency compared with the traditional method.
Drawings
Fig. 1 is a flowchart of a registration and fusion method for prostate nuclear magnetic ultrasound images provided in an embodiment of the present invention;
FIG. 2 is a schematic diagram of a two-class classifier of a trained random forest model;
FIG. 3 is a binary image pre-segmented by a random forest model;
FIG. 4 is a schematic diagram of a mathematical model of a boundary drive function of a pre-segmented binary image;
FIG. 5 is a circumscribed rectangle of the target and mean profiles under the result of the pre-segmentation of the prostate;
FIG. 6 is a diagram illustrating shape context sampling of boundary contours;
FIG. 7 is a graph of the result of the segmentation of an ultrasound image of an image;
FIG. 8 is a contour point matching result of a nuclear magnetic ultrasound image;
fig. 9 shows the nuclear magnetic ultrasound image registration result.
Detailed Description
The following further describes a specific embodiment of the present invention with reference to the drawings and technical solutions.
Fig. 1 is a flowchart of a registration and fusion method of a prostate nuclear magnetic ultrasound image according to an embodiment of the present invention. Referring to fig. 1, the prostate nuclear magnetic ultrasound image registration fusion method provided by the present invention estimates prostate contour segmentation parameters of an ultrasound image by establishing a mathematical Model of a boundary driving function based on a supervised learning method, and performs automatic segmentation of the ultrasound image by applying an Active Appearance Model (AAM). In the registration process, a shape context (shape context) operator is established based on contour feature points of prostate nuclear magnetism and an ultrasonic image, a KM algorithm is used for feature matching, and meanwhile, the positioning error of the contour feature points is introduced to serve as a regular factor of a thin plate spline to conduct image registration.
Referring to fig. 1, in the present embodiment, the registration and fusion method for prostate nuclear magnetic ultrasound images includes the following steps:
step 1: the method comprises the steps of training a training set based on prostate ultrasonic images, marking a precise outline of a prostate area by an ultrasonic image expert, wherein the precise outline of the prostate area comprises a plurality of label points, and training an Active Appearance Model (Active Appearance Model) of the prostate based on the label points.
Specifically, in this embodiment, each of the prostate ultrasound images in the training set of prostate ultrasound images in step 1 has 40 label points on its precise contour, and the label points on the precise contours of the multiple prostate ultrasound images are used to train the active appearance model of the prostate (see, in particular, Cootes T, Edwards G, Taylor C. active appearance models [ J ]. IEEE trans. Pattern Analysis and Machine interest, 2001, 23 (6): 681 685.).
Step 2: and (3) applying an ECC (Enhanced Correlation Coefficient, EEC) algorithm to align the channels of the prostate ultrasound image training set, and eliminating the scale error. Extracting training features F ═ (X, V) of the prostate region, wherein X ═ Xpixel,ypixel) Image coordinates representing pixel points, and V ═ means (std) representing mean and standard deviation of 3 × 3 neighborhood centered on XModel) as shown in fig. 2.
And step 3: completing the automatic segmentation of a new ultrasound image, comprising the steps of 3.1: classifying a new pair of ultrasonic images by using the random forest model trained in the step 2 to realize binary segmentation and obtain a pre-segmented binary image; step 3.2: establishing a mathematical model of a boundary driving function of the pre-divided binary image region obtained in the step 3.1, and performing parameter optimization on the mathematical model of the boundary driving function by using a genetic algorithm to obtain an attitude parameter of a prostate region on an ultrasonic image; step 3.3: contour segmentation is performed in the active appearance model trained in step 1, based on the pose parameters calculated in step 3.2.
Specifically, the method comprises the following steps: step 3.1: and (3) performing secondary classification on the new ultrasonic image by using the random forest model trained in the step (2) to realize binary segmentation and obtain a pre-segmented binary image, as shown in fig. 3.
Step 3.2: and 3.1, establishing a mathematical model of a boundary driving function of the pre-divided binary image region based on the pre-divided binary image obtained in the step 3.1, and performing parameter optimization on the mathematical model of the boundary driving function by using a genetic algorithm to obtain the posture parameters of the prostate region on the ultrasonic image.
In particular, the boundary contour of the prostate region of the pre-segmented binary image obtained in step 3.1 is taken as the mean shapeTarget posture ofEstablishing a boundary driving function f:
Gi=outsidei-insidei(2)
in the formula: n represents the mean shapeNumber of label points, GiRepresents the boundary gradient value, outside, in the normal direction at the label point at the i (i ═ 1,2, …, n) th positioniAnd imideiRepresenting the pixels and values on the normal, back to the centroid and towards the centroid, respectively, as shown in fig. 4.
To establish the boundary driving function f and the mean shapeDefining a mean shapeAt the target attitudeShape ofThe conversion relationship is calculated as follows:
in the formula:the representation is an affine transformation matrix;is in a target postureIs as followsThe coordinates of the center of mass of the image,representing a set of points under the attitude t; θ represents a rotation angle, which is positive when rotated clockwise on the x-axis; scale represents a scaling factor;representing a set of points at random state k.
In the target postureAny label point belowTaking a 2D segment (20 in this example) along the normal, taking pijIs a pointThe gray value of the pixel at distance j on the normal line, the image coordinate of which is recorded asAndthen there are:
in the formula:the slope in the normal direction is shown, 2D is taken as the length of the normal, j is-D, -1,0,1 …, D represents different sampling distances. According to the mathematical model, the target posture is obtainedWhen changed, the value range of the boundary driving function f is [ -255,0 [)]The fluctuation is carried out and only when f → -255 the contour will iterate to the optimal solution, at which point the contour will beWill coincide to the maximum extent with the contours of the pre-segmented binary image of the prostate. Taking an attitude parameter of the profile in a random state k as an optimization target parameter, and establishing a mathematical model taking a boundary driving function f as an optimization target:
in the formula: attitude parameter (x)k,yk,θk,scalek) Respectively representing translation, scaling and scale factors in a random state k; x is the number ofp、ypα, s can be prepared fromAnd calculating the minimum circumscribed rectangle parameter of the pre-divided binary image to obtain (x)p,yp) Image coordinates, w, representing the centroid of the contour of the pre-segmented binary image1、h1、w2、h2Respectively representing the width and length of the minimum bounding rectangle,α denotes the rotation angle of the circumscribed rectangle, as shown in FIG. 5.
Step 3.3: based on steps 3.1 and 3.2, using a genetic algorithm to solve equation (6), and substituting the calculated posture parameters into the active appearance model trained in step 1 to solve (see specifically Cootes T, Edwards G, Taylor C. active appearance models [ J ]. IEEE trans. Pattern Analysis and machine insight, 2001, 23 (6): 681) to obtain the automatic segmentation result of the prostate outline of the ultrasound image, which is shown in FIG. 7 and used for subsequent registration.
And 4, step 4: the automatic segmentation result of the prostate contour of the ultrasound image obtained in step 3 and the shape context features of the contour boundary of the preoperative manually segmented nuclear Magnetic (MR) image are extracted, and the KM algorithm is used to solve the matched contour points (i.e., model feature points) to obtain the matching relationship between the contour point pairs on the nuclear magnetic image and the ultrasound image, i.e., the registration of the images, as shown in fig. 8.
In particular, point p is notediAnd point qj(i,j=1,2,…,npt) Respectively representing contour points on the pre-prostate-operation manually-segmented nuclear magnetic image and contour points on the transrectal ultrasonic image obtained by automatic segmentation in the step 3, wherein the corresponding shape context vector of the contour points and the contour points is hiAnd hj. As shown in fig. 6, the feature descriptors of the shape context operator are divided into num ═ numr·numc(num in the present embodiment)r=5,numc12) sub-intervals, denoted krc(1≤r≤numr,1≤c≤numc) Statistics of subspace krcNumber of upper contour points, denoted NrcThen h isj=[N1,1,N1,2,…,Nr,c]。
Point piAnd point qjMatching cost C ofijSatisfies the following conditions:
defining the arrangement pi (q)i) So that the total cost H (pi) of the point set satisfies:
through the equations (7-8), a matching cost function between contour points on the nuclear magnetic image and the ultrasonic image can be established, the cost matching problem can be solved by adopting a KM algorithm, and a contour point pair matching relation of a prostate area on the ultrasonic image and the nuclear magnetic image is obtained, as shown in FIG. 8.
And 5: according to the matching relationship of the contour point pairs obtained in step 4, Thin Plate Splines (TPS) are applied to interpolate feature points having matching relationship, and finally image registration is performed based on the interpolation fitting TPS function (see Bookstein F L. Primary wars: Thin-plate splines and the composition of the compositions [ J ]. IEEE Transactions on Pattern Analysis & Machine integration, 2002, 11 (6): 567-585), the result of which is shown in FIG. 9.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: modifications of the technical solutions described in the embodiments or equivalent replacements of some or all technical features may be made without departing from the scope of the technical solutions of the embodiments of the present invention.
Claims (3)
1. A registration and fusion method of prostate nuclear magnetic ultrasonic images is characterized by comprising the following steps:
step 1: manually labeling a precise contour of a prostate region by a prostate ultrasonic image in a training set of prostate ultrasonic images, wherein the precise contour of the prostate region comprises a plurality of label points, and training an active appearance model of the prostate based on the label points;
step 2: performing channel alignment on the prostate ultrasonic image training set in the step 1 by using an ECC algorithm, and eliminating a scale error; extracting training features F ═ (X, V) of the prostate region, wherein X ═ Xpixel,ypixel) Expressing the image coordinates of the pixel points, wherein V is (mean, std) expressing the mean value and standard deviation of a neighborhood with X as the center; training a two-classification classifier of a prostate image region by using a machine learning model of a random forest to obtain a trained random forest model;
and step 3: completing automatic segmentation of a new ultrasound image, comprising:
step 3.1: classifying a new pair of ultrasonic images by using the random forest model trained in the step 2 to realize binary segmentation and obtain a pre-segmented binary image;
step 3.2: establishing a mathematical model of a boundary driving function of the pre-segmented binary image region obtained in the step 3.1, and performing parameter optimization on the mathematical model of the boundary driving function by using a genetic algorithm to obtain segmentation attitude parameters of the prostate region on the ultrasonic image;
step 3.3: performing contour segmentation in the active appearance model trained in the step 1 based on the attitude parameters calculated in the step 3.2 to obtain an automatic segmentation result of the prostate contour of the ultrasonic image;
and 4, step 4: extracting the automatic segmentation result of the prostate contour of the ultrasonic image obtained in the step 3 and the shape context characteristics of the contour boundary of the preoperative manual segmentation nuclear magnetic image, and solving matched contour points by adopting a KM algorithm to obtain the matching relation between the nuclear magnetic image and the contour point pairs on the ultrasonic image;
and 5: and (4) according to the matching relation of the contour point pairs obtained in the step (4), interpolating the contour points with the matching relation by using a thin plate spline, and finally carrying out image registration based on a TPS (thermal transfer printing) function fitted by interpolation.
2. The prostate nuclear magnetic ultrasound image registration fusion method according to claim 1, characterized in that in step 3.2, the boundary contour of the prostate region of the pre-segmented binary image obtained in step 3.1 is taken as the mean shapeTarget posture ofEstablishing a boundary driving function f:
Gi=outsidei-insidei(2)
in the formula: n represents the mean shapeNumber of label points, GiRepresents the boundary gradient value, outside, in the normal direction at the label point at the i (i ═ 1,2, …, n) th positioniAnd imideiRepresenting the pixel sum values at the normal, back to the centroid and towards the centroid, respectively;
to establish the boundary driving function f and the mean shapeDefining a mean shapeAt the target attitudeShape ofC represents the number of marks of the contour point, and the conversion relationship is calculated by the following formula:
in the formula:the representation is an affine transformation matrix;is in a target postureIs as followsThe coordinates of the center of mass of the image,representing a set of points under the attitude t; θ represents a rotation angle, which is positive when rotated clockwise on the x-axis; scale represents a scaling factor;representing a set of points at a random state k;
in the target postureAny label point belowTaking a 2D line segment along the normal line as the center, and recording pijIs a pointThe gray value of the pixel at distance j on the normal line, the image coordinate of which is recorded asAndthen there are:
in the formula:representing the slope of the normal direction, taking 2D as the length of the normal, j ═ D, -1,0,1 …, D representing different sampling distances; according to the mathematical model, the target posture is obtainedWhen changed, the value range of the boundary driving function f is [ -255,0 [)]The fluctuation is carried out and only when f → -255 the contour will iterate to the optimal solution, at which point the contour will beWill be in contact with the prostateThe outlines of the binary images of the gland pre-segmentation are overlapped to the maximum extent; taking an attitude parameter of the profile in a random state k as an optimization target parameter, and establishing a mathematical model taking a boundary driving function f as an optimization target:
min f(xk,yk,θk,scalek)
in the formula: attitude parameter (x)k,yk,θk,scalek) Respectively representing translation, scaling and scale factors in a random state k; x is the number ofp、ypα, s can be prepared fromAnd calculating the minimum circumscribed rectangle parameter of the pre-divided binary image to obtain (x)p,yp) Image coordinates, w, representing the centroid of the contour of the pre-segmented binary image1、h1、w2、h2Respectively representing the width and length of the minimum bounding rectangle,α denotes the rotation angle of the circumscribed rectangle.
3. The prostate nuclear magnetic ultrasound image registration fusion method according to claim 1 or 2, characterized in that in step 4, point p isiAnd point qj(i,j=1,2,…,npt) Respectively representing contour points on the pre-prostate-operation manually-segmented nuclear magnetic image and contour points on the transrectal ultrasonic image obtained by automatic segmentation in the step 3, wherein the corresponding shape context vector of the contour points and the contour points is hiAnd hj;
Dividing feature descriptors of shape context operators into num ═ numr·numcSub-intervals, denoted as krc(1≤r≤numr,1≤c≤numc) Statistics of subspace krcNumber of upper contour points, denoted NrcThen h isj=[N1,1,N1,2,…,Nr,c];
Point piAnd point qjMatching cost C ofijSatisfies the following conditions:
defining the arrangement pi (q)i) So that the total cost H (pi) of the point set satisfies:
and solving the cost matching problem by adopting a KM algorithm to obtain the contour point pair matching relation of the prostate area on the ultrasonic image and the nuclear magnetic image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010079085.2A CN111340861B (en) | 2020-02-03 | 2020-02-03 | Prostate nuclear magnetic ultrasonic image registration fusion method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010079085.2A CN111340861B (en) | 2020-02-03 | 2020-02-03 | Prostate nuclear magnetic ultrasonic image registration fusion method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111340861A true CN111340861A (en) | 2020-06-26 |
CN111340861B CN111340861B (en) | 2022-09-27 |
Family
ID=71183376
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010079085.2A Active CN111340861B (en) | 2020-02-03 | 2020-02-03 | Prostate nuclear magnetic ultrasonic image registration fusion method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111340861B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116580820A (en) * | 2023-07-13 | 2023-08-11 | 卡本(深圳)医疗器械有限公司 | Intelligent trans-perineal prostate puncture anesthesia system based on multi-mode medical image |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102178526A (en) * | 2011-05-12 | 2011-09-14 | 上海交通大学医学院附属新华医院 | Ultrasonic and nuclear magnetic resonance image fusion transluminal registration device and method |
CN110363802A (en) * | 2018-10-26 | 2019-10-22 | 西安电子科技大学 | Prostate figure registration system and method based on automatic segmentation and pelvis alignment |
-
2020
- 2020-02-03 CN CN202010079085.2A patent/CN111340861B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102178526A (en) * | 2011-05-12 | 2011-09-14 | 上海交通大学医学院附属新华医院 | Ultrasonic and nuclear magnetic resonance image fusion transluminal registration device and method |
CN110363802A (en) * | 2018-10-26 | 2019-10-22 | 西安电子科技大学 | Prostate figure registration system and method based on automatic segmentation and pelvis alignment |
Non-Patent Citations (2)
Title |
---|
倪东等: "基于核磁-超声融合的前列腺靶向穿刺系统", 《深圳大学学报(理工版)》 * |
黄建波等: "基于特征学习框架的前列腺超声图像分割方法研究", 《生物医学工程与临床》 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116580820A (en) * | 2023-07-13 | 2023-08-11 | 卡本(深圳)医疗器械有限公司 | Intelligent trans-perineal prostate puncture anesthesia system based on multi-mode medical image |
CN116580820B (en) * | 2023-07-13 | 2023-12-22 | 卡本(深圳)医疗器械有限公司 | Intelligent trans-perineal prostate puncture anesthesia system based on multi-mode medical image |
Also Published As
Publication number | Publication date |
---|---|
CN111340861B (en) | 2022-09-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8425418B2 (en) | Method of ultrasonic imaging and biopsy of the prostate | |
CN110060774B (en) | Thyroid nodule identification method based on generative confrontation network | |
US11937973B2 (en) | Systems and media for automatically diagnosing thyroid nodules | |
CN110338840B (en) | Three-dimensional imaging data display processing method and three-dimensional ultrasonic imaging method and system | |
CN106890031B (en) | Marker identification and marking point positioning method and operation navigation system | |
CN111179227B (en) | Mammary gland ultrasonic image quality evaluation method based on auxiliary diagnosis and subjective aesthetics | |
CN110464459A (en) | Intervention plan navigation system and its air navigation aid based on CT-MRI fusion | |
WO2022141882A1 (en) | Lesion recognition model construction apparatus and system based on historical pathological information | |
Shen et al. | Optimized prostate biopsy via a statistical atlas of cancer spatial distribution | |
Li et al. | Learning image context for segmentation of prostate in CT-guided radiotherapy | |
CN112215844A (en) | MRI (magnetic resonance imaging) multi-mode image segmentation method and system based on ACU-Net | |
JP2007061607A (en) | Method for processing image including one object and one or more other objects, and system for processing image from image data | |
CN111340861B (en) | Prostate nuclear magnetic ultrasonic image registration fusion method | |
Pan et al. | Application of Three-Dimensional Coding Network in Screening and Diagnosis of Cervical Precancerous Lesions | |
CN110782434A (en) | Intelligent marking and positioning device for brain tuberculosis MRI image focus | |
CN109801276A (en) | A kind of method and device calculating ambition ratio | |
Hughes et al. | Robust alignment of prostate histology slices with quantified accuracy | |
Xu et al. | ROI-based intraoperative MR-CT registration for image-guided multimode tumor ablation therapy in hepatic malignant tumors | |
Ou et al. | Non-rigid registration between histological and MR images of the prostate: A joint segmentation and registration framework | |
Liu et al. | 3-D prostate MR and TRUS images detection and segmentation for puncture biopsy | |
CN110580697B (en) | Video image processing method and system for measuring thickness of fetal nape transparency from ultrasonic video image | |
CN111127404B (en) | Medical image contour rapid extraction method | |
CN105354842A (en) | Contour key point registration and identification method based on stable area | |
Kadoury et al. | A model-based registration approach of preoperative MRI with 3D ultrasound of the liver for Interventional guidance procedures | |
Kadoury et al. | Realtime TRUS/MRI fusion targeted-biopsy for prostate cancer: a clinical demonstration of increased positive biopsy rates |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |