CN111340861A - Prostate nuclear magnetic ultrasonic image registration fusion method - Google Patents

Prostate nuclear magnetic ultrasonic image registration fusion method Download PDF

Info

Publication number
CN111340861A
CN111340861A CN202010079085.2A CN202010079085A CN111340861A CN 111340861 A CN111340861 A CN 111340861A CN 202010079085 A CN202010079085 A CN 202010079085A CN 111340861 A CN111340861 A CN 111340861A
Authority
CN
China
Prior art keywords
image
prostate
contour
nuclear magnetic
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010079085.2A
Other languages
Chinese (zh)
Other versions
CN111340861B (en
Inventor
丛明
杨德勇
杜宇
刘冬
吴童
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian Dahuazhongtian Technology Co ltd
Dalian University of Technology
First Affiliated Hospital of Dalian Medical University
Original Assignee
Dalian Dahuazhongtian Technology Co ltd
Dalian University of Technology
First Affiliated Hospital of Dalian Medical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian Dahuazhongtian Technology Co ltd, Dalian University of Technology, First Affiliated Hospital of Dalian Medical University filed Critical Dalian Dahuazhongtian Technology Co ltd
Priority to CN202010079085.2A priority Critical patent/CN111340861B/en
Publication of CN111340861A publication Critical patent/CN111340861A/en
Application granted granted Critical
Publication of CN111340861B publication Critical patent/CN111340861B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/004Artificial life, i.e. computing arrangements simulating life
    • G06N3/008Artificial life, i.e. computing arrangements simulating life based on physical entities controlled by simulated intelligence so as to replicate intelligent life forms, e.g. based on robots replicating pets or humans in their appearance or behaviour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/12Computing arrangements based on biological models using genetic models
    • G06N3/126Evolutionary algorithms, e.g. genetic algorithms or genetic programming
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/94Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10088Magnetic resonance imaging [MRI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30081Prostate

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Genetics & Genomics (AREA)
  • Physiology (AREA)
  • Robotics (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a registration and fusion method of prostate nuclear magnetic ultrasonic images, and belongs to the field of computer-aided diagnosis. The registration fusion method of the prostate nuclear magnetic ultrasonic image comprises the steps of firstly, training an active appearance model of a prostate ultrasonic image aiming at an image segmentation task, and establishing a mathematical model of a boundary driving function based on a random forest to realize automatic segmentation of the ultrasonic image; and then, extracting the shape feature vector of the contour established by the preoperative segmented nuclear magnetic image and the automatic segmented ultrasonic image, and performing feature matching and image registration. The prostate nuclear magnetic ultrasound image registration fusion method provided by the invention can adapt to imaging parameters of different equipment manufacturers, improves the image registration puncture efficiency, and can effectively solve the problem of low manual segmentation registration efficiency of ultrasound images due to low imaging quality.

Description

Prostate nuclear magnetic ultrasonic image registration fusion method
Technical Field
The invention belongs to the field of computer-aided diagnosis, and particularly relates to a prostate nuclear magnetic ultrasonic image registration fusion method, which is used for segmenting an ultrasonic image and registering the ultrasonic nuclear magnetic image.
Background
Currently, the diagnosis of Prostate cancer relies primarily on the Prostate specific antigen PSA (PSA), followed by a needle biopsy, which is clinically used as the gold standard for definitive diagnosis of Prostate cancer due to the poor specificity of PSA. The prostate puncture biopsy guided by transrectal ultrasound is the most common method for diagnosing prostate cancer clinically due to the advantages of instantaneity, low cost, easy operation and the like. However, due to the low imaging quality of the rectal ultrasound images, it is difficult to accurately locate the malignant tumor region from the images. The false negative rate of the prostate 6-needle aspiration biopsy under the guidance of rectal ultrasound is as high as 30%. On the other hand, multi-parameter magnetic resonance imaging (mpMRI) is the currently accepted optimal imaging technique for diagnosing prostate cancer, and can accurately locate a suspicious lesion region, thereby achieving the purpose of targeted puncture. In order to improve the detection rate of prostate cancer, a magnetic resonance/Trans-Rectal Ultrasound (MR/TRUS) fusion guide targeted puncture technology comes, and aims to perform registration fusion on a preoperative MR image and a real-time TRUS image so as to improve puncture precision. However, preoperative MR is not consistent with real-time TRUS with prostate morphology due to inflation of the coil or intrarectal gas during MR scanning, patient breathing and involuntary movements, and compression of the prostate gland by the ultrasound probe in the rectum. To compensate for the effects of shape changes, a deformation registration of the MR/TRUS images is typically required.
Due to the low signal-to-noise ratio of the ultrasound image and the complex gray scale change relationship on the MR/TRUS image, it is difficult to perform positioning registration on the corresponding structural features on the MR/TRUS image only according to the image gray scale features, so the registration fusion of the prostate MR/TRUS is usually based on the deformation registration of the segmented prostate surface. The existing registration methods mainly include a registration method based on gray scale and a registration method based on a segmentation surface. The former mainly aims at the measurement standard of the similarity of the structural region under the complex gray scale change of the MR/TRUS image, and the latter mainly comprises the steps of segmenting the MR/TRUS image and carrying out registration fusion on the MR/TRUS image through a specific marking point or directly on the whole prostate section. However, since automatic segmentation of the prostate is a very challenging task, existing fusion methods are mainly based on manual or semi-automatic segmentation of the MR/TRUS image of the prostate and rigid or non-rigid registration based on manual placement or special physiological feature points.
In order to fully exert the advantages of the prostate MR/TRUS image in the tumor diagnosis and puncture guiding process, the MR image before the prostate patient operation is manually segmented and the puncture region is marked, and in the process of real-time puncture guiding of the prostate TRUS image, the MR image information labeled before the operation is registered to the TRUS image, namely, the registration mode of the prostate region manually segmented by the MR image before the operation to the prostate region automatically segmented by the TRUS image in the operation is adopted for image fusion and puncture guiding. Considering the influence of the shape and size change and texture gray scale of the prostate TRUS image in the segmentation process, an active appearance model of the prostate TRUS image is established by adopting a mixing method based on the shape and gray scale change for automatic segmentation of the image, and meanwhile, a corresponding contour region on the prostate MR/TRUS image is subjected to registration fusion by adopting a thin plate spline.
Disclosure of Invention
Aiming at the problem that the manual segmentation registration efficiency of an ultrasonic image is low due to low imaging quality, the invention provides a prostate MR/TRUS image automatic segmentation method based on supervised learning, which is used for non-rigid registration with a preoperative nuclear magnetic image. The invention is based on a supervised learning method, estimates the prostate contour segmentation parameters of an ultrasonic image by establishing a boundary driving function, and automatically segments the ultrasonic image by applying an active appearance model. In the registration process, a shape context operator is established based on contour feature points of prostate nuclear magnetism and ultrasonic images, a KM algorithm is used for feature matching, and meanwhile, the positioning error of the contour feature points is introduced to serve as a regular factor of a thin plate spline for image registration.
In order to achieve the purpose, the technical scheme of the invention is as follows:
a registration and fusion method of prostate nuclear magnetic ultrasonic images comprises the following steps:
step 1: manually labeling a precise contour of a prostate region by a prostate ultrasonic image in a training set of prostate ultrasonic images, wherein the precise contour of the prostate region comprises a plurality of label points, and training an active appearance model of the prostate based on the label points;
step 2: performing channel alignment on the prostate ultrasonic image training set in the step 1 by using an ECC algorithm, and eliminating a scale error; extracting training features F ═ (X, V) of the prostate region, wherein X ═ Xpixel,ypixel) Expressing the image coordinates of pixel points, wherein V is (mean, std) expressing the mean value and standard deviation of a 3 × 3 neighborhood taking X as the center;
and step 3: completing automatic segmentation of a new ultrasound image, comprising:
step 3.1: classifying a new pair of ultrasonic images by using the random forest model trained in the step 2 to realize binary segmentation and obtain a pre-segmented binary image;
step 3.2: establishing a mathematical model of a boundary driving function of the pre-segmented binary image region obtained in the step 3.1, and performing parameter optimization on the mathematical model of the boundary driving function by using a genetic algorithm to obtain segmentation attitude parameters of the prostate region on the ultrasonic image;
step 3.3: performing contour segmentation in the active appearance model trained in the step 1 based on the attitude parameters calculated in the step 3.2 to obtain an automatic segmentation result of the prostate contour of the ultrasonic image;
and 4, step 4: extracting the automatic segmentation result of the prostate contour of the ultrasonic image obtained in the step 3 and the shape context characteristics of the contour boundary of the preoperative manual segmentation nuclear magnetic image, and solving matched contour points by adopting a KM algorithm to obtain the matching relation between the nuclear magnetic image and the contour point pairs on the ultrasonic image;
and 5: and (4) according to the matching relation of the contour point pairs obtained in the step (4), interpolating the contour points with the matching relation by using a thin plate spline, and finally carrying out image registration based on a TPS (thermal transfer printing) function fitted by interpolation.
In a specific embodiment, in step 3.2, the boundary contour of the prostate region of the pre-segmented binary image obtained in step 3.1 is taken as the mean shape
Figure BDA0002379638100000035
Target posture of
Figure BDA0002379638100000031
Establishing a boundary driving function f:
Figure BDA0002379638100000032
Gi=outsidei-insidei(2)
in the formula: n represents the mean shape
Figure BDA0002379638100000036
Number of label points, GiRepresents the boundary gradient value, outside, in the normal direction at the label point at the i (i ═ 1,2, …, n) th positioniAnd imideiRepresenting the pixel sum values at the normal, back to the centroid and towards the centroid, respectively;
to establish the boundary driving function f and the mean shape
Figure BDA0002379638100000037
Defining a mean shape
Figure BDA0002379638100000038
At the target attitude
Figure BDA0002379638100000033
Shape of
Figure BDA0002379638100000034
(C represents the number of labels of the contour points, and C is preferably 40), and the conversion relationship is calculated by the following formula:
Figure BDA0002379638100000041
in the formula:
Figure BDA0002379638100000042
the representation is an affine transformation matrix;
Figure BDA0002379638100000043
is in a target posture
Figure BDA0002379638100000044
Is as follows
Figure BDA0002379638100000045
The coordinates of the center of mass of the image,
Figure BDA0002379638100000046
representing a set of points under the attitude t; θ represents a rotation angle, which is positive when rotated clockwise on the x-axis; scale represents a scaling factor;
Figure BDA0002379638100000047
represents aA set of points in random state k.
In the target posture
Figure BDA0002379638100000048
Any label point below
Figure BDA0002379638100000049
Taking a 2D line segment along the normal line as the center, and recording pijIs a point
Figure BDA00023796381000000410
The gray value of the pixel at distance j on the normal line, the image coordinate of which is recorded as
Figure BDA00023796381000000411
And
Figure BDA00023796381000000412
then there are:
Figure BDA00023796381000000413
Figure BDA00023796381000000414
in the formula:
Figure BDA00023796381000000415
the slope in the normal direction is shown, 2D is taken as the length of the normal, j is-D, -1,0,1 …, D represents different sampling distances. According to the mathematical model, the target posture is obtained
Figure BDA00023796381000000416
When changed, the value range of the boundary driving function f is [ -255,0 [)]The fluctuation is carried out and only when f → -255 the contour will iterate to the optimal solution, at which point the contour will be
Figure BDA00023796381000000417
Will coincide to the maximum extent with the contours of the pre-segmented binary image of the prostate. In outlineA posture parameter under a random state k is used as an optimization target parameter, and a boundary driving function f is established as a mathematical model of an optimization target:
Figure BDA00023796381000000418
in the formula: attitude parameter (x)k,ykk,scalek) Respectively representing translation, scaling and scale factors in a random state k; x is the number ofp、ypα, s can be prepared from
Figure BDA0002379638100000055
And calculating the minimum circumscribed rectangle parameter of the pre-divided binary image to obtain (x)p,yp) Image coordinates, w, representing the centroid of the contour of the pre-segmented binary image1、h1、w2、h2Respectively representing the width and length of the minimum bounding rectangle,
Figure BDA0002379638100000051
α denotes the rotation angle of the circumscribed rectangle, as shown in FIG. 5.
In a particular embodiment, in step 4, point piAnd point qj(i,j=1,2,…,npt) Respectively representing contour points on the pre-prostate-operation manually-segmented nuclear magnetic image and contour points on the transrectal ultrasonic image obtained by automatic segmentation in the step 3, wherein the corresponding shape context vector of the contour points and the contour points is hiAnd hj
Dividing feature descriptors of shape context operators into num ═ numr·numcSub-intervals, denoted as krc(1≤r≤numr,1≤c≤numc) Statistics of subspace krcNumber of upper contour points, denoted NrcThen, then
Figure BDA0002379638100000052
Point piAnd point qjMatching cost C ofijSatisfies the following conditions:
Figure BDA0002379638100000053
defining the arrangement pi (q)i) So that the total cost H (pi) of the point set satisfies:
Figure BDA0002379638100000054
and solving the cost matching problem by adopting a KM algorithm to obtain the contour point pair matching relation of the prostate area on the ultrasonic image and the nuclear magnetic image.
The invention has the advantages that:
the invention provides a method for automatically segmenting a prostate ultrasonic image and registering and fusing a nuclear magnetic ultrasonic image, which can improve the positive rate of prostate tumor puncture biopsy and reduce the number of repeatedly punctured needles, and has higher diagnosis efficiency compared with the traditional method.
Drawings
Fig. 1 is a flowchart of a registration and fusion method for prostate nuclear magnetic ultrasound images provided in an embodiment of the present invention;
FIG. 2 is a schematic diagram of a two-class classifier of a trained random forest model;
FIG. 3 is a binary image pre-segmented by a random forest model;
FIG. 4 is a schematic diagram of a mathematical model of a boundary drive function of a pre-segmented binary image;
FIG. 5 is a circumscribed rectangle of the target and mean profiles under the result of the pre-segmentation of the prostate;
FIG. 6 is a diagram illustrating shape context sampling of boundary contours;
FIG. 7 is a graph of the result of the segmentation of an ultrasound image of an image;
FIG. 8 is a contour point matching result of a nuclear magnetic ultrasound image;
fig. 9 shows the nuclear magnetic ultrasound image registration result.
Detailed Description
The following further describes a specific embodiment of the present invention with reference to the drawings and technical solutions.
Fig. 1 is a flowchart of a registration and fusion method of a prostate nuclear magnetic ultrasound image according to an embodiment of the present invention. Referring to fig. 1, the prostate nuclear magnetic ultrasound image registration fusion method provided by the present invention estimates prostate contour segmentation parameters of an ultrasound image by establishing a mathematical Model of a boundary driving function based on a supervised learning method, and performs automatic segmentation of the ultrasound image by applying an Active Appearance Model (AAM). In the registration process, a shape context (shape context) operator is established based on contour feature points of prostate nuclear magnetism and an ultrasonic image, a KM algorithm is used for feature matching, and meanwhile, the positioning error of the contour feature points is introduced to serve as a regular factor of a thin plate spline to conduct image registration.
Referring to fig. 1, in the present embodiment, the registration and fusion method for prostate nuclear magnetic ultrasound images includes the following steps:
step 1: the method comprises the steps of training a training set based on prostate ultrasonic images, marking a precise outline of a prostate area by an ultrasonic image expert, wherein the precise outline of the prostate area comprises a plurality of label points, and training an Active Appearance Model (Active Appearance Model) of the prostate based on the label points.
Specifically, in this embodiment, each of the prostate ultrasound images in the training set of prostate ultrasound images in step 1 has 40 label points on its precise contour, and the label points on the precise contours of the multiple prostate ultrasound images are used to train the active appearance model of the prostate (see, in particular, Cootes T, Edwards G, Taylor C. active appearance models [ J ]. IEEE trans. Pattern Analysis and Machine interest, 2001, 23 (6): 681 685.).
Step 2: and (3) applying an ECC (Enhanced Correlation Coefficient, EEC) algorithm to align the channels of the prostate ultrasound image training set, and eliminating the scale error. Extracting training features F ═ (X, V) of the prostate region, wherein X ═ Xpixel,ypixel) Image coordinates representing pixel points, and V ═ means (std) representing mean and standard deviation of 3 × 3 neighborhood centered on XModel) as shown in fig. 2.
And step 3: completing the automatic segmentation of a new ultrasound image, comprising the steps of 3.1: classifying a new pair of ultrasonic images by using the random forest model trained in the step 2 to realize binary segmentation and obtain a pre-segmented binary image; step 3.2: establishing a mathematical model of a boundary driving function of the pre-divided binary image region obtained in the step 3.1, and performing parameter optimization on the mathematical model of the boundary driving function by using a genetic algorithm to obtain an attitude parameter of a prostate region on an ultrasonic image; step 3.3: contour segmentation is performed in the active appearance model trained in step 1, based on the pose parameters calculated in step 3.2.
Specifically, the method comprises the following steps: step 3.1: and (3) performing secondary classification on the new ultrasonic image by using the random forest model trained in the step (2) to realize binary segmentation and obtain a pre-segmented binary image, as shown in fig. 3.
Step 3.2: and 3.1, establishing a mathematical model of a boundary driving function of the pre-divided binary image region based on the pre-divided binary image obtained in the step 3.1, and performing parameter optimization on the mathematical model of the boundary driving function by using a genetic algorithm to obtain the posture parameters of the prostate region on the ultrasonic image.
In particular, the boundary contour of the prostate region of the pre-segmented binary image obtained in step 3.1 is taken as the mean shape
Figure BDA0002379638100000071
Target posture of
Figure BDA0002379638100000072
Establishing a boundary driving function f:
Figure BDA0002379638100000073
Gi=outsidei-insidei(2)
in the formula: n represents the mean shape
Figure BDA0002379638100000074
Number of label points, GiRepresents the boundary gradient value, outside, in the normal direction at the label point at the i (i ═ 1,2, …, n) th positioniAnd imideiRepresenting the pixels and values on the normal, back to the centroid and towards the centroid, respectively, as shown in fig. 4.
To establish the boundary driving function f and the mean shape
Figure BDA0002379638100000075
Defining a mean shape
Figure BDA0002379638100000076
At the target attitude
Figure BDA0002379638100000077
Shape of
Figure BDA0002379638100000078
The conversion relationship is calculated as follows:
Figure BDA0002379638100000079
in the formula:
Figure BDA0002379638100000081
the representation is an affine transformation matrix;
Figure BDA0002379638100000082
is in a target posture
Figure BDA0002379638100000083
Is as follows
Figure BDA0002379638100000084
The coordinates of the center of mass of the image,
Figure BDA0002379638100000085
representing a set of points under the attitude t; θ represents a rotation angle, which is positive when rotated clockwise on the x-axis; scale represents a scaling factor;
Figure BDA0002379638100000086
representing a set of points at random state k.
In the target posture
Figure BDA0002379638100000087
Any label point below
Figure BDA0002379638100000088
Taking a 2D segment (20 in this example) along the normal, taking pijIs a point
Figure BDA0002379638100000089
The gray value of the pixel at distance j on the normal line, the image coordinate of which is recorded as
Figure BDA00023796381000000810
And
Figure BDA00023796381000000811
then there are:
Figure BDA00023796381000000812
Figure BDA00023796381000000813
in the formula:
Figure BDA00023796381000000814
the slope in the normal direction is shown, 2D is taken as the length of the normal, j is-D, -1,0,1 …, D represents different sampling distances. According to the mathematical model, the target posture is obtained
Figure BDA00023796381000000815
When changed, the value range of the boundary driving function f is [ -255,0 [)]The fluctuation is carried out and only when f → -255 the contour will iterate to the optimal solution, at which point the contour will be
Figure BDA00023796381000000817
Will coincide to the maximum extent with the contours of the pre-segmented binary image of the prostate. Taking an attitude parameter of the profile in a random state k as an optimization target parameter, and establishing a mathematical model taking a boundary driving function f as an optimization target:
Figure BDA00023796381000000816
in the formula: attitude parameter (x)k,ykk,scalek) Respectively representing translation, scaling and scale factors in a random state k; x is the number ofp、ypα, s can be prepared from
Figure BDA00023796381000000818
And calculating the minimum circumscribed rectangle parameter of the pre-divided binary image to obtain (x)p,yp) Image coordinates, w, representing the centroid of the contour of the pre-segmented binary image1、h1、w2、h2Respectively representing the width and length of the minimum bounding rectangle,
Figure BDA0002379638100000091
α denotes the rotation angle of the circumscribed rectangle, as shown in FIG. 5.
Step 3.3: based on steps 3.1 and 3.2, using a genetic algorithm to solve equation (6), and substituting the calculated posture parameters into the active appearance model trained in step 1 to solve (see specifically Cootes T, Edwards G, Taylor C. active appearance models [ J ]. IEEE trans. Pattern Analysis and machine insight, 2001, 23 (6): 681) to obtain the automatic segmentation result of the prostate outline of the ultrasound image, which is shown in FIG. 7 and used for subsequent registration.
And 4, step 4: the automatic segmentation result of the prostate contour of the ultrasound image obtained in step 3 and the shape context features of the contour boundary of the preoperative manually segmented nuclear Magnetic (MR) image are extracted, and the KM algorithm is used to solve the matched contour points (i.e., model feature points) to obtain the matching relationship between the contour point pairs on the nuclear magnetic image and the ultrasound image, i.e., the registration of the images, as shown in fig. 8.
In particular, point p is notediAnd point qj(i,j=1,2,…,npt) Respectively representing contour points on the pre-prostate-operation manually-segmented nuclear magnetic image and contour points on the transrectal ultrasonic image obtained by automatic segmentation in the step 3, wherein the corresponding shape context vector of the contour points and the contour points is hiAnd hj. As shown in fig. 6, the feature descriptors of the shape context operator are divided into num ═ numr·numc(num in the present embodiment)r=5,numc12) sub-intervals, denoted krc(1≤r≤numr,1≤c≤numc) Statistics of subspace krcNumber of upper contour points, denoted NrcThen h isj=[N1,1,N1,2,…,Nr,c]。
Point piAnd point qjMatching cost C ofijSatisfies the following conditions:
Figure BDA0002379638100000092
defining the arrangement pi (q)i) So that the total cost H (pi) of the point set satisfies:
Figure BDA0002379638100000093
through the equations (7-8), a matching cost function between contour points on the nuclear magnetic image and the ultrasonic image can be established, the cost matching problem can be solved by adopting a KM algorithm, and a contour point pair matching relation of a prostate area on the ultrasonic image and the nuclear magnetic image is obtained, as shown in FIG. 8.
And 5: according to the matching relationship of the contour point pairs obtained in step 4, Thin Plate Splines (TPS) are applied to interpolate feature points having matching relationship, and finally image registration is performed based on the interpolation fitting TPS function (see Bookstein F L. Primary wars: Thin-plate splines and the composition of the compositions [ J ]. IEEE Transactions on Pattern Analysis & Machine integration, 2002, 11 (6): 567-585), the result of which is shown in FIG. 9.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: modifications of the technical solutions described in the embodiments or equivalent replacements of some or all technical features may be made without departing from the scope of the technical solutions of the embodiments of the present invention.

Claims (3)

1. A registration and fusion method of prostate nuclear magnetic ultrasonic images is characterized by comprising the following steps:
step 1: manually labeling a precise contour of a prostate region by a prostate ultrasonic image in a training set of prostate ultrasonic images, wherein the precise contour of the prostate region comprises a plurality of label points, and training an active appearance model of the prostate based on the label points;
step 2: performing channel alignment on the prostate ultrasonic image training set in the step 1 by using an ECC algorithm, and eliminating a scale error; extracting training features F ═ (X, V) of the prostate region, wherein X ═ Xpixel,ypixel) Expressing the image coordinates of the pixel points, wherein V is (mean, std) expressing the mean value and standard deviation of a neighborhood with X as the center; training a two-classification classifier of a prostate image region by using a machine learning model of a random forest to obtain a trained random forest model;
and step 3: completing automatic segmentation of a new ultrasound image, comprising:
step 3.1: classifying a new pair of ultrasonic images by using the random forest model trained in the step 2 to realize binary segmentation and obtain a pre-segmented binary image;
step 3.2: establishing a mathematical model of a boundary driving function of the pre-segmented binary image region obtained in the step 3.1, and performing parameter optimization on the mathematical model of the boundary driving function by using a genetic algorithm to obtain segmentation attitude parameters of the prostate region on the ultrasonic image;
step 3.3: performing contour segmentation in the active appearance model trained in the step 1 based on the attitude parameters calculated in the step 3.2 to obtain an automatic segmentation result of the prostate contour of the ultrasonic image;
and 4, step 4: extracting the automatic segmentation result of the prostate contour of the ultrasonic image obtained in the step 3 and the shape context characteristics of the contour boundary of the preoperative manual segmentation nuclear magnetic image, and solving matched contour points by adopting a KM algorithm to obtain the matching relation between the nuclear magnetic image and the contour point pairs on the ultrasonic image;
and 5: and (4) according to the matching relation of the contour point pairs obtained in the step (4), interpolating the contour points with the matching relation by using a thin plate spline, and finally carrying out image registration based on a TPS (thermal transfer printing) function fitted by interpolation.
2. The prostate nuclear magnetic ultrasound image registration fusion method according to claim 1, characterized in that in step 3.2, the boundary contour of the prostate region of the pre-segmented binary image obtained in step 3.1 is taken as the mean shape
Figure FDA0002379638090000011
Target posture of
Figure FDA0002379638090000012
Establishing a boundary driving function f:
Figure FDA0002379638090000021
Gi=outsidei-insidei(2)
in the formula: n represents the mean shape
Figure FDA0002379638090000022
Number of label points, GiRepresents the boundary gradient value, outside, in the normal direction at the label point at the i (i ═ 1,2, …, n) th positioniAnd imideiRepresenting the pixel sum values at the normal, back to the centroid and towards the centroid, respectively;
to establish the boundary driving function f and the mean shape
Figure FDA0002379638090000023
Defining a mean shape
Figure FDA0002379638090000024
At the target attitude
Figure FDA0002379638090000025
Shape of
Figure FDA0002379638090000026
C represents the number of marks of the contour point, and the conversion relationship is calculated by the following formula:
Figure FDA0002379638090000027
in the formula:
Figure FDA0002379638090000028
the representation is an affine transformation matrix;
Figure FDA0002379638090000029
is in a target posture
Figure FDA00023796380900000210
Is as follows
Figure FDA00023796380900000211
The coordinates of the center of mass of the image,
Figure FDA00023796380900000212
representing a set of points under the attitude t; θ represents a rotation angle, which is positive when rotated clockwise on the x-axis; scale represents a scaling factor;
Figure FDA00023796380900000213
representing a set of points at a random state k;
in the target posture
Figure FDA00023796380900000214
Any label point below
Figure FDA00023796380900000215
Taking a 2D line segment along the normal line as the center, and recording pijIs a point
Figure FDA00023796380900000216
The gray value of the pixel at distance j on the normal line, the image coordinate of which is recorded as
Figure FDA00023796380900000217
And
Figure FDA00023796380900000218
then there are:
Figure FDA00023796380900000219
Figure FDA00023796380900000220
in the formula:
Figure FDA00023796380900000221
representing the slope of the normal direction, taking 2D as the length of the normal, j ═ D, -1,0,1 …, D representing different sampling distances; according to the mathematical model, the target posture is obtained
Figure FDA0002379638090000031
When changed, the value range of the boundary driving function f is [ -255,0 [)]The fluctuation is carried out and only when f → -255 the contour will iterate to the optimal solution, at which point the contour will be
Figure FDA0002379638090000032
Will be in contact with the prostateThe outlines of the binary images of the gland pre-segmentation are overlapped to the maximum extent; taking an attitude parameter of the profile in a random state k as an optimization target parameter, and establishing a mathematical model taking a boundary driving function f as an optimization target:
min f(xk,ykk,scalek)
Figure FDA0002379638090000033
in the formula: attitude parameter (x)k,ykk,scalek) Respectively representing translation, scaling and scale factors in a random state k; x is the number ofp、ypα, s can be prepared from
Figure FDA0002379638090000034
And calculating the minimum circumscribed rectangle parameter of the pre-divided binary image to obtain (x)p,yp) Image coordinates, w, representing the centroid of the contour of the pre-segmented binary image1、h1、w2、h2Respectively representing the width and length of the minimum bounding rectangle,
Figure FDA0002379638090000035
α denotes the rotation angle of the circumscribed rectangle.
3. The prostate nuclear magnetic ultrasound image registration fusion method according to claim 1 or 2, characterized in that in step 4, point p isiAnd point qj(i,j=1,2,…,npt) Respectively representing contour points on the pre-prostate-operation manually-segmented nuclear magnetic image and contour points on the transrectal ultrasonic image obtained by automatic segmentation in the step 3, wherein the corresponding shape context vector of the contour points and the contour points is hiAnd hj
Dividing feature descriptors of shape context operators into num ═ numr·numcSub-intervals, denoted as krc(1≤r≤numr,1≤c≤numc) Statistics of subspace krcNumber of upper contour points, denoted NrcThen h isj=[N1,1,N1,2,…,Nr,c];
Point piAnd point qjMatching cost C ofijSatisfies the following conditions:
Figure FDA0002379638090000036
defining the arrangement pi (q)i) So that the total cost H (pi) of the point set satisfies:
Figure FDA0002379638090000037
and solving the cost matching problem by adopting a KM algorithm to obtain the contour point pair matching relation of the prostate area on the ultrasonic image and the nuclear magnetic image.
CN202010079085.2A 2020-02-03 2020-02-03 Prostate nuclear magnetic ultrasonic image registration fusion method Active CN111340861B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010079085.2A CN111340861B (en) 2020-02-03 2020-02-03 Prostate nuclear magnetic ultrasonic image registration fusion method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010079085.2A CN111340861B (en) 2020-02-03 2020-02-03 Prostate nuclear magnetic ultrasonic image registration fusion method

Publications (2)

Publication Number Publication Date
CN111340861A true CN111340861A (en) 2020-06-26
CN111340861B CN111340861B (en) 2022-09-27

Family

ID=71183376

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010079085.2A Active CN111340861B (en) 2020-02-03 2020-02-03 Prostate nuclear magnetic ultrasonic image registration fusion method

Country Status (1)

Country Link
CN (1) CN111340861B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116580820A (en) * 2023-07-13 2023-08-11 卡本(深圳)医疗器械有限公司 Intelligent trans-perineal prostate puncture anesthesia system based on multi-mode medical image

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102178526A (en) * 2011-05-12 2011-09-14 上海交通大学医学院附属新华医院 Ultrasonic and nuclear magnetic resonance image fusion transluminal registration device and method
CN110363802A (en) * 2018-10-26 2019-10-22 西安电子科技大学 Prostate figure registration system and method based on automatic segmentation and pelvis alignment

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102178526A (en) * 2011-05-12 2011-09-14 上海交通大学医学院附属新华医院 Ultrasonic and nuclear magnetic resonance image fusion transluminal registration device and method
CN110363802A (en) * 2018-10-26 2019-10-22 西安电子科技大学 Prostate figure registration system and method based on automatic segmentation and pelvis alignment

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
倪东等: "基于核磁-超声融合的前列腺靶向穿刺系统", 《深圳大学学报(理工版)》 *
黄建波等: "基于特征学习框架的前列腺超声图像分割方法研究", 《生物医学工程与临床》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116580820A (en) * 2023-07-13 2023-08-11 卡本(深圳)医疗器械有限公司 Intelligent trans-perineal prostate puncture anesthesia system based on multi-mode medical image
CN116580820B (en) * 2023-07-13 2023-12-22 卡本(深圳)医疗器械有限公司 Intelligent trans-perineal prostate puncture anesthesia system based on multi-mode medical image

Also Published As

Publication number Publication date
CN111340861B (en) 2022-09-27

Similar Documents

Publication Publication Date Title
US8425418B2 (en) Method of ultrasonic imaging and biopsy of the prostate
CN110060774B (en) Thyroid nodule identification method based on generative confrontation network
US11937973B2 (en) Systems and media for automatically diagnosing thyroid nodules
CN110338840B (en) Three-dimensional imaging data display processing method and three-dimensional ultrasonic imaging method and system
CN106890031B (en) Marker identification and marking point positioning method and operation navigation system
CN111179227B (en) Mammary gland ultrasonic image quality evaluation method based on auxiliary diagnosis and subjective aesthetics
CN110464459A (en) Intervention plan navigation system and its air navigation aid based on CT-MRI fusion
WO2022141882A1 (en) Lesion recognition model construction apparatus and system based on historical pathological information
Shen et al. Optimized prostate biopsy via a statistical atlas of cancer spatial distribution
Li et al. Learning image context for segmentation of prostate in CT-guided radiotherapy
CN112215844A (en) MRI (magnetic resonance imaging) multi-mode image segmentation method and system based on ACU-Net
JP2007061607A (en) Method for processing image including one object and one or more other objects, and system for processing image from image data
CN111340861B (en) Prostate nuclear magnetic ultrasonic image registration fusion method
Pan et al. Application of Three-Dimensional Coding Network in Screening and Diagnosis of Cervical Precancerous Lesions
CN110782434A (en) Intelligent marking and positioning device for brain tuberculosis MRI image focus
CN109801276A (en) A kind of method and device calculating ambition ratio
Hughes et al. Robust alignment of prostate histology slices with quantified accuracy
Xu et al. ROI-based intraoperative MR-CT registration for image-guided multimode tumor ablation therapy in hepatic malignant tumors
Ou et al. Non-rigid registration between histological and MR images of the prostate: A joint segmentation and registration framework
Liu et al. 3-D prostate MR and TRUS images detection and segmentation for puncture biopsy
CN110580697B (en) Video image processing method and system for measuring thickness of fetal nape transparency from ultrasonic video image
CN111127404B (en) Medical image contour rapid extraction method
CN105354842A (en) Contour key point registration and identification method based on stable area
Kadoury et al. A model-based registration approach of preoperative MRI with 3D ultrasound of the liver for Interventional guidance procedures
Kadoury et al. Realtime TRUS/MRI fusion targeted-biopsy for prostate cancer: a clinical demonstration of increased positive biopsy rates

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant