CN113592925B - Intraoperative ultrasonic image and contour real-time registration method and system thereof - Google Patents

Intraoperative ultrasonic image and contour real-time registration method and system thereof Download PDF

Info

Publication number
CN113592925B
CN113592925B CN202110806507.6A CN202110806507A CN113592925B CN 113592925 B CN113592925 B CN 113592925B CN 202110806507 A CN202110806507 A CN 202110806507A CN 113592925 B CN113592925 B CN 113592925B
Authority
CN
China
Prior art keywords
image
real
objective function
time
slice
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110806507.6A
Other languages
Chinese (zh)
Other versions
CN113592925A (en
Inventor
陶波
陈特儒
赵兴炜
丁汉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong University of Science and Technology
Original Assignee
Huazhong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong University of Science and Technology filed Critical Huazhong University of Science and Technology
Priority to CN202110806507.6A priority Critical patent/CN113592925B/en
Publication of CN113592925A publication Critical patent/CN113592925A/en
Application granted granted Critical
Publication of CN113592925B publication Critical patent/CN113592925B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • G06T7/344Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30081Prostate

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Image Processing (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The invention belongs to the technical field of image registration, and discloses a method and a system for registering an intraoperative ultrasonic image and a contour of the intraoperative ultrasonic image in real time, wherein the method comprises the following steps: acquiring a volume image of an area to be operated; in the operation process, an ultrasonic probe is utilized to image a target area to obtain a real-time image slice, and an image at a corresponding position is obtained from a volume image to serve as a reference image; gradient analysis calculation is carried out on the objective function so as to realize rigid registration and further obtain primary deformation field parameters; gradient analysis calculation is carried out on the objective function in the FFD model so as to carry out non-rigid registration and further obtain local deformation field parameters; and superposing the preliminary deformation field parameters and the local deformation field parameters on the contour of the reference image to obtain the contour of the real-time image slice, and superposing the contour of the real-time image slice and the real-time image slice to obtain the real-time image and the contour thereof. The method and the device remarkably shorten the image registration time, improve the image registration efficiency by nearly 100 times, and have great application value.

Description

Intraoperative ultrasonic image and contour real-time registration method and system thereof
Technical Field
The invention belongs to the technical field of image registration, and particularly relates to an intraoperative ultrasonic image and a real-time contour registration method and system thereof.
Background
Prostate disease is the most common symptomatic disease in middle-aged and elderly men, and prostate cancer is the most common non-skin cancer and is also the second leading cause of cancer-related death in men. Early-stage prostate cancer is the most common non-skin cancer and is also the second most common cause of death related to male cancer, the early-stage prostate cancer can be effectively diagnosed and controlled, the survival time of the untransferred prostate cancer is long, the prostate cancer can not be cured after the metastasis, and along with the development of economy, the life rhythm of people is aggravated, the prostate diseases gradually tend to be younger, and the prevalence rate is also increased year by year.
Transrectal ultrasound (TRUS) guidance techniques are currently an effective real-time guidance modality for treating prostate disease, including ultrasound examination of the prostate, biopsy of the prostate puncture, ultrasound guided radio frequency ablation of the prostate, and the like. The deformation of the prostate is unavoidable during the guiding process due to the interference of the external environment, and the compensation of the deformation of the prostate is particularly important. Rectally guided robotic surgery (e.g., radical prostatectomy, prostate aspiration biopsy, etc.) is becoming mature, but currently there is a lack of medical imaging techniques for real-time tracking of prostate deformation, and the ultrasound images obtained by the probe require registration of the contours of the reference images to achieve tracking of the surgical site. In the prior art, a numerical analysis algorithm is adopted to analyze and optimize the image registration, the registration time is generally several seconds, and a doctor needs to wait for remarkably prolonging the operation time when performing an operation.
Disclosure of Invention
Aiming at the defects or improvement demands of the prior art, the invention provides an intraoperative ultrasonic image and a contour real-time registration method thereof, and the image registration time can reach below 100ms and even 50ms by optimizing an objective function analysis algorithm in image registration, so that the image registration time is obviously shortened, the image registration efficiency is improved by nearly 100 times, and the method has great application value.
To achieve the above object, according to one aspect of the present invention, there is provided a method for registering an intraoperative ultrasound image and its contour in real time, the method comprising: s1: acquiring a volumetric image V of an area to be operated on R The method comprises the steps of carrying out a first treatment on the surface of the S2: in the operation process, an ultrasonic probe is utilized to image a target area in an operation area to obtain a real-time image slice I t At the same time from the volume image V R Wherein the image of the corresponding position is obtained as reference image I R The method comprises the steps of carrying out a first treatment on the surface of the S3: for characterizing the real-time image slice I t And the reference image I R Gradient analysis calculation is carried out on the objective function of the similarity measurement to realize rigid registration so as to obtain primary deformation field parameters; s4: performing gradient analysis calculation on an objective function in the FFD model on the image processed in the step S3 to perform non-rigid registration so as to obtain local deformation field parameters; s5: superposing the preliminary deformation field parameters and the local deformation field parameters on the reference image I R Is to obtain the real-time image slice I t Is to slice the real-time image into slices I t Contour of (1) and real-time image slice I t And superposing to obtain a real-time image and the outline thereof.
Preferably, the step S3 specifically includes: respectively obtaining a first-order gradient and a second-order gradient of the objective function in each deformation direction, and adopting an optimization algorithm to carry out iteration to obtain primary deformation field parametersWherein x, y and z are three translational degrees of freedom, w, θ, and +.>Three degrees of freedom of rotation, the deformation directions are x, y, z, w, θ, and +>Six directions.
Preferably, the first-order gradient of the objective function in each deformation direction is:
wherein k is x, y, z, w, θ orRepresenting an objective function at x, y, z, w, θ or +.>First order gradient in direction, j=1, 2 and 3 denote in x, y and z directions, G j For the gradient of the objective function D in the J direction, J k,j Jacobian matrix j= [ J ] being a rigid deformation T k,j ]∈R 3×6 Values in k rows j columns.
Preferably, the second order gradient of the objective function in each deformation direction is:
wherein m is x, y, z, w, θ ori=1, 2 and 3 denote in x, y and z directions, G i For the gradient of the objective function D in the i direction, J k,i Jacobian matrix j= [ J ] being a rigid deformation T k,i ]∈R 3×6 Values at k rows and i columns; j (J) m,j Jacobian matrix j= [ J ] being a rigid deformation T m,j ]∈R 3×6 Values in m rows and j columns.
Preferably, the step S4 specifically includes:
carrying out one-step resolution on each pixel point in the image in each direction in the FFD model, wherein the resolution formula is as follows:
wherein x= [ X ', y ', z ] ']Is the position of the pixel point in the image, k is the direction, v is x, y, z, w, theta,Is a collection of (3); mu (mu) k The component of the pixel point in the k direction, which is the control point, # V is every pixel on the image, # V is the pixel point in the k direction>Gradient in k-direction for the objective function D, +.>For the bias of the FFD model in the k-direction relative to the control point coordinates, +.>λ j For the position of the control point near the control point, Δρ is the interval between the control points, β (3) =β (3) (x′)β (3) (y′)β (3) And (z ') is a B-spline basis function in three directions of x', y ', and z'.
Preferably, the step S4 further includes performing a second-order gradient analysis on each pixel point in the image in each direction in the FFD model, where an analysis formula is:
wherein,gradient in the m-direction for the objective function D, +.>Is the bias guide of the cubic B spline model in the FFD model relative to the control point coordinate along the m direction, v m Is the component of the pixel point of the control point in the m direction.
Preferably, the steps S3 and/or S4 perform gradient analysis iterative computation by using a quasi-newton method or a trust domain method until the objective function converges or the objective function variation value is smaller than a preset value.
Preferably, the step S1 further comprises the step of generating the volumetric image V R And (3) dividing, wherein the target area in the step S2 is a certain area after the division in the step S1.
Preferably, step S2 further comprises the step of comparing the volume image V R Ultrasound reference image I R The ultrasound reference image I R The resolution of (c) is adjusted to be the same, and the picture is subjected to gray scale normalization. Another aspect of the present application provides an intraoperative ultrasound image and contour real-time registration system thereof, the system comprising: an acquisition module for acquiring a volume image V of an area to be operated R The method comprises the steps of carrying out a first treatment on the surface of the The imaging module is used for imaging a target area in an area to be operated by using the ultrasonic probe in the operation process to obtain a real-time image slice I t At the same time from the volume image V R Wherein the image of the corresponding position is obtained as reference image I R The method comprises the steps of carrying out a first treatment on the surface of the A rigid registration module for characterizing the real-time image slice I t And the reference image I R Gradient resolution computation of objective function of similarity measureRealizing rigid registration so as to obtain primary deformation field parameters; the non-rigid registration module is used for carrying out gradient analysis calculation on an objective function in the FFD model on the image processed by the rigid registration module so as to carry out non-rigid registration and further obtain local deformation field parameters; a superposition module for superposing the preliminary deformation field parameters and the local deformation field parameters on the reference image I R Is to obtain the real-time image slice I t Is to slice the real-time image into slices I t Contour of (1) and real-time image slice I t And superposing to obtain a real-time image and the outline thereof.
In general, compared with the prior art, the technical scheme adopted by the invention has the following beneficial effects that the intraoperative ultrasonic image and the contour real-time registration method and system thereof have the following beneficial effects:
1. the method and the device adopt gradient analysis algorithm to carry out iterative computation of the objective function so as to realize rigid registration and non-rigid registration, and can calculate the variation in multiple directions by one iteration, thereby obviously shortening the registration time and improving the registration efficiency by nearly 100 times.
2. The rigid registration and the non-rigid registration both adopt second-order gradient iterative computation, so that the iterative computation amount is obviously reduced, the gradient computation can well characterize the trend of the objective function, and further, the better computation precision is obtained.
3. According to the method, firstly, rigid registration is carried out, then non-rigid registration is adopted, and the precision and the robustness of registration are improved by adopting a coarse-to-fine registration mode.
Drawings
FIG. 1 is a step diagram of a method for real-time registration of an intraoperative ultrasound image and its profile of the present embodiment;
FIG. 2 is a flow chart of a method of real-time registration of an intraoperative ultrasound image and its profile of the present embodiment;
FIGS. 3A-3D are schematic views of four frames of real-time images and contours on a horizontal plane on a prostate phantom according to this embodiment;
fig. 4A to 4D are schematic views of four frames of real-time images and contours on the sagittal plane on the prostate phantom according to this embodiment.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention. In addition, the technical features of the embodiments of the present invention described below may be combined with each other as long as they do not collide with each other.
The application provides a real-time registration method of an intraoperative ultrasonic image and a contour thereof, which can be divided into an image acquisition stage, an image registration stage and a deformation prediction stage. The data acquisition stage is to acquire a volume image V R . The image registration stage comprises a rigid registration stage and a non-rigid registration stage, and the two stages adopt a gradient analysis algorithm to analyze and calculate the objective function, so that the calculation efficiency is greatly improved. The deformation prediction stage is to apply deformation parameters of the rigid registration stage and the non-rigid registration stage to the volume image V R The contour obtained on the contour of the middle reference image is the contour of the real-time image. The method of the present application will be described in detail with reference to fig. 1 and 2 by taking transrectal ultrasound (TRUS) technique for treating the prostate, and the specific steps are as follows in steps S1 to S5.
S1: acquiring a volumetric image V of an area to be operated on R
Volumetric image V of prostate region by transrectal ultrasound (TRUS) guide scanning and accompanying encoder information prior to surgery R . If the volume image V is obtained R The oversized can be further segmented into a plurality of areas, and the segmentation method can be a manual or deep learning-based method.
The concrete operation is that the TRUS probe is driven by a guide frame with an encoder, slice images of the ultrasonic probe at each position are obtained based on the information of the encoder and the ultrasonic video stream, and the difference value of the slice images is three-dimensional image V through resolution adjustment R The pixel pitch of the image is 0.5X0.5X0.5 mm, and at the same time, the doctor performs the segmentation of the prostate shape on the slice to obtainContour information at each slice to the prostate is saved as an image template corresponding to each slice.
S2: in the operation process, an ultrasonic probe is utilized to image a target area in an operation area to obtain a real-time image slice I t At the same time from the volume image V R Wherein the image of the corresponding position is obtained as reference image I R
Imaging the target region in the prostate by using the ultrasound probe of TRUS during the operation to obtain a real-time image slice I t Simultaneously obtaining a volumetric image V using guide frame encoder information R Ultrasound reference image I in (1) R The ultrasound reference image I R And the volume image V R Ultrasound reference image I R The ultrasound reference image I R The resolution of (c) is adjusted to be the same, and the picture is subjected to gray scale normalization.
In this embodiment, the ultrasound probe may be in the form of a biplane or monoplane probe, the volume image of the prostate is interpolated from the ultrasound planar slices with a resolution of no less than 0.5x0.5x0.5 mm.
S3: for characterizing the real-time image slice I t And the reference image I R And performing gradient analysis calculation on the objective function of the similarity measure to realize rigid registration so as to obtain the primary deformation field parameters.
Specifically, a first-order gradient and a second-order gradient of an objective function in each deformation direction are respectively obtained, and an optimization algorithm is adopted for iteration to obtain a preliminary deformation field parameterWherein x, y and z are three translational degrees of freedom, w, θ, and +.>Three degrees of freedom of rotation, the deformation directions are x, y, z, w, θ, and +>Six directions.
Wherein, the first order gradient of the objective function in each deformation direction is as follows:
wherein k is x, y, z, w, θ orRepresenting an objective function at x, y, z, w, θ or +.>First order gradient in direction, j=1, 2 and 3 denote in x, y and z directions, G j For the gradient of the objective function D in the J direction, J k,j Jacobian matrix j= [ J ] being a rigid deformation T k,j ]∈R 3×6 Values in k rows j columns.
The second-order gradient of the objective function in each deformation direction is as follows:
wherein m is x, y, z, w, θ ori=1, 2 and 3 denote in x, y and z directions, G i For the gradient of the objective function D in the i direction, J k,i Jacobian matrix j= [ J ] being a rigid deformation T k,i ]∈R 3×6 Values at k rows and i columns; j (J) m,j Jacobian matrix j= [ J ] being a rigid deformation T m,j ]∈R 3×6 Values in m rows and j columns.
In this embodiment, the objective function may be Sum of Squares (SSD), sum of absolute error (SAD), normalized Cross Correlation (NCC), mutual Information (MI), or the like of the image differences.
S4: and (3) carrying out gradient analysis calculation on the objective function in the FFD model on the image processed in the step (S3) so as to carry out non-rigid registration and further obtain local deformation field parameters.
The FFD model is also called a cubic B-spline model, and its expression is as follows:
wherein x= [ X ', y ', z ] ']C is the position of the pixel point in the image i For interpolation coefficients, X i Is the position of the control point in the pixel, beta (3) =β (3) (x′)β (3) (y′)β (3) (z ') is a B-spline basis function in three directions of x', y ', z', wherein β (3) (x′)、β (3) (y′)、β (3) The expression of (z') can be unified as follows:
wherein p is x ', y ', or z '.
The step S4 specifically includes:
carrying out one-step resolution on each pixel point in the image in each direction in the FFD model, wherein the resolution formula is as follows:
wherein x= [ X ', y ', z ] ']Is the position of the pixel point in the image, k is the direction, v is x, y, z, w, theta,Is a collection of (3); mu (mu) k The component of the pixel point in the k direction, which is the control point, # V is every pixel on the image, # V is the pixel point in the k direction>Gradient in k-direction for the objective function D, +.>For the bias of the FFD model in the k-direction relative to the control point coordinates, +.>λ j For the position of the control point near the control point, Δρ is the interval between the control points, β (3) =β (3) (x′)β (3) (y′)β (3) And (z ') is a B-spline basis function in three directions of x', y ', and z'.
The step S4 further includes performing second-order gradient analysis on each pixel point in the image in each direction in the FFD model, where the analysis formula is:
wherein,gradient in the m-direction for the objective function D, +.>Is the bias guide of the cubic B spline model in the FFD model relative to the control point coordinate along the m direction, v m Is the component of the pixel point of the control point in the m direction.
And S3 and/or S4, performing gradient analysis iterative computation by adopting a quasi-Newton method or a trust domain method until the objective function converges or the variation value of the objective function is smaller than a preset value. The convergence rate of the two stages can ensure that the objective function region can be converged in about 6 iterations, and the condition of iteration stopping can be set to be that the iteration times are more than or equal to 10 times or the objective function change is less than 10 when the initial setting is performed -5 Preliminary deformation field parameters and local deformation field parameters can be obtained so far.
S5: superposing the preliminary deformation field parameters and the local deformation field parameters on the reference image I R Is to obtain the real-time image slice I t Is to slice the real-time image into slices I t Contour of (1) and real-time image slice I t And superposing to obtain a real-time image and the outline thereof.
The preliminary deformation field parameters and the local deformation field parameters are obtained through registration of the reference image and the real-time image, the contours of the real-time image slices can be obtained by superposing the two parameters on the contours of the reference image, and the contours and the real-time image slices can be superposed to obtain the real-time image and the contours thereof.
The embodiment can be realized by adopting a Matlab platform, adopting a Matlab and C++ mixed programming method, realizing a part with larger calculation amount such as deformed image calculation by using C++ and the like, and realizing an optimization algorithm in the Matlab. And (3) calling an image reading function, reading image data and encoder parameters of an ultrasonic probe, wherein the three-dimensional ultrasonic volume image VR of the prostate is stored as a matrix X of H multiplied by W multiplied by D, H is the length of the ultrasonic volume image, W is the width of the ultrasonic volume image, D is the number of slices of the ultrasonic volume image, and the embodiment normalizes the image gray scale to be between 0 and 255 for standard unification.
The verification process is as follows:
the experiment is completed by adopting an ultrasonic probe and an ultrasonic phantom, the ultrasonic probe images the ultrasonic phantom in real time, and pressure is applied to the ultrasonic phantom from the outside, so that the ultrasonic phantom is deformed, the algorithm deformation is predicted, and the accuracy and the instantaneity of the algorithm are verified. The experimental process is shown in figures 3A-3D and figures 4A-4D, and the black outline is the real-time outline of the prostate. The body model experiment proves that the registration time of the existing algorithm is 100ms or less and even 50ms (5 s before optimization), the solving speed is increased by nearly 100 times, the real-time requirement is met, and meanwhile, the average contour registration precision is 2mm or less, so that a good precision effect is achieved.
Another aspect of the present application provides an intraoperative ultrasound image and contour real-time registration system thereof, the system comprising:
the acquisition module, for example, may perform step S1 in fig. 1 for acquiring a volumetric image V of the region to be operated on R
An imaging module, for example, may perform step S2 of fig. 1 for use in a surgical procedure using an ultrasound probe pairImaging a target area in the area to be operated to obtain a real-time image slice I t At the same time from the volume image V R Wherein the image of the corresponding position is obtained as reference image I R
A rigid registration module, for example, may perform step S3 in fig. 1 for characterizing the real-time image slice I t And the reference image I R Gradient analysis calculation is carried out on the objective function of the similarity measurement to realize rigid registration so as to obtain primary deformation field parameters;
the non-rigid registration module may, for example, execute step S4 in fig. 1, and is configured to perform gradient analysis calculation on the objective function in the FFD model on the image processed by the rigid registration module so as to perform non-rigid registration, thereby obtaining local deformation field parameters;
a superposition module, for example, may perform step S5 in fig. 1 for superposing the preliminary deformation field parameters and the local deformation field parameters on the reference image I R Is to obtain the real-time image slice I t Is to slice the real-time image into slices I t Contour of (1) and real-time image slice I t And superposing to obtain a real-time image and the outline thereof.
In summary, the invention provides an intraoperative ultrasonic image and a contour real-time registration method thereof, and the image registration time can reach below 100ms and even 50ms by optimizing an objective function analysis algorithm in image registration, so that the image registration time is obviously shortened, the image registration efficiency is improved by nearly 100 times, and the method has great application value.
It will be readily appreciated by those skilled in the art that the foregoing description is merely a preferred embodiment of the invention and is not intended to limit the invention, but any modifications, equivalents, improvements or alternatives falling within the spirit and principles of the invention are intended to be included within the scope of the invention.

Claims (9)

1. An intraoperative ultrasound image and a contour real-time registration method thereof, wherein the method comprises the following steps:
s1: obtaining the volume of the region to be operatedImage V R
S2: in the operation process, an ultrasonic probe is utilized to image a target area in an operation area to obtain a real-time image slice I t At the same time from the volume image V R Wherein the image of the corresponding position is obtained as reference image I R
S3: for characterizing the real-time image slice I t And the reference image I R Gradient analysis calculation is carried out on the objective function of the similarity measurement to realize rigid registration so as to obtain primary deformation field parameters;
s4: performing gradient analysis calculation on an objective function in the FFD model on the image processed in the step S3 to perform non-rigid registration so as to obtain local deformation field parameters;
the step S4 specifically comprises the following steps:
carrying out one-step resolution on each pixel point in the image in each direction in the FFD model, wherein the resolution formula is as follows:
wherein x= [ X ', y ', z ] ']Is the position of the pixel point in the image, k is the direction, v is x, y, z, w, theta,Is a collection of (3); mu (mu) k The component of the pixel point in the k direction, which is the control point, # V is every pixel on the image, # V is the pixel point in the k direction>Gradient in k-direction for the objective function D, +.>For the bias of the FFD model in the k-direction relative to the control point coordinates, +.>λ j For the position of the control point near the control point, Δρ is the interval between the control points, β (3) =β (3) (x')β (3) (y')β (3) (z ') is a B-spline basis function in three directions of x', y ', z';
s5: superposing the preliminary deformation field parameters and the local deformation field parameters on the reference image I R Is to obtain the real-time image slice I t Is to slice the real-time image into slices I t Contour of (1) and real-time image slice I t And superposing to obtain a real-time image and the outline thereof.
2. The method according to claim 1, wherein the step S3 specifically includes:
respectively obtaining a first-order gradient and a second-order gradient of the objective function in each deformation direction, and adopting an optimization algorithm to carry out iteration to obtain primary deformation field parametersWherein x, y and z are three translational degrees of freedom, w, θ, and +.>The deformation method is x, y, z, w, theta and/or (II) for three rotational degrees of freedom>Six directions.
3. The method of claim 2, wherein the first order gradient of the objective function in each deformation direction is:
wherein k is x, y, z, w, θ orRepresenting an objective function at x, y, z, w, θ or +.>First order gradient in direction, j=1, 2 and 3 denote in x, y and z directions, G j For the gradient of the objective function D in the J direction, J k,j Jacobian matrix j= [ J ] being a rigid deformation T k,j ]∈R 3×6 Values in k rows j columns.
4. A method according to claim 3, wherein the second order gradient of the objective function in each deformation direction is:
wherein m is x, y, z, w, θ ori=1, 2 and 3 denote in x, y and z directions, G i For the gradient of the objective function D in the i direction, J k,i Jacobian matrix j= [ J ] being a rigid deformation T k,i ]∈R 3×6 Values at k rows and i columns; j (J) m,j Jacobian matrix j= [ J ] being a rigid deformation T m,j ]∈R 3×6 Values in m rows and j columns.
5. The method of claim 1, wherein the step S4 further includes performing a second order gradient analysis on each pixel point in the image in each direction in the FFD model, where the analysis formula is:
wherein,gradient in the m-direction for the objective function D, +.>Is the bias guide of the cubic B spline model in the FFD model relative to the control point coordinate along the m direction, v m Is the component of the pixel point of the control point in the m direction.
6. The method according to claim 1, wherein steps S3 and/or S4 employ quasi-newton method or trust domain method to perform gradient analysis iterative calculation until the objective function converges or the objective function variation value is smaller than a preset value.
7. The method according to claim 1, wherein said step S1 further comprises the step of generating said volumetric image V R And (3) dividing, wherein the target area in the step S2 is a certain area after the division in the step S1.
8. The method according to claim 1, wherein step S2 further comprises the step of combining the volumetric image V R Ultrasound reference image I R The ultrasound reference image I R The resolution of (c) is adjusted to be the same, and the picture is subjected to gray scale normalization.
9. A system for implementing the intraoperative ultrasound image and contour real-time registration method thereof as claimed in any one of claims 1 to 8, the system comprising:
an acquisition module for acquiring a volume image V of an area to be operated R
The imaging module is used for imaging a target area in an area to be operated by using the ultrasonic probe in the operation process to obtain a real-time image slice I t At the same time from the volume image V R Wherein the image of the corresponding position is obtained as reference image I R
A rigid registration module for characterizing the real-time image slice I t And the reference image I R Similarity measurementPerforming gradient analysis calculation on the objective function of the deformation field to realize rigid registration so as to obtain primary deformation field parameters;
the non-rigid registration module is used for carrying out gradient analysis calculation on an objective function in the FFD model on the image processed by the rigid registration module so as to carry out non-rigid registration and further obtain local deformation field parameters;
a superposition module for superposing the preliminary deformation field parameters and the local deformation field parameters on the reference image I R Is to obtain the real-time image slice I t Is to slice the real-time image into slices I t Contour of (1) and real-time image slice I t And superposing to obtain a real-time image and the outline thereof.
CN202110806507.6A 2021-07-16 2021-07-16 Intraoperative ultrasonic image and contour real-time registration method and system thereof Active CN113592925B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110806507.6A CN113592925B (en) 2021-07-16 2021-07-16 Intraoperative ultrasonic image and contour real-time registration method and system thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110806507.6A CN113592925B (en) 2021-07-16 2021-07-16 Intraoperative ultrasonic image and contour real-time registration method and system thereof

Publications (2)

Publication Number Publication Date
CN113592925A CN113592925A (en) 2021-11-02
CN113592925B true CN113592925B (en) 2024-02-06

Family

ID=78248091

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110806507.6A Active CN113592925B (en) 2021-07-16 2021-07-16 Intraoperative ultrasonic image and contour real-time registration method and system thereof

Country Status (1)

Country Link
CN (1) CN113592925B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113796952B (en) * 2021-11-18 2022-03-18 北京智愈医疗科技有限公司 Tissue resection system and cutting parameter determination method thereof

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106558073A (en) * 2016-11-23 2017-04-05 山东大学 Based on characteristics of image and TV L1Non-rigid image registration method
CN107871325A (en) * 2017-11-14 2018-04-03 华南理工大学 Image non-rigid registration method based on Log Euclidean covariance matrix descriptors
CN108038848A (en) * 2017-12-07 2018-05-15 上海交通大学 Quick calculation method and system based on medical image sequence plaque stability index
CN110782428A (en) * 2019-09-20 2020-02-11 浙江未来技术研究院(嘉兴) Method and system for constructing clinical brain CT image ROI template

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8532402B2 (en) * 2011-06-22 2013-09-10 The Boeing Company Image registration

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106558073A (en) * 2016-11-23 2017-04-05 山东大学 Based on characteristics of image and TV L1Non-rigid image registration method
CN107871325A (en) * 2017-11-14 2018-04-03 华南理工大学 Image non-rigid registration method based on Log Euclidean covariance matrix descriptors
CN108038848A (en) * 2017-12-07 2018-05-15 上海交通大学 Quick calculation method and system based on medical image sequence plaque stability index
CN110782428A (en) * 2019-09-20 2020-02-11 浙江未来技术研究院(嘉兴) Method and system for constructing clinical brain CT image ROI template

Also Published As

Publication number Publication date
CN113592925A (en) 2021-11-02

Similar Documents

Publication Publication Date Title
JP5335280B2 (en) Alignment processing apparatus, alignment method, program, and storage medium
CN104584074B (en) Coupled segmentation in 3D conventional and contrast-enhanced ultrasound images
Xiao et al. Nonrigid registration of 3-D free-hand ultrasound images of the breast
Ling et al. Hierarchical, learning-based automatic liver segmentation
CN110738701B (en) Tumor three-dimensional positioning system
Lee et al. Non-rigid registration between 3D ultrasound and CT images of the liver based on intensity and gradient information
US20050251029A1 (en) Radiation therapy treatment plan
CN102402788A (en) Method for segmenting three-dimensional ultrasonic image
CN114187338B (en) Organ deformation registration method based on estimated 2d displacement field
CN113592925B (en) Intraoperative ultrasonic image and contour real-time registration method and system thereof
Schalk et al. 3D surface-based registration of ultrasound and histology in prostate cancer imaging
CN112348794A (en) Ultrasonic breast tumor automatic segmentation method based on attention-enhanced U-shaped network
CN116363181A (en) Feature-based CT image and ultrasonic image liver registration method
François et al. Robust statistical registration of 3D ultrasound images using texture information
JP7378694B2 (en) Lung lobe segmentation method based on digital human technology
Erdt et al. Computer aided segmentation of kidneys using locally shape constrained deformable models on CT images
Mao et al. Direct 3d ultrasound fusion for transesophageal echocardiography
CN111166373B (en) Positioning registration method, device and system
Bhattacharjee et al. Non-rigid registration (computed tomography-ultrasound) of liver using B-splines and free form deformation
CN112884765A (en) 2D image and 3D image registration method based on contour features
CN112581428A (en) Multi-modal medical image auxiliary diagnosis method
Koenig et al. Automatic cropping of breast regions for registration in MR mammography
CN114300096B (en) TRUS image-based prostate activity apparent model building method
CN114757951B (en) Sign data fusion method, data fusion equipment and readable storage medium
Mitra et al. A non-linear diffeomorphic framework for prostate multimodal registration

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant