CN115984270A - Brain perfusion data analysis method, device and storage medium - Google Patents

Brain perfusion data analysis method, device and storage medium Download PDF

Info

Publication number
CN115984270A
CN115984270A CN202310265682.8A CN202310265682A CN115984270A CN 115984270 A CN115984270 A CN 115984270A CN 202310265682 A CN202310265682 A CN 202310265682A CN 115984270 A CN115984270 A CN 115984270A
Authority
CN
China
Prior art keywords
perfusion
contrast agent
image
agent concentration
perfusion data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310265682.8A
Other languages
Chinese (zh)
Inventor
刘伟奇
马学升
陈金钢
赵友源
庞盼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tongxin Zhiyi Technology Beijing Co ltd
Original Assignee
Tongxin Zhiyi Technology Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tongxin Zhiyi Technology Beijing Co ltd filed Critical Tongxin Zhiyi Technology Beijing Co ltd
Priority to CN202310265682.8A priority Critical patent/CN115984270A/en
Publication of CN115984270A publication Critical patent/CN115984270A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The embodiment of the application discloses a method, a device and a storage medium for analyzing cerebral perfusion data, wherein the method for analyzing the cerebral perfusion data comprises the following steps: acquiring an initial image, wherein the initial image is formed by combining two types of images obtained by brain perfusion imaging in a CT scanning mode and an MRI scanning mode; carrying out blood vessel coding and blood vessel marking on the initial image, and then carrying out imaging scanning on the initial image through subtraction calculation to obtain a coded image; acquiring perfusion parameters based on the coded image and a deconvolution algorithm, and constructing a micro-tissue model based on the perfusion parameters; based on the micro-tissue model, obtaining perfusion data corresponding to perfusion parameters, processing and analyzing the perfusion data to improve image quality, correct artifacts, optimize analysis of a perfusion value map and/or optimize a workflow, wherein the processing comprises motion correction, noise reduction, segmentation, conversion of contrast agent concentration, a local AIF algorithm and/or correction of partial volume effects in AIF.

Description

Brain perfusion data analysis method, device and storage medium
Technical Field
The application relates to the technical field of image processing, in particular to a method and a device for analyzing cerebral perfusion data and a storage medium.
Background
In clinical medicine, the pathogenic factors of cerebrovascular diseases are more, and the classification and typing of the cerebrovascular diseases directly influence the clinical diagnosis, treatment and prevention, so that the treatment effect of the cerebrovascular diseases can be directly improved and the prognosis of patients can be improved by correctly classifying and typing the cerebrovascular diseases.
The clinical diagnosis of cerebrovascular diseases mainly adopts the means of imaging, and the whole brain angiography is always the gold standard for the diagnosis of cerebrovascular diseases. However, since the angiography technique is based on 2D planar imaging and is related to an imaging plane, only cerebrovascular artery stenosis, aneurysm, vascular malformation, and other lesions can be observed simply, and cerebrovascular compensation and adaptation, which directly determine whether the blood supply and perfusion of brain tissue are sufficient, and brain parenchymal structures cannot be observed simultaneously, and this technique involves a risk of thromboembolism. In addition, the examination requires the use of contrast media, which may exacerbate the impairment of renal function in some patients with combined renal disease and may be less reproducible. In order to determine the extent of affected tissue and delineate ischemic tissue that can be reperfused, perfusion imaging using CT and MRI has become a routine means of clinically examining the perfusion of a patient's cerebral blood flow. The CT perfusion imaging (CTP) can reflect the hemodynamic change, the operation is simple, and the examination time is short; magnetic resonance perfusion imaging (MRP) can better reflect the perfusion condition of ischemic brain tissue, with higher sensitivity.
CTP and MRP can obtain a full brain perfusion image but cannot provide a brain region perfusion image independently, and the development of CT or MRI compatible angiography technology appears as a darkroom seam lamp, angiography equipment is fused with CT or MRI equipment, and CT or MRI perfusion scanning is carried out while angiography is carried out, so that selective angiography imaging can be provided, and a CT or MRI perfusion image of a local region can also be obtained.
However, the data processing, identification and analysis of such technologies (the angiography device is fused with the CT or MRI device, and the CT or MRI perfusion scan is performed while the angiography is performed) are complicated, and the size of the output after the features are extracted through the convolutional neural network is often reduced, which is not beneficial to subsequent calculation, so that the existing CT and MRI brain perfusion data analysis technologies have the problems of large workload, low work efficiency, and poor repeatability and consistency of the analysis results.
Disclosure of Invention
An object of the embodiments of the present application is to provide a method and an apparatus for analyzing brain perfusion data, and a storage medium, so as to solve the problems of large workload, low working efficiency, and poor repeatability and consistency of analysis results in the CT and MRI brain perfusion data analysis techniques in the prior art.
In order to achieve the above object, an embodiment of the present application provides a method for analyzing brain perfusion data, including the steps of: acquiring an initial image, wherein the initial image is formed by combining two types of images obtained by brain perfusion imaging in a CT scanning mode and an MRI scanning mode;
carrying out blood vessel coding and blood vessel marking on the initial image, and then carrying out imaging scanning on the initial image through subtraction calculation to obtain a coded image;
acquiring perfusion parameters based on the coded image and a deconvolution algorithm, and constructing a micro tissue model based on the perfusion parameters, wherein the micro tissue model comprises covering organ specific parenchyma, interstitial space and capillary beds;
and acquiring perfusion data corresponding to the perfusion parameters based on the micro-tissue model, and processing and analyzing the perfusion data to improve the image quality, correct artifacts, optimize the analysis of a perfusion value map and/or optimize the workflow, wherein the processing comprises motion correction, noise reduction, segmentation, contrast agent concentration conversion, a local AIF algorithm and/or partial volume effect correction in AIF.
Optionally, the perfusion parameters include: local contrast agent concentration, average contrast agent concentration, transit time of blood cells through the capillary bed, cerebral blood flow, rotation amount and/or mean density of parenchymal, interstitial space and capillary bed of the capillary bed.
Optionally, the method for obtaining the perfusion parameter based on the encoded image and the deconvolution algorithm includes:
obtaining the local contrast agent concentration and the average contrast agent concentration from the encoded image by measurement;
by the formula
Figure SMS_1
And
Figure SMS_2
obtaining the cerebral blood flow and the mean density of parenchymal, interstitial space and capillary beds, wherein,
Figure SMS_3
means the mean density of parenchyma, interstitial space and capillary beds, CBF means the cerebral blood flow, t means the transit time of blood cells through the capillary beds, c voi (t) means the average contrast agent concentration, c art (t) refers to the local contrast agent concentration and r (t) refers to the residual function.
Optionally, the method of motion correction comprises:
all time frames of the reconstructed data set using 3D registration are registered onto the first time frame, and time frames exhibiting severe reconstruction artifacts are removed from the data set.
Optionally, the noise reduction method includes:
the CTP volumes were rigidly registered and the CTP scans were filtered using TIPS bilateral filter and 4D bilateral filter for noise reduction.
Optionally, the method of segmenting comprises:
automatic image segmentation is performed to identify voxels with attenuation greater than or equal to 100 HU as bone.
Optionally, the method of converting the concentration of the contrast agent comprises:
by the formula
Figure SMS_4
And
Figure SMS_5
converting a contrast agent signal into the contrast agent concentration, wherein kct = 1 g/mL/HU, baseline value μ 0 The calculation is performed as an average value of μ (tj) over the acquisition time during the first B segments of the input function for the arrival of contrast agent at the artery, and the measured contrast agent signal is converted into contrast agent concentration values, i.e. from μ (tj) to the corresponding contrast agent concentration value c (tj).
Optionally, after obtaining the perfusion data corresponding to the perfusion parameter, and processing and analyzing the perfusion data, the method further includes:
storing the perfusion data.
To achieve the above object, the present application also provides a cerebral perfusion data analyzing apparatus, including: a memory; and
a processor coupled to the memory, the processor configured to:
acquiring an initial image, wherein the initial image is formed by combining two types of images obtained by brain perfusion imaging in a CT scanning mode and an MRI scanning mode;
carrying out blood vessel coding and blood vessel marking on the initial image, and then carrying out imaging scanning on the initial image through subtraction calculation to obtain a coded image;
acquiring perfusion parameters based on the coded images and a deconvolution algorithm, and constructing a micro-tissue model based on the perfusion parameters, wherein the micro-tissue model comprises covering organ specific parenchyma, interstitial space and capillary beds;
and acquiring perfusion data corresponding to the perfusion parameters based on the micro-tissue model, and processing and analyzing the perfusion data to improve image quality, correct artifacts, optimize analysis of a perfusion value map and/or optimize a workflow, wherein the processing comprises motion correction, noise reduction, segmentation, contrast agent concentration conversion, a local AIF algorithm and/or partial volume effect correction in AIF.
To achieve the above object, the present application also provides a computer storage medium having a computer program stored thereon, wherein the computer program, when executed by a machine, implements the steps of the method as described above.
The embodiment of the application has the following advantages:
the embodiment of the application provides a brain perfusion data analysis method, which comprises the following steps: acquiring an initial image, wherein the initial image is formed by combining two types of images obtained by brain perfusion imaging in a CT scanning mode and an MRI scanning mode; carrying out blood vessel coding and blood vessel marking on the initial image, and then carrying out imaging scanning on the initial image through subtraction calculation to obtain a coded image; acquiring perfusion parameters based on the coded images and a deconvolution algorithm, and constructing a micro-tissue model based on the perfusion parameters, wherein the micro-tissue model comprises covering organ specific parenchyma, interstitial space and capillary beds; and acquiring perfusion data corresponding to the perfusion parameters based on the micro-tissue model, and processing and analyzing the perfusion data to improve the image quality, correct artifacts, optimize the analysis of a perfusion value map and/or optimize the workflow, wherein the processing comprises motion correction, noise reduction, segmentation, contrast agent concentration conversion, a local AIF algorithm and/or partial volume effect correction in AIF.
By the method, different from the traditional calculation method, all analysis processes can be completed before a user accesses the images, so that the user can directly see the analysis results when opening the cerebral perfusion image sequence on the image workstation, the artificial deviation and the waiting time which are possibly introduced by manual intervention and manual designation are avoided, the workload of the user is reduced, the working efficiency of the user is improved, and the repeatability and the consistency of the analysis results are ensured.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below. It should be apparent that the drawings in the following description are merely exemplary, and that other embodiments can be derived from the drawings provided by those of ordinary skill in the art without inventive effort.
Fig. 1 is a flowchart of a method for analyzing cerebral perfusion data according to an embodiment of the present disclosure;
fig. 2 is a diagram of a labeling geometry and a pulse sequence of a method for analyzing brain perfusion data according to an embodiment of the present application;
fig. 3 is a vessel-encoded scan image of a cerebral perfusion data analysis method according to an embodiment of the present application;
fig. 4 is a three-vessel coded scan image of a cerebral perfusion data analysis method according to an embodiment of the present application;
fig. 5 is a schematic diagram of a tissue perfusion physiological model of a brain perfusion data analysis method according to an embodiment of the present application;
fig. 6 is a schematic view illustrating a noise reduction effect of a brain perfusion data analysis method according to an embodiment of the present application;
fig. 7 is a block diagram of a cerebral perfusion data analysis apparatus according to an embodiment of the present disclosure.
Detailed Description
The present disclosure is not intended to be limited to the particular embodiments shown herein, but is to be accorded the widest scope consistent with the principles and novel features disclosed herein. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In addition, the technical features mentioned in the different embodiments of the present application described below may be combined with each other as long as they do not conflict with each other.
An embodiment of the present application provides a method for analyzing brain perfusion data, and referring to fig. 1, fig. 1 is a flowchart of a method for analyzing brain perfusion data provided in an embodiment of the present application, it should be understood that the method may further include additional blocks not shown and/or may omit the illustrated blocks, and the scope of the present application is not limited in this respect.
This embodiment proposes a new robust deconvolution algorithm that optimizes the quantification of improved perfusion parameter estimates by Tensor Total Variation (TTV) regularization. And perfecting the model through a noise reduction algorithm so as to further reduce noise and improve image quality. Based on a deconvolution method, the spatial correlation of a vascular structure and the time continuity of signal flow are simultaneously obtained, a perfusion measurement model is constructed for the analysis of CT and MRI cerebral perfusion data, and the method mainly comprises the following steps:
at step 101, an initial image is acquired that is a combination of two types of images obtained by brain perfusion imaging from both CT and MRI scan modes.
At step 102, the initial image is subjected to blood vessel coding and blood vessel marking, and then the initial image is subjected to imaging scanning through subtraction calculation to obtain a coded image.
Specifically, vessel coding:
the present embodiment first encodes the blood vessels, and the scanning mode used for the initial image is composed of two image types (imaging by cerebral perfusion in both CT and MRI scanning modes), both image types contain the same static tissue signal, but the sign of the inflow arterial magnetization is different. This encoding process is described by equation 1, where x is the contribution of the inflowing blood and static tissue components to the signal, a is the encoding matrix, and y is the resulting signal strength:
Figure SMS_6
,/>
Figure SMS_7
,/>
Figure SMS_8
equation 1
Where V is the signal for inflow of blood and S is the signal for static tissue. The matrix of A is the encoding step required to generate y 1 and y 2 when A has a pseudo-inverse A + Then, x = a + y, which is equation 2, can be obtained by inverting x, i.e., V is proportional to y 2-y 1.
Figure SMS_9
Equation 2
In order to encode the contributions of more than one vessel to the CT and MR signals, respectively, the present invention designs more than two encoding steps, as shown in equation 3:
Figure SMS_10
]=/>
Figure SMS_11
/>
Figure SMS_12
equation 3
Where R, L and B are the contributions of labeled blood signals from the right, left and basilar arteries, respectively. In general, vessel geometry and labeling methods may not achieve optimal encoding, and the present invention operates by decoding matrix A from + The expected SNR efficiency is calculated and the resulting unit signal is directly inverted in the decoding process.
Marking blood vessels:
since there is a large gradient when applying radio frequency, the radio frequency irradiation is further away from resonance in the target tissue and the magnetization transfer effect is greatly reduced, the present application has the advantage that additional transverse gradient pulses are applied to modulate the relative position of spins in different vessels in the labeling plane to allow differential encoding of the vessels in the inversion plane, with the specific steps as follows:
1) Using a single marker gradient waveform in the flow direction;
2) Applying an additional gradient perpendicular to the marker gradient to create a phase shift between the target vessels;
3) The target vessel is placed under tagging and control conditions according to an encoding schedule using radio frequency phase modulation across pulses.
The mark geometry and pulse sequence are shown in fig. 2, four periods:
1) Inverting all blood vessels;
2) Inverting without a container;
3) Only container a is inverted;
4) Only container B is inverted.
Imaging scanning:
first, this embodiment is based on imaging scans on an eight-channel head radio frequency coil array and a body coil for radio frequency transmission, with FOV set at 24 cm × 8 mm, slice gap at 2 mm, readings using a single two-dimensional spiral, total length of the marker pulse sequence of 1574 ms, consisting of 1640 RF pulses spaced 960 μ s apart.
Second, the present embodiment suppresses the background by applying two non-selective adiabatic inversion pulses at both time points 950 ms and 300 ms before image acquisition. Post-labelling delays of 1000 ms and a TR of 3000 ms are possible.
Finally, a signal intensity threshold is set to half the corresponding intensity at the 99 th percentile in the image, voxels above this threshold are used as a coarse gray mask, and the high-quality, sharp encoded images scanned by subtracting the conventional image according to the above-mentioned parameters are shown in fig. 3 and 4.
At step 103, perfusion parameters are obtained based on the encoded image and a deconvolution algorithm, and a micro-tissue model is constructed based on the perfusion parameters, the micro-tissue model including covering organ-specific parenchyma, interstitial space, and capillary beds.
In some embodiments, the perfusion parameters include: local contrast agent concentration, average contrast agent concentration of the capillary bed, transit time of blood cells through the capillary bed, cerebral blood flow, rotation and/or parenchyma, interstitial space, and average density of the capillary bed.
In some embodiments, the method of obtaining the perfusion parameters based on the encoded image and a deconvolution algorithm comprises:
acquiring the local contrast agent concentration and the average contrast agent concentration from the encoded image by measurement;
by the formula
Figure SMS_13
And
Figure SMS_14
/>
obtaining the cerebral blood flow and the mean density of parenchyma, interstitial space and capillary beds, wherein,
Figure SMS_15
means the mean density of parenchyma, interstitial space and capillary beds, CBF means the cerebral blood flow, t means the transit time of blood cells through the capillary beds, c voi (t) means the average contrast agent concentration, c art (t) refers to the local contrast agent concentration and r (t) refers to the residual function.
Specifically, a deconvolution algorithm is used to calculate perfusion parameters, construct a micro-tissue model:
the present application constructs a physiological model of tissue blood supply for the purpose of calculating tissue perfusion (micro-tissue model) consisting of covering organ-specific parenchyma, interstitial space and capillary beds as shown in fig. 5. Volumetric use of parenchymal and interstitial spaces
Figure SMS_16
Shows, however, the volume V of the capillary bed cap And (4) showing. Total volume->
Figure SMS_17
Blood should be supplied through a single arterial inlet and correspondingly drained through a single venous outlet. Blood cells may traverse the capillary bed through various pathways, the transit time t of which through the capillary bed depends primarily on the chosen pathway. This embodiment makes it possible to achieve a probability density distribution h (t) of transit time that is smooth and that will enter the total volume V through the arterial inlet after injection of contrast agent voi Then diluted into the capillary bed. Local contrast agent concentration c art (t) and c ven (t) is measured directly in the vicinity of the capillary beds on the arterial and venous sides, respectively. Furthermore, the average contrast agent concentration c in the volume may also be measured voi (t) of (d). Variable c based on equation 4 art (t) and c voi (t) can be measured, while CBF, r (t) and->
Figure SMS_18
The value of (a) is unknown. For the calculation of CBF and other diagnostically relevant tissue perfusion parameters, a flow scaling residual function k (t) is first introduced, formula (5), in 1/s, which can be directly derived from the measurement data c art (t) and c voi (t) determining.
Figure SMS_19
Equation 4
Figure SMS_20
Equation 5
At step 104, based on the micro-tissue model, obtaining perfusion data corresponding to the perfusion parameters, processing and analyzing the perfusion data to improve image quality, correct artifacts, optimize analysis of a perfusion value map, and/or optimize workflow, wherein the processing includes motion correction, noise reduction, segmentation, converting contrast agent concentration, local AIF algorithm, and/or correcting partial volume effect in AIF.
In some embodiments, the method of motion correction comprises:
all time frames of the reconstructed data set using 3D registration are registered onto the first time frame, and time frames exhibiting severe reconstruction artifacts are removed from the data set.
In some embodiments, the noise reduction method comprises:
the CTP volumes were rigidly registered and the CTP scans were filtered using TIPS bilateral filter and 4D bilateral filter for noise reduction.
In some embodiments, the method of segmenting comprises:
automatic image segmentation is performed and any voxel that attenuates to 100 HU or above will be identified as bone.
In some embodiments, the method of converting the concentration of the contrast agent comprises:
by the formula
Figure SMS_21
And
Figure SMS_22
converting a contrast agent signal into the contrast agent concentration, wherein kct = 1 g/mL/HU, baseline value μ 0 The calculation is performed as an average value of μ (tj) over the acquisition time during the first B segments of the input function for the arrival of contrast agent at the artery, and the measured contrast agent signal is converted into contrast agent concentration values, i.e. from μ (tj) to the corresponding contrast agent concentration value c (tj).
Specifically, perfusion data processing and analysis:
further, the processing steps are mainly intended to improve the image quality (e.g. noise reduction) and to correct artifacts (e.g. motion correction, partial volume correction) and specific properties of the blood (e.g. correct for hematocrit differences). Still other pre-processing steps may optimize the analysis of the perfusion value map (e.g. segmentation of certain anatomical structures) and the application workflow (e.g. automatic AIF estimation). The data processing steps of the application are mainly as follows:
1. motion correction
Patient motion (e.g., due to head motion or breathing) may cause sudden changes in attenuation values at a fixed (stationary) position. Since such changes in attenuation values are caused by motion, not by contrast agent flow rate, the computed perfusion values may be severely biased. According to the design, all time frames of a reconstruction data set are registered to the first time frame, 3D registration is used, motion perpendicular to the direction of a reconstruction slice can be corrected, rigid registration can meet requirements for brain perfusion scanning, the accuracy of streak artifacts in a reconstructed fusion image is guaranteed, the time frames showing serious reconstruction artifacts can be simply deleted from the data set, and invalid sampling points of a time concentration curve are removed. The specific operation is as follows, first adding the damping factor λ to introduce the gradient factor to buffer the quadratic convergence (equation 6):
Figure SMS_23
equation 6
λ needs to be balanced between convergence speed and overshoot risk, the larger λ, the smaller the convergence step and the smaller the possibility of overshoot. In the method of this embodiment, assuming λ i is the damping factor selected for iteration i, then λ will be at for iteration i +1
Figure SMS_24
Where c0 ≦ 10 is a predefined non-negative integer constant. Assuming that δ P (λ) is the solution of equation 6 using λ in iteration i +1, then the δ P value for iteration i +1 can be derived from equation 7.
Figure SMS_25
Equation 7
2. Noise reduction
Noise reduction can be implemented as spatial smoothing of the data, and the present embodiment replaces the similarity function of intensity difference with a time intensity distribution similarity (TIPS) function, and constructs a 3D weighted kernel, where the weights are based on 3D proximity and the fourth dimension of brain Computed Tomography Perfusion (CTP) scan. TIPS function (p: (m))
Figure SMS_26
X)) is defined as formula 8:
Figure SMS_27
equation 8
Wherein
Figure SMS_28
Is the sum of the temporal intensity distribution at voxel x and a squared difference (SSD) measure between neighboring voxels. The standard deviation σ ζ determines the temporal intensity distribution of the SSD measurement, which is defined as equation 9:
Figure SMS_29
equation 9
Where T is the size of the time dimension, I (x (x, y, z, T)) is the intensity value of the voxel x (x, y, z) at the time point T, and I (ξ (x, y, z, T)) is the intensity value of the neighboring voxel ξ (x, y, z) at the time point T. Prior to filtering, CTP scans were filtered using a rigid registration CTP volume, a TIPS bilateral filter and a 4D bilateral filter. For TIPS bilateral filter, use 3x3x3 kernel and assume σ d = 1.0. The kernel size of the 4D bilateral filter is set to 3x3x3x5, with time direction σ D = 3.0 and space direction σ D = 1.0. Both σ ζ for TIPS and σ r for 4D bilateral filters depend on the noise level in the scan. The parameter σ ζ of the TIPS bilateral filter is set as described in equation 9, σ r of the 4D bilateral filter is set as the average intensity difference, and the noise reduction effect is compared as in fig. 6.
3. Segmentation
The present application developed a raw working program running on Microsoft Windows to calculate CBF by perfusion CT/MRI and PET. The CBF of perfusion was calculated using the central volume principle, which relates CBF, cerebral Blood Volume (CBV) and Mean Transit Time (MTT) values in the following relationship:
Figure SMS_30
(equation 10), when the contrast agent reaches the destination, the brain tissue can be measured with the scanner (tissue residual function, ct]) And arteries (arterial input function, ca [ t ]]) The relationship between the contrast agent concentration in (1) and the concentration of the contrast agent in (2) is shown in formula 11: />
Figure SMS_31
Wherein->
Figure SMS_32
Representing a convolution operation, R (t) is the impulse residual function. R (t) represents the fraction of injected tracer remaining in the vasculature for time t and is calculated by deconvolution. After determining R (t), the MTT is calculated according to equation 12: />
Figure SMS_33
Wherein R is max Is the maximum value of R (t). The absolute CBV in the capillary network can be calculated as follows, equation 13:
Figure SMS_34
equation 13
Wherein k = (1-HCT) LV )/(1 - HCT SV ) Correction of small blood vessels (HCT) SV = 0.25) and large blood vessel (HCT) LV = 0.45), 241is brain tissue density (1.04 g/mL). All calculations are performed on a voxel-by-voxel basis. The Arterial Input Function (AIF) was determined by measuring the time-decay curve of the superior segment of the right or left internal carotid superior bed processes. The singular value decomposition is used in the deconvolution process to determine R (t). MTT is calculated by equation 12, CBV is calculated by equation 13 using time-decay curves of C (t) and Ca (t), and CBF is calculated by equation 10.
To limit the data comparison to brain parenchyma only, we performed automatic image segmentation and any voxel attenuated to 100 HU or more would be identified as bone. Bones and outer regions of bones were excluded and we calculated the mean CBF, the CBF of grey and white matter. Meanwhile, to distinguish between gray and white matter, thresholding is performed: gray matter is defined as voxel value ≧ 32 HU and white matter is defined as voxel value < 32 HU, and the CT-CBF and PET-CBF mean values of the whole slice, gray matter and white matter are analyzed to reduce the matrix size of both CT-CBF and PET-CBF to 32 × 32 in order to reduce the noise of the spatial distribution.
4. Conversion of contrast agent concentration
Neither CT nor MR imaging allow the direct measurement of the time-concentration curve C art (t) and c voi (t), this example measures the superposition of signals from the tissue itself and the contrast agent, based on a deconvolution algorithm, c art (tj) and c voi (tj) only relates to the contrast agent induced signal and therefore the tissue signal has to be subtracted. Furthermore, it is assumed that the (basic) contrast agent concentration value is proportional to the (measured) x-ray attenuation value. Since deconvolution is a linear operation, the proportionality constant does not affect the calculated flow scaling residual function, and the perfusion parameters are independent of the proportionality constant. Therefore, it is usually set to kct = 1 g/mL/HU. The baseline value μ 0 may be calculated as the pre-contrast arrival artery input function BThe segments are averaged over time μ (tj). The measured signal also needs to be converted into a contrast agent concentration value, and the conversion formula from μ (tj) to the corresponding contrast agent concentration value c (tj) is calculated with reference to equations 14-15.
Figure SMS_35
/>
Equation 14
Figure SMS_36
Equation 15
5. Local AIF algorithm
The perfusion map is created by assigning a score to each voxel in the brain based on attributes that determine it as a quality of the AIF. For each voxel in the brain, the AIF of that voxel is calculated by searching the surrounding (27 mm) volume for the three best scores, and then interpolated and smoothed using a three-dimensional gaussian kernel. The resulting local AIFs are then deconvoluted on a voxel-by-voxel basis using a SVD method similar to that used for global metrics.
6. Correcting partial volume effects in AIF
To generate additional AIFs that increase with PVE degree, the present embodiment uses sub-Regions (ROIs) of 3, 5, 7, 9, and 11 voxels in width centered on the voxel used for reference AIF. Uses the modulus signal for calculating the estimated tissue pair AIF(s) required for TSF t,est ) Is measured in terms of the difference in the area of the two largest ROIs, it is always located outside the container. The VOF is measured from the voxel with the largest signal during the bolus channel in the sagittal sinus, all signals are normalized by the number of voxels in the ROI. Finally, tissue ROIs were selected in the normally occurring gray matter to illustrate CBF quantification, which was quantified by model-free deconvolution of tracer kinetic equations (equation 11), concentration curve c t (t) and c a (t) is determined from the measured signal according to the signal equation of the center phase encoded saturation recovery sequence.
In some embodiments, after obtaining the perfusion data corresponding to the perfusion parameter, and processing and analyzing the perfusion data, the method further includes:
storing the perfusion data.
By the method, different from the traditional calculation method, all analysis processes can be completed before a user accesses images, so that the user can directly see the analysis results when opening the cerebral perfusion image sequence on the image workstation, the artificial deviation and the waiting time which are possibly introduced by manual intervention and manual designation are avoided, the workload of the user is reduced, the working efficiency of the user is improved, and the repeatability and the consistency of the analysis results are ensured.
The application solves the problems of the prior art:
1. the problems of low manual extraction efficiency and unstable automatic extraction performance are solved;
2. the number of clusters does not need to be specified, the number of uniformly distributed curves is not needed, and the robustness is improved;
3. simple formula calculation aiming at curve characteristics is solved;
4. the problem that the signal change of a patient is regarded as the prominent change characteristic of a contrast agent due to the movement of the patient in the scanning process, so that data are selected by mistake is solved;
5. the method solves the problem that the injection rate of the contrast agent in the conventional algorithm is instantly unrealistic, considers the actual situation, optimizes the algorithm, and truly reflects the change of the quantity of the contrast agent in the tissues along with the lapse of time.
The method applies a transverse gradient field to carry out phase coding on the basis of CASL, realizes different marks for flow rate and space selection, and can simultaneously display the far-end branches or perfusion of different blood supply vessels;
the method is based on spiral three-dimensional volume scanning, the pulse is marked by the quasi-continuous ellipse, the size, the position and the angle of the pulse can be adjusted randomly according to the target blood vessel, and the problems of marking failure and non-target blood vessel pollution caused by ontogeny difference are solved;
in order to ensure that the image is restored to the original size for subsequent calculation, the invention adopts a deconvolution algorithm to construct a perfusion measurement model for CT and MRI brain perfusion data analysis;
after the scanning is finished, the images are subjected to post-processing and superposition to obtain perfusion maps with different colors, so that perfusion areas for supplying cerebral blood vessels are visually displayed, the perfusion areas for supplying cerebral blood vessels can be independently observed, perfusion areas of internal carotid artery and basilar artery can be displayed, intracerebral blood supply areas of unilateral vertebral artery and blood supply areas of branches of Willis ring can be independently displayed, and intracerebral small blood vessels can be subjected to super-selective perfusion area display;
according to the method, the corresponding blood supply area quantification index cerebral blood flow is obtained after scanning is finished, a plurality of physiological parameters can be measured in one examination, all common perfusion parameters can be calculated, the method is not limited to Cerebral Blood Flow (CBF), cerebral Blood Volume (CBV), mean Transit Time (MTT) and peak time (TTP), the scanning time is generally less than 3 min, and the marked delay time is selected in various ways;
the method is different from the traditional calculation method, all analysis processes can be completed before a user accesses images, so that the user can directly see the analysis result when opening the brain perfusion image sequence on the image workstation, the artificial deviation and the waiting time which are possibly introduced by manual intervention and manual designation are avoided, the workload of the user is reduced, the working efficiency of the user is improved, and the repeatability and the consistency of the analysis result are ensured.
Fig. 7 is a block diagram of a cerebral perfusion data analyzing apparatus according to an embodiment of the present disclosure. The device includes:
a memory 201; and a processor 202 coupled to the memory 201, the processor 202 configured to: acquiring an initial image, wherein the initial image is formed by combining two types of images obtained by brain perfusion imaging in a CT scanning mode and an MRI scanning mode;
carrying out blood vessel coding and blood vessel marking on the initial image, and then carrying out imaging scanning on the initial image through subtraction calculation to obtain a coded image;
acquiring perfusion parameters based on the coded images and a deconvolution algorithm, and constructing a micro-tissue model based on the perfusion parameters, wherein the micro-tissue model comprises covering organ specific parenchyma, interstitial space and capillary beds;
and acquiring perfusion data corresponding to the perfusion parameters based on the micro-tissue model, and processing and analyzing the perfusion data to improve the image quality, correct artifacts, optimize the analysis of a perfusion value map and/or optimize the workflow, wherein the processing comprises motion correction, noise reduction, segmentation, contrast agent concentration conversion, a local AIF algorithm and/or partial volume effect correction in AIF.
In some embodiments, the processor 202 is further configured to: the perfusion parameters include: local contrast agent concentration, average contrast agent concentration, transit time of blood cells through the capillary bed, cerebral blood flow, rotation amount and/or mean density of parenchymal, interstitial space and capillary bed of the capillary bed.
In some embodiments, the processor 202 is further configured to: the method for obtaining the perfusion parameters based on the encoded image and the deconvolution algorithm comprises the following steps:
acquiring the local contrast agent concentration and the average contrast agent concentration from the encoded image by measurement;
by the formula
Figure SMS_37
And
Figure SMS_38
obtaining the cerebral blood flow and the mean density of parenchyma, interstitial space and capillary beds, wherein,
Figure SMS_39
means the mean density of parenchyma, interstitial space and capillary beds, CBF means the cerebral blood flow, t means the transit time of blood cells through the capillary beds, c voi (t) means the average contrast agent concentration, c art (t) refers to the local contrast agent concentration and r (t) refers to the residual function.
In some embodiments, the processor 202 is further configured to: the method for correcting the movement comprises the following steps:
all time frames of the reconstructed data set using 3D registration are registered onto the first time frame, and time frames exhibiting severe reconstruction artifacts are removed from the data set.
In some embodiments, the processor 202 is further configured to: the noise reduction method comprises the following steps:
the CTP volumes were rigidly registered and the CTP scans were filtered using TIPS bilateral filter and 4D bilateral filter for noise reduction.
In some embodiments, the processor 202 is further configured to: the segmentation method comprises the following steps:
automatic image segmentation is performed and any voxel that will attenuate to 100 HU or above will be identified as bone.
In some embodiments, the processor 202 is further configured to: the method for converting the concentration of the contrast agent comprises the following steps:
by the formula
Figure SMS_40
And
Figure SMS_41
converting a contrast agent signal into the contrast agent concentration, wherein kct = 1 g/mL/HU, baseline value μ 0 The calculation is performed as an average value of μ (tj) over the acquisition time during the first B segments of the input function for the arrival of contrast agent at the artery, and the measured contrast agent signal is converted into contrast agent concentration values, i.e. from μ (tj) to the corresponding contrast agent concentration value c (tj).
In some embodiments, the processor 202 is further configured to: after acquiring the perfusion data corresponding to the perfusion parameters, and processing and analyzing the perfusion data, the method further comprises the following steps:
storing the perfusion data.
For the specific implementation method, reference is made to the foregoing method embodiments, which are not described herein again.
The present application may be methods, apparatus, systems and/or computer program products. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied thereon for carrying out various aspects of the present application.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present application may be assembler instructions, instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, the electronic circuitry can execute computer-readable program instructions to implement aspects of the present application by utilizing state information of the computer-readable program instructions to personalize the electronic circuitry, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA).
Various aspects of the present application are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
It is noted that, unless expressly stated otherwise, all features disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose. Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic series of equivalent or similar features. Where used, it is further preferred, even further and more preferred that the brief introduction of the further embodiment is made on the basis of the preceding embodiment, the contents of which further, preferably, even further or more preferred the rear band is combined with the preceding embodiment as a complete constituent of the further embodiment. Several further, preferred, still further or more preferred arrangements of the belt after the same embodiment may be combined in any combination to form a further embodiment.
Although the present application has been described in detail with respect to the general description and the specific examples, it will be apparent to those skilled in the art that certain changes and modifications may be made based on the present application. Accordingly, such modifications and improvements are intended to be within the scope of this invention as claimed.

Claims (10)

1. A method of brain perfusion data analysis, comprising the steps of:
acquiring an initial image, wherein the initial image is formed by combining two types of images obtained by brain perfusion imaging in a CT scanning mode and an MRI scanning mode;
carrying out blood vessel coding and blood vessel marking on the initial image, and then carrying out imaging scanning on the initial image through subtraction calculation to obtain a coded image;
acquiring perfusion parameters based on the coded image and a deconvolution algorithm, and constructing a micro tissue model based on the perfusion parameters, wherein the micro tissue model comprises covering organ specific parenchyma, interstitial space and capillary beds;
and acquiring perfusion data corresponding to the perfusion parameters based on the micro-tissue model, and processing and analyzing the perfusion data to improve the image quality, correct artifacts, optimize the analysis of a perfusion value map and/or optimize the workflow, wherein the processing comprises motion correction, noise reduction, segmentation, contrast agent concentration conversion, a local AIF algorithm and/or partial volume effect correction in AIF.
2. The method for brain perfusion data analysis according to claim 1,
the perfusion parameters include: local contrast agent concentration, average contrast agent concentration, transit time of blood cells through the capillary bed, cerebral blood flow, rotation amount and/or mean density of parenchymal, interstitial space and capillary bed of the capillary bed.
3. The brain perfusion data analyzing method of claim 2, wherein the method of obtaining the perfusion parameters based on the encoded image and a deconvolution algorithm comprises:
acquiring the local contrast agent concentration and the average contrast agent concentration from the encoded image by measurement;
by the formula
Figure QLYQS_1
And
Figure QLYQS_2
obtaining the cerebral blood flow and the mean density of parenchymal, interstitial space and capillary beds, wherein,
Figure QLYQS_3
means the mean density of parenchyma, interstitial space and capillary beds, CBF means the cerebral blood flow, t means the transit time of blood cells through the capillary beds, c voi (t) means the average contrast agent concentration, c art (t) refers to the local contrast agent concentration and r (t) refers to the residual function.
4. The brain perfusion data analysis method of claim 1, wherein the method of motion correction comprises:
all time frames of the reconstructed data set using 3D registration are registered onto the first time frame, and time frames exhibiting severe reconstruction artifacts are removed from the data set.
5. The brain perfusion data analyzing method of claim 1, wherein the noise reduction method comprises:
the CTP volumes were rigidly registered and the CTP scans were filtered using TIPS bilateral filter and 4D bilateral filter for noise reduction.
6. The method for brain perfusion data analysis according to claim 1, wherein the method of segmenting comprises:
automatic image segmentation is performed to identify voxels with attenuation greater than or equal to 100 HU as bone.
7. The method for brain perfusion data analysis according to claim 1, wherein the method of converting contrast agent concentration includes:
by the formula
Figure QLYQS_4
And
Figure QLYQS_5
converting a contrast agent signal into the contrast agent concentration, wherein kct = 1 g/mL/HU, baseline value μ 0 The average value of μ (tj) over the acquisition time for the first B segments of the input function of the arrival of the contrast agent at the artery is calculated and the measured contrast agent signal is converted into contrast agent concentration values, i.e. from μ (tj) to the corresponding contrast agent concentration value c (tj).
8. The method for analyzing brain perfusion data according to claim 1, further comprising, after acquiring the perfusion data corresponding to the perfusion parameters, and processing and analyzing the perfusion data:
storing the perfusion data.
9. A cerebral perfusion data analysis apparatus, comprising:
a memory; and
a processor coupled to the memory, the processor configured to:
acquiring an initial image, wherein the initial image is formed by combining two types of images obtained by brain perfusion imaging in a CT scanning mode and an MRI scanning mode;
carrying out blood vessel coding and blood vessel marking on the initial image, and then carrying out imaging scanning on the initial image through subtraction calculation to obtain a coded image;
acquiring perfusion parameters based on the coded images and a deconvolution algorithm, and constructing a micro-tissue model based on the perfusion parameters, wherein the micro-tissue model comprises covering organ specific parenchyma, interstitial space and capillary beds;
and acquiring perfusion data corresponding to the perfusion parameters based on the micro-tissue model, and processing and analyzing the perfusion data to improve the image quality, correct artifacts, optimize the analysis of a perfusion value map and/or optimize the workflow, wherein the processing comprises motion correction, noise reduction, segmentation, contrast agent concentration conversion, a local AIF algorithm and/or partial volume effect correction in AIF.
10. A computer storage medium having a computer program stored thereon, wherein the computer program, when executed by a machine, implements the steps of the method of any of claims 1 to 8.
CN202310265682.8A 2023-03-20 2023-03-20 Brain perfusion data analysis method, device and storage medium Pending CN115984270A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310265682.8A CN115984270A (en) 2023-03-20 2023-03-20 Brain perfusion data analysis method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310265682.8A CN115984270A (en) 2023-03-20 2023-03-20 Brain perfusion data analysis method, device and storage medium

Publications (1)

Publication Number Publication Date
CN115984270A true CN115984270A (en) 2023-04-18

Family

ID=85958210

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310265682.8A Pending CN115984270A (en) 2023-03-20 2023-03-20 Brain perfusion data analysis method, device and storage medium

Country Status (1)

Country Link
CN (1) CN115984270A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116798612A (en) * 2023-08-23 2023-09-22 北京超数时代科技有限公司 Digital diagnosis and treatment system for neurological diseases

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116798612A (en) * 2023-08-23 2023-09-22 北京超数时代科技有限公司 Digital diagnosis and treatment system for neurological diseases

Similar Documents

Publication Publication Date Title
CN108693491B (en) Robust quantitative magnetic susceptibility imaging system and method
US10839567B2 (en) Systems and methods for correcting mismatch induced by respiratory motion in positron emission tomography image reconstruction
US11062449B2 (en) Method and system for extracting vasculature
US20240012080A1 (en) System and method of perceptive quantitative mapping of physical properties
US11399779B2 (en) System-independent quantitative perfusion imaging
Dahnke et al. Cortical thickness and central surface estimation
US9424637B2 (en) Vessel segmentation method and apparatus using multiple thresholds values
KR101699452B1 (en) System and method for estimating a quantity of interest in a kinematic system by contrast agent tomography
JP2019500960A (en) Coronary aneurysm segmentation with cardiac model guide
US20110150309A1 (en) Method and system for managing imaging data, and associated devices and compounds
US11688110B2 (en) Systems and methods for evaluating image quality
US20230320611A1 (en) System and method of accurate quantitative mapping of biophysical parameters from mri data
Wang et al. JointVesselNet: Joint volume-projection convolutional embedding networks for 3D cerebrovascular segmentation
US11995745B2 (en) Systems and methods for correcting mismatch induced by respiratory motion in positron emission tomography image reconstruction
CN115984270A (en) Brain perfusion data analysis method, device and storage medium
Özdemir et al. Three-dimensional visualization and improved quantification with super-resolution ultrasound imaging-Validation framework for analysis of microvascular morphology using a chicken embryo model
Gurney-Champion et al. Potential of deep learning in quantitative magnetic resonance imaging for personalized radiotherapy
US10993688B2 (en) Method of data processing for computed tomography
JP2022505451A (en) Reconstruction of active images using anatomical data
Wang et al. 3D airway segmentation via hyperpolarized 3He gas MRI by using scale-based fuzzy connectedness
Chen et al. A novel algorithm for refining cerebral vascular measurements in infants and adults
CN113129297A (en) Automatic diameter measurement method and system based on multi-phase tumor images
Farzi et al. Measuring cardiomyocyte cellular characteristics in cardiac hypertrophy using diffusion‐weighted MRI
Luo et al. Recent progresses on cerebral vasculature segmentation for 3D quantification and visualization of MRA
US20230214999A1 (en) System and method for estimating an indicator of the tissue activity of an organ

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20230418

RJ01 Rejection of invention patent application after publication