CN103793711A - Multidimensional vein extracting method based on brain nuclear magnetic resonance image - Google Patents

Multidimensional vein extracting method based on brain nuclear magnetic resonance image Download PDF

Info

Publication number
CN103793711A
CN103793711A CN201410023305.4A CN201410023305A CN103793711A CN 103793711 A CN103793711 A CN 103793711A CN 201410023305 A CN201410023305 A CN 201410023305A CN 103793711 A CN103793711 A CN 103793711A
Authority
CN
China
Prior art keywords
entropy
energy
region
poor
crowd
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410023305.4A
Other languages
Chinese (zh)
Inventor
郭秀花
高妮
王晶晶
陶丽新
刘相佟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Capital Medical University
Original Assignee
Capital Medical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Capital Medical University filed Critical Capital Medical University
Priority to CN201410023305.4A priority Critical patent/CN103793711A/en
Publication of CN103793711A publication Critical patent/CN103793711A/en
Priority to PCT/CN2014/001130 priority patent/WO2015106374A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/52Scale-space analysis, e.g. wavelet analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)

Abstract

本发明公开了一种基于人群脑部核磁共振图像的多维度纹理提取方法,运用区域增长法将人群脑部核磁共振图像中感兴趣区域分割出来,采用Curvelet变换和Contourlet变换方法提取感兴趣区域的纹理特征参数,其中所述人群包括阿尔茨海默病人群体,轻度认知障碍病人群体和正常老年人群体,所述的感兴趣区域的纹理特征参数包括熵、灰度均值、相关性、能量、同质度、方差、最大概率、逆差距、聚类趋势、对比度、和的均值、差的均值、和的熵、差的熵,所述感兴趣区域包括内嗅皮层和海马两个区域。

The invention discloses a multi-dimensional texture extraction method based on crowd brain nuclear magnetic resonance images, using the region growing method to segment the regions of interest in the crowd brain nuclear magnetic resonance images, and using Curvelet transform and Contourlet transform methods to extract the regions of interest Texture feature parameters, wherein the crowd includes Alzheimer's patient population, mild cognitive impairment patient population and normal elderly population, and the texture feature parameters of the region of interest include entropy, gray mean value, correlation, energy , homogeneity, variance, maximum probability, inverse gap, clustering trend, contrast, mean of sum, mean of difference, entropy of sum, entropy of difference, and the regions of interest include the entorhinal cortex and the hippocampus.

Description

一种基于脑部核磁共振图像的多维度纹理提取方法A multi-dimensional texture extraction method based on brain MRI images

技术领域:Technical field:

本发明属于医学技术领域,具体涉及一种基于脑部核磁共振图像(MRI)的多维度纹理提取方法。The invention belongs to the field of medical technology, in particular to a method for extracting multidimensional textures based on brain magnetic resonance images (MRI).

背景技术:Background technique:

在辅助诊断早期阿尔茨海默病(AD)中,识别MRI图像中ROIs(包括内嗅皮层,海马)的性质具有重要意义。但MRI影像技术只能以海马萎缩作为区别患者和正常人的指标之一,医生对MRI图像的解释易受主观个人影响,缺乏一致性,且不易准确评价痴呆患者症状的严重程度。Identifying the nature of ROIs (including entorhinal cortex, hippocampus) in MRI images is of great significance in aiding the diagnosis of early Alzheimer's disease (AD). However, MRI imaging technology can only use hippocampal atrophy as one of the indicators to distinguish patients from normal people. Doctors' interpretation of MRI images is easily influenced by subjective individuals, lacks consistency, and is difficult to accurately evaluate the severity of symptoms in dementia patients.

1、现有的图像处理技术有以下5种:1. There are five existing image processing technologies:

1)区域增长法(Region-growing Method):1) Region-growing Method:

该方法利用了图像的局部空间信息,可有效克服其它方法存在的图像分割空间不连续的缺点,但尚没有人用该方法进行脑部MRI图像的处理。This method utilizes the local spatial information of the image and can effectively overcome the shortcomings of image segmentation space discontinuity in other methods, but no one has used this method to process brain MRI images.

2)灰度共生矩阵(GLCM):2) Gray level co-occurrence matrix (GLCM):

以往仅有少量相关研究中采用灰度共生矩阵法提取的纹理特征参数,这对于根据脑部MRI图像纹理特征诊断早期AD、MCI还远远不够。In the past, only a small number of relevant studies used the texture feature parameters extracted by the gray-level co-occurrence matrix method, which is far from enough for the diagnosis of early AD and MCI based on the texture features of brain MRI images.

3)小波变换(Wavelet Transformation):3) Wavelet Transformation:

小波变换形成的特征向量虽然能够在一定程度上对图像进行精确描述,但是利用小波变换提取图像中ROIs纹理特征存在着检索精度不高的缺点。Although the feature vector formed by wavelet transform can accurately describe the image to a certain extent, there is a shortcoming of low retrieval accuracy in extracting texture features of ROIs in images using wavelet transform.

4)第二代小波变换(Curvelet变换):4) The second generation wavelet transform (Curvelet transform):

继上世纪80年代后期发展起来的小波变换之后,1996年Swendens提出了先进的第二代小波变换,在基函数算法上也在不断改进,1998年E.J.Candes提出了Ridgelet变换、1999年E.J.Candes和D.L.Donoho发明了Curvelet变换新算法:Following the wavelet transform developed in the late 1980s, Swendens proposed the advanced second-generation wavelet transform in 1996, and the basis function algorithm was also continuously improved. In 1998, E.J.Candes proposed the Ridgelet transform, and in 1999, E.J.Candes and D.L.Donoho invented a new algorithm for Curvelet transformation:

Figure BDA0000458431300000021
Figure BDA0000458431300000021

其中:2-j为尺度、θl为方向角θl

Figure BDA0000458431300000022
为位置
Figure BDA0000458431300000023
R为转换弧度,2006年又提出了快速离散Curvelet变换。第二代小波变换不但保留了小波变换(Wavelet Transformation)方法多尺度的优点,同时还具有各向异性特点,能够很好地逼近奇异曲线,比Wavelet Transformation更加适合分析二维图像中的曲线或边缘特征,而且具有更高的逼近精度和更好的稀疏表达能力,能够为图像提供一种比Wavelet Transformation多分辨分析更加精确的方法。Where: 2 -j is the scale, θ l is the direction angle θ l ,
Figure BDA0000458431300000022
for the location
Figure BDA0000458431300000023
R is for converting radians. In 2006, a fast discrete Curvelet transform was proposed. The second-generation wavelet transform not only retains the multi-scale advantages of the wavelet transform (Wavelet Transformation) method, but also has anisotropic characteristics, which can well approximate singular curves, and is more suitable for analyzing curves or edges in two-dimensional images than Wavelet Transformation features, and has higher approximation accuracy and better sparse expression ability, which can provide a more accurate method for images than Wavelet Transformation multi-resolution analysis.

5)Contourlet变换5) Contourlet transformation

Contourlet变换继承了Curvelet变换的各向异性尺度关系,在一定意义上它是Curvelet变换的另一种实现方式。Contourlet变换的基本思想是首先用一个类似小波的多尺度分解捕捉边缘奇异点,再根据方向信息将位置相近的奇异点汇集成轮廓段。Contourlet transform inherits the anisotropic scale relationship of Curvelet transform, and in a certain sense it is another implementation of Curvelet transform. The basic idea of Contourlet transform is to use a wavelet-like multi-scale decomposition to capture edge singular points first, and then gather similar singular points into contour segments according to the direction information.

Contourlet变换可分为两个部分:拉普拉斯塔式滤波器结构(LaplacianPyramid,LP)和二维方向滤波器组(Directional Filter Bank,DFB)。LP分解首先产生原始信号的一个低通采样逼近及原始图像与低通预测图像之间的一个差值图像,对得到的低通图像继续分解得到下一层的低通图像和差值图像,如此逐步滤波得到图像的多分辨率分解;DFB滤波器组使用扇形结构的共轭镜像滤波器组以避免对输入信号的调制,同时将1层二叉树状结构的方向滤波器变成了21个并行通道的结构。Contourlet transform can be divided into two parts: Laplacian Pyramid (LP) and two-dimensional directional filter bank (Directional Filter Bank, DFB). LP decomposition first produces a low-pass sampling approximation of the original signal and a difference image between the original image and the low-pass predicted image, and then continues to decompose the obtained low-pass image to obtain the low-pass image and difference image of the next layer, so The multi-resolution decomposition of the image is obtained by step-by-step filtering; the DFB filter bank uses a fan-shaped conjugate mirror filter bank to avoid modulation of the input signal, and at the same time changes the direction filter of the 1-layer binary tree structure into 21 parallel channels Structure.

Contourlet变换是一种新的图像二维表示方法,具有多分辨率、局部定位、多方向性、近邻界采样和各向异性等性质,其基函数分布于多尺度、多方向上,少量系数即可有效地捕捉图像中的边缘轮廓,而边缘轮廓正是图像中的主要特征。Contourlet transform is a new two-dimensional image representation method, which has the properties of multi-resolution, local positioning, multi-direction, nearest neighbor sampling and anisotropy. Its basis functions are distributed on multi-scale and multi-direction, and a small number of coefficients It can effectively capture the edge contour in the image, and the edge contour is the main feature in the image.

但是这些新方法在处理不同部位的MRI图像时,需要利用基函数重新构造新算法、选取适宜的参数,因此仍有许多理论问题值得研究。Contourlet变换已经成功地用于图像融合等实际问题,而用于脑部图像纹理特征提取的文献报道凤毛麟角。目前所查阅的文献中,仅有人使用GLCM和Wavelet变换对AD组及正常组脑部MRI图像提取纹理特征并建立预测模型,以诊断结果准确性对上述两种方法进行比较,发现采用GLCM提取纹理建模预测效果优于Wavelet变换。目前尚未见有人使用第二代小波变换以及Contourlet变换进行AD脑部MRI图像纹理提取。因此,需要结合上述现有技术的优点,并克服其不足,对图像的处理方法进行改进,以达到提高早期AD、MCI诊断率的目的。However, these new methods need to use basis functions to reconstruct new algorithms and select appropriate parameters when processing MRI images of different parts, so there are still many theoretical issues worthy of research. Contourlet transform has been successfully used in practical problems such as image fusion, but there are few literature reports on brain image texture feature extraction. In the currently reviewed literature, only people use GLCM and Wavelet transform to extract texture features from brain MRI images of AD group and normal group and establish a prediction model. The above two methods are compared with the accuracy of diagnosis results, and it is found that GLCM is used to extract texture features. Modeling prediction effect is better than Wavelet transform. So far, no one has used the second-generation wavelet transform and Contourlet transform to extract the texture of AD brain MRI images. Therefore, it is necessary to combine the advantages of the above-mentioned prior art and overcome its shortcomings to improve the image processing method to achieve the purpose of improving the early diagnosis rate of AD and MCI.

2、常用的预测模型:2. Commonly used forecasting models:

支持向量机(Support Vector Machine,SVM):Support Vector Machine (Support Vector Machine, SVM):

支持向量机是建立在统计学习理论VC维理论和结构风险最小化原理基础上的机器学习方法。其机理是寻找一个满足分类要求的最优分类超平面,使得该超平面在保证分类精度的同时,能够使超平面两侧的空白区域最大化。Support vector machine is a machine learning method based on the VC dimension theory of statistical learning theory and the principle of structural risk minimization. Its mechanism is to find an optimal classification hyperplane that meets the classification requirements, so that the hyperplane can maximize the blank areas on both sides of the hyperplane while ensuring the classification accuracy.

理论上,支持向量机能够实现对线性可分数据的最优分类。以两类数据分类为例,给定训练样本集(xi,yj),i=1,2,…,l,x∈{±1},超平面记作(w·x)+b=0,为使分类面对所有样本正确分类并且具备分类间隔,就要求它满足如下约束:yi[(w·xi)+b]≥1,i=1,2,…,l,可以计算出分类间隔为2/||w||,因此构造最优超平面的问题就转化为在约束式下求:In theory, support vector machines can achieve optimal classification of linearly separable data. Taking two types of data classification as an example, given a training sample set ( xi ,y j ), i=1,2,…,l,x∈{±1}, the hyperplane is denoted as (w·x)+b= 0, in order to make the classification face all samples correctly classified and have a classification interval, it is required to meet the following constraints: y i [(w x i )+b]≥1, i=1,2,...,l, can be calculated The classification interval is 2/||w||, so the problem of constructing the optimal hyperplane is transformed into finding under the constraints:

minmin φφ (( ww )) == 11 22 || || ww || || 22 == 11 22 (( ww ,, ·· ww ))

为了解决该个约束最优化问题,引入Lagrange函数:In order to solve this constrained optimization problem, the Lagrange function is introduced:

LL (( ww ,, bb ,, aa )) == 11 22 || || ww || || -- αα (( ythe y (( (( ww ·· xx )) ++ bb )) -- 11 ))

式中,αi>0为Lagrange乘数。约束最优化问题的解由Lagrange函数的鞍点决定,并且最优化问题的解在鞍点处满足对w和b的偏导为0,将该QP问题转化为相应的对偶问题即:In the formula, α i >0 is the Lagrange multiplier. The solution of the constrained optimization problem is determined by the saddle point of the Lagrange function, and the solution of the optimization problem satisfies the partial derivative of w and b at the saddle point to be 0, and the QP problem is transformed into the corresponding dual problem:

maxmax QQ (( αα )) == ΣΣ jj == 11 ll αα jj -- 11 22 ΣΣ ii == 11 ll ΣΣ jj == 11 ll αα ii αα jj ythe y ii ythe y jj (( xx ii ·· xx jj ))

sthe s .. tt .. ΣΣ jj == 11 ll αα jj ythe y jj == 00 ,, jj == 1,21,2 ,, .. .. .. ,, ll

解得最优解 α * = ( α 1 * , α 2 * , . . . , α l * ) l get the best solution α * = ( α 1 * , α 2 * , . . . , α l * ) l

计算最优权值向量w*和最优偏置b*,分别为:Calculate the optimal weight vector w * and the optimal bias b * , respectively:

ww ** == ΣΣ jj == 11 11 αα jj ** ythe y jj xx jj

bb ** == ythe y ii -- ΣΣ jj == 11 ll ythe y jj αα jj ** (( xx jj ·&Center Dot; xx ii ))

式中,下标

Figure BDA0000458431300000048
因此得到最优分类超平面In the formula, the subscript
Figure BDA0000458431300000048
Therefore, the optimal classification hyperplane

(w*·x)+b*=0,而最优分类函数为:(w * x)+b * =0, and the optimal classification function is:

ff (( xx )) == sgnsgn {{ (( ww ** ·&Center Dot; xx )) ++ bb ** }} == sgnsgn {{ ΣΣ jj == 11 ll αα jj ** ythe y jj (( xx jj ·&Center Dot; xx ii )) ++ bb ** }} ,, xx ∈∈ RR nno

对于线性不可分情况,SVM的主要思想是将输人向量映射到一个高维的特征向量空间,并在该特征空间中构造最优分类面。For the case of linear inseparability, the main idea of SVM is to map the input vector to a high-dimensional feature vector space, and construct the optimal classification surface in this feature space.

将x做从输入空间Rn到特征空间H的变换φ,得:Transform x from the input space R n to the feature space H, get:

x→φ(x)=(φ1(x),φ2(x),…φl(x))T x→φ(x)=(φ 1 (x),φ 2 (x),…φ l (x)) T

以特征向量φ(x)代替输入向量x,则可以得到最优分类函数为:Using the feature vector φ(x) instead of the input vector x, the optimal classification function can be obtained as:

ff (( xx )) == sgnsgn (( ww ·&Center Dot; φφ (( xx )) ++ bb )) == sgnsgn (( ΣΣ jj == 11 ll aa jj ythe y ii φφ (( xx ii )) ·· φφ (( xx )) ++ bb ))

发明内容:Invention content:

本发明的目的在于提供一种对含有MCI、早期AD患者病灶以及正常老年人群ROIs的MRI图像的分割方法以及提取纹理特征的方法,建立多种预测模型,以便更有效的发现MCI患者,诊断早期AD以及观测正常老年人群的脑部结构改变。The purpose of the present invention is to provide a method for segmenting MRI images containing MCI, early AD patient lesions, and normal elderly population ROIs and a method for extracting texture features, and to establish multiple prediction models so as to more effectively find MCI patients and diagnose early AD and observation of brain structural changes in the normal elderly population.

本发明综合了现有技术中的区域增长法、灰度共生矩阵、小波变换等方法的优点,并加以改进,运用Curvelet变换和Contourlet变换提取ROIs的边缘纹理特征,纹理提取方法全面、新颖,能够达到发明目的。The present invention combines the advantages of the region growing method, gray-level co-occurrence matrix, wavelet transform and other methods in the prior art, and improves them, and uses Curvelet transform and Contourlet transform to extract the edge texture features of ROIs. The texture extraction method is comprehensive and novel, and can achieve the purpose of the invention.

本发明用以下主要技术路线建立了一种对含有MCI、早期AD患者病灶以及正常老年人ROIs的MRI图像的分割以及提取纹理特征的方法(具体程序见图4),并据此建立了判断相关ROIs性质的预测模型:The present invention uses the following main technical route to establish a method for segmenting and extracting texture features of MRI images containing MCI, early AD patient lesions, and normal elderly ROIs (see Figure 4 for the specific procedure), and establishes a judgment correlation Predictive models for the properties of ROIs:

1)建立ROIs图像库;1) Establish ROIs image library;

2)运用区域增长法将图像中相关ROIs分割出来;2) Use the region growing method to segment the relevant ROIs in the image;

3)采用Curvelet变换和Contourlet变换处理图像,提取以下各变量:熵(熵)、灰度均值(灰度均值)、相关性(相关性)、能量(能量)、同质度(同质度geneity)、方差(Variance)、最大概率(Maximum probability,最大概率)、逆差距(Inverse Difference Moment,逆差距)、聚类趋势(Cluster Tendency)、对比度(Contrast)、和的均值(Sun-灰度均值)、差的均值(Difference-灰度均值)、和的熵(Sum-熵)、差的熵(Difference-熵);3) Use Curvelet transform and Contourlet transform to process the image, and extract the following variables: entropy (entropy), gray mean (gray mean), correlation (correlation), energy (energy), homogeneity (homogeneity ), variance (Variance), maximum probability (Maximum probability, maximum probability), inverse gap (Inverse Difference Moment, reverse gap), clustering trend (Cluster Tendency), contrast (Contrast), and the mean (Sun-gray mean ), the mean of the difference (Difference-gray mean), the entropy of the sum (Sum-entropy), and the entropy of the difference (Difference-entropy);

4)用步骤2)~3)所得到的各种变量数据建立图像特征参量数据库;4) Use the various variable data obtained in steps 2) to 3) to establish an image characteristic parameter database;

5)根据步骤4)的数据库构建基于Curvelet变换和Contourlet变换的预测模型;建立预测模型的方法包括支持向量机;5) According to the database in step 4), a prediction model based on Curvelet transformation and Contourlet transformation is constructed; the method for establishing the prediction model includes a support vector machine;

6)将步骤5)所得到的各种参量数据与样本经反复验证,以修正预测模型,得到结果比较准确的理想模型。6) The various parameter data and samples obtained in step 5) are repeatedly verified to correct the prediction model and obtain an ideal model with relatively accurate results.

7)比较步骤5)中基于Curvelet变换和Contourlet变换建立预测模型的预测效果。7) Compare the prediction effect of the prediction model based on Curvelet transformation and Contourlet transformation in step 5).

目前对于图像的纹理还没有一个统一的定义,一般认为图像的纹理特征描述的是物体表面灰度或颜色的变化,这种变化与物体的自身属性有关,是某种纹理基元的重复。At present, there is no unified definition of the texture of an image. It is generally believed that the texture feature of an image describes the change of grayscale or color on the surface of an object. This change is related to the object's own attributes and is the repetition of a certain texture primitive.

采用Curvelet变换和Contourlet变换可以得到以下纹理特征参数:The following texture feature parameters can be obtained by using Curvelet transformation and Contourlet transformation:

熵熵:

Figure BDA0000458431300000061
反映影像纹理的随机性;entropy entropy:
Figure BDA0000458431300000061
Reflect the randomness of image texture;

灰度均值灰度均值:

Figure BDA0000458431300000062
反映像素所有灰度值的集中趋势。Grayscale mean Grayscale mean:
Figure BDA0000458431300000062
Reflects the central tendency of all gray values of pixels.

相关性相关性:测量像素灰度的相关性;Correlation Correlation: Measure the correlation of pixel gray levels;

能量(Angular Second Moment)能量(角二阶矩):

Figure BDA0000458431300000064
反映影像灰度分布的均匀程度和纹理粗细度;Energy (Angular Second Moment) Energy (angular second moment):
Figure BDA0000458431300000064
Reflect the uniformity of image grayscale distribution and texture thickness;

同质度geneity同质度:

Figure BDA0000458431300000065
反映灰度水平相似程度;Homogeneity degree of geneity degree of homogeneity:
Figure BDA0000458431300000065
Reflect the similarity of the gray level;

Variance方差:

Figure BDA0000458431300000066
反映灰度水平的分布情况;Variance variance:
Figure BDA0000458431300000066
Reflect the distribution of gray levels;

Maximum probability(最大概率)最大概率:

Figure BDA0000458431300000067
最突出的像素对的发生率;Maximum probability (maximum probability) maximum probability:
Figure BDA0000458431300000067
The incidence of the most salient pixel pair;

Inverse Difference Moment(逆差距)逆差矩:

Figure BDA0000458431300000071
反映图像的平滑性;Inverse Difference Moment (inverse gap) inverse difference moment:
Figure BDA0000458431300000071
Reflect the smoothness of the image;

Cluster tendency聚类趋势:

Figure BDA0000458431300000072
测量相似灰度水平值像素的分组;Cluster tendency clustering trend:
Figure BDA0000458431300000072
Measuring groupings of pixels with similar gray level values;

Contrast对比度:

Figure BDA0000458431300000073
反映影像的清晰度;Contrast contrast:
Figure BDA0000458431300000073
reflect the clarity of the image;

Sun-灰度均值和的均值

Figure BDA0000458431300000074
Difference-灰度均值差的均值
Figure BDA0000458431300000075
提供图像中灰度水平的均值。Sun - the mean of the sum of the gray mean
Figure BDA0000458431300000074
Difference - the mean of the gray mean difference
Figure BDA0000458431300000075
Provides the mean of the gray levels in the image.

Sum-熵和的熵

Figure BDA0000458431300000076
Difference-熵差的熵 f 9 = - Σ k = 0 K - 1 c x - y ( k ) log { c x - y ( k ) } . Sum-entropy and entropy
Figure BDA0000458431300000076
Difference-Entropy of entropy difference f 9 = - Σ k = 0 K - 1 c x - the y ( k ) log { c x - the y ( k ) } .

本发明用上述方法建立了早期AD预测模型,判断准确度达到了100%。The present invention uses the above method to establish an early AD prediction model, and the judgment accuracy reaches 100%.

以下是ROIs内部纹理提取的实例,步骤如下:The following is an example of texture extraction inside ROIs, the steps are as follows:

1、收集AD、MCI及正常老年人的MRI原始图像(附图说明以正常老年人的MRI图像为例),见图1;1. Collect the original MRI images of AD, MCI and normal elderly people (the illustration of the figure takes MRI images of normal elderly people as an example), see Figure 1;

2、用区域增长法分割上述图像,得到图像见图2、图3,分割采用编写好的程序,直接运行即可。2. Use the region growth method to segment the above image, and the obtained images are shown in Fig. 2 and Fig. 3. The segmented program is written and can be run directly.

3、采用Curvelet变换和Contourlet变换提取纹理特征参量,每种方法分别有相应程序,直接运行即可,提取的纹理特征参量见表2~表29。3. Use Curvelet transform and Contourlet transform to extract texture feature parameters. Each method has a corresponding program, which can be run directly. The extracted texture feature parameters are shown in Table 2 to Table 29.

实验结果证明:Experimental results prove that:

表1Curvelet变换与Contourlet变化不同方位差异性分析Table 1 Analysis of the difference between Curvelet transformation and Contourlet transformation in different orientations

由表1可知,Contourlet变换比Curvelet变换能较好反映AD组,MCI组,正常组之间纹理值总体差异性。It can be seen from Table 1 that the Contourlet transformation can better reflect the overall difference in texture value among the AD group, the MCI group, and the normal group than the Curvelet transformation.

采用区域增长法分割ROIs,选取80%的数据作为训练样本,再根据余下20%的数据作为验证样本,以检验通过模型判断ROIs是否病变与病理诊断的一致性,证明通过Contourlet提取脑部MRI图像ROIs外部纹理建立预测早期肺癌的灵敏度、特异度均为100%。Using the region growth method to segment ROIs, select 80% of the data as training samples, and then use the remaining 20% of the data as verification samples to test whether the ROIs are determined by the model. The sensitivity and specificity of ROIs external texture to predict early lung cancer were both 100%.

通过以上数据,可以得到结论:采用区域增长法分割脑部MRI图像,并通Contourlet变换提取纹理特征参量建立预测模型对早期AD辅助诊断有很好的效果。Based on the above data, it can be concluded that using the region growing method to segment brain MRI images, and extracting texture feature parameters through Contourlet transform to establish a predictive model has a good effect on the early auxiliary diagnosis of AD.

有益效果:Beneficial effect:

将区域增长法用于脑部MRI的分割,这是本发明的一个创新。经实验证明,区域增长法分割提取纹理建模要更优于整体分割提取纹理建模,能更好的保留结节的边缘信息;It is an innovation of the present invention that the region growing method is used for the segmentation of brain MRI. Experiments have proved that the segmentation and extraction texture modeling of the region growing method is better than the overall segmentation and extraction texture modeling, and can better preserve the edge information of nodules;

Contourlet是提取脑部MRI图像内部纹理特征的有效方法,可以提取14种纹理特征参数,较全面的反映图像的纹理特征。Contourlet is an effective method for extracting internal texture features of brain MRI images. It can extract 14 kinds of texture feature parameters and reflect the texture features of images more comprehensively.

附图说明:Description of drawings:

图1是正常老年人的MRI原始图像,其中用黑框标注的是ROIs;Figure 1 is the original MRI image of a normal elderly person, where ROIs are marked with black boxes;

图2是图1中用区域增长法分割后得到的左脑ROIs放大图像;Fig. 2 is the magnified image of the left brain ROIs obtained after segmentation by the region growing method in Fig. 1;

图3是图1中用区域增长法分割后得到的右脑ROIs放大图像;Figure 3 is an enlarged image of the right brain ROIs obtained after segmentation by the region growing method in Figure 1;

图4是用本发明方法对脑部MRI图像进行多维度纹理提取的技术路线;Fig. 4 is the technical route of carrying out multi-dimensional texture extraction to brain MRI image with the method of the present invention;

其中,1是左脑海马、内嗅皮层区域;2是右脑海马、内嗅皮层区域。Among them, 1 is the left hippocampus and entorhinal cortex area; 2 is the right hippocampus and entorhinal cortex area.

具体实施方式Detailed ways

以下实例是用本发明方法建立基于含相关ROIs的脑部MRI图像预测AD的模型介绍,这只是对本发明方法的进一步说明,但实例并不限制本发明的应用范围。实际上,用本方法还可对其它类型的医学图像进行性质判断。The following example is a model introduction based on the method of the present invention to establish AD prediction based on brain MRI images containing relevant ROIs. This is only a further description of the method of the present invention, but the examples do not limit the scope of application of the present invention. In fact, this method can also be used to judge the properties of other types of medical images.

图像来源:ADNI网站上共享的AD、MCI及正常老年人的脑部MRI图像,分别为.Nii格式的图像,使用MRIcro软件读取;Image source: Brain MRI images of AD, MCI and normal elderly shared on the ADNI website, respectively in .Nii format, read by MRIcro software;

方法:采用Matlab软件编程,运用区域增长法将上述MRI图像中的ROIs分割出来,利用Contourlet变换提取相关ROIs的纹理特征参数。Methods: Matlab software was used to program, the ROIs in the above MRI images were segmented by region growing method, and the texture feature parameters of relevant ROIs were extracted by Contourlet transform.

以下是脑部MRI图像ROIs纹理特征参量提取的实例,步骤如下:The following is an example of extracting texture feature parameters of brain MRI image ROIs, the steps are as follows:

1、分别收集AD组病例20例,MCI病例组20例,正常组病例20例总共的250张脑部MRI原始图像,病例年龄为40~89岁,平均年龄为64岁,年龄分布的中位数为60岁。1. A total of 250 original brain MRI images were collected from 20 cases of AD group, 20 cases of MCI case group, and 20 cases of normal group. The age of the cases ranged from 40 to 89 years old, the average age was 64 years old, and the median age distribution was The number is 60 years old.

2、用区域增长法分割图像,分割方式按区域增长法常规方法,采用编写好的程序,设定阈值为35,直接运行。图1是待分割的一个图像举例,得到图像见图2、图3,2. Segment the image with the region growing method, the segmentation method follows the conventional method of the region growing method, use the written program, set the threshold to 35, and run it directly. Figure 1 is an example of an image to be segmented, see Figure 2 and Figure 3 for the obtained image,

3、分别采用Curvelet变换和Contourlet变换提取纹理特征参量,提取的纹理特征参量见表2~表29。3. Using Curvelet transform and Contourlet transform to extract texture feature parameters respectively, the extracted texture feature parameters are shown in Table 2-Table 29.

4、采用Curvelet变换和Contourlet变换方法提取脑部MRI图像ROIs分别得到14个纹理特征参量建立预测模型。4. Use Curvelet transform and Contourlet transform methods to extract brain MRI image ROIs to obtain 14 texture feature parameters to establish prediction models.

5、比较Curvelet变换和Contourlet变换两种纹理提取方法建立预测模型的预测效果。5. Compare the prediction effect of two texture extraction methods, Curvelet transform and Contourlet transform, to establish a predictive model.

由12例200张脑部MRI图像作为训练样本集,余下的3例50张脑部MRI图像作为测试样本进行支持向量机进行分类预测,预测效果:灵敏度=100%;特异度=100%;符合率=100%。The 200 brain MRI images of 12 cases were used as the training sample set, and the remaining 3 cases of 50 brain MRI images were used as test samples for classification prediction by support vector machine. The prediction effect: sensitivity = 100%; specificity = 100%; Rate = 100%.

实验结果证明:采用区域增长法分割ROIs,选取80%的数据作为训练样本,再根据余下的20%的数据作为验证样本,以检验通过模型判断肺ROIs是否病变与病理诊断的一致性,证明通过对小样本脑部MRI图像ROIs提取纹理特征参量预测早期AD的灵敏度为100%。The experimental results prove that: using the region growth method to segment ROIs, select 80% of the data as training samples, and then use the remaining 20% of the data as verification samples to test whether the lung ROIs are determined by the model. The sensitivity of extracting texture feature parameters from small sample brain MRI image ROIs to predict early AD is 100%.

通过以上数据,可以得到结论:采用区域增长法分割脑部MRI图像ROIs,并通过Contourlet变换提取纹理特征参量建立预测模型对早期AD的辅助诊断有很好的效果。Based on the above data, it can be concluded that using the region growing method to segment brain MRI image ROIs, and extracting texture feature parameters through Contourlet transform to establish a prediction model has a good effect on the auxiliary diagnosis of early AD.

Curvelet变换提取的变量见表2~表15,Contourlet变换提取的变量见表16~表29(以AD组编号为136_s_0194的病例纹理值为例)。The variables extracted by Curvelet transformation are shown in Tables 2 to 15, and the variables extracted by Contourlet transformation are shown in Tables 16 to 29 (taking the texture value of the case number 136_s_0194 in the AD group as an example).

表2灰度均值Table 2 gray mean value

Figure BDA0000458431300000101
Figure BDA0000458431300000101

Figure BDA0000458431300000111
Figure BDA0000458431300000111

表3标准差Table 3 standard deviation

标准差1standard deviation 1 标准差2standard deviation 2 标准差3standard deviation 3 标准差4standard deviation 4 标准差5standard deviation 5 标准差6standard deviation 6 4.38477114.3847711 1.424984911.42498491 1.368189051.36818905 1.412283971.41228397 1.4624267991.462426799 1.5912837491.591283749 标准差7standard deviation 7 标准差8standard deviation 8 标准差9standard deviation 9 标准差10standard deviation 10 标准差11standard deviation 11 标准差12standard deviation 12 1.328857861.32885786 1.91483511.9148351 1.410655291.41065529 1.550301541.55030154 1.3678889241.367888924 1.4054326691.405432669 标准差13standard deviation 13 标准差14standard deviation 14 标准差15standard deviation 15 标准差16standard deviation 16 标准差17standard deviation 17 标准差18standard deviation 18 1.300440571.30044057 1.5797072871.579707287 1.335763281.33576328 1.184535011.18453501 1.4277093831.427709383 1.2770032121.277003212 标准差19standard deviation 19 标准差20standard deviation 20 标准差21standard deviation 21 标准差22standard deviation 22 标准差23standard deviation 23 标准差24standard deviation 24 1.70111951.7011195 1.4438542621.443854262 1.382095431.38209543 1.577566751.57756675 1.3922414191.392241419 1.6471250471.647125047 标准差25standard deviation 25 标准差26standard deviation 26 标准差27standard deviation 27 标准差28standard deviation 28 标准差29standard deviation 29 标准差30standard deviation 30 1.422591271.42259127 1.4304694251.430469425 1.510287011.51028701 1.272158531.27215853 1.3112994651.311299465 1.5272514741.527251474 标准差31Standard Deviation 31 标准差32Standard Deviation 32 标准差33Standard Deviation 33 标准差34Standard Deviation 34 1.043935551.04393555 1.520058991.52005899 1.76471351.7647135 0.860308650.86030865

表4聚类趋势Table 4 Clustering trend

聚类趋势1Cluster Trend 1 聚类趋势2Cluster Trend 2 聚类趋势3Cluster Trend 3 聚类趋势4Cluster Trend 4 聚类趋势5Cluster Trend 5 聚类趋势6Cluster Trend 6 16.360630116.3606301 16.1243928516.12439285 18.369515318.3695153 15.923846615.9238466 13.8549904213.85499042 21.256951221.2569512 聚类趋势7Cluster Trend 7 聚类趋势8Cluster Trend 8 聚类趋势9Cluster Trend 9 聚类趋势10Cluster Trend 10 聚类趋势11Cluster Trend 11 聚类趋势12Cluster Trend 12 18.66145118.661451 18.3206942918.32069429 16.924314116.9243141 17.318778817.3187788 14.4478402214.44784022 20.699079520.6990795 聚类趋势13Cluster Trend 13 聚类趋势14Cluster Trend 14 聚类趋势15Cluster Trend 15 聚类趋势16Cluster Trend 16 聚类趋势17Cluster Trend 17 聚类趋势18Cluster Trend 18

11.95918911.959189 14.1341580414.13415804 17.395542717.3955427 17.532577217.5325772 16.9857628616.98576286 14.21720114.217201 聚类趋势19Cluster Trend 19 聚类趋势20Cluster Trend 20 聚类趋势21Cluster Trend 21 聚类趋势22Cluster Trend 22 聚类趋势23Cluster Trend 23 聚类趋势24Cluster Trend 24 16.811440916.8114409 15.2453624615.24536246 15.391486115.3914861 15.504062815.5040628 15.4325897115.43258971 13.9891683913.98916839 聚类趋势25Cluster Trend 25 聚类趋势26Cluster Trend 26 聚类趋势27Cluster Trend 27 聚类趋势28Cluster Trend 28 聚类趋势29Cluster Trend 29 聚类趋势30Cluster Trend 30 16.33187616.331876 19.9683216219.96832162 14.705938614.7059386 16.065566216.0655662 16.2137915716.21379157 14.0423115114.04231151 聚类趋势31Cluster Trend 31 聚类趋势32Cluster Trend 32 聚类趋势33Cluster Trend 33 聚类趋势34Cluster Trend 34 18.294258718.2942587 16.8686638616.86866386 16.587192816.5871928 16.255571516.2555715

表5同质度Table 5 Homogeneity

Figure BDA0000458431300000121
Figure BDA0000458431300000121

表6最大概率Table 6 Maximum probability

Figure BDA0000458431300000131
Figure BDA0000458431300000131

表7能量Table 7 Energy

能量1energy 1 能量2energy 2 能量3energy 3 能量4energy 4 能量5Energy 5 能量6Energy 6 0.04989050.0498905 0.11600320.1160032 0.1945330.194533 0.1739570.173957 0.21825370.2182537 0.121018590.12101859 能量7Energy 7 能量8Energy 8 能量9Energy 9 能量10Energy 10 能量11energy 11 能量12Energy 12 0.14891910.1489191 0.08178740.0817874 0.0919020.091902 0.10120480.1012048 0.18589770.1858977 0.107481280.10748128 能量13Energy 13 能量14Energy 14 能量15Energy 15 能量16Energy 16 能量17Energy 17 能量18Energy 18 0.3454140.345414 0.24644710.2464471 0.1540170.154017 0.19918890.1991889 0.11647630.1164763 0.124921030.12492103

能量19energy 19 能量20Energy 20 能量21Energy 21 能量22Energy 22 能量23Energy 23 能量24Energy 24 0.07499740.0749974 0.11072580.1107258 0.1648320.164832 0.17278380.1727838 0.15570920.1557092 0.145396560.14539656 能量25Energy 25 能量26Energy 26 能量27Energy 27 能量28Energy 28 能量29Energy 29 能量30Energy 30 0.11464610.1146461 0.13716980.1371698 0.1103860.110386 0.16712020.1671202 0.46453030.4645303 0.296532360.29653236 能量31Energy 31 能量32Energy 32 能量33Energy 33 能量34Energy 34 0.30571560.3057156 0.09395110.0939511 0.0845320.084532 0.51902620.5190262

表8惯性矩Table 8 Moments of inertia

Figure BDA0000458431300000141
Figure BDA0000458431300000141

表9逆差距Table 9 Inverse Gap

Figure BDA0000458431300000142
Figure BDA0000458431300000142

Figure BDA0000458431300000151
Figure BDA0000458431300000151

表10熵Table 10 Entropy

熵1Entropy 1 熵2Entropy 2 熵3Entropy 3 熵4Entropy 4 熵5Entropy 5 熵6Entropy 6 1.220351511.22035151 1.0293866591.029386659 0.823596020.82359602 0.996923850.99692385 0.8747464610.874746461 1.077543841.07754384 熵7Entropy 7 熵8Entropy 8 熵9Entropy 9 熵10Entropy 10 熵11Entropy 11 熵12Entropy 12 0.930650330.93065033 1.2781115471.278111547 1.129476791.12947679 1.120730571.12073057 0.8879064470.887906447 0.9322293610.932229361 熵13Entropy 13 熵14Entropy 14 熵15Entropy 15 熵16Entropy 16 熵17Entropy 17 熵18Entropy 18 0.825475920.82547592 0.9488374010.948837401 0.853675730.85367573 0.838209760.83820976 0.8949504280.894950428 1.0014052531.001405253 熵19Entropy 19 熵20Entropy 20 熵21Entropy 21 熵22Entropy 22 熵23Entropy 23 熵24Entropy 24 1.096760621.09676062 1.0082006471.008200647 0.817001940.81700194 0.906755330.90675533 0.9785922810.978592281 1.2114091081.211409108 熵25Entropy 25 熵26Entropy 26 熵27Entropy 27 熵28Entropy 28 熵29Entropy 29 熵30Entropy 30 1.14444411.1444441 0.982367920.98236792 1.072743021.07274302 0.896606640.89660664 0.6309610170.630961017 0.8005160140.800516014

熵31Entropy 31 熵32Entropy 32 熵33Entropy 33 熵34Entropy 34 0.582984690.58298469 1.0299473191.029947319 1.18374091.1837409 0.55068910.5506891

表11相关性Table 11 Correlation

相关性1Correlation 1 相关性2Correlation 2 相关性3Correlation 3 相关性4Correlation 4 相关性5Correlation 5 相关性6Correlation 6 0.91553250.9155325 0.09072110.0907211 0.0254490.025449 -0.199867-0.199867 -0.036875-0.036875 -0.0596785-0.0596785 相关性7Correlation 7 相关性8Correlation 8 相关性9Correlation 9 相关性10Correlation 10 相关性11Correlation 11 相关性12Correlation 12 -0.201381-0.201381 -0.014917-0.014917 -0.0417-0.0417 0.06053410.0605341 0.03574460.0357446 -0.1837574-0.1837574 相关性13Correlation 13 相关性14Correlation 14 相关性15Correlation 15 相关性16Correlation 16 相关性17Correlation 17 相关性18Correlation 18 0.00622530.0062253 -0.031564-0.031564 -0.19318-0.19318 0.01788520.0178852 0.07178810.0717881 0.029109430.02910943 相关性19Correlation 19 相关性20Correlation 20 相关性21Correlation 21 相关性22Correlation 22 相关性23Correlation 23 相关性24Correlation 24 0.01337190.0133719 -0.211466-0.211466 0.0192580.019258 0.0169190.016919 -0.191599-0.191599 -0.00425-0.00425 相关性25Correlation 25 相关性26Correlation 26 相关性27Correlation 27 相关性28Correlation 28 相关性29Correlation 29 相关性30Correlation 30 -0.035369-0.035369 0.06564790.0656479 0.0103180.010318 -0.184086-0.184086 0.09107570.0910757 0.043940310.04394031 相关性31Correlation 31 相关性32Correlation 32 相关性33Correlation 33 相关性34Correlation 34 -0.222713-0.222713 -0.009445-0.009445 0.0845920.084592 -0.006333-0.006333

表12和的均数Table 12 and mean of

Figure BDA0000458431300000161
Figure BDA0000458431300000161

Figure BDA0000458431300000171
Figure BDA0000458431300000171

表13差的均数Table 13 Mean of difference

Figure BDA0000458431300000172
Figure BDA0000458431300000172

表14和的熵Table 14 and the entropy of

Figure BDA0000458431300000181
Figure BDA0000458431300000181

表15差的熵Table 15 Entropy of difference

差的熵1poor entropy 1 差的熵2poor entropy 2 差的熵3poor entropy 3 差的熵4poor entropy 4 差的熵5Poor Entropy 5 差的熵6Poor Entropy 6 2.77178222.7717822 2.73345862.7334586 2.6467822.646782 2.8522172.852217 2.78063912.7806391 2.905639062.90563906 差的熵7poor entropy7 差的熵8poor entropy 8 差的熵9Poor Entropy 9 差的熵10poor entropy 10 差的熵11Poor Entropy 11 差的熵12Poor Entropy 12 2.8522172.852217 3.253.25 2.6467822.646782 2.9772172.977217 33 2.774397472.77439747 差的熵13Poor Entropy 13 差的熵14Poor Entropy 14 差的熵15Poor Entropy 15 差的熵16Poor Entropy 16 差的熵17Poor Entropy 17 差的熵18Poor Entropy 18 2.3752.375 2.78063912.7806391 2.7743972.774397 2.47460182.4746018 2.90563912.9056391 2.646782222.64678222 差的熵19Poor Entropy 19 差的熵20Poor Entropy 20 差的熵21Poor Entropy 21 差的熵22Poor Entropy 22 差的熵23Poor Entropy 23 差的熵24Poor Entropy 24

3.07781953.0778195 2.64678222.6467822 2.6084592.608459 2.64678222.6467822 2.77439752.7743975 2.899397472.89939747 差的熵25Poor Entropy 25 差的熵26Poor Entropy 26 差的熵27Poor Entropy 27 差的熵28Poor Entropy 28 差的熵29Poor Entropy 29 差的熵30Poor Entropy 30 2.64678222.6467822 2.78063912.7806391 2.7743972.774397 2.77439752.7743975 2.43627812.4362781 2.649397472.64939747 差的熵31Poor Entropy 31 差的熵32Poor Entropy 32 差的熵33Poor Entropy 33 差的熵34Poor Entropy 34 2.53063912.5306391 2.73345862.7334586 3.0243973.024397 2.21691722.2169172

表16灰度均值Table 16 gray mean value

Figure BDA0000458431300000191
Figure BDA0000458431300000191

Figure BDA0000458431300000201
Figure BDA0000458431300000201

Figure BDA0000458431300000211
Figure BDA0000458431300000211

表17标准差Table 17 Standard Deviation

Figure BDA0000458431300000212
Figure BDA0000458431300000212

表18聚类趋势Table 18 Clustering trend

Figure BDA0000458431300000232
Figure BDA0000458431300000232

Figure BDA0000458431300000241
Figure BDA0000458431300000241

表19同质度Table 19 Homogeneity

A-1_1_HA-1_1_H A-1_2_HA-1_2_H A-1_3_HA-1_3_H A-1_4_HA-1_4_H A-1_5_HA-1_5_H A-1_6_HA-1_6_H 0.7293530.729353 0.858960.85896 0.7425910.742591 0.8119830.811983 0.7074360.707436 0.8204060.820406 A-2_1_HA-2_1_H A-2_2_HA-2_2_H A-2_3_HA-2_3_H A-2_4_HA-2_4_H A-2_5_HA-2_5_H A-2_6_HA-2_6_H 0.6730750.673075 0.8323210.832321 0.8046190.804619 0.8187360.818736 0.7651960.765196 0.8191040.819104 A-3_1_HA-3_1_H A-3_2_HA-3_2_H A-3_3_HA-3_3_H A-3_4_HA-3_4_H A-3_5_HA-3_5_H A-3_6_HA-3_6_H 0.7028890.702889 0.7703290.770329 0.7316640.731664 0.8123250.812325 0.7501130.750113 0.8233540.823354 A-4_1_HA-4_1_H A-4_2_HA-4_2_H A-4_3_HA-4_3_H A-4_4_HA-4_4_H A-4_5_HA-4_5_H A-4_6_HA-4_6_H 0.6300940.630094 0.8098260.809826 0.7650240.765024 0.7508870.750887 0.7221460.722146 0.8333790.833379 A-5_1_HA-5_1_H A-5_2_HA-5_2_H A-5_3_HA-5_3_H A-5_4_HA-5_4_H A-5_5_HA-5_5_H A-5_6_HA-5_6_H 0.7842260.784226 0.8027530.802753 0.8288980.828898 0.8249060.824906 0.8554820.855482 0.8327830.832783 A-6_1_HA-6_1_H A-6_2_HA-6_2_H A-6_3_HA-6_3_H A-6_4_HA-6_4_H A-6_5_HA-6_5_H A-6_6_HA-6_6_H 0.8604910.860491 0.8822920.882292 0.9021360.902136 0.8882630.888263 0.8999280.899928 0.8923650.892365 A-7_1_HA-7_1_H A-7_2_HA-7_2_H A-7_3_HA-7_3_H A-7_4_HA-7_4_H A-7_5_HA-7_5_H A-7_6_HA-7_6_H 0.8604910.860491 0.8724110.872411 0.8975290.897529 0.8973290.897329 0.9099250.909925 0.9131090.913109 A-8_1_HA-8_1_H A-8_2_HA-8_2_H A-8_3_HA-8_3_H A-8_4_HA-8_4_H A-8_5_HA-8_5_H A-8_6_HA-8_6_H 0.7200640.720064 0.8468910.846891 0.8453860.845386 0.7867990.786799 0.8192070.819207 0.7766030.776603 B-1_1_HB-1_1_H B-1_2_HB-1_2_H B-1_3_HB-1_3_H B-1_4_HB-1_4_H B-1_5_HB-1_5_H B-1_6_HB-1_6_H 0.4210030.421003 0.634880.63488 0.6782650.678265 0.7586530.758653 0.8535340.853534 0.9029550.902955 B-2_1_HB-2_1_H B-2_2_HB-2_2_H B-2_3_HB-2_3_H B-2_4_HB-2_4_H B-2_5_HB-2_5_H B-2_6_HB-2_6_H 0.4134490.413449 0.7237710.723771 0.7379130.737913 0.8478280.847828 0.9003180.900318 0.9195160.919516 B-3_1_HB-3_1_H B-3_2_HB-3_2_H B-3_3_HB-3_3_H B-3_4_HB-3_4_H B-3_5_HB-3_5_H B-3_6_HB-3_6_H 0.4049890.404989 0.678850.67885 0.737140.73714 0.7348930.734893 0.9012680.901268 0.885820.88582 B-4_1_HB-4_1_H B-4_2_HB-4_2_H B-4_3_HB-4_3_H B-4_4_HB-4_4_H B-4_5_HB-4_5_H B-4_6_HB-4_6_H 0.3791050.379105 0.7592410.759241 0.7060570.706057 0.7647120.764712 0.8625150.862515 0.8429740.842974 B-5_1_HB-5_1_H B-5_2_HB-5_2_H B-5_3_HB-5_3_H B-5_4_HB-5_4_H B-5_5_HB-5_5_H B-5_6_HB-5_6_H 0.3862280.386228 0.8032880.803288 0.7416320.741632 0.7773760.777376 0.9341540.934154 0.9004760.900476 B-6_1_HB-6_1_H B-6_2_HB-6_2_H B-6_3_HB-6_3_H B-6_4_HB-6_4_H B-6_5_HB-6_5_H B-6_6_HB-6_6_H 0.4928370.492837 0.648380.64838 0.840930.84093 0.8044510.804451 0.9345740.934574 0.9249240.924924

B-7_1_HB-7_1_H B-7_2_HB-7_2_H B-7_3_HB-7_3_H B-7_4_HB-7_4_H B-7_5_HB-7_5_H B-7_6_HB-7_6_H 0.5461760.546176 0.8258090.825809 0.8485210.848521 0.8750170.875017 0.9239750.923975 0.9250810.925081 B-8_1_HB-8_1_H B-8_2_HB-8_2_H B-8_3_HB-8_3_H B-8_4_HB-8_4_H B-8_5_HB-8_5_H B-8_6_HB-8_6_H 0.3415060.341506 0.6485990.648599 0.8227980.822798 0.6374630.637463 0.898220.89822 0.8918330.891833

表20最大概率Table 20 Maximum Probability

Figure BDA0000458431300000261
Figure BDA0000458431300000261

Figure BDA0000458431300000271
Figure BDA0000458431300000271

表21能量Table 21 Energy

A-1_1_能量A-1_1_Energy A-1_2_能量A-1_2_Energy A-1_3_能量A-1_3_Energy A-1_4_能量A-1_4_Energy A-1_5_能量A-1_5_Energy A-1_6_能量A-1_6_Energy 0.3298170.329817 0.4296580.429658 0.2691690.269169 0.3570820.357082 0.2342910.234291 0.3606170.360617 A-2_1_能量A-2_1_Energy A-2_2_能量A-2_2_Energy A-2_3_能量A-2_3_Energy A-2_4_能量A-2_4_Energy A-2_5_能量A-2_5_Energy A-2_6_能量A-2_6_Energy 0.2524310.252431 0.3466330.346633 0.3190420.319042 0.3474120.347412 0.2836230.283623 0.3570810.357081 A-3_1_能量A-3_1_Energy A-3_2_能量A-3_2_Energy A-3_3_能量A-3_3_Energy A-3_4_能量A-3_4_Energy A-3_5_能量A-3_5_Energy A-3_6_能量A-3_6_Energy 0.2533930.253393 0.279210.27921 0.2609220.260922 0.3286120.328612 0.2795070.279507 0.3298210.329821 A-4_1_能量A-4_1_Energy A-4_2_能量A-4_2_Energy A-4_3_能量A-4_3_Energy A-4_4_能量A-4_4_Energy A-4_5_能量A-4_5_Energy A-4_6_能量A-4_6_Energy 0.2624320.262432 0.3265710.326571 0.2811190.281119 0.2688730.268873 0.2523950.252395 0.3298040.329804 A-5_1_能量A-5_1_Energy A-5_2_能量A-5_2_Energy A-5_3_能量A-5_3_Energy A-5_4_能量A-5_4_Energy A-5_5_能量A-5_5_Energy A-5_6_能量A-5_6_Energy 0.3012370.301237 0.3051920.305192 0.3476850.347685 0.3348010.334801 0.3784290.378429 0.3288880.328888 A-6_1_能量A-6_1_Energy A-6_2_能量A-6_2_Energy A-6_3_能量A-6_3_Energy A-6_4_能量A-6_4_Energy A-6_5_能量A-6_5_Energy A-6_6_能量A-6_6_Energy 0.3805320.380532 0.3977090.397709 0.4225090.422509 0.3904280.390428 0.4310730.431073 0.3993580.399358 A-7_1_能量A-7_1_Energy A-7_2_能量A-7_2_Energy A-7_3_能量A-7_3_Energy A-7_4_能量A-7_4_Energy A-7_5_能量A-7_5_Energy A-7_6_能量A-7_6_Energy 0.3805320.380532 0.3850160.385016 0.4132170.413217 0.4215720.421572 0.44220.4422 0.4371270.437127 A-8_1_能量A-8_1_Energy A-8_2_能量A-8_2_Energy A-8_3_能量A-8_3_Energy A-8_4_能量A-8_4_Energy A-8_5_能量A-8_5_Energy A-8_6_能量A-8_6_Energy 0.2356150.235615 0.3581530.358153 0.3653010.365301 0.2856750.285675 0.3859350.385935 0.2859430.285943 B-1_1_能量B-1_1_Energy B-1_2_能量B-1_2_Energy B-1_3_能量B-1_3_Energy B-1_4_能量B-1_4_Energy B-1_5_能量B-1_5_Energy B-1_6_能量B-1_6_Energy 0.0405590.040559 0.1523710.152371 0.1459380.145938 0.3221720.322172 0.5629410.562941 0.7030120.703012 B-2_1_能量B-2_1_Energy B-2_2_能量B-2_2_Energy B-2_3_能量B-2_3_Energy B-2_4_能量B-2_4_Energy B-2_5_能量B-2_5_Energy B-2_6_能量B-2_6_Energy 0.0477430.047743 0.2247510.224751 0.3270960.327096 0.5872060.587206 0.6942830.694283 0.7462950.746295 B-3_1_能量B-3_1_Energy B-3_2_能量B-3_2_Energy B-3_3_能量B-3_3_Energy B-3_4_能量B-3_4_Energy B-3_5_能量B-3_5_Energy B-3_6_能量B-3_6_Energy 0.0365290.036529 0.1994460.199446 0.2712570.271257 0.3261230.326123 0.6738290.673829 0.6156740.615674 B-4_1_能量B-4_1_Energy B-4_2_能量B-4_2_Energy B-4_3_能量B-4_3_Energy B-4_4_能量B-4_4_Energy B-4_5_能量B-4_5_Energy B-4_6_能量B-4_6_Energy 0.029040.02904 0.373420.37342 0.2375870.237587 0.3710340.371034 0.5909660.590966 0.5091260.509126

B-5_1_能量B-5_1_Energy B-5_2_能量B-5_2_Energy B-5_3_能量B-5_3_Energy B-5_4_能量B-5_4_Energy B-5_5_能量B-5_5_Energy B-5_6_能量B-5_6_Energy 0.0280610.028061 0.469620.46962 0.2847350.284735 0.3439050.343905 0.793420.79342 0.6967140.696714 B-6_1_能量B-6_1_Energy B-6_2_能量B-6_2_Energy B-6_3_能量B-6_3_Energy B-6_4_能量B-6_4_Energy B-6_5_能量B-6_5_Energy B-6_6_能量B-6_6_Energy 0.0655380.065538 0.1597440.159744 0.5635710.563571 0.3828430.382843 0.7946070.794607 0.7544980.754498 B-7_1_能量B-7_1_Energy B-7_2_能量B-7_2_Energy B-7_3_能量B-7_3_Energy B-7_4_能量B-7_4_Energy B-7_5_能量B-7_5_Energy B-7_6_能量B-7_6_Energy 0.1136490.113649 0.5623780.562378 0.4579910.457991 0.6119390.611939 0.7325340.732534 0.7578450.757845 B-8_1_能量B-8_1_Energy B-8_2_能量B-8_2_Energy B-8_3_能量B-8_3_Energy B-8_4_能量B-8_4_Energy B-8_5_能量B-8_5_Energy B-8_6_能量B-8_6_Energy 0.0272730.027273 0.1330060.133006 0.5012180.501218 0.1268510.126851 0.5857130.585713 0.6424670.642467

表22惯性矩Table 22 Moments of inertia

Figure BDA0000458431300000291
Figure BDA0000458431300000291

Figure BDA0000458431300000301
Figure BDA0000458431300000301

Figure BDA0000458431300000311
Figure BDA0000458431300000311

表23逆差距Table 23 Inverse Gap

A-1_1_I_D_MA-1_1_I_D_M A-1_2_I_D_MA-1_2_I_D_M A-1_3_I_D_MA-1_3_I_D_M A-1_4_I_D_MA-1_4_I_D_M A-1_5_I_D_MA-1_5_I_D_M A-1_6_I_D_MA-1_6_I_D_M 0.7125870.712587 0.8494130.849413 0.7208550.720855 0.7946390.794639 0.6797480.679748 0.8044190.804419 A-2_1_I_D_MA-2_1_I_D_M A-2_2_I_D_MA-2_2_I_D_M A-2_3_I_D_MA-2_3_I_D_M A-2_4_I_D_MA-2_4_I_D_M A-2_5_I_D_MA-2_5_I_D_M A-2_6_I_D_MA-2_6_I_D_M 0.6426870.642687 0.8182830.818283 0.7857180.785718 0.8034430.803443 0.7435360.743536 0.8030960.803096 A-3_1_I_D_MA-3_1_I_D_M A-3_2_I_D_MA-3_2_I_D_M A-3_3_I_D_MA-3_3_I_D_M A-3_4_I_D_MA-3_4_I_D_M A-3_5_I_D_MA-3_5_I_D_M A-3_6_I_D_MA-3_6_I_D_M 0.6879180.687918 0.7524370.752437 0.7084390.708439 0.7979820.797982 0.7278940.727894 0.8091250.809125 A-4_1_I_D_MA-4_1_I_D_M A-4_2_I_D_MA-4_2_I_D_M A-4_3_I_D_MA-4_3_I_D_M A-4_4_I_D_MA-4_4_I_D_M A-4_5_I_D_MA-4_5_I_D_M A-4_6_I_D_MA-4_6_I_D_M 0.6038720.603872 0.7963560.796356 0.745490.74549 0.7298650.729865 0.6970740.697074 0.8193740.819374 A-5_1_I_D_MA-5_1_I_D_M A-5_2_I_D_MA-5_2_I_D_M A-5_3_I_D_MA-5_3_I_D_M A-5_4_I_D_MA-5_4_I_D_M A-5_5_I_D_MA-5_5_I_D_M A-5_6_I_D_MA-5_6_I_D_M 0.7724730.772473 0.7857520.785752 0.8135670.813567 0.8116720.811672 0.8440240.844024 0.8194570.819457 A-6_1_I_D_MA-6_1_I_D_M A-6_2_I_D_MA-6_2_I_D_M A-6_3_I_D_MA-6_3_I_D_M A-6_4_I_D_MA-6_4_I_D_M A-6_5_I_D_MA-6_5_I_D_M A-6_6_I_D_MA-6_6_I_D_M 0.8518490.851849 0.873950.87395 0.8943260.894326 0.8777560.877756 0.8915040.891504 0.8837670.883767 A-7_1_I_D_MA-7_1_I_D_M A-7_2_I_D_MA-7_2_I_D_M A-7_3_I_D_MA-7_3_I_D_M A-7_4_I_D_MA-7_4_I_D_M A-7_5_I_D_MA-7_5_I_D_M A-7_6_I_D_MA-7_6_I_D_M 0.8518490.851849 0.8609030.860903 0.8881110.888111 0.888890.88889 0.9022530.902253 0.9061850.906185 A-8_1_I_D_MA-8_1_I_D_M A-8_2_I_D_MA-8_2_I_D_M A-8_3_I_D_MA-8_3_I_D_M A-8_4_I_D_MA-8_4_I_D_M A-8_5_I_D_MA-8_5_I_D_M A-8_6_I_D_MA-8_6_I_D_M 0.6984920.698492 0.8372750.837275 0.8328350.832835 0.7677810.767781 0.8049730.804973 0.7561870.756187 B-1_1_I_D_MB-1_1_I_D_M B-1_2_I_D_MB-1_2_I_D_M B-1_3_I_D_MB-1_3_I_D_M B-1_4_I_D_MB-1_4_I_D_M B-1_5_I_D_MB-1_5_I_D_M B-1_6_I_D_MB-1_6_I_D_M 0.3452310.345231 0.5988570.598857 0.6558710.655871 0.7377160.737716 0.8386560.838656 0.8933590.893359 B-2_1_I_D_MB-2_1_I_D_M B-2_2_I_D_MB-2_2_I_D_M B-2_3_I_D_MB-2_3_I_D_M B-2_4_I_D_MB-2_4_I_D_M B-2_5_I_D_MB-2_5_I_D_M B-2_6_I_D_MB-2_6_I_D_M 0.3458870.345887 0.7028560.702856 0.708730.70873 0.8294260.829426 0.8898510.889851 0.9113880.911388

B-3_1_I_D_MB-3_1_I_D_M B-3_2_I_D_MB-3_2_I_D_M B-3_3_I_D_MB-3_3_I_D_M B-3_4_I_D_MB-3_4_I_D_M B-3_5_I_D_MB-3_5_I_D_M B-3_6_I_D_MB-3_6_I_D_M 0.33180.3318 0.64680.6468 0.7131970.713197 0.7052590.705259 0.8930050.893005 0.8771130.877113 B-4_1_I_D_MB-4_1_I_D_M B-4_2_I_D_MB-4_2_I_D_M B-4_3_I_D_MB-4_3_I_D_M B-4_4_I_D_MB-4_4_I_D_M B-4_5_I_D_MB-4_5_I_D_M B-4_6_I_D_MB-4_6_I_D_M 0.2921720.292172 0.7329550.732955 0.6782730.678273 0.7392010.739201 0.8484890.848489 0.8292230.829223 B-5_1_I_D_MB-5_1_I_D_M B-5_2_I_D_MB-5_2_I_D_M B-5_3_I_D_MB-5_3_I_D_M B-5_4_I_D_MB-5_4_I_D_M B-5_5_I_D_MB-5_5_I_D_M B-5_6_I_D_MB-5_6_I_D_M 0.3084950.308495 0.7798280.779828 0.7247580.724758 0.760870.76087 0.9264330.926433 0.8903520.890352 B-6_1_I_D_MB-6_1_I_D_M B-6_2_I_D_MB-6_2_I_D_M B-6_3_I_D_MB-6_3_I_D_M B-6_4_I_D_MB-6_4_I_D_M B-6_5_I_D_MB-6_5_I_D_M B-6_6_I_D_MB-6_6_I_D_M 0.4431820.443182 0.6202920.620292 0.8227650.822765 0.7886440.788644 0.9274260.927426 0.9176310.917631 B-7_1_I_D_MB-7_1_I_D_M B-7_2_I_D_MB-7_2_I_D_M B-7_3_I_D_MB-7_3_I_D_M B-7_4_I_D_MB-7_4_I_D_M B-7_5_I_D_MB-7_5_I_D_M B-7_6_I_D_MB-7_6_I_D_M 0.4960280.496028 0.805420.80542 0.8376610.837661 0.862740.86274 0.9173960.917396 0.9174020.917402 B-8_1_I_D_MB-8_1_I_D_M B-8_2_I_D_MB-8_2_I_D_M B-8_3_I_D_MB-8_3_I_D_M B-8_4_I_D_MB-8_4_I_D_M B-8_5_I_D_MB-8_5_I_D_M B-8_6_I_D_MB-8_6_I_D_M 0.2478640.247864 0.6216260.621626 0.8037630.803763 0.6083890.608389 0.8928660.892866 0.8816860.881686

表24熵熵Table 24 Entropy Entropy

A-1_1_熵A-1_1_entropy A-1_2_熵A-1_2_Entropy A-1_3_熵A-1_3_Entropy A-1_4_熵A-1_4_Entropy A-1_5_熵A-1_5_Entropy A-1_6_熵A-1_6_Entropy 0.1473650.147365 0.2498420.249842 0.8099250.809925 0.6747050.674705 0.9910990.991099 0.7074310.707431 A-2_1_熵A-2_1_Entropy A-2_2_熵A-2_2_entropy A-2_3_熵A-2_3_Entropy A-2_4_熵A-2_4_Entropy A-2_5_熵A-2_5_Entropy A-2_6_熵A-2_6_Entropy 0.2576540.257654 0.4673250.467325 0.7446890.744689 0.6345340.634534 0.8414020.841402 0.7194840.719484 A-3_1_熵A-3_1_Entropy A-3_2_熵A-3_2_Entropy A-3_3_熵A-3_3_entropy A-3_4_熵A-3_4_Entropy A-3_5_熵A-3_5_Entropy A-3_6_熵A-3_6_Entropy 0.2576540.257654 0.783440.78344 0.9418540.941854 0.5776940.577694 0.866420.86642 0.6386370.638637 A-4_1_熵A-4_1_Entropy A-4_2_熵A-4_2_Entropy A-4_3_熵A-4_3_Entropy A-4_4_熵A-4_4_entropy A-4_5_熵A-4_5_Entropy A-4_6_熵A-4_6_Entropy 0.53310.5331 0.4350410.435041 0.8920070.892007 0.7376860.737686 0.9260490.926049 0.5270860.527086 A-5_1_熵A-5_1_Entropy A-5_2_熵A-5_2_Entropy A-5_3_熵A-5_3_Entropy A-5_4_熵A-5_4_Entropy A-5_5_熵A-5_5_entropy A-5_6_熵A-5_6_Entropy 0.2131010.213101 0.6738180.673818 0.6096550.609655 0.5698820.569882 0.4831880.483188 0.5426280.542628 A-6_1_熵A-6_1_Entropy A-6_2_熵A-6_2_Entropy A-6_3_熵A-6_3_Entropy A-6_4_熵A-6_4_Entropy A-6_5_熵A-6_5_Entropy A-6_6_熵A-6_6_entropy 0.1473650.147365 0.4027570.402757 0.5332420.533242 0.5426280.542628 0.4021090.402109 0.3707030.370703 A-7_1_熵A-7_1_Entropy A-7_2_熵A-7_2_Entropy A-7_3_熵A-7_3_Entropy A-7_4_熵A-7_4_Entropy A-7_5_熵A-7_5_Entropy A-7_6_熵A-7_6_Entropy

0.1473650.147365 0.4491940.449194 0.5049840.504984 0.4192450.419245 0.3394530.339453 0.3145770.314577 A-8_1_熵A-8_1_Entropy A-8_2_熵A-8_2_Entropy A-8_3_熵A-8_3_Entropy A-8_4_熵A-8_4_Entropy A-8_5_熵A-8_5_Entropy A-8_6_熵A-8_6_Entropy 0.4590460.459046 0.4857280.485728 0.5994590.599459 0.8272730.827273 0.634160.63416 0.8626120.862612 B-1_1_熵B-1_1_Entropy B-1_2_熵B-1_2_Entropy B-1_3_熵B-1_3_Entropy B-1_4_熵B-1_4_Entropy B-1_5_熵B-1_5_Entropy B-1_6_熵B-1_6_Entropy 1.1352931.135293 1.3726491.372649 0.9892720.989272 0.9076890.907689 0.6233990.623399 0.4077730.407773 B-2_1_熵B-2_1_Entropy B-2_2_熵B-2_2_Entropy B-2_3_熵B-2_3_Entropy B-2_4_熵B-2_4_Entropy B-2_5_熵B-2_5_Entropy B-2_6_熵B-2_6_Entropy 0.9602010.960201 1.0472551.047255 1.1029551.102955 0.806230.80623 0.4602220.460222 0.3394530.339453 B-3_1_熵B-3_1_Entropy B-3_2_熵B-3_2_Entropy B-3_3_熵B-3_3_Entropy B-3_4_熵B-3_4_Entropy B-3_5_熵B-3_5_Entropy B-3_6_熵B-3_6_Entropy 1.2267851.226785 1.2592161.259216 0.9448220.944822 1.0489461.048946 0.3648060.364806 0.3942970.394297 B-4_1_熵B-4_1_Entropy B-4_2_熵B-4_2_Entropy B-4_3_熵B-4_3_Entropy B-4_4_熵B-4_4_Entropy B-4_5_熵B-4_5_Entropy B-4_6_熵B-4_6_Entropy 1.2762761.276276 1.0921161.092116 1.0666061.066606 1.018841.01884 0.5027520.502752 0.5948690.594869 B-5_1_熵B-5_1_Entropy B-5_2_熵B-5_2_Entropy B-5_3_熵B-5_3_Entropy B-5_4_熵B-5_4_Entropy B-5_5_熵B-5_5_Entropy B-5_6_熵B-5_6_Entropy 1.3111381.311138 1.0983371.098337 0.7074740.707474 0.7697070.769707 0.3067650.306765 0.4048250.404825 B-6_1_熵B-6_1_Entropy B-6_2_熵B-6_2_Entropy B-6_3_熵B-6_3_Entropy B-6_4_熵B-6_4_Entropy B-6_5_熵B-6_5_Entropy B-6_6_熵B-6_6_Entropy 0.9671830.967183 1.216741.21674 0.7654650.765465 0.8333320.833332 0.3028270.302827 0.3618570.361857 B-7_1_熵B-7_1_Entropy B-7_2_熵B-7_2_Entropy B-7_3_熵B-7_3_Entropy B-7_4_熵B-7_4_Entropy B-7_5_熵B-7_5_Entropy B-7_6_熵B-7_6_Entropy 1.0106071.010607 1.0274961.027496 0.5855070.585507 0.6622550.662255 0.2779050.277905 0.3540450.354045 B-8_1_熵B-8_1_Entropy B-8_2_熵B-8_2_Entropy B-8_3_熵B-8_3_Entropy B-8_4_熵B-8_4_Entropy B-8_5_熵B-8_5_Entropy B-8_6_熵B-8_6_Entropy 1.258161.25816 1.2355781.235578 0.8079470.807947 0.9753530.975353 0.255310.25531 0.5994590.599459

表25相关性Table 25 Correlation

A-1_1_CA-1_1_C A-1_2_CA-1_2_C A-1_3_CA-1_3_C A-1_4_CA-1_4_C A-1_5_CA-1_5_C A-1_6_CA-1_6_C 0.4534880.453488 0.6918890.691889 0.6117680.611768 0.71480.7148 0.624260.62426 0.7323090.732309 A-2_1_CA-2_1_C A-2_2_CA-2_2_C A-2_3_CA-2_3_C A-2_4_CA-2_4_C A-2_5_CA-2_5_C A-2_6_CA-2_6_C 0.4596530.459653 0.7358350.735835 0.7333020.733302 0.7153120.715312 0.6970150.697015 0.7328490.732849 A-3_1_CA-3_1_C A-3_2_CA-3_2_C A-3_3_CA-3_3_C A-3_4_CA-3_4_C A-3_5_CA-3_5_C A-3_6_CA-3_6_C 0.4958420.495842 0.6523040.652304 0.6537170.653717 0.678980.67898 0.6568040.656804 0.7466050.746605

A-4_1_CA-4_1_C A-4_2_CA-4_2_C A-4_3_CA-4_3_C A-4_4_CA-4_4_C A-4_5_CA-4_5_C A-4_6_CA-4_6_C 0.353450.35345 0.6516180.651618 0.6663320.666332 0.5994820.599482 0.6296910.629691 0.7429860.742986 A-5_1_CA-5_1_C A-5_2_CA-5_2_C A-5_3_CA-5_3_C A-5_4_CA-5_4_C A-5_5_CA-5_5_C A-5_6_CA-5_6_C 0.5957310.595731 0.7128440.712844 0.7219260.721926 0.7043750.704375 0.7562680.756268 0.7319470.731947 A-6_1_CA-6_1_C A-6_2_CA-6_2_C A-6_3_CA-6_3_C A-6_4_CA-6_4_C A-6_5_CA-6_5_C A-6_6_CA-6_6_C 0.7132270.713227 0.7749560.774956 0.8449110.844911 0.8371040.837104 0.840230.84023 0.8404480.840448 A-7_1_CA-7_1_C A-7_2_CA-7_2_C A-7_3_CA-7_3_C A-7_4_CA-7_4_C A-7_5_CA-7_5_C A-7_6_CA-7_6_C 0.7132270.713227 0.808980.80898 0.8543490.854349 0.8205230.820523 0.8645540.864554 0.8654750.865475 A-8_1_CA-8_1_C A-8_2_CA-8_2_C A-8_3_CA-8_3_C A-8_4_CA-8_4_C A-8_5_CA-8_5_C A-8_6_CA-8_6_C 0.5833630.583363 0.7329090.732909 0.7457910.745791 0.7232270.723227 0.7061810.706181 0.706560.70656 B-1_1_CB-1_1_C B-1_2_CB-1_2_C B-1_3_CB-1_3_C B-1_4_CB-1_4_C B-1_5_CB-1_5_C B-1_6_CB-1_6_C -0.06096-0.06096 0.028360.02836 0.1810450.181045 0.071720.07172 -0.04784-0.04784 -0.17716-0.17716 B-2_1_CB-2_1_C B-2_2_CB-2_2_C B-2_3_CB-2_3_C B-2_4_CB-2_4_C B-2_5_CB-2_5_C B-2_6_CB-2_6_C -0.11077-0.11077 -0.02076-0.02076 -0.04274-0.04274 -0.16053-0.16053 -0.17078-0.17078 -0.18233-0.18233 B-3_1_CB-3_1_C B-3_2_CB-3_2_C B-3_3_CB-3_3_C B-3_4_CB-3_4_C B-3_5_CB-3_5_C B-3_6_CB-3_6_C -0.08765-0.08765 -0.03503-0.03503 -0.05094-0.05094 -0.15471-0.15471 -0.0704-0.0704 -0.15323-0.15323 B-4_1_CB-4_1_C B-4_2_CB-4_2_C B-4_3_CB-4_3_C B-4_4_CB-4_4_C B-4_5_CB-4_5_C B-4_6_CB-4_6_C -0.0882-0.0882 -0.05643-0.05643 0.0253840.025384 0.1739750.173975 -0.14326-0.14326 -0.09191-0.09191 B-5_1_CB-5_1_C B-5_2_CB-5_2_C B-5_3_CB-5_3_C B-5_4_CB-5_4_C B-5_5_CB-5_5_C B-5_6_CB-5_6_C -0.12153-0.12153 -0.12012-0.12012 -0.12315-0.12315 0.0227570.022757 -0.19152-0.19152 -0.24968-0.24968 B-6_1_CB-6_1_C B-6_2_CB-6_2_C B-6_3_CB-6_3_C B-6_4_CB-6_4_C B-6_5_CB-6_5_C B-6_6_CB-6_6_C 0.1361080.136108 -0.03391-0.03391 -0.20318-0.20318 0.0279260.027926 -0.11888-0.11888 0.0347830.034783 B-7_1_CB-7_1_C B-7_2_CB-7_2_C B-7_3_CB-7_3_C B-7_4_CB-7_4_C B-7_5_CB-7_5_C B-7_6_CB-7_6_C 0.0643150.064315 -0.18657-0.18657 -0.19499-0.19499 -0.2193-0.2193 -0.12311-0.12311 -0.09605-0.09605 B-8_1_CB-8_1_C B-8_2_CB-8_2_C B-8_3_CB-8_3_C B-8_4_CB-8_4_C B-8_5_CB-8_5_C B-8_6_CB-8_6_C 0.1858170.185817 -0.05458-0.05458 -0.06841-0.06841 0.1200480.120048 0.0050320.005032 -0.03155-0.03155

表26和的均数Table 26 and the mean of

Figure BDA0000458431300000361
Figure BDA0000458431300000361

表27差的均数Table 27 Mean of difference

A-1_1_D-mA-1_1_D-m A-1_2_D-mA-1_2_D-m A-1_3_D-mA-1_3_D-m A-1_4_D-mA-1_4_D-m A-1_5_D-mA-1_5_D-m A-1_6_D-mA-1_6_D-m 4.3303574.330357 2.1382442.138244 2.9943482.994348 2.1420652.142065 2.8351982.835198 1.9971811.997181 A-2_1_D-mA-2_1_D-m A-2_2_D-mA-2_2_D-m A-2_3_D-mA-2_3_D-m A-2_4_D-mA-2_4_D-m A-2_5_D-mA-2_5_D-m A-2_6_D-mA-2_6_D-m

4.2008934.200893 2.1043152.104315 2.193442.19344 2.2391632.239163 2.3773852.377385 2.0199572.019957 A-3_1_D-mA-3_1_D-m A-3_2_D-mA-3_2_D-m A-3_3_D-mA-3_3_D-m A-3_4_D-mA-3_4_D-m A-3_5_D-mA-3_5_D-m A-3_6_D-mA-3_6_D-m 3.8794643.879464 2.755062.75506 2.7390192.739019 2.5075242.507524 2.5334492.533449 2.0160092.016009 A-4_1_D-mA-4_1_D-m A-4_2_D-mA-4_2_D-m A-4_3_D-mA-4_3_D-m A-4_4_D-mA-4_4_D-m A-4_5_D-mA-4_5_D-m A-4_6_D-mA-4_6_D-m 4.8898814.889881 2.6677082.667708 2.626262.62626 3.1743233.174323 2.704222.70422 2.0900962.090096 A-5_1_D-mA-5_1_D-m A-5_2_D-mA-5_2_D-m A-5_3_D-mA-5_3_D-m A-5_4_D-mA-5_4_D-m A-5_5_D-mA-5_5_D-m A-5_6_D-mA-5_6_D-m 3.0877983.087798 2.3038692.303869 2.193442.19344 2.3027792.302779 1.844921.84492 2.1484132.148413 A-6_1_D-mA-6_1_D-m A-6_2_D-mA-6_2_D-m A-6_3_D-mA-6_3_D-m A-6_4_D-mA-6_4_D-m A-6_5_D-mA-6_5_D-m A-6_6_D-mA-6_6_D-m 2.2321432.232143 1.6891371.689137 1.213891.21389 1.3428861.342886 1.2074651.207465 1.2825441.282544 A-7_1_D-mA-7_1_D-m A-7_2_D-mA-7_2_D-m A-7_3_D-mA-7_3_D-m A-7_4_D-mA-7_4_D-m A-7_5_D-mA-7_5_D-m A-7_6_D-mA-7_6_D-m 2.2321432.232143 1.5531251.553125 1.2021531.202153 1.3742081.374208 1.055871.05587 1.0448991.044899 A-8_1_D-mA-8_1_D-m A-8_2_D-mA-8_2_D-m A-8_3_D-mA-8_3_D-m A-8_4_D-mA-8_4_D-m A-8_5_D-mA-8_5_D-m A-8_6_D-mA-8_6_D-m 3.3616073.361607 2.0068452.006845 1.9153591.915359 2.288452.28845 2.0080732.008073 2.3403652.340365 B-1_1_D-mB-1_1_D-m B-1_2_D-mB-1_2_D-m B-1_3_D-mB-1_3_D-m B-1_4_D-mB-1_4_D-m B-1_5_D-mB-1_5_D-m B-1_6_D-mB-1_6_D-m 3.1101193.110119 1.4395831.439583 1.0031321.003132 0.8571430.857143 0.5353340.535334 0.3767440.376744 B-2_1_D-mB-2_1_D-m B-2_2_D-mB-2_2_D-m B-2_3_D-mB-2_3_D-m B-2_4_D-mB-2_4_D-m B-2_5_D-mB-2_5_D-m B-2_6_D-mB-2_6_D-m 3.2127983.212798 0.9593750.959375 1.1005181.100518 0.6561780.656178 0.3741980.374198 0.2997690.299769 B-3_1_D-mB-3_1_D-m B-3_2_D-mB-3_2_D-m B-3_3_D-mB-3_3_D-m B-3_4_D-mB-3_4_D-m B-3_5_D-mB-3_5_D-m B-3_6_D-mB-3_6_D-m 3.430063.43006 1.2456851.245685 0.9853830.985383 1.1145231.114523 0.3380870.338087 0.3649590.364959 B-4_1_D-mB-4_1_D-m B-4_2_D-mB-4_2_D-m B-4_3_D-mB-4_3_D-m B-4_4_D-mB-4_4_D-m B-4_5_D-mB-4_5_D-m B-4_6_D-mB-4_6_D-m 3.6413693.641369 0.9769350.976935 1.2103971.210397 0.8531830.853183 0.5494050.549405 0.5527940.552794 B-5_1_D-mB-5_1_D-m B-5_2_D-mB-5_2_D-m B-5_3_D-mB-5_3_D-m B-5_4_D-mB-5_4_D-m B-5_5_D-mB-5_5_D-m B-5_6_D-mB-5_6_D-m 3.4553573.455357 0.9129460.912946 0.9201110.920111 0.7516920.751692 0.2695930.269593 0.4300390.430039 B-6_1_D-mB-6_1_D-m B-6_2_D-mB-6_2_D-m B-6_3_D-mB-6_3_D-m B-6_4_D-mB-6_4_D-m B-6_5_D-mB-6_5_D-m B-6_6_D-mB-6_6_D-m 2.8452382.845238 1.3851191.385119 0.7441680.744168 0.7587850.758785 0.2655510.265551 0.3028270.302827 B-7_1_D-mB-7_1_D-m B-7_2_D-mB-7_2_D-m B-7_3_D-mB-7_3_D-m B-7_4_D-mB-7_4_D-m B-7_5_D-mB-7_5_D-m B-7_6_D-mB-7_6_D-m 2.6651792.665179 1.0126491.012649 0.7053570.705357 0.5210250.521025 0.3179520.317952 0.299190.29919 B-8_1_D-mB-8_1_D-m B-8_2_D-mB-8_2_D-m B-8_3_D-mB-8_3_D-m B-8_4_D-mB-8_4_D-m B-8_5_D-mB-8_5_D-m B-8_6_D-mB-8_6_D-m

3.281253.28125 1.2489581.248958 0.7147540.714754 1.2159781.215978 0.3000660.300066 0.3686430.368643

表28和的熵Entropy of Table 28 and

A-1_1_S-EA-1_1_S-E A-1_2_S-EA-1_2_S-E A-1_3_S-EA-1_3_S-E A-1_4_S-EA-1_4_S-E A-1_5_S-EA-1_5_S-E A-1_6_S-EA-1_6_S-E 0.6120690.612069 1.0118921.011892 2.752112.75211 2.5593952.559395 2.6537172.653717 2.3570322.357032 A-2_1_S-EA-2_1_S-E A-2_2_S-EA-2_2_S-E A-2_3_S-EA-2_3_S-E A-2_4_S-EA-2_4_S-E A-2_5_S-EA-2_5_S-E A-2_6_S-EA-2_6_S-E 0.9473760.947376 1.7249221.724922 2.3973862.397386 2.2641992.264199 2.4748222.474822 2.1361882.136188 A-3_1_S-EA-3_1_S-E A-3_2_S-EA-3_2_S-E A-3_3_S-EA-3_3_S-E A-3_4_S-EA-3_4_S-E A-3_5_S-EA-3_5_S-E A-3_6_S-EA-3_6_S-E 1.0118921.011892 2.5390992.539099 2.9623062.962306 1.7717091.771709 2.4371672.437167 2.3829272.382927 A-4_1_S-EA-4_1_S-E A-4_2_S-EA-4_2_S-E A-4_3_S-EA-4_3_S-E A-4_4_S-EA-4_4_S-E A-4_5_S-EA-4_5_S-E A-4_6_S-EA-4_6_S-E 2.0280472.028047 1.5313741.531374 2.8567912.856791 2.2695152.269515 2.4855082.485508 1.927661.92766 A-5_1_S-EA-5_1_S-E A-5_2_S-EA-5_2_S-E A-5_3_S-EA-5_3_S-E A-5_4_S-EA-5_4_S-E A-5_5_S-EA-5_5_S-E A-5_6_S-EA-5_6_S-E 0.7483270.748327 2.1320392.132039 2.1361882.136188 2.0553422.055342 1.8362251.836225 2.2641992.264199 A-6_1_S-EA-6_1_S-E A-6_2_S-EA-6_2_S-E A-6_3_S-EA-6_3_S-E A-6_4_S-EA-6_4_S-E A-6_5_S-EA-6_5_S-E A-6_6_S-EA-6_6_S-E 0.6120690.612069 1.5715391.571539 1.8387921.838792 2.0589862.058986 1.8538221.853822 1.8362251.836225 A-7_1_S-EA-7_1_S-E A-7_2_S-EA-7_2_S-E A-7_3_S-EA-7_3_S-E A-7_4_S-EA-7_4_S-E A-7_5_S-EA-7_5_S-E A-7_6_S-EA-7_6_S-E 0.6120690.612069 1.9116071.911607 1.9050911.905091 1.7278671.727867 1.9426891.942689 1.6126491.612649 A-8_1_S-EA-8_1_S-E A-8_2_S-EA-8_2_S-E A-8_3_S-EA-8_3_S-E A-8_4_S-EA-8_4_S-E A-8_5_S-EA-8_5_S-E A-8_6_S-EA-8_6_S-E 1.9142391.914239 1.6719891.671989 2.2426532.242653 2.3829272.382927 2.411852.41185 2.5931172.593117 B-1_1_S-EB-1_1_S-E B-1_2_S-EB-1_2_S-E B-1_3_S-EB-1_3_S-E B-1_4_S-EB-1_4_S-E B-1_5_S-EB-1_5_S-E B-1_6_S-EB-1_6_S-E 2.9439252.943925 3.070123.07012 2.6943982.694398 2.6196682.619668 2.0569852.056985 1.9384911.938491 B-2_1_S-EB-2_1_S-E B-2_2_S-EB-2_2_S-E B-2_3_S-EB-2_3_S-E B-2_4_S-EB-2_4_S-E B-2_5_S-EB-2_5_S-E B-2_6_S-EB-2_6_S-E 2.8619192.861919 2.4837752.483775 2.8048682.804868 2.1912452.191245 1.7787591.778759 1.7268351.726835 B-3_1_S-EB-3_1_S-E B-3_2_S-EB-3_2_S-E B-3_3_S-EB-3_3_S-E B-3_4_S-EB-3_4_S-E B-3_5_S-EB-3_5_S-E B-3_6_S-EB-3_6_S-E 3.1861763.186176 2.7325592.732559 2.7053142.705314 2.6021772.602177 1.7268351.726835 1.7268351.726835 B-4_1_S-EB-4_1_S-E B-4_2_S-EB-4_2_S-E B-4_3_S-EB-4_3_S-E B-4_4_S-EB-4_4_S-E B-4_5_S-EB-4_5_S-E B-4_6_S-EB-4_6_S-E 3.3664783.366478 2.6841842.684184 2.9213082.921308 2.7228042.722804 2.0833372.083337 2.2359812.235981 B-5_1_S-EB-5_1_S-E B-5_2_S-EB-5_2_S-E B-5_3_S-EB-5_3_S-E B-5_4_S-EB-5_4_S-E B-5_5_S-EB-5_5_S-E B-5_6_S-EB-5_6_S-E

3.0973083.097308 2.7213162.721316 2.3394012.339401 2.2603332.260333 1.6561751.656175 1.9116071.911607 B-6_1_S-EB-6_1_S-E B-6_2_S-EB-6_2_S-E B-6_3_S-EB-6_3_S-E B-6_4_S-EB-6_4_S-E B-6_5_S-EB-6_5_S-E B-6_6_S-EB-6_6_S-E 3.0819673.081967 2.7264982.726498 2.3599262.359926 2.4175822.417582 1.6379681.637968 1.8094591.809459 B-7_1_S-EB-7_1_S-E B-7_2_S-EB-7_2_S-E B-7_3_S-EB-7_3_S-E B-7_4_S-EB-7_4_S-E B-7_5_S-EB-7_5_S-E B-7_6_S-EB-7_6_S-E 3.187913.18791 2.6969652.696965 1.9543051.954305 2.0509712.050971 1.6126491.612649 1.721281.72128 B-8_1_S-EB-8_1_S-E B-8_2_S-EB-8_2_S-E B-8_3_S-EB-8_3_S-E B-8_4_S-EB-8_4_S-E B-8_5_S-EB-8_5_S-E B-8_6_S-EB-8_6_S-E 3.4711593.471159 2.8991892.899189 2.4371672.437167 2.714652.71465 1.4668581.466858 1.8253721.825372

表29差的熵Table 29 Entropy of difference

A-1_1_D-EA-1_1_D-E A-1_2_D-EA-1_2_D-E A-1_3_D-EA-1_3_D-E A-1_4_D-EA-1_4_D-E A-1_5_D-EA-1_5_D-E A-1_6_D-EA-1_6_D-E 0.6685640.668564 1.3112781.311278 33 2.8993972.899397 2.8752.875 2.4772172.477217 A-2_1_D-EA-2_1_D-E A-2_2_D-EA-2_2_D-E A-2_3_D-EA-2_3_D-E A-2_4_D-EA-2_4_D-E A-2_5_D-EA-2_5_D-E A-2_6_D-EA-2_6_D-E 1.3112781.311278 1.6216411.621641 2.827822.82782 2.6556392.655639 3.0306393.030639 2.4237952.423795 A-3_1_D-EA-3_1_D-E A-3_2_D-EA-3_2_D-E A-3_3_D-EA-3_3_D-E A-3_4_D-EA-3_4_D-E A-3_5_D-EA-3_5_D-E A-3_6_D-EA-3_6_D-E 1.3112781.311278 3.1253.125 2.952822.95282 2.4834592.483459 2.5217822.521782 2.3050372.305037 A-4_1_D-EA-4_1_D-E A-4_2_D-EA-4_2_D-E A-4_3_D-EA-4_3_D-E A-4_4_D-EA-4_4_D-E A-4_5_D-EA-4_5_D-E A-4_6_D-EA-4_6_D-E 2.2717822.271782 22 3.1253.125 2.8584592.858459 2.52.5 2.1252.125 A-5_1_D-EA-5_1_D-E A-5_2_D-EA-5_2_D-E A-5_3_D-EA-5_3_D-E A-5_4_D-EA-5_4_D-E A-5_5_D-EA-5_5_D-E A-5_6_D-EA-5_6_D-E 0.9933930.993393 2.6556392.655639 2.4834592.483459 2.577822.57782 2.5550372.555037 2.4362782.436278 A-6_1_D-EA-6_1_D-E A-6_2_D-EA-6_2_D-E A-6_3_D-EA-6_3_D-E A-6_4_D-EA-6_4_D-E A-6_5_D-EA-6_5_D-E A-6_6_D-EA-6_6_D-E 0.6685640.668564 2.077822.07782 2.2987952.298795 2.6493972.649397 2.4056392.405639 2.2806392.280639 A-7_1_D-EA-7_1_D-E A-7_2_D-EA-7_2_D-E A-7_3_D-EA-7_3_D-E A-7_4_D-EA-7_4_D-E A-7_5_D-EA-7_5_D-E A-7_6_D-EA-7_6_D-E 0.6685640.668564 2.3496022.349602 2.3752.375 2.0306392.030639 1.7947371.794737 2.077822.07782 A-8_1_D-EA-8_1_D-E A-8_2_D-EA-8_2_D-E A-8_3_D-EA-8_3_D-E A-8_4_D-EA-8_4_D-E A-8_5_D-EA-8_5_D-E A-8_6_D-EA-8_6_D-E 2.3752.375 2.077822.07782 2.752.75 2.752.75 2.7806392.780639 3.0306393.030639 B-1_1_D-EB-1_1_D-E B-1_2_D-EB-1_2_D-E B-1_3_D-EB-1_3_D-E B-1_4_D-EB-1_4_D-E B-1_5_D-EB-1_5_D-E B-1_6_D-EB-1_6_D-E 3.1556393.155639 3.4056393.405639 2.5217822.521782 2.7717822.771782 2.5217822.521782 2.3496022.349602

B-2_1_D-EB-2_1_D-E B-2_2_D-EB-2_2_D-E B-2_3_D-EB-2_3_D-E B-2_4_D-EB-2_4_D-E B-2_5_D-EB-2_5_D-E B-2_6_D-EB-2_6_D-E 3.327823.32782 2.7743972.774397 3.253.25 2.4746022.474602 2.5306392.530639 2.202822.20282 B-3_1_D-EB-3_1_D-E B-3_2_D-EB-3_2_D-E B-3_3_D-EB-3_3_D-E B-3_4_D-EB-3_4_D-E B-3_5_D-EB-3_5_D-E B-3_6_D-EB-3_6_D-E 3.452823.45282 3.1084593.108459 2.8752.875 2.952822.95282 2.077822.07782 2.202822.20282 B-4_1_D-EB-4_1_D-E B-4_2_D-EB-4_2_D-E B-4_3_D-EB-4_3_D-E B-4_4_D-EB-4_4_D-E B-4_5_D-EB-4_5_D-E B-4_6_D-EB-4_6_D-E 3.452823.45282 33 3.2806393.280639 2.6467822.646782 2.6556392.655639 2.6084592.608459 B-5_1_D-EB-5_1_D-E B-5_2_D-EB-5_2_D-E B-5_3_D-EB-5_3_D-E B-5_4_D-EB-5_4_D-E B-5_5_D-EB-5_5_D-E B-5_6_D-EB-5_6_D-E 3.202823.20282 3.0243973.024397 2.827822.82782 2.4772172.477217 2.202822.20282 2.5487952.548795 B-6_1_D-EB-6_1_D-E B-6_2_D-EB-6_2_D-E B-6_3_D-EB-6_3_D-E B-6_4_D-EB-6_4_D-E B-6_5_D-EB-6_5_D-E B-6_6_D-EB-6_6_D-E 3.1084593.108459 2.952822.95282 2.8522172.852217 2.8522172.852217 2.077822.07782 2.2717822.271782 B-7_1_D-EB-7_1_D-E B-7_2_D-EB-7_2_D-E B-7_3_D-EB-7_3_D-E B-7_4_D-EB-7_4_D-E B-7_5_D-EB-7_5_D-E B-7_6_D-EB-7_6_D-E 3.6253.625 2.952822.95282 2.202822.20282 2.4362782.436278 2.3050372.305037 2.252.25 B-8_1_D-EB-8_1_D-E B-8_2_D-EB-8_2_D-E B-8_3_D-EB-8_3_D-E B-8_4_D-EB-8_4_D-E B-8_5_D-EB-8_5_D-E B-8_6_D-EB-8_6_D-E 3.0243973.024397 3.077823.07782 2.752.75 33 1.9197371.919737 2.3752.375

Claims (3)

1. the various dimensions texture extracting method based on crowd's brain nuclear magnetic resonance image, is characterized in that: deployment area growth method by the Region Segmentation needing in crowd's brain nuclear magnetic resonance image out.
2. extracting method according to claim 1, it is characterized in that: adopt Curvelet transform method to extract the textural characteristics parameter in the region of needs, wherein said crowd comprises Alzheimer disease people colony, mild cognitive impairment patient population and normal aging people colony, the textural characteristics parameter in the region of described needs comprises entropy, gray average, correlativity, energy, homogeneity degree, variance, maximum probability, unfavourable balance distance, Clustering Tendency, contrast, with average, poor average, with entropy, poor entropy, the region of described needs comprises entorhinal cortex and two regions of hippocampus.
3. extracting method according to claim 1, it is characterized in that: adopt Contourlet transform method to extract the textural characteristics parameter in the region of needs, wherein said crowd comprises Alzheimer disease people colony, mild cognitive impairment patient population and normal aging people colony, the textural characteristics parameter in the region of described needs comprises entropy, gray average, correlativity, energy, homogeneity degree, variance, maximum probability, unfavourable balance distance, Clustering Tendency, contrast, with average, poor average, with entropy, poor entropy, the region of described needs comprises entorhinal cortex and two regions of hippocampus.
CN201410023305.4A 2014-01-17 2014-01-17 Multidimensional vein extracting method based on brain nuclear magnetic resonance image Pending CN103793711A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201410023305.4A CN103793711A (en) 2014-01-17 2014-01-17 Multidimensional vein extracting method based on brain nuclear magnetic resonance image
PCT/CN2014/001130 WO2015106374A1 (en) 2014-01-17 2014-12-16 Multidimensional texture extraction method based on brain nuclear magnetic resonance images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410023305.4A CN103793711A (en) 2014-01-17 2014-01-17 Multidimensional vein extracting method based on brain nuclear magnetic resonance image

Publications (1)

Publication Number Publication Date
CN103793711A true CN103793711A (en) 2014-05-14

Family

ID=50669354

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410023305.4A Pending CN103793711A (en) 2014-01-17 2014-01-17 Multidimensional vein extracting method based on brain nuclear magnetic resonance image

Country Status (2)

Country Link
CN (1) CN103793711A (en)
WO (1) WO2015106374A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015106374A1 (en) * 2014-01-17 2015-07-23 首都医科大学 Multidimensional texture extraction method based on brain nuclear magnetic resonance images
WO2015106373A1 (en) * 2014-01-17 2015-07-23 首都医科大学 Method for establishing prediction model based on multidimensional texture of brain nuclear magnetic resonance images
CN104881680A (en) * 2015-05-25 2015-09-02 电子科技大学 Alzheimer's disease and mild cognitive impairment identification method based on two-dimension features and three-dimension features
CN106780372A (en) * 2016-11-30 2017-05-31 华南理工大学 A kind of weight nuclear norm magnetic resonance imaging method for reconstructing sparse based on Generalized Tree
CN109492700A (en) * 2018-11-21 2019-03-19 西安中科光电精密工程有限公司 A kind of Target under Complicated Background recognition methods based on multidimensional information fusion
CN110033019A (en) * 2019-03-06 2019-07-19 腾讯科技(深圳)有限公司 Method for detecting abnormality, device and the storage medium of human body
CN111739645A (en) * 2020-05-14 2020-10-02 上海依智医疗技术有限公司 Training method of immune-related pneumonia prediction model
CN111754598A (en) * 2020-06-27 2020-10-09 昆明理工大学 Transform Learning-Based Parallel MRI Reconstruction Method for Local Spatial Neighborhood
CN113962930A (en) * 2021-09-07 2022-01-21 北京邮电大学 Alzheimer's disease risk assessment model building method and electronic device

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108241865B (en) * 2016-12-26 2021-11-02 哈尔滨工业大学 Multi-level quantitative staging method for liver fibrosis based on multi-scale and multi-subgraphs of ultrasound images
US20210228079A1 (en) 2018-03-07 2021-07-29 Institut National De La Sante Et De La Recherche Medicale (Inserm) Method for early prediction of neurodegenerative decline

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101971213A (en) * 2008-02-29 2011-02-09 新加坡科技研究局 A method and system for anatomy structure segmentation and modeling in an image
CN101853493B (en) * 2009-10-21 2013-06-19 首都医科大学 Method for extracting multi-dimensional texture of nodi from medical images
CN103793908A (en) * 2014-01-17 2014-05-14 首都医科大学 Method for constructing prediction model of multifunctional veins based on brain nuclear magnetic resonance image
CN103793711A (en) * 2014-01-17 2014-05-14 首都医科大学 Multidimensional vein extracting method based on brain nuclear magnetic resonance image

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015106373A1 (en) * 2014-01-17 2015-07-23 首都医科大学 Method for establishing prediction model based on multidimensional texture of brain nuclear magnetic resonance images
WO2015106374A1 (en) * 2014-01-17 2015-07-23 首都医科大学 Multidimensional texture extraction method based on brain nuclear magnetic resonance images
CN104881680A (en) * 2015-05-25 2015-09-02 电子科技大学 Alzheimer's disease and mild cognitive impairment identification method based on two-dimension features and three-dimension features
CN106780372A (en) * 2016-11-30 2017-05-31 华南理工大学 A kind of weight nuclear norm magnetic resonance imaging method for reconstructing sparse based on Generalized Tree
CN106780372B (en) * 2016-11-30 2019-06-18 华南理工大学 A Weighted Kernel Norm Magnetic Resonance Imaging Reconstruction Method Based on Generalized Tree Sparse
CN109492700B (en) * 2018-11-21 2020-09-08 西安中科光电精密工程有限公司 Complex background target identification method based on multi-dimensional information fusion
CN109492700A (en) * 2018-11-21 2019-03-19 西安中科光电精密工程有限公司 A kind of Target under Complicated Background recognition methods based on multidimensional information fusion
CN110033019A (en) * 2019-03-06 2019-07-19 腾讯科技(深圳)有限公司 Method for detecting abnormality, device and the storage medium of human body
CN110033019B (en) * 2019-03-06 2021-07-27 腾讯科技(深圳)有限公司 Method and device for detecting abnormality of human body part and storage medium
CN111739645A (en) * 2020-05-14 2020-10-02 上海依智医疗技术有限公司 Training method of immune-related pneumonia prediction model
CN111739645B (en) * 2020-05-14 2024-01-30 北京深睿博联科技有限责任公司 Training method of immune-related pneumonia prediction model
CN111754598A (en) * 2020-06-27 2020-10-09 昆明理工大学 Transform Learning-Based Parallel MRI Reconstruction Method for Local Spatial Neighborhood
CN111754598B (en) * 2020-06-27 2022-02-25 昆明理工大学 Transform Learning-Based Parallel MRI Reconstruction Method for Local Spatial Neighborhood
CN113962930A (en) * 2021-09-07 2022-01-21 北京邮电大学 Alzheimer's disease risk assessment model building method and electronic device
CN113962930B (en) * 2021-09-07 2022-09-09 北京邮电大学 Alzheimer disease risk assessment model establishing method and electronic equipment

Also Published As

Publication number Publication date
WO2015106374A1 (en) 2015-07-23

Similar Documents

Publication Publication Date Title
CN103793711A (en) Multidimensional vein extracting method based on brain nuclear magnetic resonance image
Zhang et al. Automatic skin lesion segmentation by coupling deep fully convolutional networks and shallow network with textons
CN113902761B (en) Knowledge distillation-based unsupervised segmentation method for lung disease focus
CN103793908A (en) Method for constructing prediction model of multifunctional veins based on brain nuclear magnetic resonance image
CN108428229A (en) It is a kind of that apparent and geometric properties lung's Texture Recognitions are extracted based on deep neural network
CN104881680A (en) Alzheimer's disease and mild cognitive impairment identification method based on two-dimension features and three-dimension features
CN111462116A (en) Multimodal parameter model optimization fusion method based on radiomics features
CN103400154B (en) A kind of based on the human motion recognition method having supervision Isometric Maps
CN115496771A (en) A Brain Tumor Segmentation Method Based on Brain 3D MRI Image Design
Yang et al. Skull sex estimation based on wavelet transform and Fourier transform
CN108549912A (en) A kind of medical image pulmonary nodule detection method based on machine learning
CN111340770B (en) A cancer prognostic model building method combining global weighted LBP and texture analysis
CN108319969B (en) A method and system for glioma survival prediction based on a sparse representation framework
CN105809175A (en) Encephaledema segmentation method and system based on support vector machine algorithm
CN104268873A (en) Breast tumor partition method based on nuclear magnetic resonance images
CN118262875A (en) Medical image diagnosis and contrast film reading method
CN101609485B (en) Medical Imaging Diagnosis System and Diagnosis Method Based on Migration Kernel Matching and Tracking
CN104281856A (en) Image preprocessing method and system for brain medical image classification
CN101853493B (en) Method for extracting multi-dimensional texture of nodi from medical images
CN105512670A (en) HRCT peripheral nerve cutting based on KECA feature dimension reduction and clustering
CN108090507A (en) A kind of medical imaging textural characteristics processing method based on integrated approach
Roy et al. Heterogeneity of human brain tumor with lesion identification, localization, and analysis from MRI
CN113610746A (en) Image processing method and device, computer equipment and storage medium
CN112614093A (en) Breast pathology image classification method based on multi-scale space attention network
Zeng et al. Abnormality detection via iterative deformable registration and basis-pursuit decomposition

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20140514