CN103793711A - Multidimensional vein extracting method based on brain nuclear magnetic resonance image - Google Patents
Multidimensional vein extracting method based on brain nuclear magnetic resonance image Download PDFInfo
- Publication number
- CN103793711A CN103793711A CN201410023305.4A CN201410023305A CN103793711A CN 103793711 A CN103793711 A CN 103793711A CN 201410023305 A CN201410023305 A CN 201410023305A CN 103793711 A CN103793711 A CN 103793711A
- Authority
- CN
- China
- Prior art keywords
- entropy
- energy
- poor
- correlativity
- region
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/52—Scale-space analysis, e.g. wavelet analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/03—Recognition of patterns in medical or anatomical images
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Magnetic Resonance Imaging Apparatus (AREA)
Abstract
The invention discloses a multidimensional vein extracting method based on a brain nuclear magnetic resonance image. An area-of-interests in the brain nuclear magnetic resonance image is segmented through a region growing method. Vein characteristic parameters of the area-of-interests are extracted through a Curvelet conversion and Contourlet conversion method. The people comprise an Alzheimer patient group, a mild cognitive impairment patient group and normal old people group, the vein characteristic parameters of the area-of-interests comprise entropy, gray average, correlation, energy, the homogeneity degree, variance, the maximum probability, deficit distance, an inverse gap, clustering tendency, contrast, a sum mean value, a difference mean value, a sum entropy and a difference entropy. The area-of-interests comprises a entorhinal cortex are and a sea horse area.
Description
Technical field:
The invention belongs to medicine technology field, be specifically related to a kind of various dimensions texture extracting method based on brain nuclear magnetic resonance image (MRI).
Background technology:
In the early stage Alzheimer disease of auxiliary diagnosis (AD), in identification MRI image, ROIs(comprises entorhinal cortex, hippocampus) character significant.But MRI image technology can only be using atrophy of hippocampal as distinguishing one of index of patient and normal person, and doctor is subject to subjective personal influence to the explanation of MRI image, is lack of consistency, and is difficult for the order of severity of accurate evaluation dementia patients symptom.
1, existing image processing techniques has following 5 kinds:
1) region growing method (Region-growing Method):
The method has been utilized the local spatial information of image, can effectively overcome the discontinuous shortcoming of image partition space that other method exists, but the method that is still out of use is carried out the processing of brain MRI image.
2) gray level co-occurrence matrixes (GLCM):
Only had the textural characteristics parameter that adopts gray level co-occurrence matrixes method to extract in a small amount of correlative study, this is for diagnosing early stage AD, MCI to be also nowhere near according to brain MRI image texture characteristic in the past.
3) wavelet transformation (Wavelet Transformation):
Although the proper vector that wavelet transformation forms can accurately be described image to a certain extent, utilize wavelet transformation to extract ROIs textural characteristics in image and exist the shortcoming that retrieval precision is not high.
4) Second Generation Wavelet Transformation (Curvelet conversion):
After the wavelet transformation growing up continue the later stage eighties in last century, within 1996, Swendens has proposed advanced Second Generation Wavelet Transformation, on basis function algorithm, also updating, E.J.Candes in 1998 has proposed Ridgelet conversion, E.J.Candes in 1999 and D.L.Donoho and has invented Curvelet conversion new algorithm:
Wherein: 2
-jfor yardstick, θ
lfor deflection θ
l,
for position
r is conversion radian, within 2006, has proposed again fast discrete Curvelet conversion.Second Generation Wavelet Transformation has not only retained the multiple dimensioned advantage of wavelet transformation (Wavelet Transformation) method, also there is anisotropy feature simultaneously, can approach well singular curve, more be applicable to curve or the edge feature in analysis of two-dimensional images than Wavelet Transformation, and there is higher approximation accuracy and better sparse expression ability, can provide a kind of than the more accurate method of Wavelet Transformation multiresolution analysis for image.
5) Contourlet conversion
The anisotropy scaling relation of Curvelet conversion has been inherited in Contourlet conversion, and it is the another kind of implementation of Curvelet conversion in a sense.The basic thought of Contourlet conversion is that first the multiple dimensioned decomposition with a similar small echo catches edge singular point, then according to directional information, singular point close position is accumulated to contour segment.
Contourlet conversion can be divided into two parts: the tower filter construction of Laplce (Laplacian Pyramid, LP) and two-dimensional directional bank of filters (Directional Filter Bank, DFB).LP decompose that low pass sampling that first produces original signal approaches and original image and low pass predicted picture between an error image, the low-pass pictures obtaining is continued to decompose the low-pass pictures and the error image that obtain lower one deck, and so progressively filtering obtains the Multiresolution Decomposition of image; DFB bank of filters is used the conjugate mirror filter group of sector structure to avoid the modulation to input signal, the anisotropic filter of 1 layer of y-bend tree structure has been become to the structure of 21 parallel channels simultaneously.
Contourlet conversion is a kind of new image two-dimensional representation method, there is the character such as multiresolution, local positioning, multidirectional, neighbour's circle sampling and anisotropy, its basis function be distributed in multiple dimensioned, multi-direction on, a small amount of coefficient can catch the edge contour in image effectively, and the edge contour principal character in image just.
But these new methods, in the time processing the MRI image of different parts, need to be utilized basis function to re-construct new algorithm, choose suitable parameter, therefore still have many theoretical questions to be worth research.Contourlet conversion has been successfully used to the practical problemss such as image co-registration, and for the bibliographical information phoenix feathers and unicorn horns of brain image texture feature extraction.In the document of consulting at present, only someone uses GLCM and Wavelet transfer pair AD group and normal group brain MRI image texture feature extraction and sets up forecast model, with diagnostic result accuracy, above-mentioned two kinds of methods are compared, find to adopt GLCM to extract texture modeling and forecasting effect and be better than Wavelet conversion.Having not yet to see someone uses Second Generation Wavelet Transformation and Contourlet conversion to carry out the extraction of AD brain MRI image texture.Therefore, need to be in conjunction with the advantage of above-mentioned prior art, and overcome its deficiency, the disposal route of image is improved, to reach the object that improves early stage AD, MCI diagnosis.
2, conventional forecast model:
Support vector machine (Support Vector Machine, SVM):
Support vector machine is the machine learning method being based upon on Statistical Learning Theory VC dimension theory and structural risk minimization principle basis.Its mechanism is to find an optimal classification lineoid that meets classificating requirement, makes this lineoid in guaranteeing nicety of grading, can make the white space of lineoid both sides maximize.
In theory, support vector machine can realize the optimal classification to linear separability data.Take two class Data classifications as example, given training sample set (x
i, y
j), i=1,2 ..., l, { ± 1}, lineoid is denoted as (wx)+b=0 to x ∈, for making classifying face correctly classify to all samples and possess class interval, just requires it to meet following constraint: y
i[(wx
i)+b]>=1, i=1,2 ..., l, can calculate class interval is 2/||w||, the problem of therefore constructing optimum lineoid is just converted under constraint formula to be asked:
In order to solve this constrained optimization problem, introduce Lagrange function:
In formula, α
i> 0 is Lagrange multiplier.The solution of constrained optimization problem determines by the saddle point of Lagrange function, and the solution of optimization problem to meet at saddle point place be 0 to the local derviation of w and b, this QP problem is converted into corresponding dual problem:
Solve optimum solution
Calculate optimum weight vector w
*with optimum biasing b
*, be respectively:
(w
*x)+b
*=0, and optimal classification function is:
For linearly inseparable situation, the main thought of SVM is by the characteristic vector space of defeated people's DUAL PROBLEMS OF VECTOR MAPPING to higher-dimension, and in this feature space, constructs optimal classification face.
X is done from input space R
nto the conversion φ of feature space H:
x→φ(x)=(φ
1(x),φ
2(x),…φ
l(x))
T
Replace input vector x with proper vector φ (x), can obtain optimal classification function and be:
Summary of the invention:
The object of the present invention is to provide a kind of to containing the dividing method of MRI image and the method for texture feature extraction of MCI, early stage AD patient's focus and Elderly people crowd ROIs, set up multiple forecast model, so that more effective discovery MCI patient, diagnoses early stage AD and observation Elderly people crowd's all brain structures to change.
The present invention combines the advantage of the methods such as region growing method of the prior art, gray level co-occurrence matrixes, wavelet transformation, and improved, use Curvelet conversion and Contourlet conversion to extract the Edge texture feature of ROIs, texture extracting method is comprehensive, novel, can reach goal of the invention.
The present invention has set up a kind of to the cutting apart and the method (specific procedure is shown in Fig. 4) of texture feature extraction of the MRI image that contains MCI, early stage AD patient's focus and normal aging people ROIs by following main technological route, and has set up accordingly the forecast model of the relevant ROIs character of judgement:
1) set up ROIs image library;
2) deployment area growth method splits relevant ROIs in image;
3) adopt Curvelet conversion and Contourlet conversion process image, extract following variable: entropy (entropy), gray average (gray average), correlativity (correlativity), energy (energy), homogeneity degree (homogeneity degree geneity), variance (Variance), maximum probability (Maximum probability, maximum probability), unfavourable balance is apart from (Inverse Difference Moment, unfavourable balance distance), Clustering Tendency (Cluster Tendency), contrast (Contrast), with average (Sun-gray average), poor average (Difference-gray average), with entropy (Sum-entropy), poor entropy (Difference-entropy),
4) by step 2)~3) the various variable data that obtain set up characteristics of image parametric data storehouse;
5) according to step 4) database build the forecast model converting based on Curvelet conversion and Contourlet; The method of setting up forecast model comprises support vector machine;
6) by step 5) the various parametric data that obtain and sample be through checking repeatedly, to revise forecast model, obtains result ideal model more accurately.
7) comparison step 5) in convert based on Curvelet conversion and Contourlet the prediction effect of setting up forecast model.
At present for the also unified definition of neither one of texture of image, what it is generally acknowledged that the textural characteristics of image describes is the variation of body surface gray scale or color, and this variation is relevant with the self attributes of object, is the repetition of certain texture primitive.
Adopt Curvelet conversion and Contourlet conversion can obtain following textural characteristics parameter:
Correlativity correlativity:
measure the correlativity of pixel grey scale;
Energy (Angular Second Moment) energy (angle second moment):
degree of uniformity and texture fineness degree that reflection image greyscale distributes;
Maximum probability (maximum probability) maximum probability:
the incidence that the most outstanding pixel is right;
Inverse Difference Moment (unfavourable balance distance) unfavourable balance square:
the flatness of reflection image;
Cluster tendency Clustering Tendency:
measure similar grey level and be worth the grouping of pixel;
Sun-gray average and average
the average that Difference-gray average is poor
the average of grey level in image is provided.
The present invention has set up early stage AD forecast model with said method, and accuracy of judgement degree has reached 100%.
Be below the example that ROIs inner vein extracts, step is as follows:
1, collect the MRI original image (accompanying drawing explanation is take the MRI image of normal aging people as example) of AD, MCI and normal aging people, see Fig. 1;
2, cut apart above-mentioned image by region growing method, obtain image and see Fig. 2, Fig. 3, cut apart the program writing that adopts, directly operation.
3, adopt Curvelet conversion and Contourlet conversion texture feature extraction parameter, every kind of method has respectively corresponding program, directly operation, and the textural characteristics parameter of extraction is in table 2~table 29.
The results show:
Table 1Curvelet conversion changes different azimuth difference analysis with Contourlet
As shown in Table 1, Contourlet ratio of transformation Curvelet conversion can better reflect AD group, MCI group, texture value total difference between normal group.
Adopt region growing method to cut apart ROIs, choose 80% data as training sample, again according to remaining 20% data as checking sample, judge the whether consistance of pathology and pathological diagnosis of ROIs with the model of upchecking, prove that extracting brain MRI image ROIs external texture by Contourlet sets up sensitivity, the specificity of the prediction early stage of lung cancer and be 100%.
By above data, can obtain conclusion: adopt region growing method to cut apart brain MRI image, and logical Contourlet conversion texture feature extraction parameter is set up forecast model early stage AD auxiliary diagnosis is had to good effect.
Beneficial effect:
Region growing method is used for to cutting apart of brain MRI, and this is an innovation of the present invention.The experiment proved that, region growing method is cut apart and is extracted texture modeling and will more be better than entirety and cut apart and extract texture modeling, can better retain the marginal information of tubercle;
Contourlet is the effective ways that extract brain MRI image inner vein feature, can extract 14 kinds of textural characteristics parameters, more comprehensively reflects the textural characteristics of image.
Accompanying drawing explanation:
Fig. 1 is the MRI original image of normal aging people, and that wherein mark with black surround is ROIs;
Fig. 2 is the left brain ROIs enlarged image obtaining after cutting apart by region growing method in Fig. 1;
Fig. 3 is the right brain ROIs enlarged image obtaining after cutting apart by region growing method in Fig. 1;
Fig. 4 is the technology path that brain MRI image is carried out to the extraction of various dimensions texture by the inventive method;
Wherein, the 1st, left cerebral hippocampal, entorhinal cortex region; The 2nd, right cerebral hippocampal, entorhinal cortex region.
Embodiment
Following instance is the model introduction of setting up the brain MRI image prediction AD based on containing relevant ROIs by the inventive method, and this is just to the further illustrating of the inventive method, but example does not limit range of application of the present invention.In fact, also can carry out character to the medical image of other type by this method judges.
Image source: shared AD, MCI and the brain MRI image of normal aging people on ADNI website, be respectively the image of .Nii form, use MRIcro software to read;
Method: adopt Matlab software programming, deployment area growth method splits the ROIs in above-mentioned MRI image, utilizes Contourlet conversion to extract the textural characteristics parameter of relevant ROIs.
Be below the example that brain MRI image ROIs textural characteristics parameter extracts, step is as follows:
1, collect respectively AD group case 20 examples, MCI case group 20 examples, normal group case 20 examples 250 brain MRI original images altogether, the case age is 40~89 years old, and the mean age is 64 years old, and the median of age distribution is 60 years old.
2, cut apart image by region growing method, partitioning scheme is pressed region growing method conventional method, adopts the program writing, and setting threshold is 35, directly operation.Fig. 1 is that an image to be split is given an example, obtains image and sees Fig. 2, Fig. 3,
3, adopt respectively Curvelet conversion and Contourlet conversion texture feature extraction parameter, the textural characteristics parameter of extraction is in table 2~table 29.
4, adopt Curvelet conversion and Contourlet transform method extraction brain MRI image ROIs to obtain respectively 14 textural characteristics parameters and set up forecast model.
5, comparison Curvelet conversion and two kinds of texture extracting method of Contourlet conversion are set up the prediction effect of forecast model.
By 200 brain MRI images of 12 example, as training sample set, remaining 50 brain MRI images of 3 example carry out the support vector machine prediction of classifying, prediction effect: sensitivity=100% as test sample book; Specificity=100%; Coincidence rate=100%.
The results show: adopt region growing method to cut apart ROIs, choose 80% data as training sample, again according to 20% of remainder data as checking sample, judge the whether consistance of pathology and pathological diagnosis of lung ROIs with the model of upchecking, prove by small sample brain MRI image ROIs texture feature extraction parameter is predicted to the sensitivity of early stage AD is 100%.
By above data, can obtain conclusion: adopt region growing method to cut apart brain MRI image ROIs, and convert texture feature extraction parameter by Contourlet and set up forecast model the auxiliary diagnosis of early stage AD is had to good effect.
The variable that Curvelet conversion is extracted is in table 2~table 15, and in table 16~table 29(, the case texture value take AD group # as 136_s_0194 is example to the variable that Contourlet conversion is extracted).
Table 2 gray average
Table 3 standard deviation
|
|
Standard deviation 3 | Standard deviation 4 | Standard deviation 5 | Standard deviation 6 |
4.3847711 | 1.42498491 | 1.36818905 | 1.41228397 | 1.462426799 | 1.591283749 |
Standard deviation 7 | Standard deviation 8 | Standard deviation 9 | Standard deviation 10 | Standard deviation 11 | Standard deviation 12 |
1.32885786 | 1.9148351 | 1.41065529 | 1.55030154 | 1.367888924 | 1.405432669 |
Standard deviation 13 | Standard deviation 14 | Standard deviation 15 | Standard deviation 16 | Standard deviation 17 | Standard deviation 18 |
1.30044057 | 1.579707287 | 1.33576328 | 1.18453501 | 1.427709383 | 1.277003212 |
Standard deviation 19 | Standard deviation 20 | Standard deviation 21 | Standard deviation 22 | Standard deviation 23 | Standard deviation 24 |
1.7011195 | 1.443854262 | 1.38209543 | 1.57756675 | 1.392241419 | 1.647125047 |
Standard deviation 25 | Standard deviation 26 | Standard deviation 27 | Standard deviation 28 | Standard deviation 29 | Standard deviation 30 |
1.42259127 | 1.430469425 | 1.51028701 | 1.27215853 | 1.311299465 | 1.527251474 |
Standard deviation 31 | Standard deviation 32 | Standard deviation 33 | Standard deviation 34 | ? | ? |
1.04393555 | 1.52005899 | 1.7647135 | 0.86030865 | ? | ? |
Table 4 Clustering Tendency
|
|
Clustering Tendency 3 | Clustering Tendency 4 | Clustering Tendency 5 | Clustering Tendency 6 |
16.3606301 | 16.12439285 | 18.3695153 | 15.9238466 | 13.85499042 | 21.2569512 |
Clustering Tendency 7 | Clustering Tendency 8 | Clustering Tendency 9 | Clustering Tendency 10 | Clustering Tendency 11 | Clustering Tendency 12 |
18.661451 | 18.32069429 | 16.9243141 | 17.3187788 | 14.44784022 | 20.6990795 |
Clustering Tendency 13 | Clustering Tendency 14 | Clustering Tendency 15 | Clustering Tendency 16 | Clustering Tendency 17 | Clustering Tendency 18 |
11.959189 | 14.13415804 | 17.3955427 | 17.5325772 | 16.98576286 | 14.217201 |
Clustering Tendency 19 | Clustering Tendency 20 | Clustering Tendency 21 | Clustering Tendency 22 | Clustering Tendency 23 | Clustering Tendency 24 |
16.8114409 | 15.24536246 | 15.3914861 | 15.5040628 | 15.43258971 | 13.98916839 |
Clustering Tendency 25 | Clustering Tendency 26 | Clustering Tendency 27 | Clustering Tendency 28 | Clustering Tendency 29 | Clustering Tendency 30 |
16.331876 | 19.96832162 | 14.7059386 | 16.0655662 | 16.21379157 | 14.04231151 |
Clustering Tendency 31 | Clustering Tendency 32 | Clustering Tendency 33 | Clustering Tendency 34 | ? | ? |
18.2942587 | 16.86866386 | 16.5871928 | 16.2555715 | ? | ? |
Table 5 homogeneity degree
Table 6 maximum probability
Table 7 energy
|
|
Energy 3 | Energy 4 | Energy 5 | Energy 6 |
0.0498905 | 0.1160032 | 0.194533 | 0.173957 | 0.2182537 | 0.12101859 |
Energy 7 | Energy 8 | Energy 9 | Energy 10 | Energy 11 | Energy 12 |
0.1489191 | 0.0817874 | 0.091902 | 0.1012048 | 0.1858977 | 0.10748128 |
Energy 13 | Energy 14 | Energy 15 | Energy 16 | Energy 17 | Energy 18 |
0.345414 | 0.2464471 | 0.154017 | 0.1991889 | 0.1164763 | 0.12492103 |
Energy 19 | Energy 20 | Energy 21 | Energy 22 | Energy 23 | Energy 24 |
0.0749974 | 0.1107258 | 0.164832 | 0.1727838 | 0.1557092 | 0.14539656 |
Energy 25 | Energy 26 | Energy 27 | Energy 28 | Energy 29 | Energy 30 |
0.1146461 | 0.1371698 | 0.110386 | 0.1671202 | 0.4645303 | 0.29653236 |
Energy 31 | Energy 32 | Energy 33 | Energy 34 | ? | ? |
0.3057156 | 0.0939511 | 0.084532 | 0.5190262 | ? | ? |
Table 8 moment of inertia
Table 9 unfavourable balance distance
Table 10 entropy
|
|
Entropy 3 | Entropy 4 | Entropy 5 | Entropy 6 |
1.22035151 | 1.029386659 | 0.82359602 | 0.99692385 | 0.874746461 | 1.07754384 |
Entropy 7 | Entropy 8 | Entropy 9 | Entropy 10 | Entropy 11 | Entropy 12 |
0.93065033 | 1.278111547 | 1.12947679 | 1.12073057 | 0.887906447 | 0.932229361 |
Entropy 13 | Entropy 14 | Entropy 15 | Entropy 16 | Entropy 17 | Entropy 18 |
0.82547592 | 0.948837401 | 0.85367573 | 0.83820976 | 0.894950428 | 1.001405253 |
Entropy 19 | Entropy 20 | Entropy 21 | Entropy 22 | Entropy 23 | Entropy 24 |
1.09676062 | 1.008200647 | 0.81700194 | 0.90675533 | 0.978592281 | 1.211409108 |
Entropy 25 | Entropy 26 | Entropy 27 | Entropy 28 | Entropy 29 | Entropy 30 |
1.1444441 | 0.98236792 | 1.07274302 | 0.89660664 | 0.630961017 | 0.800516014 |
Entropy 31 | Entropy 32 | Entropy 33 | Entropy 34 | ? | ? |
0.58298469 | 1.029947319 | 1.1837409 | 0.5506891 | ? | ? |
Table 11 correlativity
|
|
Correlativity 3 | Correlativity 4 | Correlativity 5 | Correlativity 6 |
0.9155325 | 0.0907211 | 0.025449 | -0.199867 | -0.036875 | -0.0596785 |
Correlativity 7 | Correlativity 8 | Correlativity 9 | Correlativity 10 | Correlativity 11 | Correlativity 12 |
-0.201381 | -0.014917 | -0.0417 | 0.0605341 | 0.0357446 | -0.1837574 |
Correlativity 13 | Correlativity 14 | Correlativity 15 | Correlativity 16 | Correlativity 17 | Correlativity 18 |
0.0062253 | -0.031564 | -0.19318 | 0.0178852 | 0.0717881 | 0.02910943 |
Correlativity 19 | Correlativity 20 | Correlativity 21 | Correlativity 22 | Correlativity 23 | Correlativity 24 |
0.0133719 | -0.211466 | 0.019258 | 0.016919 | -0.191599 | -0.00425 |
Correlativity 25 | Correlativity 26 | Correlativity 27 | Correlativity 28 | Correlativity 29 | Correlativity 30 |
-0.035369 | 0.0656479 | 0.010318 | -0.184086 | 0.0910757 | 0.04394031 |
Correlativity 31 | Correlativity 32 | Correlativity 33 | Correlativity 34 | ? | ? |
-0.222713 | -0.009445 | 0.084592 | -0.006333 | ? | ? |
Table 12 and mean
The mean that table 13 is poor
Table 14 and entropy
The entropy that table 15 is poor
|
|
Poor entropy 3 | Poor entropy 4 | Poor entropy 5 | Poor entropy 6 |
2.7717822 | 2.7334586 | 2.646782 | 2.852217 | 2.7806391 | 2.90563906 |
Poor entropy 7 | Poor entropy 8 | Poor entropy 9 | Poor entropy 10 | Poor entropy 11 | Poor entropy 12 |
2.852217 | 3.25 | 2.646782 | 2.977217 | 3 | 2.77439747 |
Poor entropy 13 | Poor entropy 14 | Poor entropy 15 | Poor entropy 16 | Poor entropy 17 | Poor entropy 18 |
2.375 | 2.7806391 | 2.774397 | 2.4746018 | 2.9056391 | 2.64678222 |
Poor entropy 19 | Poor entropy 20 | Poor entropy 21 | Poor entropy 22 | Poor entropy 23 | Poor entropy 24 |
3.0778195 | 2.6467822 | 2.608459 | 2.6467822 | 2.7743975 | 2.89939747 |
Poor entropy 25 | Poor entropy 26 | Poor entropy 27 | Poor entropy 28 | Poor entropy 29 | Poor entropy 30 |
2.6467822 | 2.7806391 | 2.774397 | 2.7743975 | 2.4362781 | 2.64939747 |
Poor entropy 31 | Poor entropy 32 | Poor entropy 33 | Poor entropy 34 | ? | ? |
2.5306391 | 2.7334586 | 3.024397 | 2.2169172 | ? | ? |
Table 16 gray average
Table 17 standard deviation
Table 18 Clustering Tendency
Table 19 homogeneity degree
A-1_1_H | A-1_2_H | A-1_3_H | A-1_4_H | A-1_5_H | A-1_6_H |
0.729353 | 0.85896 | 0.742591 | 0.811983 | 0.707436 | 0.820406 |
A-2_1_H | A-2_2_H | A-2_3_H | A-2_4_H | A-2_5_H | A-2_6_H |
0.673075 | 0.832321 | 0.804619 | 0.818736 | 0.765196 | 0.819104 |
A-3_1_H | A-3_2_H | A-3_3_H | A-3_4_H | A-3_5_H | A-3_6_H |
0.702889 | 0.770329 | 0.731664 | 0.812325 | 0.750113 | 0.823354 |
A-4_1_H | A-4_2_H | A-4_3_H | A-4_4_H | A-4_5_H | A-4_6_H |
0.630094 | 0.809826 | 0.765024 | 0.750887 | 0.722146 | 0.833379 |
A-5_1_H | A-5_2_H | A-5_3_H | A-5_4_H | A-5_5_H | A-5_6_H |
0.784226 | 0.802753 | 0.828898 | 0.824906 | 0.855482 | 0.832783 |
A-6_1_H | A-6_2_H | A-6_3_H | A-6_4_H | A-6_5_H | A-6_6_H |
0.860491 | 0.882292 | 0.902136 | 0.888263 | 0.899928 | 0.892365 |
A-7_1_H | A-7_2_H | A-7_3_H | A-7_4_H | A-7_5_H | A-7_6_H |
0.860491 | 0.872411 | 0.897529 | 0.897329 | 0.909925 | 0.913109 |
A-8_1_H | A-8_2_H | A-8_3_H | A-8_4_H | A-8_5_H | A-8_6_H |
0.720064 | 0.846891 | 0.845386 | 0.786799 | 0.819207 | 0.776603 |
B-1_1_H | B-1_2_H | B-1_3_H | B-1_4_H | B-1_5_H | B-1_6_H |
0.421003 | 0.63488 | 0.678265 | 0.758653 | 0.853534 | 0.902955 |
B-2_1_H | B-2_2_H | B-2_3_H | B-2_4_H | B-2_5_H | B-2_6_H |
0.413449 | 0.723771 | 0.737913 | 0.847828 | 0.900318 | 0.919516 |
B-3_1_H | B-3_2_H | B-3_3_H | B-3_4_H | B-3_5_H | B-3_6_H |
0.404989 | 0.67885 | 0.73714 | 0.734893 | 0.901268 | 0.88582 |
B-4_1_H | B-4_2_H | B-4_3_H | B-4_4_H | B-4_5_H | B-4_6_H |
0.379105 | 0.759241 | 0.706057 | 0.764712 | 0.862515 | 0.842974 |
B-5_1_H | B-5_2_H | B-5_3_H | B-5_4_H | B-5_5_H | B-5_6_H |
0.386228 | 0.803288 | 0.741632 | 0.777376 | 0.934154 | 0.900476 |
B-6_1_H | B-6_2_H | B-6_3_H | B-6_4_H | B-6_5_H | B-6_6_H |
0.492837 | 0.64838 | 0.84093 | 0.804451 | 0.934574 | 0.924924 |
B-7_1_H | B-7_2_H | B-7_3_H | B-7_4_H | B-7_5_H | B-7_6_H |
0.546176 | 0.825809 | 0.848521 | 0.875017 | 0.923975 | 0.925081 |
B-8_1_H | B-8_2_H | B-8_3_H | B-8_4_H | B-8_5_H | B-8_6_H |
0.341506 | 0.648599 | 0.822798 | 0.637463 | 0.89822 | 0.891833 |
Table 20 maximum probability
Table 21 energy
A-1_1_ energy | A-1_2_ energy | A-1_3_ energy | A-1_4_ energy | A-1_5_ energy | A-1_6_ energy |
0.329817 | 0.429658 | 0.269169 | 0.357082 | 0.234291 | 0.360617 |
A-2_1_ energy | A-2_2_ energy | A-2_3_ energy | A-2_4_ energy | A-2_5_ energy | A-2_6_ energy |
0.252431 | 0.346633 | 0.319042 | 0.347412 | 0.283623 | 0.357081 |
A-3_1_ energy | A-3_2_ energy | A-3_3_ energy | A-3_4_ energy | A-3_5_ energy | A-3_6_ energy |
0.253393 | 0.27921 | 0.260922 | 0.328612 | 0.279507 | 0.329821 |
A-4_1_ energy | A-4_2_ energy | A-4_3_ energy | A-4_4_ energy | A-4_5_ energy | A-4_6_ energy |
0.262432 | 0.326571 | 0.281119 | 0.268873 | 0.252395 | 0.329804 |
A-5_1_ energy | A-5_2_ energy | A-5_3_ energy | A-5_4_ energy | A-5_5_ energy | A-5_6_ energy |
0.301237 | 0.305192 | 0.347685 | 0.334801 | 0.378429 | 0.328888 |
A-6_1_ energy | A-6_2_ energy | A-6_3_ energy | A-6_4_ energy | A-6_5_ energy | A-6_6_ energy |
0.380532 | 0.397709 | 0.422509 | 0.390428 | 0.431073 | 0.399358 |
A-7_1_ energy | A-7_2_ energy | A-7_3_ energy | A-7_4_ energy | A-7_5_ energy | A-7_6_ energy |
0.380532 | 0.385016 | 0.413217 | 0.421572 | 0.4422 | 0.437127 |
A-8_1_ energy | A-8_2_ energy | A-8_3_ energy | A-8_4_ energy | A-8_5_ energy | A-8_6_ energy |
0.235615 | 0.358153 | 0.365301 | 0.285675 | 0.385935 | 0.285943 |
B-1_1_ energy | B-1_2_ energy | B-1_3_ energy | B-1_4_ energy | B-1_5_ energy | B-1_6_ energy |
0.040559 | 0.152371 | 0.145938 | 0.322172 | 0.562941 | 0.703012 |
B-2_1_ energy | B-2_2_ energy | B-2_3_ energy | B-2_4_ energy | B-2_5_ energy | B-2_6_ energy |
0.047743 | 0.224751 | 0.327096 | 0.587206 | 0.694283 | 0.746295 |
B-3_1_ energy | B-3_2_ energy | B-3_3_ energy | B-3_4_ energy | B-3_5_ energy | B-3_6_ energy |
0.036529 | 0.199446 | 0.271257 | 0.326123 | 0.673829 | 0.615674 |
B-4_1_ energy | B-4_2_ energy | B-4_3_ energy | B-4_4_ energy | B-4_5_ energy | B-4_6_ energy |
0.02904 | 0.37342 | 0.237587 | 0.371034 | 0.590966 | 0.509126 |
B-5_1_ energy | B-5_2_ energy | B-5_3_ energy | B-5_4_ energy | B-5_5_ energy | B-5_6_ energy |
0.028061 | 0.46962 | 0.284735 | 0.343905 | 0.79342 | 0.696714 |
B-6_1_ energy | B-6_2_ energy | B-6_3_ energy | B-6_4_ energy | B-6_5_ energy | B-6_6_ energy |
0.065538 | 0.159744 | 0.563571 | 0.382843 | 0.794607 | 0.754498 |
B-7_1_ energy | B-7_2_ energy | B-7_3_ energy | B-7_4_ energy | B-7_5_ energy | B-7_6_ energy |
0.113649 | 0.562378 | 0.457991 | 0.611939 | 0.732534 | 0.757845 |
B-8_1_ energy | B-8_2_ energy | B-8_3_ energy | B-8_4_ energy | B-8_5_ energy | B-8_6_ energy |
0.027273 | 0.133006 | 0.501218 | 0.126851 | 0.585713 | 0.642467 |
Table 22 moment of inertia
Table 23 unfavourable balance distance
A-1_1_I_D_M | A-1_2_I_D_M | A-1_3_I_D_M | A-1_4_I_D_M | A-1_5_I_D_M | A-1_6_I_D_M |
0.712587 | 0.849413 | 0.720855 | 0.794639 | 0.679748 | 0.804419 |
A-2_1_I_D_M | A-2_2_I_D_M | A-2_3_I_D_M | A-2_4_I_D_M | A-2_5_I_D_M | A-2_6_I_D_M |
0.642687 | 0.818283 | 0.785718 | 0.803443 | 0.743536 | 0.803096 |
A-3_1_I_D_M | A-3_2_I_D_M | A-3_3_I_D_M | A-3_4_I_D_M | A-3_5_I_D_M | A-3_6_I_D_M |
0.687918 | 0.752437 | 0.708439 | 0.797982 | 0.727894 | 0.809125 |
A-4_1_I_D_M | A-4_2_I_D_M | A-4_3_I_D_M | A-4_4_I_D_M | A-4_5_I_D_M | A-4_6_I_D_M |
0.603872 | 0.796356 | 0.74549 | 0.729865 | 0.697074 | 0.819374 |
A-5_1_I_D_M | A-5_2_I_D_M | A-5_3_I_D_M | A-5_4_I_D_M | A-5_5_I_D_M | A-5_6_I_D_M |
0.772473 | 0.785752 | 0.813567 | 0.811672 | 0.844024 | 0.819457 |
A-6_1_I_D_M | A-6_2_I_D_M | A-6_3_I_D_M | A-6_4_I_D_M | A-6_5_I_D_M | A-6_6_I_D_M |
0.851849 | 0.87395 | 0.894326 | 0.877756 | 0.891504 | 0.883767 |
A-7_1_I_D_M | A-7_2_I_D_M | A-7_3_I_D_M | A-7_4_I_D_M | A-7_5_I_D_M | A-7_6_I_D_M |
0.851849 | 0.860903 | 0.888111 | 0.88889 | 0.902253 | 0.906185 |
A-8_1_I_D_M | A-8_2_I_D_M | A-8_3_I_D_M | A-8_4_I_D_M | A-8_5_I_D_M | A-8_6_I_D_M |
0.698492 | 0.837275 | 0.832835 | 0.767781 | 0.804973 | 0.756187 |
B-1_1_I_D_M | B-1_2_I_D_M | B-1_3_I_D_M | B-1_4_I_D_M | B-1_5_I_D_M | B-1_6_I_D_M |
0.345231 | 0.598857 | 0.655871 | 0.737716 | 0.838656 | 0.893359 |
B-2_1_I_D_M | B-2_2_I_D_M | B-2_3_I_D_M | B-2_4_I_D_M | B-2_5_I_D_M | B-2_6_I_D_M |
0.345887 | 0.702856 | 0.70873 | 0.829426 | 0.889851 | 0.911388 |
B-3_1_I_D_M | B-3_2_I_D_M | B-3_3_I_D_M | B-3_4_I_D_M | B-3_5_I_D_M | B-3_6_I_D_M |
0.3318 | 0.6468 | 0.713197 | 0.705259 | 0.893005 | 0.877113 |
B-4_1_I_D_M | B-4_2_I_D_M | B-4_3_I_D_M | B-4_4_I_D_M | B-4_5_I_D_M | B-4_6_I_D_M |
0.292172 | 0.732955 | 0.678273 | 0.739201 | 0.848489 | 0.829223 |
B-5_1_I_D_M | B-5_2_I_D_M | B-5_3_I_D_M | B-5_4_I_D_M | B-5_5_I_D_M | B-5_6_I_D_M |
0.308495 | 0.779828 | 0.724758 | 0.76087 | 0.926433 | 0.890352 |
B-6_1_I_D_M | B-6_2_I_D_M | B-6_3_I_D_M | B-6_4_I_D_M | B-6_5_I_D_M | B-6_6_I_D_M |
0.443182 | 0.620292 | 0.822765 | 0.788644 | 0.927426 | 0.917631 |
B-7_1_I_D_M | B-7_2_I_D_M | B-7_3_I_D_M | B-7_4_I_D_M | B-7_5_I_D_M | B-7_6_I_D_M |
0.496028 | 0.80542 | 0.837661 | 0.86274 | 0.917396 | 0.917402 |
B-8_1_I_D_M | B-8_2_I_D_M | B-8_3_I_D_M | B-8_4_I_D_M | B-8_5_I_D_M | B-8_6_I_D_M |
0.247864 | 0.621626 | 0.803763 | 0.608389 | 0.892866 | 0.881686 |
Table 24 entropy entropy
A-1_1_ entropy | A-1_2_ entropy | A-1_3_ entropy | A-1_4_ entropy | A-1_5_ entropy | A-1_6_ entropy |
0.147365 | 0.249842 | 0.809925 | 0.674705 | 0.991099 | 0.707431 |
A-2_1_ entropy | A-2_2_ entropy | A-2_3_ entropy | A-2_4_ entropy | A-2_5_ entropy | A-2_6_ entropy |
0.257654 | 0.467325 | 0.744689 | 0.634534 | 0.841402 | 0.719484 |
A-3_1_ entropy | A-3_2_ entropy | A-3_3_ entropy | A-3_4_ entropy | A-3_5_ entropy | A-3_6_ entropy |
0.257654 | 0.78344 | 0.941854 | 0.577694 | 0.86642 | 0.638637 |
A-4_1_ entropy | A-4_2_ entropy | A-4_3_ entropy | A-4_4_ entropy | A-4_5_ entropy | A-4_6_ entropy |
0.5331 | 0.435041 | 0.892007 | 0.737686 | 0.926049 | 0.527086 |
A-5_1_ entropy | A-5_2_ entropy | A-5_3_ entropy | A-5_4_ entropy | A-5_5_ entropy | A-5_6_ entropy |
0.213101 | 0.673818 | 0.609655 | 0.569882 | 0.483188 | 0.542628 |
A-6_1_ entropy | A-6_2_ entropy | A-6_3_ entropy | A-6_4_ entropy | A-6_5_ entropy | A-6_6_ entropy |
0.147365 | 0.402757 | 0.533242 | 0.542628 | 0.402109 | 0.370703 |
A-7_1_ entropy | A-7_2_ entropy | A-7_3_ entropy | A-7_4_ entropy | A-7_5_ entropy | A-7_6_ entropy |
0.147365 | 0.449194 | 0.504984 | 0.419245 | 0.339453 | 0.314577 |
A-8_1_ entropy | A-8_2_ entropy | A-8_3_ entropy | A-8_4_ entropy | A-8_5_ entropy | A-8_6_ entropy |
0.459046 | 0.485728 | 0.599459 | 0.827273 | 0.63416 | 0.862612 |
B-1_1_ entropy | B-1_2_ entropy | B-1_3_ entropy | B-1_4_ entropy | B-1_5_ entropy | B-1_6_ entropy |
1.135293 | 1.372649 | 0.989272 | 0.907689 | 0.623399 | 0.407773 |
B-2_1_ entropy | B-2_2_ entropy | B-2_3_ entropy | B-2_4_ entropy | B-2_5_ entropy | B-2_6_ entropy |
0.960201 | 1.047255 | 1.102955 | 0.80623 | 0.460222 | 0.339453 |
B-3_1_ entropy | B-3_2_ entropy | B-3_3_ entropy | B-3_4_ entropy | B-3_5_ entropy | B-3_6_ entropy |
1.226785 | 1.259216 | 0.944822 | 1.048946 | 0.364806 | 0.394297 |
B-4_1_ entropy | B-4_2_ entropy | B-4_3_ entropy | B-4_4_ entropy | B-4_5_ entropy | B-4_6_ entropy |
1.276276 | 1.092116 | 1.066606 | 1.01884 | 0.502752 | 0.594869 |
B-5_1_ entropy | B-5_2_ entropy | B-5_3_ entropy | B-5_4_ entropy | B-5_5_ entropy | B-5_6_ entropy |
1.311138 | 1.098337 | 0.707474 | 0.769707 | 0.306765 | 0.404825 |
B-6_1_ entropy | B-6_2_ entropy | B-6_3_ entropy | B-6_4_ entropy | B-6_5_ entropy | B-6_6_ entropy |
0.967183 | 1.21674 | 0.765465 | 0.833332 | 0.302827 | 0.361857 |
B-7_1_ entropy | B-7_2_ entropy | B-7_3_ entropy | B-7_4_ entropy | B-7_5_ entropy | B-7_6_ entropy |
1.010607 | 1.027496 | 0.585507 | 0.662255 | 0.277905 | 0.354045 |
B-8_1_ entropy | B-8_2_ entropy | B-8_3_ entropy | B-8_4_ entropy | B-8_5_ entropy | B-8_6_ entropy |
1.25816 | 1.235578 | 0.807947 | 0.975353 | 0.25531 | 0.599459 |
Table 25 correlativity
A-1_1_C | A-1_2_C | A-1_3_C | A-1_4_C | A-1_5_C | A-1_6_C |
0.453488 | 0.691889 | 0.611768 | 0.7148 | 0.62426 | 0.732309 |
A-2_1_C | A-2_2_C | A-2_3_C | A-2_4_C | A-2_5_C | A-2_6_C |
0.459653 | 0.735835 | 0.733302 | 0.715312 | 0.697015 | 0.732849 |
A-3_1_C | A-3_2_C | A-3_3_C | A-3_4_C | A-3_5_C | A-3_6_C |
0.495842 | 0.652304 | 0.653717 | 0.67898 | 0.656804 | 0.746605 |
A-4_1_C | A-4_2_C | A-4_3_C | A-4_4_C | A-4_5_C | A-4_6_C |
0.35345 | 0.651618 | 0.666332 | 0.599482 | 0.629691 | 0.742986 |
A-5_1_C | A-5_2_C | A-5_3_C | A-5_4_C | A-5_5_C | A-5_6_C |
0.595731 | 0.712844 | 0.721926 | 0.704375 | 0.756268 | 0.731947 |
A-6_1_C | A-6_2_C | A-6_3_C | A-6_4_C | A-6_5_C | A-6_6_C |
0.713227 | 0.774956 | 0.844911 | 0.837104 | 0.84023 | 0.840448 |
A-7_1_C | A-7_2_C | A-7_3_C | A-7_4_C | A-7_5_C | A-7_6_C |
0.713227 | 0.80898 | 0.854349 | 0.820523 | 0.864554 | 0.865475 |
A-8_1_C | A-8_2_C | A-8_3_C | A-8_4_C | A-8_5_C | A-8_6_C |
0.583363 | 0.732909 | 0.745791 | 0.723227 | 0.706181 | 0.70656 |
B-1_1_C | B-1_2_C | B-1_3_C | B-1_4_C | B-1_5_C | B-1_6_C |
-0.06096 | 0.02836 | 0.181045 | 0.07172 | -0.04784 | -0.17716 |
B-2_1_C | B-2_2_C | B-2_3_C | B-2_4_C | B-2_5_C | B-2_6_C |
-0.11077 | -0.02076 | -0.04274 | -0.16053 | -0.17078 | -0.18233 |
B-3_1_C | B-3_2_C | B-3_3_C | B-3_4_C | B-3_5_C | B-3_6_C |
-0.08765 | -0.03503 | -0.05094 | -0.15471 | -0.0704 | -0.15323 |
B-4_1_C | B-4_2_C | B-4_3_C | B-4_4_C | B-4_5_C | B-4_6_C |
-0.0882 | -0.05643 | 0.025384 | 0.173975 | -0.14326 | -0.09191 |
B-5_1_C | B-5_2_C | B-5_3_C | B-5_4_C | B-5_5_C | B-5_6_C |
-0.12153 | -0.12012 | -0.12315 | 0.022757 | -0.19152 | -0.24968 |
B-6_1_C | B-6_2_C | B-6_3_C | B-6_4_C | B-6_5_C | B-6_6_C |
0.136108 | -0.03391 | -0.20318 | 0.027926 | -0.11888 | 0.034783 |
B-7_1_C | B-7_2_C | B-7_3_C | B-7_4_C | B-7_5_C | B-7_6_C |
0.064315 | -0.18657 | -0.19499 | -0.2193 | -0.12311 | -0.09605 |
B-8_1_C | B-8_2_C | B-8_3_C | B-8_4_C | B-8_5_C | B-8_6_C |
0.185817 | -0.05458 | -0.06841 | 0.120048 | 0.005032 | -0.03155 |
Table 26 and mean
The mean that table 27 is poor
A-1_1_D-m | A-1_2_D-m | A-1_3_D-m | A-1_4_D-m | A-1_5_D-m | A-1_6_D-m |
4.330357 | 2.138244 | 2.994348 | 2.142065 | 2.835198 | 1.997181 |
A-2_1_D-m | A-2_2_D-m | A-2_3_D-m | A-2_4_D-m | A-2_5_D-m | A-2_6_D-m |
4.200893 | 2.104315 | 2.19344 | 2.239163 | 2.377385 | 2.019957 |
A-3_1_D-m | A-3_2_D-m | A-3_3_D-m | A-3_4_D-m | A-3_5_D-m | A-3_6_D-m |
3.879464 | 2.75506 | 2.739019 | 2.507524 | 2.533449 | 2.016009 |
A-4_1_D-m | A-4_2_D-m | A-4_3_D-m | A-4_4_D-m | A-4_5_D-m | A-4_6_D-m |
4.889881 | 2.667708 | 2.62626 | 3.174323 | 2.70422 | 2.090096 |
A-5_1_D-m | A-5_2_D-m | A-5_3_D-m | A-5_4_D-m | A-5_5_D-m | A-5_6_D-m |
3.087798 | 2.303869 | 2.19344 | 2.302779 | 1.84492 | 2.148413 |
A-6_1_D-m | A-6_2_D-m | A-6_3_D-m | A-6_4_D-m | A-6_5_D-m | A-6_6_D-m |
2.232143 | 1.689137 | 1.21389 | 1.342886 | 1.207465 | 1.282544 |
A-7_1_D-m | A-7_2_D-m | A-7_3_D-m | A-7_4_D-m | A-7_5_D-m | A-7_6_D-m |
2.232143 | 1.553125 | 1.202153 | 1.374208 | 1.05587 | 1.044899 |
A-8_1_D-m | A-8_2_D-m | A-8_3_D-m | A-8_4_D-m | A-8_5_D-m | A-8_6_D-m |
3.361607 | 2.006845 | 1.915359 | 2.28845 | 2.008073 | 2.340365 |
B-1_1_D-m | B-1_2_D-m | B-1_3_D-m | B-1_4_D-m | B-1_5_D-m | B-1_6_D-m |
3.110119 | 1.439583 | 1.003132 | 0.857143 | 0.535334 | 0.376744 |
B-2_1_D-m | B-2_2_D-m | B-2_3_D-m | B-2_4_D-m | B-2_5_D-m | B-2_6_D-m |
3.212798 | 0.959375 | 1.100518 | 0.656178 | 0.374198 | 0.299769 |
B-3_1_D-m | B-3_2_D-m | B-3_3_D-m | B-3_4_D-m | B-3_5_D-m | B-3_6_D-m |
3.43006 | 1.245685 | 0.985383 | 1.114523 | 0.338087 | 0.364959 |
B-4_1_D-m | B-4_2_D-m | B-4_3_D-m | B-4_4_D-m | B-4_5_D-m | B-4_6_D-m |
3.641369 | 0.976935 | 1.210397 | 0.853183 | 0.549405 | 0.552794 |
B-5_1_D-m | B-5_2_D-m | B-5_3_D-m | B-5_4_D-m | B-5_5_D-m | B-5_6_D-m |
3.455357 | 0.912946 | 0.920111 | 0.751692 | 0.269593 | 0.430039 |
B-6_1_D-m | B-6_2_D-m | B-6_3_D-m | B-6_4_D-m | B-6_5_D-m | B-6_6_D-m |
2.845238 | 1.385119 | 0.744168 | 0.758785 | 0.265551 | 0.302827 |
B-7_1_D-m | B-7_2_D-m | B-7_3_D-m | B-7_4_D-m | B-7_5_D-m | B-7_6_D-m |
2.665179 | 1.012649 | 0.705357 | 0.521025 | 0.317952 | 0.29919 |
B-8_1_D-m | B-8_2_D-m | B-8_3_D-m | B-8_4_D-m | B-8_5_D-m | B-8_6_D-m |
3.28125 | 1.248958 | 0.714754 | 1.215978 | 0.300066 | 0.368643 |
Table 28 and entropy
A-1_1_S-E | A-1_2_S-E | A-1_3_S-E | A-1_4_S-E | A-1_5_S-E | A-1_6_S-E |
0.612069 | 1.011892 | 2.75211 | 2.559395 | 2.653717 | 2.357032 |
A-2_1_S-E | A-2_2_S-E | A-2_3_S-E | A-2_4_S-E | A-2_5_S-E | A-2_6_S-E |
0.947376 | 1.724922 | 2.397386 | 2.264199 | 2.474822 | 2.136188 |
A-3_1_S-E | A-3_2_S-E | A-3_3_S-E | A-3_4_S-E | A-3_5_S-E | A-3_6_S-E |
1.011892 | 2.539099 | 2.962306 | 1.771709 | 2.437167 | 2.382927 |
A-4_1_S-E | A-4_2_S-E | A-4_3_S-E | A-4_4_S-E | A-4_5_S-E | A-4_6_S-E |
2.028047 | 1.531374 | 2.856791 | 2.269515 | 2.485508 | 1.92766 |
A-5_1_S-E | A-5_2_S-E | A-5_3_S-E | A-5_4_S-E | A-5_5_S-E | A-5_6_S-E |
0.748327 | 2.132039 | 2.136188 | 2.055342 | 1.836225 | 2.264199 |
A-6_1_S-E | A-6_2_S-E | A-6_3_S-E | A-6_4_S-E | A-6_5_S-E | A-6_6_S-E |
0.612069 | 1.571539 | 1.838792 | 2.058986 | 1.853822 | 1.836225 |
A-7_1_S-E | A-7_2_S-E | A-7_3_S-E | A-7_4_S-E | A-7_5_S-E | A-7_6_S-E |
0.612069 | 1.911607 | 1.905091 | 1.727867 | 1.942689 | 1.612649 |
A-8_1_S-E | A-8_2_S-E | A-8_3_S-E | A-8_4_S-E | A-8_5_S-E | A-8_6_S-E |
1.914239 | 1.671989 | 2.242653 | 2.382927 | 2.41185 | 2.593117 |
B-1_1_S-E | B-1_2_S-E | B-1_3_S-E | B-1_4_S-E | B-1_5_S-E | B-1_6_S-E |
2.943925 | 3.07012 | 2.694398 | 2.619668 | 2.056985 | 1.938491 |
B-2_1_S-E | B-2_2_S-E | B-2_3_S-E | B-2_4_S-E | B-2_5_S-E | B-2_6_S-E |
2.861919 | 2.483775 | 2.804868 | 2.191245 | 1.778759 | 1.726835 |
B-3_1_S-E | B-3_2_S-E | B-3_3_S-E | B-3_4_S-E | B-3_5_S-E | B-3_6_S-E |
3.186176 | 2.732559 | 2.705314 | 2.602177 | 1.726835 | 1.726835 |
B-4_1_S-E | B-4_2_S-E | B-4_3_S-E | B-4_4_S-E | B-4_5_S-E | B-4_6_S-E |
3.366478 | 2.684184 | 2.921308 | 2.722804 | 2.083337 | 2.235981 |
B-5_1_S-E | B-5_2_S-E | B-5_3_S-E | B-5_4_S-E | B-5_5_S-E | B-5_6_S-E |
3.097308 | 2.721316 | 2.339401 | 2.260333 | 1.656175 | 1.911607 |
B-6_1_S-E | B-6_2_S-E | B-6_3_S-E | B-6_4_S-E | B-6_5_S-E | B-6_6_S-E |
3.081967 | 2.726498 | 2.359926 | 2.417582 | 1.637968 | 1.809459 |
B-7_1_S-E | B-7_2_S-E | B-7_3_S-E | B-7_4_S-E | B-7_5_S-E | B-7_6_S-E |
3.18791 | 2.696965 | 1.954305 | 2.050971 | 1.612649 | 1.72128 |
B-8_1_S-E | B-8_2_S-E | B-8_3_S-E | B-8_4_S-E | B-8_5_S-E | B-8_6_S-E |
3.471159 | 2.899189 | 2.437167 | 2.71465 | 1.466858 | 1.825372 |
The entropy that table 29 is poor
A-1_1_D-E | A-1_2_D-E | A-1_3_D-E | A-1_4_D-E | A-1_5_D-E | A-1_6_D-E |
0.668564 | 1.311278 | 3 | 2.899397 | 2.875 | 2.477217 |
A-2_1_D-E | A-2_2_D-E | A-2_3_D-E | A-2_4_D-E | A-2_5_D-E | A-2_6_D-E |
1.311278 | 1.621641 | 2.82782 | 2.655639 | 3.030639 | 2.423795 |
A-3_1_D-E | A-3_2_D-E | A-3_3_D-E | A-3_4_D-E | A-3_5_D-E | A-3_6_D-E |
1.311278 | 3.125 | 2.95282 | 2.483459 | 2.521782 | 2.305037 |
A-4_1_D-E | A-4_2_D-E | A-4_3_D-E | A-4_4_D-E | A-4_5_D-E | A-4_6_D-E |
2.271782 | 2 | 3.125 | 2.858459 | 2.5 | 2.125 |
A-5_1_D-E | A-5_2_D-E | A-5_3_D-E | A-5_4_D-E | A-5_5_D-E | A-5_6_D-E |
0.993393 | 2.655639 | 2.483459 | 2.57782 | 2.555037 | 2.436278 |
A-6_1_D-E | A-6_2_D-E | A-6_3_D-E | A-6_4_D-E | A-6_5_D-E | A-6_6_D-E |
0.668564 | 2.07782 | 2.298795 | 2.649397 | 2.405639 | 2.280639 |
A-7_1_D-E | A-7_2_D-E | A-7_3_D-E | A-7_4_D-E | A-7_5_D-E | A-7_6_D-E |
0.668564 | 2.349602 | 2.375 | 2.030639 | 1.794737 | 2.07782 |
A-8_1_D-E | A-8_2_D-E | A-8_3_D-E | A-8_4_D-E | A-8_5_D-E | A-8_6_D-E |
2.375 | 2.07782 | 2.75 | 2.75 | 2.780639 | 3.030639 |
B-1_1_D-E | B-1_2_D-E | B-1_3_D-E | B-1_4_D-E | B-1_5_D-E | B-1_6_D-E |
3.155639 | 3.405639 | 2.521782 | 2.771782 | 2.521782 | 2.349602 |
B-2_1_D-E | B-2_2_D-E | B-2_3_D-E | B-2_4_D-E | B-2_5_D-E | B-2_6_D-E |
3.32782 | 2.774397 | 3.25 | 2.474602 | 2.530639 | 2.20282 |
B-3_1_D-E | B-3_2_D-E | B-3_3_D-E | B-3_4_D-E | B-3_5_D-E | B-3_6_D-E |
3.45282 | 3.108459 | 2.875 | 2.95282 | 2.07782 | 2.20282 |
B-4_1_D-E | B-4_2_D-E | B-4_3_D-E | B-4_4_D-E | B-4_5_D-E | B-4_6_D-E |
3.45282 | 3 | 3.280639 | 2.646782 | 2.655639 | 2.608459 |
B-5_1_D-E | B-5_2_D-E | B-5_3_D-E | B-5_4_D-E | B-5_5_D-E | B-5_6_D-E |
3.20282 | 3.024397 | 2.82782 | 2.477217 | 2.20282 | 2.548795 |
B-6_1_D-E | B-6_2_D-E | B-6_3_D-E | B-6_4_D-E | B-6_5_D-E | B-6_6_D-E |
3.108459 | 2.95282 | 2.852217 | 2.852217 | 2.07782 | 2.271782 |
B-7_1_D-E | B-7_2_D-E | B-7_3_D-E | B-7_4_D-E | B-7_5_D-E | B-7_6_D-E |
3.625 | 2.95282 | 2.20282 | 2.436278 | 2.305037 | 2.25 |
B-8_1_D-E | B-8_2_D-E | B-8_3_D-E | B-8_4_D-E | B-8_5_D-E | B-8_6_D-E |
3.024397 | 3.07782 | 2.75 | 3 | 1.919737 | 2.375 |
Claims (3)
1. the various dimensions texture extracting method based on crowd's brain nuclear magnetic resonance image, is characterized in that: deployment area growth method by the Region Segmentation needing in crowd's brain nuclear magnetic resonance image out.
2. extracting method according to claim 1, it is characterized in that: adopt Curvelet transform method to extract the textural characteristics parameter in the region of needs, wherein said crowd comprises Alzheimer disease people colony, mild cognitive impairment patient population and normal aging people colony, the textural characteristics parameter in the region of described needs comprises entropy, gray average, correlativity, energy, homogeneity degree, variance, maximum probability, unfavourable balance distance, Clustering Tendency, contrast, with average, poor average, with entropy, poor entropy, the region of described needs comprises entorhinal cortex and two regions of hippocampus.
3. extracting method according to claim 1, it is characterized in that: adopt Contourlet transform method to extract the textural characteristics parameter in the region of needs, wherein said crowd comprises Alzheimer disease people colony, mild cognitive impairment patient population and normal aging people colony, the textural characteristics parameter in the region of described needs comprises entropy, gray average, correlativity, energy, homogeneity degree, variance, maximum probability, unfavourable balance distance, Clustering Tendency, contrast, with average, poor average, with entropy, poor entropy, the region of described needs comprises entorhinal cortex and two regions of hippocampus.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410023305.4A CN103793711A (en) | 2014-01-17 | 2014-01-17 | Multidimensional vein extracting method based on brain nuclear magnetic resonance image |
PCT/CN2014/001130 WO2015106374A1 (en) | 2014-01-17 | 2014-12-16 | Multidimensional texture extraction method based on brain nuclear magnetic resonance images |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410023305.4A CN103793711A (en) | 2014-01-17 | 2014-01-17 | Multidimensional vein extracting method based on brain nuclear magnetic resonance image |
Publications (1)
Publication Number | Publication Date |
---|---|
CN103793711A true CN103793711A (en) | 2014-05-14 |
Family
ID=50669354
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410023305.4A Pending CN103793711A (en) | 2014-01-17 | 2014-01-17 | Multidimensional vein extracting method based on brain nuclear magnetic resonance image |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN103793711A (en) |
WO (1) | WO2015106374A1 (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015106374A1 (en) * | 2014-01-17 | 2015-07-23 | 首都医科大学 | Multidimensional texture extraction method based on brain nuclear magnetic resonance images |
WO2015106373A1 (en) * | 2014-01-17 | 2015-07-23 | 首都医科大学 | Method for establishing prediction model based on multidimensional texture of brain nuclear magnetic resonance images |
CN104881680A (en) * | 2015-05-25 | 2015-09-02 | 电子科技大学 | Alzheimer's disease and mild cognitive impairment identification method based on two-dimension features and three-dimension features |
CN106780372A (en) * | 2016-11-30 | 2017-05-31 | 华南理工大学 | A kind of weight nuclear norm magnetic resonance imaging method for reconstructing sparse based on Generalized Tree |
CN109492700A (en) * | 2018-11-21 | 2019-03-19 | 西安中科光电精密工程有限公司 | A kind of Target under Complicated Background recognition methods based on multidimensional information fusion |
CN110033019A (en) * | 2019-03-06 | 2019-07-19 | 腾讯科技(深圳)有限公司 | Method for detecting abnormality, device and the storage medium of human body |
CN111739645A (en) * | 2020-05-14 | 2020-10-02 | 上海依智医疗技术有限公司 | Training method of immune-related pneumonia prediction model |
CN111754598A (en) * | 2020-06-27 | 2020-10-09 | 昆明理工大学 | Local space neighborhood parallel magnetic resonance imaging reconstruction method based on transformation learning |
CN113962930A (en) * | 2021-09-07 | 2022-01-21 | 北京邮电大学 | Alzheimer disease risk assessment model establishing method and electronic equipment |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108241865B (en) * | 2016-12-26 | 2021-11-02 | 哈尔滨工业大学 | Ultrasound image-based multi-scale and multi-subgraph hepatic fibrosis multistage quantitative staging method |
WO2019170711A1 (en) | 2018-03-07 | 2019-09-12 | Institut National De La Sante Et De La Recherche Medicale (Inserm) | Method for early prediction of neurodegenerative decline |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2277148A4 (en) * | 2008-02-29 | 2012-08-22 | Agency Science Tech & Res | A method and system for anatomy structure segmentation and modeling in an image |
CN101853493B (en) * | 2009-10-21 | 2013-06-19 | 首都医科大学 | Method for extracting multi-dimensional texture of nodi from medical images |
CN103793711A (en) * | 2014-01-17 | 2014-05-14 | 首都医科大学 | Multidimensional vein extracting method based on brain nuclear magnetic resonance image |
CN103793908A (en) * | 2014-01-17 | 2014-05-14 | 首都医科大学 | Method for constructing prediction model of multifunctional veins based on brain nuclear magnetic resonance image |
-
2014
- 2014-01-17 CN CN201410023305.4A patent/CN103793711A/en active Pending
- 2014-12-16 WO PCT/CN2014/001130 patent/WO2015106374A1/en active Application Filing
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015106373A1 (en) * | 2014-01-17 | 2015-07-23 | 首都医科大学 | Method for establishing prediction model based on multidimensional texture of brain nuclear magnetic resonance images |
WO2015106374A1 (en) * | 2014-01-17 | 2015-07-23 | 首都医科大学 | Multidimensional texture extraction method based on brain nuclear magnetic resonance images |
CN104881680A (en) * | 2015-05-25 | 2015-09-02 | 电子科技大学 | Alzheimer's disease and mild cognitive impairment identification method based on two-dimension features and three-dimension features |
CN106780372A (en) * | 2016-11-30 | 2017-05-31 | 华南理工大学 | A kind of weight nuclear norm magnetic resonance imaging method for reconstructing sparse based on Generalized Tree |
CN106780372B (en) * | 2016-11-30 | 2019-06-18 | 华南理工大学 | A kind of weight nuclear norm magnetic resonance imaging method for reconstructing sparse based on Generalized Tree |
CN109492700B (en) * | 2018-11-21 | 2020-09-08 | 西安中科光电精密工程有限公司 | Complex background target identification method based on multi-dimensional information fusion |
CN109492700A (en) * | 2018-11-21 | 2019-03-19 | 西安中科光电精密工程有限公司 | A kind of Target under Complicated Background recognition methods based on multidimensional information fusion |
CN110033019A (en) * | 2019-03-06 | 2019-07-19 | 腾讯科技(深圳)有限公司 | Method for detecting abnormality, device and the storage medium of human body |
CN110033019B (en) * | 2019-03-06 | 2021-07-27 | 腾讯科技(深圳)有限公司 | Method and device for detecting abnormality of human body part and storage medium |
CN111739645A (en) * | 2020-05-14 | 2020-10-02 | 上海依智医疗技术有限公司 | Training method of immune-related pneumonia prediction model |
CN111739645B (en) * | 2020-05-14 | 2024-01-30 | 北京深睿博联科技有限责任公司 | Training method of immune-related pneumonia prediction model |
CN111754598A (en) * | 2020-06-27 | 2020-10-09 | 昆明理工大学 | Local space neighborhood parallel magnetic resonance imaging reconstruction method based on transformation learning |
CN111754598B (en) * | 2020-06-27 | 2022-02-25 | 昆明理工大学 | Local space neighborhood parallel magnetic resonance imaging reconstruction method based on transformation learning |
CN113962930A (en) * | 2021-09-07 | 2022-01-21 | 北京邮电大学 | Alzheimer disease risk assessment model establishing method and electronic equipment |
CN113962930B (en) * | 2021-09-07 | 2022-09-09 | 北京邮电大学 | Alzheimer disease risk assessment model establishing method and electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
WO2015106374A1 (en) | 2015-07-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103793711A (en) | Multidimensional vein extracting method based on brain nuclear magnetic resonance image | |
Montaha et al. | BreastNet18: a high accuracy fine-tuned VGG16 model evaluated using ablation study for diagnosing breast cancer from enhanced mammography images | |
US9881374B2 (en) | Method for establishing prediction model based on multidimensional texture of brain nuclear magnetic resonance images | |
Vaishnavee et al. | An automated MRI brain image segmentation and tumor detection using SOM-clustering and Proximal Support Vector Machine classifier | |
CN110534195B (en) | Alzheimer disease detection method based on data space transformation | |
CN104881680A (en) | Alzheimer's disease and mild cognitive impairment identification method based on two-dimension features and three-dimension features | |
Macin et al. | An accurate multiple sclerosis detection model based on exemplar multiple parameters local phase quantization: ExMPLPQ | |
DE102008060789A1 (en) | System and method for unmonitored detection and Gleason grading for a prostate cancer preparation (whole-mount) using NIR fluorescence | |
CN104346617A (en) | Cell detection method based on sliding window and depth structure extraction features | |
Somasundaram et al. | Brain segmentation in magnetic resonance human head scans using multi-seeded region growing | |
Yang et al. | Skull sex estimation based on wavelet transform and Fourier transform | |
Sharan et al. | Encoder modified U-net and feature pyramid network for multi-class segmentation of cardiac magnetic resonance images | |
Lahmiri | Features extraction from high frequency domain for retina digital images classification | |
Deshmukh et al. | Study of different brain tumor MRI image segmentation techniques | |
Almalki et al. | Impact of image enhancement module for analysis of mammogram images for diagnostics of breast cancer | |
Sathish et al. | Exponential cuckoo search algorithm to radial basis neural network for automatic classification in MRI images | |
CN105512670A (en) | HRCT peripheral nerve cutting based on KECA feature dimension reduction and clustering | |
Rasheed et al. | Recognizing brain tumors using adaptive noise filtering and statistical features | |
CN114492519A (en) | Lung ultrasonic special sign B-line identification and classification method based on ultrasonic echo radio frequency signals | |
Kumaraswamy et al. | Automatic prostate segmentation of magnetic resonance imaging using Res-Net | |
CN104331864B (en) | Based on the processing of the breast image of non-down sampling contourlet and the significant model of vision | |
CN105260609A (en) | Method and apparatus storing medical images | |
Aggarwal et al. | 3d discrete wavelet transform for computer aided diagnosis of A lzheimer's disease using t1‐weighted brain MRI | |
Mahdi | Computer aided diagnosis system for breast cancer using ID3 and SVM based on slantlet transform | |
CN106570880A (en) | Brain tissue MRI image segmentation method based on fuzzy clustering and Markov random field |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20140514 |
|
RJ01 | Rejection of invention patent application after publication |